Compare commits

...

31 Commits

Author SHA1 Message Date
Yuan Teoh
2e1ac73c0c draft: add invoke interface to postgres source 2025-10-06 11:22:02 -07:00
Yuan Teoh
cbd72c2ea1 chore: update logger out stream to errStream in stdio transport (#1466)
## Description

---
Update out stream in `stdio` transport protocol.


**Before:** When `stdio=true`, log levels will automatically get set to
WARN (even if user explicitly set it as DEBUG or INFO). This is because
the DEBUG and INFO logger send logs to outStream. In `stdio`, server
MUST NOT write anything to `stdout` that is not a valid MCP message.

MCP specifications mentioned that "The server MAY write UTF-8 strings to
its standard error (stderr) for logging purposes. Client MAY capture,
forward, or ignore this logging."

**After:** When `stdio=true`, logger's DEBUG and INFO logger send logs
to outStream. This will still respect the `log-levels` set by user
(default as INFO).
2025-09-30 13:39:33 -07:00
Mend Renovate
d64b4812c3 chore(deps): update module cloud.google.com/go/bigquery to v1.71.0 (#1604)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[cloud.google.com/go/bigquery](https://redirect.github.com/googleapis/google-cloud-go)
| `v1.70.0` -> `v1.71.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fbigquery/v1.71.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fbigquery/v1.70.0/v1.71.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xMzEuOSIsInVwZGF0ZWRJblZlciI6IjQxLjEzMS45IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->
2025-09-30 19:31:43 +00:00
manuka rahul
dbc477ab0f fix: added google_ml_integration extension to use alloydb ai-nl support api (#1445)
Including the google_ml_integration extension made the add_template
function work because it enabled the AI Query Engine features within the
Cloud SQL for PostgreSQL database.


http://github.com/googleapis/genai-toolbox/blob/test-genai/docs/en/samples/alloydb/ai-nl/alloydb_ai_nl.ipynb

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Manu Paliwal <paliwalmanu99@gmail.com>
Co-authored-by: Pete Hampton <pjhampton@protonmail.com>
Co-authored-by: Pete Hampton <pjhampton@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
Co-authored-by: Sri Varshitha <96117854+Myst9@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
Co-authored-by: Mend Renovate <bot@renovateapp.com>
Co-authored-by: prernakakkar-google <158031829+prernakakkar-google@users.noreply.github.com>
Co-authored-by: Huan Chen <142538604+Genesis929@users.noreply.github.com>
Co-authored-by: Ajaykumar Yadav <akryadav@google.com>
Co-authored-by: trehanshakuntG <trehanshakunt@google.com>
Co-authored-by: Dr. Strangelove <drstrangelove@google.com>
Co-authored-by: nester-neo4j <nester.marchenko@neo4j.com>
Co-authored-by: Haoming Chen <hmchen@google.com>
Co-authored-by: isaurabhuttam <118341467+isaurabhuttam@users.noreply.github.com>
Co-authored-by: Harsh Jha <83023263+rapid-killer-9@users.noreply.github.com>
Co-authored-by: shuzhou-gc <zhoushu@google.com>
Co-authored-by: Divyansh <dvbelieve.nitk@gmail.com>
Co-authored-by: Jo Alex <17249308+johanesalxd@users.noreply.github.com>
Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: dishaprakash <57954147+dishaprakash@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Harsh Jha <harshrjha@google.com>
Co-authored-by: Pranava B <pranava018@gmail.com>
Co-authored-by: duwenxin <duwenxin@google.com>
Co-authored-by: Anubhav Dhawan <anubhavdhawan@google.com>
2025-09-30 13:45:48 +05:30
Sri Varshitha
918753d396 chore(tests): Add database cleanup functions (#1528)
## Description

---
This change adds helper functions to cleanup the test databases
(PostgreSQL, MySQL, and MSSQL) by dropping any orphan tables from
previous test runs. These functions can be called at the beginning of
each test run.

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1304

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-09-30 08:56:58 +05:30
Yuan Teoh
b43c94575d fix: remove duplicated build type in Dockerfile (#1598)
## Description

---
Removing duplication of `container.`.

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1550
2025-09-29 15:34:30 -07:00
Anubhav Dhawan
87b385b2b5 docs: Remove obsolete Homebrew entry in README TOC (#1594)
Fixes https://github.com/googleapis/genai-toolbox/issues/1590

Removes the unused link for Homebrew users from the TOC in README.md.
2025-09-29 09:33:42 -07:00
Anubhav Dhawan
d154370cef docs: Improve binary installation instructions for multiple platforms (#1576)
# Overview
The previous installation instructions for the Toolbox binary only
provided a curl command for Linux AMD64 systems. This was confusing for
users on other platforms like macOS and Windows, who had to manually
figure out the correct download URL from the releases page.

This PR overhauls the installation process in both the `README.md` and
the official documentation site. It provides dedicated, copy-pasteable
commands for all major platforms and makes the download commands more
reliable.

# Changes
* Replaced the single, Linux-only installation command with a
multi-level tabbed/collapsible interface in both the `README.md` and the
official documentation.
* Added dedicated installation commands for multiple platforms,
including Windows (AMD64) that now using the native `Invoke-WebRequest`
in PowerShell.
* Updated all `curl` commands to include the `-L` flag.
* This makes the download more robust by automatically following any
HTTP redirects.

# Before
## `README.md`
<img width="1870" height="868" alt="image"
src="https://github.com/user-attachments/assets/28b714b5-1938-42cb-904b-3a4d61f1c56a"
/>

## Docsite
<img width="1728" height="788" alt="image"
src="https://github.com/user-attachments/assets/b63d579d-ce8f-4974-9860-1fee5a8d0dc7"
/>

# After
## `README.md`
<img width="1814" height="1082" alt="image"
src="https://github.com/user-attachments/assets/54985eaf-c721-4f11-86d2-cc027aeea0ad"
/>

## Docsite
<img width="2066" height="1112" alt="image"
src="https://github.com/user-attachments/assets/a07380b7-3b38-4b99-a95e-f942356f2200"
/>
2025-09-29 12:22:16 +05:30
Yuan Teoh
0b3dac4132 feat: add metadata in MCP Manifest for Toolbox auth (#1395)
Add `_meta` for `tools/list` method in MCP Toolbox.

If there are authorized invocation, the following will be return in
`_meta`:
```
{
    "name":"my-tool-name",
    "description":"my tool description",
     "inputSchema":{
        "type":"object",
         "properties":{
             "user_id":{"type":"string","description":"user's name from google login"}
         },
         "required":["user_id"]
     },
     "_meta":{
         "toolbox/authParam":{"user_id":["my_auth"]}
     }
}
```

If there are authenticated parameter, the following will be return in
`_meta`:
```
{
    "name":"my-tool-name",
    "description":"my tool description",
    "inputSchema":{
        "type":"object",
        "properties":{
            "sql":{"type":"string","description":"The sql to execute."}
        },
        "required":["sql"]
    },
    "_meta":{
        "toolbox/authInvoke":["my_auth"]
    }
}
```

If there are no authorized invocation or authenticated prameter, the
`_meta` field will be omitted.


With this feature, the following were updated in the source code: 
* In each `func(p CommonParameter) McpManifest()`, we will return a
`[]string` for the list of authenticated parameters. This is similar to
how Manifest() return the list of authNames in non-MCP Toolbox's
manifest.
* The `func(ps Parameters) McpManifest()` will return a
`map[string][]string` that with key as param's name, and value as the
param's auth.
* Added a new function `GetMcpManifest()` in `tools.go`. This function
will consctruct the McpManifest, and add the `Metadata` field.
* Associated tests were added or updated.
2025-09-26 17:48:57 -07:00
Dr. Strangelove
62af39d751 fix(tools/looker): Refactor run-inline-query logic to helper function (#1497)
Inline queries are directed to an undocumented endpoint so that they can
be tracked in Looker System Activity separately. The logic is to try the
undocumented endpoint first, and if there is any error, fall back to the
documented endpoint.

This was done in multiple places. This PR combines that logic into one
helper function, reducing the duplication of code.
2025-09-26 11:11:50 -07:00
Dr. Strangelove
67d8221a2e fix(sources/looker): Allow Looker to be configured without setting a Client Id or Secret (#1496)
Fixed a minor oversight from previous work on oauth.
2025-09-26 10:10:56 -07:00
Mend Renovate
ab7d313f7f chore(deps): update dependency google-adk to v1.15.0 (#1575)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [google-adk](https://redirect.github.com/google/adk-python)
([changelog](https://redirect.github.com/google/adk-python/blob/main/CHANGELOG.md))
| `==1.14.1` -> `==1.15.0` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/google-adk/1.15.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/google-adk/1.14.1/1.15.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>google/adk-python (google-adk)</summary>

###
[`v1.15.0`](https://redirect.github.com/google/adk-python/blob/HEAD/CHANGELOG.md#1150-2025-09-24)

[Compare
Source](https://redirect.github.com/google/adk-python/compare/v1.14.1...v1.15.0)

##### Features

- \[Core]
- Adding the ContextFilterPlugin
([a06bf27](a06bf278cb))
- Adds plugin to save artifacts for issue
[#&#8203;2176](https://redirect.github.com/google/adk-python/issues/2176)
([657369c](657369cffe))
- Expose log probs of candidates in LlmResponse
([f7bd3c1](f7bd3c111c))
- \[Context Caching]
- Support context caching
([c66245a](c66245a3b8))
- Support explicit context caching auto creation and lifecycle
management.

Usage: `App(root_agent=..., plugins=..., context_cache_config=...)`
- Support non-text content in static instruction
([61213ce](61213ce4d4))
- Support static instructions
([9be9cc2](9be9cc2fee))
- Support static instruction that won't change, put at the beginning of
      the instruction.
Static instruction support inline\_data and file\_data as contents.
Dynamic instruction moved to the end of LlmRequest, increasing prefix
      caching matching size.

      Usage:
`LlmAgent(model=...,static_instruction =types.Content(parts=...), ... )`
- \[Telemetry]
- Add --otel\_to\_cloud experimental support
([1ae0b82](1ae0b82f56),
[b131268](b1312680f4),
[7870480](7870480c63))
- Add GenAI Instrumentation if --otel\_to\_cloud is enabled
([cee365a](cee365a13d))
- Support standard OTel env variables for exporter endpoints
([f157b2e](f157b2ee4c))
- Temporarily disable Cloud Monitoring integration in --otel\_to\_cloud
([3b80337](3b80337faf))
- \[Services]
- Add endpoint to generate memory from session
([2595824](25958242db))
- \[Tools]
- Add Google Maps Grounding Tool to ADK
([6b49391](6b49391546))
- **MCP:** Initialize tool\_name\_prefix in MCPToolse
([86dea5b](86dea5b53a))
- \[Evals]
- Data model for storing App Details and data model for steps
([01923a9](01923a9227))
- Adds Rubric based final response evaluator
([5a485b0](5a485b01cd))
- Populate AppDetails to each Invocation
([d486795](d48679582d))
- \[Samples]
- Make the bigquery sample agent run with ADC out-of-the-box
([10cf377](10cf377494))

##### Bug Fixes

- Close runners after running eval
([86ee6e3](86ee6e3fa3))
- Filter out thought parts when saving agent output to state
([632bf8b](632bf8b0bc))
- Ignore empty function chunk in LiteLlm streaming response
([8a92fd1](8a92fd18b6))
- Introduces a `raw_mcp_tool` method in `McpTool` to provide direct
access to the underlying MCP tool
([6158075](6158075a65))
- Make a copy of the `columns` instead of modifying it in place
([aef1ee9](aef1ee97a5))
- Prevent escaping of Latin characters in LLM response
([c9ea80a](c9ea80af28))
- Retain the consumers and transport registry when recreating the
ClientFactory in remote\_a2a\_agent.py
([6bd33e1](6bd33e1be3))
- Remove unsupported 'type': 'unknown' in test\_common.py for fastapi
0.117.1
([3745221](374522197f))

##### Documentation

- Correct the documentation of `after_agent_callback`
([b9735b2](b9735b2193))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xMzAuMSIsInVwZGF0ZWRJblZlciI6IjQxLjEzMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->
2025-09-26 20:04:03 +05:30
release-please[bot]
964a82eb08 chore(main): release 0.16.0 (#1530)
🤖 I have created a release *beep* *boop*
---


##
[0.16.0](https://github.com/googleapis/genai-toolbox/compare/v0.15.0...v0.16.0)
(2025-09-25)


### ⚠ BREAKING CHANGES

* **tool/bigquery-execute-sql:** add allowed datasets support
([#1443](https://github.com/googleapis/genai-toolbox/issues/1443))
* **tool/bigquery-forecast:** add allowed datasets support
([#1412](https://github.com/googleapis/genai-toolbox/issues/1412))

### Features

* **cassandra:** Add Cassandra Source and Tool
([#1012](https://github.com/googleapis/genai-toolbox/issues/1012))
([6e42053](6e420534ee))
* **sources/postgres:** Add application_name
([#1504](https://github.com/googleapis/genai-toolbox/issues/1504))
([72a2366](72a2366b28))
* **tool/bigquery-execute-sql:** Add allowed datasets support
([#1443](https://github.com/googleapis/genai-toolbox/issues/1443))
([9501ebb](9501ebbdbc))
* **tool/bigquery-forecast:** Add allowed datasets support
([#1412](https://github.com/googleapis/genai-toolbox/issues/1412))
([88bac7e](88bac7e36f))
* **tools/clickhouse-list-tables:** Add list-tables tool
([#1446](https://github.com/googleapis/genai-toolbox/issues/1446))
([69a3caf](69a3cafabe))


### Bug Fixes

* **tool/mongodb-find:** Fix find tool `limit` field
([#1570](https://github.com/googleapis/genai-toolbox/issues/1570))
([4166bf7](4166bf7ab8))
* **tools/mongodb:** Concat filter params only once in mongodb update
tools ([#1545](https://github.com/googleapis/genai-toolbox/issues/1545))
([295f9dc](295f9dc41a))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-25 18:05:14 -07:00
Yuan Teoh
cb692b5883 tests: add mssql prebuilt tests to cloud sql mssql integration (#1501)
## Description

---
Add mssql's prebuilt tools tests to cloud-sql-mssql integration tests.

Cloud sql mssql's integration test coverage check against the mssql
package since those tools are compatible. Hence, when we add new tools
to mssql, we will have to add those integration tests against cloud sql
mssql as well.
2025-09-25 22:37:09 +00:00
Wenxin Du
4166bf7ab8 fix(tool/mongodb-find): fix find tool limit field (#1570)
The projection checking block in `getOptions` exists early previously,
resulting in the limit not being set.

🛠️ Fixes https://github.com/googleapis/genai-toolbox/issues/1491
2025-09-25 22:16:48 +00:00
Yuan Teoh
8dfcbfd5b3 chore: release 0.16.0 (#1569)
Release-As: 0.16.0
2025-09-25 22:03:54 +00:00
Averi Kitsch
72a2366b28 feat(sources/postgres): add application_name (#1504)
## Description

---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-25 14:12:07 -07:00
Huan Chen
9501ebbdbc feat(tool/bigquery-execute-sql)!: add allowed datasets support (#1443)
## Description
This introduces a breaking change. The bigquery-execute-sql tool will
now enforce the allowed datasets setting from its BigQuery source
configuration. Previously, this setting had no effect on the tool.

---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes https://github.com/googleapis/genai-toolbox/issues/873

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-25 19:08:50 +00:00
Mend Renovate
a3bd2e9927 chore(deps): update google.golang.org/genproto digest to 9219d12 (#1375)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[google.golang.org/genproto](https://redirect.github.com/googleapis/go-genproto)
| require | digest | `ef028d9` -> `9219d12` |

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-25 18:56:54 +00:00
trehanshakuntG
295f9dc41a fix(tools/mongodb): concat filter params only once in mongodb update tools (#1545)
## Description

---
### What

Fixed a bug where `FilterParams` was being concatenated twice in the
parameter list for MongoDB update operations.

### Why

The duplicate concatenation would cause parameter validation to fail
during tool initialization, preventing the MongoDB update tools from
working.

### Changes

- `mongodbupdateone.go`: Fixed `slices.Concat()` to properly combine
FilterParams and UpdateParams
- `mongodbupdatemany.go`: Fixed `slices.Concat()` to properly combine
FilterParams and UpdateParams

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change
2025-09-25 11:37:07 -07:00
dependabot[bot]
58424a3f7f chore(deps): bump axios from 1.11.0 to 1.12.2 in /docs/en/getting-started/quickstart/js/llamaindex (#1554)
Bumps [axios](https://github.com/axios/axios) from 1.11.0 to 1.12.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/axios/axios/releases">axios's
releases</a>.</em></p>
<blockquote>
<h2>Release v1.12.2</h2>
<h2>Release notes:</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>fetch:</strong> use current global fetch instead of cached
one when env fetch is not specified to keep MSW support; (<a
href="https://redirect.github.com/axios/axios/issues/7030">#7030</a>)
(<a
href="cf78825e12">cf78825</a>)</li>
</ul>
<h3>Contributors to this release</h3>
<ul>
<li><!-- raw HTML omitted --> <a
href="https://github.com/DigitalBrainJS" title="+247/-16
([#7030](https://github.com/axios/axios/issues/7030)
[#7022](https://github.com/axios/axios/issues/7022)
[#7024](https://github.com/axios/axios/issues/7024) )">Dmitriy
Mozgovoy</a></li>
<li><!-- raw HTML omitted --> <a href="https://github.com/noritaka1166"
title="+2/-6 ([#7028](https://github.com/axios/axios/issues/7028)
[#7029](https://github.com/axios/axios/issues/7029) )">Noritaka
Kobayashi</a></li>
</ul>
<h2>Release v1.12.1</h2>
<h2>Release notes:</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>types:</strong> fixed env config types; (<a
href="https://redirect.github.com/axios/axios/issues/7020">#7020</a>)
(<a
href="b5f26b75bd">b5f26b7</a>)</li>
</ul>
<h3>Contributors to this release</h3>
<ul>
<li><!-- raw HTML omitted --> <a
href="https://github.com/DigitalBrainJS" title="+10/-4
([#7020](https://github.com/axios/axios/issues/7020) )">Dmitriy
Mozgovoy</a></li>
</ul>
<h2>Release v1.12.0</h2>
<h2>Release notes:</h2>
<h3>Bug Fixes</h3>
<ul>
<li>adding build artifacts (<a
href="9ec86de257">9ec86de</a>)</li>
<li>dont add dist on release (<a
href="a2edc3606a">a2edc36</a>)</li>
<li><strong>fetch-adapter:</strong> set correct Content-Type for Node
FormData (<a
href="https://redirect.github.com/axios/axios/issues/6998">#6998</a>)
(<a
href="a9f47afbf3">a9f47af</a>)</li>
<li><strong>node:</strong> enforce maxContentLength for data: URLs (<a
href="https://redirect.github.com/axios/axios/issues/7011">#7011</a>)
(<a
href="945435fc51">945435f</a>)</li>
<li>package exports (<a
href="https://redirect.github.com/axios/axios/issues/5627">#5627</a>)
(<a
href="aa78ac23fc">aa78ac2</a>)</li>
<li><strong>params:</strong> removing '[' and ']' from URL encode
exclude characters (<a
href="https://redirect.github.com/axios/axios/issues/3316">#3316</a>)
(<a
href="https://redirect.github.com/axios/axios/issues/5715">#5715</a>)
(<a
href="6d84189349">6d84189</a>)</li>
<li>release pr run (<a
href="fd7f404488">fd7f404</a>)</li>
<li><strong>types:</strong> change the type guard on isCancel (<a
href="https://redirect.github.com/axios/axios/issues/5595">#5595</a>)
(<a
href="0dbb7fd4f6">0dbb7fd</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li><strong>adapter:</strong> surface low‑level network error details;
attach original error via cause (<a
href="https://redirect.github.com/axios/axios/issues/6982">#6982</a>)
(<a
href="78b290c57c">78b290c</a>)</li>
<li><strong>fetch:</strong> add fetch, Request, Response env config
variables for the adapter; (<a
href="https://redirect.github.com/axios/axios/issues/7003">#7003</a>)
(<a
href="c959ff2901">c959ff2</a>)</li>
<li>support reviver on JSON.parse (<a
href="https://redirect.github.com/axios/axios/issues/5926">#5926</a>)
(<a
href="2a9763426e">2a97634</a>),
closes <a
href="https://redirect.github.com/axios/axios/issues/5924">#5924</a></li>
<li><strong>types:</strong> extend AxiosResponse interface to include
custom headers type (<a
href="https://redirect.github.com/axios/axios/issues/6782">#6782</a>)
(<a
href="7960d34ede">7960d34</a>)</li>
</ul>
<h3>Contributors to this release</h3>
<ul>
<li><!-- raw HTML omitted --> <a
href="https://github.com/WillianAgostini" title="+132/-16760
([#7002](https://github.com/axios/axios/issues/7002)
[#5926](https://github.com/axios/axios/issues/5926)
[#6782](https://github.com/axios/axios/issues/6782) )">Willian
Agostini</a></li>
<li><!-- raw HTML omitted --> <a
href="https://github.com/DigitalBrainJS" title="+4263/-293
([#7006](https://github.com/axios/axios/issues/7006)
[#7003](https://github.com/axios/axios/issues/7003) )">Dmitriy
Mozgovoy</a></li>
<li><!-- raw HTML omitted --> <a href="https://github.com/mkhani01"
title="+111/-15 ([#6982](https://github.com/axios/axios/issues/6982)
)">khani</a></li>
<li><!-- raw HTML omitted --> <a href="https://github.com/AmeerAssadi"
title="+123/-0 ([#7011](https://github.com/axios/axios/issues/7011)
)">Ameer Assadi</a></li>
<li><!-- raw HTML omitted --> <a href="https://github.com/emiedonmokumo"
title="+55/-35 ([#6998](https://github.com/axios/axios/issues/6998)
)">Emiedonmokumo Dick-Boro</a></li>
<li><!-- raw HTML omitted --> <a href="https://github.com/opsysdebug"
title="+8/-8 ([#6980](https://github.com/axios/axios/issues/6980)
)">Zeroday BYTE</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/axios/axios/blob/v1.x/CHANGELOG.md">axios's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/axios/axios/compare/v1.12.1...v1.12.2">1.12.2</a>
(2025-09-14)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>fetch:</strong> use current global fetch instead of cached
one when env fetch is not specified to keep MSW support; (<a
href="https://redirect.github.com/axios/axios/issues/7030">#7030</a>)
(<a
href="cf78825e12">cf78825</a>)</li>
</ul>
<h3>Contributors to this release</h3>
<ul>
<li><!-- raw HTML omitted --> <a
href="https://github.com/DigitalBrainJS" title="+247/-16
([#7030](https://github.com/axios/axios/issues/7030)
[#7022](https://github.com/axios/axios/issues/7022)
[#7024](https://github.com/axios/axios/issues/7024) )">Dmitriy
Mozgovoy</a></li>
<li><!-- raw HTML omitted --> <a href="https://github.com/noritaka1166"
title="+2/-6 ([#7028](https://github.com/axios/axios/issues/7028)
[#7029](https://github.com/axios/axios/issues/7029) )">Noritaka
Kobayashi</a></li>
</ul>
<h2><a
href="https://github.com/axios/axios/compare/v1.12.0...v1.12.1">1.12.1</a>
(2025-09-12)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>types:</strong> fixed env config types; (<a
href="https://redirect.github.com/axios/axios/issues/7020">#7020</a>)
(<a
href="b5f26b75bd">b5f26b7</a>)</li>
</ul>
<h3>Contributors to this release</h3>
<ul>
<li><!-- raw HTML omitted --> <a
href="https://github.com/DigitalBrainJS" title="+10/-4
([#7020](https://github.com/axios/axios/issues/7020) )">Dmitriy
Mozgovoy</a></li>
</ul>
<h1><a
href="https://github.com/axios/axios/compare/v1.11.0...v1.12.0">1.12.0</a>
(2025-09-11)</h1>
<h3>Bug Fixes</h3>
<ul>
<li>adding build artifacts (<a
href="9ec86de257">9ec86de</a>)</li>
<li>dont add dist on release (<a
href="a2edc3606a">a2edc36</a>)</li>
<li><strong>fetch-adapter:</strong> set correct Content-Type for Node
FormData (<a
href="https://redirect.github.com/axios/axios/issues/6998">#6998</a>)
(<a
href="a9f47afbf3">a9f47af</a>)</li>
<li><strong>node:</strong> enforce maxContentLength for data: URLs (<a
href="https://redirect.github.com/axios/axios/issues/7011">#7011</a>)
(<a
href="945435fc51">945435f</a>)</li>
<li>package exports (<a
href="https://redirect.github.com/axios/axios/issues/5627">#5627</a>)
(<a
href="aa78ac23fc">aa78ac2</a>)</li>
<li><strong>params:</strong> removing '[' and ']' from URL encode
exclude characters (<a
href="https://redirect.github.com/axios/axios/issues/3316">#3316</a>)
(<a
href="https://redirect.github.com/axios/axios/issues/5715">#5715</a>)
(<a
href="6d84189349">6d84189</a>)</li>
<li>release pr run (<a
href="fd7f404488">fd7f404</a>)</li>
<li><strong>types:</strong> change the type guard on isCancel (<a
href="https://redirect.github.com/axios/axios/issues/5595">#5595</a>)
(<a
href="0dbb7fd4f6">0dbb7fd</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li><strong>adapter:</strong> surface low‑level network error details;
attach original error via cause (<a
href="https://redirect.github.com/axios/axios/issues/6982">#6982</a>)
(<a
href="78b290c57c">78b290c</a>)</li>
<li><strong>fetch:</strong> add fetch, Request, Response env config
variables for the adapter; (<a
href="https://redirect.github.com/axios/axios/issues/7003">#7003</a>)
(<a
href="c959ff2901">c959ff2</a>)</li>
<li>support reviver on JSON.parse (<a
href="https://redirect.github.com/axios/axios/issues/5926">#5926</a>)
(<a
href="2a9763426e">2a97634</a>),
closes <a
href="https://redirect.github.com/axios/axios/issues/5924">#5924</a></li>
<li><strong>types:</strong> extend AxiosResponse interface to include
custom headers type (<a
href="https://redirect.github.com/axios/axios/issues/6782">#6782</a>)
(<a
href="7960d34ede">7960d34</a>)</li>
</ul>
<h3>Contributors to this release</h3>
<ul>
<li><!-- raw HTML omitted --> <a
href="https://github.com/WillianAgostini" title="+132/-16760
([#7002](https://github.com/axios/axios/issues/7002)
[#5926](https://github.com/axios/axios/issues/5926)
[#6782](https://github.com/axios/axios/issues/6782) )">Willian
Agostini</a></li>
<li><!-- raw HTML omitted --> <a
href="https://github.com/DigitalBrainJS" title="+4263/-293
([#7006](https://github.com/axios/axios/issues/7006)
[#7003](https://github.com/axios/axios/issues/7003) )">Dmitriy
Mozgovoy</a></li>
<li><!-- raw HTML omitted --> <a href="https://github.com/mkhani01"
title="+111/-15 ([#6982](https://github.com/axios/axios/issues/6982)
)">khani</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e5a33366d7"><code>e5a3336</code></a>
chore(release): v1.12.2 (<a
href="https://redirect.github.com/axios/axios/issues/7031">#7031</a>)</li>
<li><a
href="38726c7586"><code>38726c7</code></a>
refactor: change if in else to else if (<a
href="https://redirect.github.com/axios/axios/issues/7028">#7028</a>)</li>
<li><a
href="cf78825e12"><code>cf78825</code></a>
fix(fetch): use current global fetch instead of cached one when env
fetch is ...</li>
<li><a
href="c26d00f451"><code>c26d00f</code></a>
refactor: remove redundant assignment (<a
href="https://redirect.github.com/axios/axios/issues/7029">#7029</a>)</li>
<li><a
href="9fb41a8fcd"><code>9fb41a8</code></a>
chore(ci): add local HTTP server for Karma tests; (<a
href="https://redirect.github.com/axios/axios/issues/7022">#7022</a>)</li>
<li><a
href="19f9f36850"><code>19f9f36</code></a>
docs(readme): add custom fetch section; (<a
href="https://redirect.github.com/axios/axios/issues/7024">#7024</a>)</li>
<li><a
href="3cac78c2de"><code>3cac78c</code></a>
chore(release): v1.12.1 (<a
href="https://redirect.github.com/axios/axios/issues/7021">#7021</a>)</li>
<li><a
href="b5f26b75bd"><code>b5f26b7</code></a>
fix(types): fixed env config types; (<a
href="https://redirect.github.com/axios/axios/issues/7020">#7020</a>)</li>
<li><a
href="0d8ad6e1de"><code>0d8ad6e</code></a>
chore(release): v1.12.0 (<a
href="https://redirect.github.com/axios/axios/issues/7013">#7013</a>)</li>
<li><a
href="fd7f404488"><code>fd7f404</code></a>
fix: release pr run</li>
<li>Additional commits viewable in <a
href="https://github.com/axios/axios/compare/v1.11.0...v1.12.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=axios&package-manager=npm_and_yarn&previous-version=1.11.0&new-version=1.12.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-25 21:17:36 +05:30
Mend Renovate
e5f643f929 chore(deps): update dependency llama-index-llms-google-genai to v0.6.0 (#1547)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| llama-index-llms-google-genai | `==0.5.1` -> `==0.6.0` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/llama-index-llms-google-genai/0.6.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/llama-index-llms-google-genai/0.5.1/0.6.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Harsh Jha <83023263+rapid-killer-9@users.noreply.github.com>
2025-09-25 14:41:38 +00:00
Mend Renovate
785be3d8a4 chore(deps): update dependency llama-index to v0.14.3 (#1548)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [llama-index](https://redirect.github.com/run-llama/llama_index) |
`==0.14.2` -> `==0.14.3` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/llama-index/0.14.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/llama-index/0.14.2/0.14.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>run-llama/llama_index (llama-index)</summary>

###
[`v0.14.3`](https://redirect.github.com/run-llama/llama_index/blob/HEAD/CHANGELOG.md#2025-09-24)

[Compare
Source](https://redirect.github.com/run-llama/llama_index/compare/v0.14.2...v0.14.3)

##### llama-index-core \[0.14.3]

- Fix Gemini thought signature serialization
([#&#8203;19891](https://redirect.github.com/run-llama/llama_index/pull/19891))
- Adding a ThinkingBlock among content blocks
([#&#8203;19919](https://redirect.github.com/run-llama/llama_index/pull/19919))

##### llama-index-llms-anthropic \[0.9.0]

- Adding a ThinkingBlock among content blocks
([#&#8203;19919](https://redirect.github.com/run-llama/llama_index/pull/19919))

##### llama-index-llms-baseten \[0.1.4]

- added kimik2 0905 and reordered list for validation
([#&#8203;19892](https://redirect.github.com/run-llama/llama_index/pull/19892))
- Baseten Dynamic Model APIs Validation
([#&#8203;19893](https://redirect.github.com/run-llama/llama_index/pull/19893))

##### llama-index-llms-google-genai \[0.6.0]

- Add missing FileAPI support for documents
([#&#8203;19897](https://redirect.github.com/run-llama/llama_index/pull/19897))
- Adding a ThinkingBlock among content blocks
([#&#8203;19919](https://redirect.github.com/run-llama/llama_index/pull/19919))

##### llama-index-llms-mistralai \[0.8.0]

- Adding a ThinkingBlock among content blocks
([#&#8203;19919](https://redirect.github.com/run-llama/llama_index/pull/19919))

##### llama-index-llms-openai \[0.6.0]

- Adding a ThinkingBlock among content blocks
([#&#8203;19919](https://redirect.github.com/run-llama/llama_index/pull/19919))

##### llama-index-protocols-ag-ui \[0.2.2]

- improve how state snapshotting works in AG-UI
([#&#8203;19934](https://redirect.github.com/run-llama/llama_index/pull/19934))

##### llama-index-readers-mongodb \[0.5.0]

- Use PyMongo Asynchronous API instead of Motor
([#&#8203;19875](https://redirect.github.com/run-llama/llama_index/pull/19875))

##### llama-index-readers-paddle-ocr \[0.1.0]

- \[New Package] Add PaddleOCR Reader for extracting text from images in
PDFs
([#&#8203;19827](https://redirect.github.com/run-llama/llama_index/pull/19827))

##### llama-index-readers-web \[0.5.4]

- feat(readers/web-firecrawl): migrate to Firecrawl v2 SDK
([#&#8203;19773](https://redirect.github.com/run-llama/llama_index/pull/19773))

##### llama-index-storage-chat-store-mongo \[0.3.0]

- Use PyMongo Asynchronous API instead of Motor
([#&#8203;19875](https://redirect.github.com/run-llama/llama_index/pull/19875))

##### llama-index-storage-kvstore-mongodb \[0.5.0]

- Use PyMongo Asynchronous API instead of Motor
([#&#8203;19875](https://redirect.github.com/run-llama/llama_index/pull/19875))

##### llama-index-tools-valyu \[0.5.0]

- Add Valyu Extractor and Fast mode
([#&#8203;19915](https://redirect.github.com/run-llama/llama_index/pull/19915))

##### llama-index-vector-stores-azureaisearch \[0.4.2]

- Fix/llama index vector stores azureaisearch fix
([#&#8203;19800](https://redirect.github.com/run-llama/llama_index/pull/19800))

##### llama-index-vector-stores-azurepostgresql \[0.1.0]

- Add support for Azure PostgreSQL
([#&#8203;19709](https://redirect.github.com/run-llama/llama_index/pull/19709))

##### llama-index-vector-stores-qdrant \[0.8.5]

- Add proper compat for old sparse vectors
([#&#8203;19882](https://redirect.github.com/run-llama/llama_index/pull/19882))

##### llama-index-vector-stores-singlestoredb \[0.4.2]

- Fix SQLi Vulnerability in SingleStore Db
([#&#8203;19914](https://redirect.github.com/run-llama/llama_index/pull/19914))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Harsh Jha <83023263+rapid-killer-9@users.noreply.github.com>
2025-09-25 14:25:27 +00:00
Huan Chen
88bac7e36f feat(tool/bigquery-forecast)!: add allowed datasets support to forecast (#1412)
## Description
---

This introduces a breaking change. The bigquery-forecast tool will now
enforce the allowed datasets setting from its BigQuery source
configuration. Previously, this setting had no effect on the tool.

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-24 23:59:00 +00:00
Yuan Teoh
20cb43a249 docs: add gemini cli extensions (#1534)
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-23 22:39:28 +00:00
Pranava B
6e420534ee feat(cassandra): add Cassandra Source and Tool (#1012)
[Cassandra](https://cassandra.apache.org/_/cassandra-basics.html) is a
NoSQL distributed database. By design, NoSQL databases are lightweight,
open-source, non-relational, and largely distributed. Counted among
their strengths are horizontal scalability, distributed architectures,
and a flexible approach to schema definition.

Cassandra go driver link -
https://pkg.go.dev/github.com/apache/cassandra-gocql-driver/v2

This PR 
- adds a new source for cassandra
- adds a new tool _cassandra-cql_ with support for executing predefined
parameterized CQL queries on cassandra
- adds unit and integration tests for the tool and the source
- adds documentation for the cassandra source and cassandra-cql tool

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
Co-authored-by: duwenxin <duwenxin@google.com>
2025-09-23 20:48:43 +00:00
Anmol Shukla
6387dd3efa ci: set up quickstart testing for js (#1365)
Co-authored-by: Harsh Jha <83023263+rapid-killer-9@users.noreply.github.com>
Co-authored-by: Harsh Jha <harshrjha@google.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-23 15:12:31 +00:00
Anmol Shukla
a63d7aae2b ci: set up quickstart testing for go (#1368) 2025-09-23 20:32:16 +05:30
Harsh Jha
4be5d165de ci: add files for py quickstart sample testing workflow (#1392)
Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-23 11:45:25 +00:00
Mend Renovate
a5ef166fcb chore(deps): update dependency llama-index-llms-google-genai to v0.5.1 (#1529)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| llama-index-llms-google-genai | `==0.5.0` -> `==0.5.1` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/llama-index-llms-google-genai/0.5.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/llama-index-llms-google-genai/0.5.0/0.5.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Harsh Jha <83023263+rapid-killer-9@users.noreply.github.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-23 11:04:08 +00:00
Mend Renovate
8c268b20ae chore(deps): update dependency toolbox-core to v0.5.2 (#1511)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[toolbox-core](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python)
([changelog](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-core/CHANGELOG.md))
| `==0.5.0` -> `==0.5.2` |
[![age](https://developer.mend.io/api/mc/badges/age/pypi/toolbox-core/0.5.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/toolbox-core/0.5.0/0.5.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/mcp-toolbox-sdk-python (toolbox-core)</summary>

###
[`v0.5.2`](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/releases/tag/toolbox-core-v0.5.2):
toolbox-core: v0.5.2

[Compare
Source](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/compare/toolbox-core-v0.5.1...toolbox-core-v0.5.2)

##### Miscellaneous Chores

- **deps:** update python-nonmajor
([#&#8203;372](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/issues/372))
([d915624](d9156246fd))

###
[`v0.5.1`](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/releases/tag/toolbox-core-v0.5.1):
toolbox-core: v0.5.1

[Compare
Source](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/compare/toolbox-core-v0.5.0...toolbox-core-v0.5.1)

##### Bug Fixes

- **toolbox-core:** Use typing.Annotated for reliable parameter
descriptions instead of docstrings
([#&#8203;371](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/issues/371))
([eb76680](eb76680d24))

##### Documentation

- Update langgraph sample in toolbox-core
([#&#8203;366](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/issues/366))
([fe35082](fe35082104))

##### Miscellaneous Chores

- Remove redundant test for
test\_add\_auth\_token\_getter\_unused\_token
([#&#8203;347](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/issues/347))
([dccaf1b](dccaf1bd70))
- Remove duplicate header check during initialization
([#&#8203;357](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/issues/357))
([888170b](888170b3c3))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

---------

Co-authored-by: Harsh Jha <83023263+rapid-killer-9@users.noreply.github.com>
2025-09-23 10:49:12 +00:00
184 changed files with 4184 additions and 1333 deletions

View File

@@ -662,6 +662,26 @@ steps:
- |
./yugabytedb.test -test.v
- id: "cassandra"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID", "CASSANDRA_USER", "CASSANDRA_PASS", "CASSANDRA_HOST"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Cassandra" \
cassandra \
cassandra
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
@@ -746,6 +766,12 @@ availableSecrets:
env: YUGABYTEDB_USER
- versionName: projects/$PROJECT_ID/secrets/yugabytedb_pass/versions/latest
env: YUGABYTEDB_PASS
- versionName: projects/$PROJECT_ID/secrets/cassandra_user/versions/latest
env: CASSANDRA_USER
- versionName: projects/$PROJECT_ID/secrets/cassandra_pass/versions/latest
env: CASSANDRA_PASS
- versionName: projects/$PROJECT_ID/secrets/cassandra_host/versions/latest
env: CASSANDRA_HOST
options:
logging: CLOUD_LOGGING_ONLY

View File

@@ -0,0 +1,47 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
steps:
- name: 'golang:1.25.1'
id: 'go-quickstart-test'
entrypoint: 'bash'
args:
# The '-c' flag tells bash to execute the following string as a command.
# The 'set -ex' enables debug output and exits on error for easier troubleshooting.
- -c
- |
set -ex
export VERSION=$(cat ./cmd/version.txt)
chmod +x .ci/quickstart_test/run_go_tests.sh
.ci/quickstart_test/run_go_tests.sh
env:
- 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
- 'GCP_PROJECT=${_GCP_PROJECT}'
- 'DATABASE_NAME=${_DATABASE_NAME}'
- 'DB_USER=${_DB_USER}'
secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
availableSecrets:
secretManager:
- versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/7
env: 'TOOLS_YAML_CONTENT'
- versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
env: 'GOOGLE_API_KEY'
- versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
env: 'DB_PASSWORD'
timeout: 1000s
options:
logging: CLOUD_LOGGING_ONLY

View File

@@ -0,0 +1,47 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
steps:
- name: 'node:20'
id: 'js-quickstart-test'
entrypoint: 'bash'
args:
# The '-c' flag tells bash to execute the following string as a command.
# The 'set -ex' enables debug output and exits on error for easier troubleshooting.
- -c
- |
set -ex
export VERSION=$(cat ./cmd/version.txt)
chmod +x .ci/quickstart_test/run_js_tests.sh
.ci/quickstart_test/run_js_tests.sh
env:
- 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
- 'GCP_PROJECT=${_GCP_PROJECT}'
- 'DATABASE_NAME=${_DATABASE_NAME}'
- 'DB_USER=${_DB_USER}'
secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
availableSecrets:
secretManager:
- versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/6
env: 'TOOLS_YAML_CONTENT'
- versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
env: 'GOOGLE_API_KEY'
- versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
env: 'DB_PASSWORD'
timeout: 1000s
options:
logging: CLOUD_LOGGING_ONLY

View File

@@ -0,0 +1,47 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
steps:
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk:537.0.0'
id: 'python-quickstart-test'
entrypoint: 'bash'
args:
# The '-c' flag tells bash to execute the following string as a command.
# The 'set -ex' enables debug output and exits on error for easier troubleshooting.
- -c
- |
set -ex
export VERSION=$(cat ./cmd/version.txt)
chmod +x .ci/quickstart_test/run_py_tests.sh
.ci/quickstart_test/run_py_tests.sh
env:
- 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
- 'GCP_PROJECT=${_GCP_PROJECT}'
- 'DATABASE_NAME=${_DATABASE_NAME}'
- 'DB_USER=${_DB_USER}'
secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
availableSecrets:
secretManager:
- versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/5
env: 'TOOLS_YAML_CONTENT'
- versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
env: 'GOOGLE_API_KEY'
- versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
env: 'DB_PASSWORD'
timeout: 1000s
options:
logging: CLOUD_LOGGING_ONLY

View File

@@ -0,0 +1,125 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#!/bin/bash
set -e
TABLE_NAME="hotels_go"
QUICKSTART_GO_DIR="docs/en/getting-started/quickstart/go"
SQL_FILE=".ci/quickstart_test/setup_hotels_sample.sql"
PROXY_PID=""
TOOLBOX_PID=""
install_system_packages() {
apt-get update && apt-get install -y \
postgresql-client \
wget \
gettext-base \
netcat-openbsd
}
start_cloud_sql_proxy() {
wget "https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.10.0/cloud-sql-proxy.linux.amd64" -O /usr/local/bin/cloud-sql-proxy
chmod +x /usr/local/bin/cloud-sql-proxy
cloud-sql-proxy "${CLOUD_SQL_INSTANCE}" &
PROXY_PID=$!
for i in {1..30}; do
if nc -z 127.0.0.1 5432; then
echo "Cloud SQL Proxy is up and running."
return
fi
sleep 1
done
echo "Cloud SQL Proxy failed to start within the timeout period."
exit 1
}
setup_toolbox() {
TOOLBOX_YAML="/tools.yaml"
echo "${TOOLS_YAML_CONTENT}" > "$TOOLBOX_YAML"
if [ ! -f "$TOOLBOX_YAML" ]; then echo "Failed to create tools.yaml"; exit 1; fi
wget "https://storage.googleapis.com/genai-toolbox/v${VERSION}/linux/amd64/toolbox" -O "/toolbox"
chmod +x "/toolbox"
/toolbox --tools-file "$TOOLBOX_YAML" &
TOOLBOX_PID=$!
sleep 2
}
setup_orch_table() {
export TABLE_NAME
envsubst < "$SQL_FILE" | psql -h "$PGHOST" -p "$PGPORT" -U "$DB_USER" -d "$DATABASE_NAME"
}
run_orch_test() {
local orch_dir="$1"
local orch_name
orch_name=$(basename "$orch_dir")
if [ "$orch_name" == "openAI" ]; then
echo -e "\nSkipping framework '${orch_name}': Temporarily excluded."
return
fi
(
set -e
setup_orch_table
echo "--- Preparing module for $orch_name ---"
cd "$orch_dir"
if [ -f "go.mod" ]; then
go mod tidy
fi
cd ..
export ORCH_NAME="$orch_name"
echo "--- Running tests for $orch_name ---"
go test -v ./...
)
}
cleanup_all() {
echo "--- Final cleanup: Shutting down processes and dropping table ---"
if [ -n "$TOOLBOX_PID" ]; then
kill $TOOLBOX_PID || true
fi
if [ -n "$PROXY_PID" ]; then
kill $PROXY_PID || true
fi
}
trap cleanup_all EXIT
# Main script execution
install_system_packages
start_cloud_sql_proxy
export PGHOST=127.0.0.1
export PGPORT=5432
export PGPASSWORD="$DB_PASSWORD"
export GOOGLE_API_KEY="$GOOGLE_API_KEY"
setup_toolbox
for ORCH_DIR in "$QUICKSTART_GO_DIR"/*/; do
if [ ! -d "$ORCH_DIR" ]; then
continue
fi
run_orch_test "$ORCH_DIR"
done

View File

@@ -0,0 +1,123 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#!/bin/bash
set -e
TABLE_NAME="hotels_js"
QUICKSTART_JS_DIR="docs/en/getting-started/quickstart/js"
SQL_FILE=".ci/quickstart_test/setup_hotels_sample.sql"
# Initialize process IDs to empty at the top of the script
PROXY_PID=""
TOOLBOX_PID=""
install_system_packages() {
apt-get update && apt-get install -y \
postgresql-client \
wget \
gettext-base \
netcat-openbsd
}
start_cloud_sql_proxy() {
wget "https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.10.0/cloud-sql-proxy.linux.amd64" -O /usr/local/bin/cloud-sql-proxy
chmod +x /usr/local/bin/cloud-sql-proxy
cloud-sql-proxy "${CLOUD_SQL_INSTANCE}" &
PROXY_PID=$!
for i in {1..30}; do
if nc -z 127.0.0.1 5432; then
echo "Cloud SQL Proxy is up and running."
return
fi
sleep 1
done
echo "Cloud SQL Proxy failed to start within the timeout period."
exit 1
}
setup_toolbox() {
TOOLBOX_YAML="/tools.yaml"
echo "${TOOLS_YAML_CONTENT}" > "$TOOLBOX_YAML"
if [ ! -f "$TOOLBOX_YAML" ]; then echo "Failed to create tools.yaml"; exit 1; fi
wget "https://storage.googleapis.com/genai-toolbox/v${VERSION}/linux/amd64/toolbox" -O "/toolbox"
chmod +x "/toolbox"
/toolbox --tools-file "$TOOLBOX_YAML" &
TOOLBOX_PID=$!
sleep 2
}
setup_orch_table() {
export TABLE_NAME
envsubst < "$SQL_FILE" | psql -h "$PGHOST" -p "$PGPORT" -U "$DB_USER" -d "$DATABASE_NAME"
}
run_orch_test() {
local orch_dir="$1"
local orch_name
orch_name=$(basename "$orch_dir")
(
set -e
echo "--- Preparing environment for $orch_name ---"
setup_orch_table
cd "$orch_dir"
if [ -f "package.json" ]; then
echo "Installing dependencies for $orch_name..."
npm install
fi
cd ..
echo "--- Running tests for $orch_name ---"
export ORCH_NAME="$orch_name"
node --test quickstart.test.js
echo "--- Cleaning environment for $orch_name ---"
rm -rf "${orch_name}/node_modules"
)
}
cleanup_all() {
echo "--- Final cleanup: Shutting down processes and dropping table ---"
if [ -n "$TOOLBOX_PID" ]; then
kill $TOOLBOX_PID || true
fi
if [ -n "$PROXY_PID" ]; then
kill $PROXY_PID || true
fi
}
trap cleanup_all EXIT
# Main script execution
install_system_packages
start_cloud_sql_proxy
export PGHOST=127.0.0.1
export PGPORT=5432
export PGPASSWORD="$DB_PASSWORD"
export GOOGLE_API_KEY="$GOOGLE_API_KEY"
setup_toolbox
for ORCH_DIR in "$QUICKSTART_JS_DIR"/*/; do
if [ ! -d "$ORCH_DIR" ]; then
continue
fi
run_orch_test "$ORCH_DIR"
done

View File

@@ -0,0 +1,115 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#!/bin/bash
set -e
TABLE_NAME="hotels_python"
QUICKSTART_PYTHON_DIR="docs/en/getting-started/quickstart/python"
SQL_FILE=".ci/quickstart_test/setup_hotels_sample.sql"
PROXY_PID=""
TOOLBOX_PID=""
install_system_packages() {
apt-get update && apt-get install -y \
postgresql-client \
python3-venv \
wget \
gettext-base \
netcat-openbsd
}
start_cloud_sql_proxy() {
wget "https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.10.0/cloud-sql-proxy.linux.amd64" -O /usr/local/bin/cloud-sql-proxy
chmod +x /usr/local/bin/cloud-sql-proxy
cloud-sql-proxy "${CLOUD_SQL_INSTANCE}" &
PROXY_PID=$!
for i in {1..30}; do
if nc -z 127.0.0.1 5432; then
echo "Cloud SQL Proxy is up and running."
return
fi
sleep 1
done
echo "Cloud SQL Proxy failed to start within the timeout period."
exit 1
}
setup_toolbox() {
TOOLBOX_YAML="/tools.yaml"
echo "${TOOLS_YAML_CONTENT}" > "$TOOLBOX_YAML"
if [ ! -f "$TOOLBOX_YAML" ]; then echo "Failed to create tools.yaml"; exit 1; fi
wget "https://storage.googleapis.com/genai-toolbox/v${VERSION}/linux/amd64/toolbox" -O "/toolbox"
chmod +x "/toolbox"
/toolbox --tools-file "$TOOLBOX_YAML" &
TOOLBOX_PID=$!
sleep 2
}
setup_orch_table() {
export TABLE_NAME
envsubst < "$SQL_FILE" | psql -h "$PGHOST" -p "$PGPORT" -U "$DB_USER" -d "$DATABASE_NAME"
}
run_orch_test() {
local orch_dir="$1"
local orch_name
orch_name=$(basename "$orch_dir")
(
set -e
setup_orch_table
cd "$orch_dir"
local VENV_DIR=".venv"
python3 -m venv "$VENV_DIR"
source "$VENV_DIR/bin/activate"
pip install -r requirements.txt
echo "--- Running tests for $orch_name ---"
cd ..
ORCH_NAME="$orch_name" pytest
rm -rf "$VENV_DIR"
)
}
cleanup_all() {
echo "--- Final cleanup: Shutting down processes and dropping table ---"
if [ -n "$TOOLBOX_PID" ]; then
kill $TOOLBOX_PID || true
fi
if [ -n "$PROXY_PID" ]; then
kill $PROXY_PID || true
fi
}
trap cleanup_all EXIT
# Main script execution
install_system_packages
start_cloud_sql_proxy
export PGHOST=127.0.0.1
export PGPORT=5432
export PGPASSWORD="$DB_PASSWORD"
export GOOGLE_API_KEY="$GOOGLE_API_KEY"
setup_toolbox
for ORCH_DIR in "$QUICKSTART_PYTHON_DIR"/*/; do
if [ ! -d "$ORCH_DIR" ]; then
continue
fi
run_orch_test "$ORCH_DIR"
done

View File

@@ -0,0 +1,28 @@
-- Copyright 2025 Google LLC
--
-- Licensed under the Apache License, Version 2.0 (the "License");
-- you may not use this file except in compliance with the License.
-- You may obtain a copy of the License at
--
-- http://www.apache.org/licenses/LICENSE-2.0
--
-- Unless required by applicable law or agreed to in writing, software
-- distributed under the License is distributed on an "AS IS" BASIS,
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-- See the License for the specific language governing permissions and
-- limitations under the License.
TRUNCATE TABLE $TABLE_NAME;
INSERT INTO $TABLE_NAME (id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');

View File

@@ -1,5 +1,27 @@
# Changelog
## [0.16.0](https://github.com/googleapis/genai-toolbox/compare/v0.15.0...v0.16.0) (2025-09-25)
### ⚠ BREAKING CHANGES
* **tool/bigquery-execute-sql:** add allowed datasets support ([#1443](https://github.com/googleapis/genai-toolbox/issues/1443))
* **tool/bigquery-forecast:** add allowed datasets support ([#1412](https://github.com/googleapis/genai-toolbox/issues/1412))
### Features
* **cassandra:** Add Cassandra Source and Tool ([#1012](https://github.com/googleapis/genai-toolbox/issues/1012)) ([6e42053](https://github.com/googleapis/genai-toolbox/commit/6e420534ee894da4a8d226acb6cdb63d0d5d9ce5))
* **sources/postgres:** Add application_name ([#1504](https://github.com/googleapis/genai-toolbox/issues/1504)) ([72a2366](https://github.com/googleapis/genai-toolbox/commit/72a2366b28870aa6f81c4f890f4770ec5ecffdba))
* **tool/bigquery-execute-sql:** Add allowed datasets support ([#1443](https://github.com/googleapis/genai-toolbox/issues/1443)) ([9501ebb](https://github.com/googleapis/genai-toolbox/commit/9501ebbdbcba871b98663185c690308dda1729b5))
* **tool/bigquery-forecast:** Add allowed datasets support ([#1412](https://github.com/googleapis/genai-toolbox/issues/1412)) ([88bac7e](https://github.com/googleapis/genai-toolbox/commit/88bac7e36f5ebb6ad18773bff30b85ef678431e7))
* **tools/clickhouse-list-tables:** Add list-tables tool ([#1446](https://github.com/googleapis/genai-toolbox/issues/1446)) ([69a3caf](https://github.com/googleapis/genai-toolbox/commit/69a3cafabec5a40e2776d71de3587c0d16c722a2))
### Bug Fixes
* **tool/mongodb-find:** Fix find tool `limit` field ([#1570](https://github.com/googleapis/genai-toolbox/issues/1570)) ([4166bf7](https://github.com/googleapis/genai-toolbox/commit/4166bf7ab85732f64b877d5f20235057df919049))
* **tools/mongodb:** Concat filter params only once in mongodb update tools ([#1545](https://github.com/googleapis/genai-toolbox/issues/1545)) ([295f9dc](https://github.com/googleapis/genai-toolbox/commit/295f9dc41a43f0a4bdbd99e465bf2be01249084e))
## [0.15.0](https://github.com/googleapis/genai-toolbox/compare/v0.14.0...v0.15.0) (2025-09-18)

View File

@@ -25,7 +25,7 @@ ARG COMMIT_SHA=""
RUN go get ./...
RUN CGO_ENABLED=0 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=container.${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}"
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}"
# Final Stage
FROM gcr.io/distroless/static:nonroot

View File

@@ -33,7 +33,6 @@ documentation](https://googleapis.github.io/genai-toolbox/).
- [Getting Started](#getting-started)
- [Installing the server](#installing-the-server)
- [Running the server](#running-the-server)
- [Homebrew Users](#homebrew-users)
- [Integrating your application](#integrating-your-application)
- [Configuration](#configuration)
- [Sources](#sources)
@@ -115,13 +114,53 @@ following instructions for your OS and CPU architecture.
To install Toolbox as a binary:
<!-- {x-release-please-start-version} -->
```sh
# see releases page for other versions
export VERSION=0.15.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
> <details>
> <summary>Linux (AMD64)</summary>
>
> To install Toolbox as a binary on Linux (AMD64):
> ```sh
> # see releases page for other versions
> export VERSION=0.16.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>macOS (Apple Silicon)</summary>
>
> To install Toolbox as a binary on macOS (Apple Silicon):
> ```sh
> # see releases page for other versions
> export VERSION=0.16.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>macOS (Intel)</summary>
>
> To install Toolbox as a binary on macOS (Intel):
> ```sh
> # see releases page for other versions
> export VERSION=0.16.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>Windows (AMD64)</summary>
>
> To install Toolbox as a binary on Windows (AMD64):
> ```powershell
> # see releases page for other versions
> $VERSION = "0.16.0"
> Invoke-WebRequest -Uri "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" -OutFile "toolbox.exe"
> ```
>
> </details>
</details>
<details>
@@ -130,7 +169,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.15.0
export VERSION=0.16.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -154,12 +193,23 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.15.0
go install github.com/googleapis/genai-toolbox@v0.16.0
```
<!-- {x-release-please-end} -->
</details>
<details>
<summary>Gemini CLI Extensions</summary>
To install Gemini CLI Extensions for MCP Toolbox, run the following command:
```sh
gemini extensions install https://github.com/gemini-cli-extensions/mcp-toolbox
```
</details>
### Running the server
[Configure](#configuration) a `tools.yaml` to define your tools, and then
@@ -233,6 +283,16 @@ toolbox --tools-file "tools.yaml"
</details>
<details>
<summary>Gemini CLI</summary>
Interact with your custom tools using natural language. Check
[gemini-cli-extensions/mcp-toolbox](https://github.com/gemini-cli-extensions/mcp-toolbox)
for more information.
</details>
You can use `toolbox help` for a full list of flags! To stop the server, send a
terminate signal (`ctrl+c` on most platforms).

View File

@@ -64,6 +64,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysearchcatalog"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/tools/cassandra/cassandracql"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouseexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouselistdatabases"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouselisttables"
@@ -159,6 +160,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
_ "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
_ "github.com/googleapis/genai-toolbox/internal/sources/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/sources/cassandra"
_ "github.com/googleapis/genai-toolbox/internal/sources/clickhouse"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudmonitoring"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqladmin"
@@ -653,20 +655,6 @@ func watchChanges(ctx context.Context, watchDirs map[string]bool, watchedFiles m
}
}
// updateLogLevel checks if Toolbox have to update the existing log level set by users.
// stdio doesn't support "debug" and "info" logs.
func updateLogLevel(stdio bool, logLevel string) bool {
if stdio {
switch strings.ToUpper(logLevel) {
case log.Debug, log.Info:
return true
default:
return false
}
}
return false
}
func resolveWatcherInputs(toolsFile string, toolsFiles []string, toolsFolder string) (map[string]bool, map[string]bool) {
var relevantFiles []string
@@ -695,10 +683,6 @@ func resolveWatcherInputs(toolsFile string, toolsFiles []string, toolsFolder str
}
func run(cmd *Command) error {
if updateLogLevel(cmd.cfg.Stdio, cmd.cfg.LogLevel.String()) {
cmd.cfg.LogLevel = server.StringLevel(log.Warn)
}
ctx, cancel := context.WithCancel(cmd.Context())
defer cancel()
@@ -722,16 +706,22 @@ func run(cmd *Command) error {
cancel()
}(ctx)
// If stdio, set logger's out stream (usually DEBUG and INFO logs) to errStream
loggerOut := cmd.outStream
if cmd.cfg.Stdio {
loggerOut = cmd.errStream
}
// Handle logger separately from config
switch strings.ToLower(cmd.cfg.LoggingFormat.String()) {
case "json":
logger, err := log.NewStructuredLogger(cmd.outStream, cmd.errStream, cmd.cfg.LogLevel.String())
logger, err := log.NewStructuredLogger(loggerOut, cmd.errStream, cmd.cfg.LogLevel.String())
if err != nil {
return fmt.Errorf("unable to initialize logger: %w", err)
}
cmd.logger = logger
case "standard":
logger, err := log.NewStdLogger(cmd.outStream, cmd.errStream, cmd.cfg.LogLevel.String())
logger, err := log.NewStdLogger(loggerOut, cmd.errStream, cmd.cfg.LogLevel.String())
if err != nil {
return fmt.Errorf("unable to initialize logger: %w", err)
}

View File

@@ -1597,51 +1597,3 @@ func TestPrebuiltTools(t *testing.T) {
})
}
}
func TestUpdateLogLevel(t *testing.T) {
tcs := []struct {
desc string
stdio bool
logLevel string
want bool
}{
{
desc: "no stdio",
stdio: false,
logLevel: "info",
want: false,
},
{
desc: "stdio with info log",
stdio: true,
logLevel: "info",
want: true,
},
{
desc: "stdio with debug log",
stdio: true,
logLevel: "debug",
want: true,
},
{
desc: "stdio with warn log",
stdio: true,
logLevel: "warn",
want: false,
},
{
desc: "stdio with error log",
stdio: true,
logLevel: "error",
want: false,
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := updateLogLevel(tc.stdio, tc.logLevel)
if got != tc.want {
t.Fatalf("incorrect indication to update log level: got %t, want %t", got, tc.want)
}
})
}
}

View File

@@ -1 +1 @@
0.15.0
0.16.0

View File

@@ -234,7 +234,7 @@
},
"outputs": [],
"source": [
"version = \"0.15.0\" # x-release-please-version\n",
"version = \"0.16.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -81,23 +81,50 @@ following instructions for your OS and CPU architecture.
<!-- {x-release-please-start-version} -->
{{< tabpane text=true >}}
{{% tab header="Binary" lang="en" %}}
To install Toolbox as a binary:
{{< tabpane text=true >}}
{{% tab header="Linux (AMD64)" lang="en" %}}
To install Toolbox as a binary on Linux (AMD64):
```sh
# see releases page for other versions
export VERSION=0.15.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
export VERSION=0.16.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="macOS (Apple Silicon)" lang="en" %}}
To install Toolbox as a binary on macOS (Apple Silicon):
```sh
# see releases page for other versions
export VERSION=0.16.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="macOS (Intel)" lang="en" %}}
To install Toolbox as a binary on macOS (Intel):
```sh
# see releases page for other versions
export VERSION=0.16.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="Windows (AMD64)" lang="en" %}}
To install Toolbox as a binary on Windows (AMD64):
```powershell
# see releases page for other versions
$VERSION = "0.16.0"
Invoke-WebRequest -Uri "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" -OutFile "toolbox.exe"
```
{{% /tab %}}
{{< /tabpane >}}
{{% /tab %}}
{{% tab header="Container image" lang="en" %}}
You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.15.0
export VERSION=0.16.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -116,7 +143,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.15.0
go install github.com/googleapis/genai-toolbox@v0.16.0
```
{{% /tab %}}

View File

@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -345,9 +345,10 @@
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q=="
},
"node_modules/axios": {
"version": "1.11.0",
"resolved": "https://registry.npmjs.org/axios/-/axios-1.11.0.tgz",
"integrity": "sha512-1Lx3WLFQWm3ooKDYZD1eXmoGO9fxYQjrycfHFC8P0sCfQVXyROp0p9PFWBehewBOdCwHc+f/b8I0fMto5eSfwA==",
"version": "1.12.2",
"resolved": "https://registry.npmjs.org/axios/-/axios-1.12.2.tgz",
"integrity": "sha512-vMJzPewAlRyOgxV2dU0Cuz2O8zzzx9VYtbJOaBgXFeLc4IV/Eg50n4LowmehOOR61S8ZMpc2K5Sa7g6A4jfkUw==",
"license": "MIT",
"dependencies": {
"follow-redirects": "^1.15.6",
"form-data": "^4.0.4",

View File

@@ -11,6 +11,7 @@ import os
# TODO(developer): replace this with your Google API key
api_key = os.environ.get("GOOGLE_API_KEY") or "your-api-key" # Set your API key here
os.environ["GOOGLE_API_KEY"] = api_key
async def main():
with ToolboxSyncClient("http://127.0.0.1:5000") as toolbox_client:

View File

@@ -1,3 +1,3 @@
google-adk==1.14.1
toolbox-core==0.5.0
google-adk==1.15.0
toolbox-core==0.5.2
pytest==8.4.2

View File

@@ -1,3 +1,3 @@
google-genai==1.38.0
toolbox-core==0.5.0
toolbox-core==0.5.2
pytest==8.4.2

View File

@@ -1,4 +1,4 @@
llama-index==0.14.2
llama-index-llms-google-genai==0.5.0
llama-index==0.14.3
llama-index-llms-google-genai==0.6.0
toolbox-llamaindex==0.5.2
pytest==8.4.2

View File

@@ -13,7 +13,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -48,19 +48,19 @@ to expose your developer assistant tools to a Looker instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -45,19 +45,19 @@ instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ expose your developer assistant tools to a MySQL instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -44,19 +44,19 @@ expose your developer assistant tools to a Neo4j instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -56,19 +56,19 @@ Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ to expose your developer assistant tools to a SQLite instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -0,0 +1,38 @@
---
title: Connect via Gemini CLI Extensions
type: docs
weight: 2
description: "Connect to Toolbox via Gemini CLI Extensions."
---
## Gemini CLI Extensions
[Gemini CLI][gemini-cli] is an open-source AI agent designed to assist with development workflows by assisting with coding, debugging, data exploration, and content creation. Its mission is to provide an agentic interface for interacting with database and analytics services and popular open-source databases.
### How extensions work
Gemini CLI is highly extensible, allowing for the addition of new tools and capabilities through extensions. You can load the extensions from a GitHub URL, a local directory, or a configurable registry. They provide new tools, slash commands, and prompts to assist with your workflow.
Use the Gemini CLI Extensions to load prebuilt or custom tools to interact with your databases.
[gemini-cli]: https://google-gemini.github.io/gemini-cli/
Below are a list of Gemini CLI Extensions powered by MCP Toolbox:
* [alloydb](https://github.com/gemini-cli-extensions/alloydb)
* [alloydb-observability](https://github.com/gemini-cli-extensions/alloydb-observability)
* [bigquery-conversational-analytics](https://github.com/gemini-cli-extensions/bigquery-conversational-analytics)
* [bigquery-data-analytics](https://github.com/gemini-cli-extensions/bigquery-data-analytics)
* [cloud-sql-mysql](https://github.com/gemini-cli-extensions/cloud-sql-mysql)
* [cloud-sql-mysql-observability](https://github.com/gemini-cli-extensions/cloud-sql-mysql-observability)
* [cloud-sql-postgresql](https://github.com/gemini-cli-extensions/cloud-sql-postgresql)
* [cloud-sql-postgresql-observability](https://github.com/gemini-cli-extensions/cloud-sql-postgresql-observability)
* [cloud-sql-sqlserver](https://github.com/gemini-cli-extensions/cloud-sql-sqlserver)
* [cloud-sql-sqlserver-observability](https://github.com/gemini-cli-extensions/cloud-sql-sqlserver-observability)
* [dataplex](https://github.com/gemini-cli-extensions/dataplex)
* [firestore-native](https://github.com/gemini-cli-extensions/firestore-native)
* [looker](https://github.com/gemini-cli-extensions/looker)
* [mcp-toolbox](https://github.com/gemini-cli-extensions/mcp-toolbox)
* [mysql](https://github.com/gemini-cli-extensions/mysql)
* [postgres](https://github.com/gemini-cli-extensions/postgres)
* [spanner](https://github.com/gemini-cli-extensions/spanner)
* [sql-server](https://github.com/gemini-cli-extensions/sql-server)

View File

@@ -0,0 +1,57 @@
---
title: "Cassandra"
type: docs
weight: 1
description: >
Cassandra is a NoSQL distributed database known for its horizontal scalability, distributed architecture, and flexible schema definition.
---
## About
[Cassandra][cassandra-docs] is a NoSQL distributed database. By design, NoSQL databases are lightweight, open-source, non-relational, and largely distributed. Counted among their strengths are horizontal scalability, distributed architectures, and a flexible approach to schema definition.
[cassandra-docs]: https://cassandra.apache.org/
## Available Tools
- [`cassandra-cql`](../tools/cassandra/cassandra-cql.md)
Run parameterized CQL queries in Cassandra.
## Example
```yaml
sources:
my-cassandra-source:
kind: cassandra
hosts:
- 127.0.0.1
keyspace: my_keyspace
protoVersion: 4
username: ${USER_NAME}
password: ${PASSWORD}
caPath: /path/to/ca.crt # Optional: path to CA certificate
certPath: /path/to/client.crt # Optional: path to client certificate
keyPath: /path/to/client.key # Optional: path to client key
enableHostVerification: true # Optional: enable host verification
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|------------------------|:---------:|:------------:|-------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cassandra". |
| hosts | string[] | true | List of IP addresses to connect to (e.g., ["192.168.1.1:9042", "192.168.1.2:9042","192.168.1.3:9042"]). The default port is 9042 if not specified. |
| keyspace | string | true | Name of the Cassandra keyspace to connect to (e.g., "my_keyspace"). |
| protoVersion | integer | false | Protocol version for the Cassandra connection (e.g., 4). |
| username | string | false | Name of the Cassandra user to connect as (e.g., "my-cassandra-user"). |
| password | string | false | Password of the Cassandra user (e.g., "my-password"). |
| caPath | string | false | Path to the CA certificate for SSL/TLS (e.g., "/path/to/ca.crt"). |
| certPath | string | false | Path to the client certificate for SSL/TLS (e.g., "/path/to/client.crt"). |
| keyPath | string | false | Path to the client key for SSL/TLS (e.g., "/path/to/client.key"). |
| enableHostVerification | boolean | false | Enable host verification for SSL/TLS (e.g., true). By default, host verification is disabled. |

View File

@@ -15,9 +15,23 @@ It's compatible with the following sources:
- [bigquery](../../sources/bigquery.md)
`bigquery-execute-sql` takes a required `sql` input parameter and runs the SQL
statement against the configured `source`. It also supports an optional `dry_run`
parameter to validate a query without executing it.
`bigquery-execute-sql` accepts the following parameters:
- **`sql`** (required): The GoogleSQL statement to execute.
- **`dry_run`** (optional): If set to `true`, the query is validated but not run,
returning information about the execution instead. Defaults to `false`.
The tool's behavior is influenced by the `allowedDatasets` restriction on the
`bigquery` source:
- **Without `allowedDatasets` restriction:** The tool can execute any valid GoogleSQL
query.
- **With `allowedDatasets` restriction:** Before execution, the tool performs a dry run
to analyze the query.
It will reject the query if it attempts to access any table outside the
allowed `datasets` list. To enforce this restriction, the following operations
are also disallowed:
- **Dataset-level operations** (e.g., `CREATE SCHEMA`, `ALTER SCHEMA`).
- **Unanalyzable operations** where the accessed tables cannot be determined
statically (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`, `CALL`).
## Example

View File

@@ -33,6 +33,13 @@ query based on the provided parameters:
- **horizon** (integer, optional): The number of future time steps you want to
predict. It defaults to 10 if not specified.
The tool's behavior regarding these parameters is influenced by the `allowedDatasets` restriction on the `bigquery` source:
- **Without `allowedDatasets` restriction:** The tool can use any table or query for the `history_data` parameter.
- **With `allowedDatasets` restriction:** The tool verifies that the `history_data` parameter only accesses tables
within the allowed datasets. If `history_data` is a table ID, the tool checks if the table's dataset is in the
allowed list. If `history_data` is a query, the tool performs a dry run to analyze the query and rejects it
if it accesses any table outside the allowed list.
## Example
```yaml

View File

@@ -0,0 +1,7 @@
---
title: "Cassandra"
type: docs
weight: 1
description: >
Tools that work with Cassandra Sources.
---

View File

@@ -0,0 +1,96 @@
---
title: "cassandra-cql"
type: docs
weight: 1
description: >
A "cassandra-cql" tool executes a pre-defined CQL statement against a Cassandra
database.
aliases:
- /resources/tools/cassandra-cql
---
## About
A `cassandra-cql` tool executes a pre-defined CQL statement against a Cassandra
database. It's compatible with any of the following sources:
- [cassandra](../sources/cassandra.md)
The specified CQL statement is executed as a [prepared statement][cassandra-prepare],
and expects parameters in the CQL query to be in the form of placeholders `?`.
[cassandra-prepare]: https://docs.datastax.com/en/developer/go-driver/4.8/cql-prepared-statements/
## Example
> **Note:** This tool uses parameterized queries to prevent CQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for keyspaces, table names, column names,
> or other parts of the query.
```yaml
tools:
search_users_by_email:
kind: cassandra-cql
source: my-cassandra-cluster
statement: |
SELECT user_id, email, first_name, last_name, created_at
FROM users
WHERE email = ?
description: |
Use this tool to retrieve specific user information by their email address.
Takes an email address and returns user details including user ID, email,
first name, last name, and account creation timestamp.
Do NOT use this tool with a user ID or other identifiers.
Example:
{{
"email": "user@example.com",
}}
parameters:
- name: email
type: string
description: User's email address
```
### Example with Template Parameters
> **Note:** This tool allows direct modifications to the CQL statement,
> including keyspaces, table names, and column names. **This makes it more
> vulnerable to CQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](../#template-parameters).
```yaml
tools:
list_keyspace_table:
kind: cassandra-cql
source: my-cassandra-cluster
statement: |
SELECT * FROM {{.keyspace}}.{{.tableName}};
description: |
Use this tool to list all information from a specific table in a keyspace.
Example:
{{
"keyspace": "my_keyspace",
"tableName": "users",
}}
templateParameters:
- name: keyspace
type: string
description: Keyspace containing the table
- name: tableName
type: string
description: Table to select from
```
## Reference
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cassandra-cql". |
| source | string | true | Name of the source the CQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | CQL statement to execute. |
| authRequired | []string | false | List of authentication requirements for the source. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the CQL statement. |
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the CQL statement before executing prepared statement. |

View File

@@ -297,7 +297,8 @@
"setup_queries = [\n",
" # Install required extension to use the AlloyDB AI natural language support API\n",
" \"\"\"CREATE EXTENSION IF NOT EXISTS alloydb_ai_nl cascade;\"\"\",\n",
"\n",
" \"\"\"CREATE EXTENSION IF NOT EXISTS google_ml_integration; \"\"\",\n",
" \n",
" # Create schema\n",
" \"\"\"CREATE SCHEMA IF NOT EXISTS nla_demo;\"\"\",\n",
"\n",
@@ -689,15 +690,18 @@
"source": [
"async def run_queries(pool):\n",
" async with pool.connect() as db_conn:\n",
"\n",
" # Verify the generated context for the tables\n",
" for query in verify_context_queries:\n",
" response = await db_conn.execute(sqlalchemy.text(query))\n",
" print(\"Verify the context:\", response.mappings().all())\n",
"\n",
" \n",
" # Update context that needs revision\n",
" for query in update_context_queries:\n",
" await db_conn.execute(sqlalchemy.text(query))\n",
"\n",
" \n",
" # The resulting context entries in the alloydb_ai_nl.generated_schema_context_view \n",
" # view are applied to the corresponding schema objects, and the comments are overwritten.\n",
" for query in apply_generated_context_queries:\n",
@@ -767,7 +771,7 @@
},
"outputs": [],
"source": [
"version = \"0.15.0\" # x-release-please-version\n",
"version = \"0.16.0\" # x-release-please-version\n",
"! curl -L -o /content/toolbox https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -123,7 +123,7 @@ In this section, we will download and install the Toolbox binary.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
export VERSION="0.15.0"
export VERSION="0.16.0"
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -220,7 +220,7 @@
},
"outputs": [],
"source": [
"version = \"0.15.0\" # x-release-please-version\n",
"version = \"0.16.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -179,7 +179,7 @@ to use BigQuery, and then run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -98,7 +98,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -34,7 +34,7 @@ In this section, we will download Toolbox and run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -48,7 +48,7 @@ In this section, we will download Toolbox and run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
@@ -64,6 +64,7 @@ In this section, we will download Toolbox and run the Toolbox server.
```bash
export LOOKER_BASE_URL=https://looker.example.com
export LOOKER_VERIFY_SSL=true
export LOOKER_USE_CLIENT_OAUTH=true
```
In some instances you may need to append `:19999` to

View File

@@ -34,7 +34,7 @@ In this section, we will download Toolbox and run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.16.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

23
go.mod
View File

@@ -6,10 +6,10 @@ toolchain go1.25.1
require (
cloud.google.com/go/alloydbconn v1.15.5
cloud.google.com/go/bigquery v1.70.0
cloud.google.com/go/bigquery v1.71.0
cloud.google.com/go/bigtable v1.40.0
cloud.google.com/go/cloudsqlconn v1.18.1
cloud.google.com/go/dataplex v1.27.0
cloud.google.com/go/dataplex v1.27.1
cloud.google.com/go/firestore v1.18.0
cloud.google.com/go/spanner v1.85.1
github.com/ClickHouse/clickhouse-go/v2 v2.40.3
@@ -26,6 +26,7 @@ require (
github.com/go-playground/validator/v10 v10.27.0
github.com/go-sql-driver/mysql v1.9.3
github.com/goccy/go-yaml v1.18.0
github.com/gocql/gocql v1.7.0
github.com/google/go-cmp v0.7.0
github.com/google/uuid v1.6.0
github.com/jackc/pgx/v5 v5.7.6
@@ -50,8 +51,8 @@ require (
go.opentelemetry.io/otel/sdk/metric v1.37.0
go.opentelemetry.io/otel/trace v1.38.0
golang.org/x/oauth2 v0.31.0
google.golang.org/api v0.249.0
google.golang.org/genproto v0.0.0-20250826171959-ef028d996bc1
google.golang.org/api v0.250.0
google.golang.org/genproto v0.0.0-20250922171735-9219d122eba9
modernc.org/sqlite v1.39.0
)
@@ -73,7 +74,7 @@ require (
cloud.google.com/go/alloydb v1.18.0 // indirect
cloud.google.com/go/auth v0.16.5 // indirect
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
cloud.google.com/go/compute/metadata v0.8.0 // indirect
cloud.google.com/go/compute/metadata v0.8.4 // indirect
cloud.google.com/go/iam v1.5.2 // indirect
cloud.google.com/go/longrunning v0.6.7 // indirect
cloud.google.com/go/monitoring v1.24.2 // indirect
@@ -115,6 +116,7 @@ require (
github.com/gorilla/websocket v1.5.3 // indirect
github.com/grpc-ecosystem/go-grpc-middleware v1.4.0 // indirect
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.1 // indirect
github.com/hailocab/go-hostpool v0.0.0-20160125115350-e80d13ce29ed // indirect
github.com/hashicorp/go-uuid v1.0.3 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/jackc/pgpassfile v1.0.0 // indirect
@@ -169,13 +171,14 @@ require (
golang.org/x/sync v0.17.0 // indirect
golang.org/x/sys v0.36.0 // indirect
golang.org/x/text v0.29.0 // indirect
golang.org/x/time v0.12.0 // indirect
golang.org/x/time v0.13.0 // indirect
golang.org/x/tools v0.36.0 // indirect
golang.org/x/xerrors v0.0.0-20240903120638-7835f813f4da // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250818200422-3122310a409c // indirect
google.golang.org/grpc v1.75.0 // indirect
google.golang.org/protobuf v1.36.8 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20250908214217-97024824d090 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250908214217-97024824d090 // indirect
google.golang.org/grpc v1.75.1 // indirect
google.golang.org/protobuf v1.36.9 // indirect
gopkg.in/inf.v0 v0.9.1 // indirect
gopkg.in/ini.v1 v1.67.0 // indirect
modernc.org/libc v1.66.3 // indirect
modernc.org/mathutil v1.7.1 // indirect

54
go.sum
View File

@@ -137,8 +137,8 @@ cloud.google.com/go/bigquery v1.47.0/go.mod h1:sA9XOgy0A8vQK9+MWhEQTY6Tix87M/Zur
cloud.google.com/go/bigquery v1.48.0/go.mod h1:QAwSz+ipNgfL5jxiaK7weyOhzdoAy1zFm0Nf1fysJac=
cloud.google.com/go/bigquery v1.49.0/go.mod h1:Sv8hMmTFFYBlt/ftw2uN6dFdQPzBlREY9yBh7Oy7/4Q=
cloud.google.com/go/bigquery v1.50.0/go.mod h1:YrleYEh2pSEbgTBZYMJ5SuSr0ML3ypjRB1zgf7pvQLU=
cloud.google.com/go/bigquery v1.70.0 h1:V1OIhhOSionCOXWMmypXOvZu/ogkzosa7s1ArWJO/Yg=
cloud.google.com/go/bigquery v1.70.0/go.mod h1:6lEAkgTJN+H2JcaX1eKiuEHTKyqBaJq5U3SpLGbSvwI=
cloud.google.com/go/bigquery v1.71.0 h1:NvSZvXU1Hyb+YiRVKQPuQXGeZaw/0NP6M/WOrBqSx3g=
cloud.google.com/go/bigquery v1.71.0/go.mod h1:GUbRtmeCckOE85endLherHD9RsujY+gS7i++c1CqssQ=
cloud.google.com/go/bigtable v1.40.0 h1:iNeqGqkJvFdjg07Ku3F7KKfq5QZvBySisYHVsLB1RwE=
cloud.google.com/go/bigtable v1.40.0/go.mod h1:LtPzCcrAFaGRZ82Hs8xMueUeYW9Jw12AmNdUTMfDnh4=
cloud.google.com/go/billing v1.4.0/go.mod h1:g9IdKBEFlItS8bTtlrZdVLWSSdSyFUZKXNS02zKMOZY=
@@ -194,8 +194,8 @@ cloud.google.com/go/compute/metadata v0.1.0/go.mod h1:Z1VN+bulIf6bt4P/C37K4DyZYZ
cloud.google.com/go/compute/metadata v0.2.0/go.mod h1:zFmK7XCadkQkj6TtorcaGlCW1hT1fIilQDwofLpJ20k=
cloud.google.com/go/compute/metadata v0.2.1/go.mod h1:jgHgmJd2RKBGzXqF5LR2EZMGxBkeanZ9wwa75XHJgOM=
cloud.google.com/go/compute/metadata v0.2.3/go.mod h1:VAV5nSsACxMJvgaAuX6Pk2AawlZn8kiOGuCv6gTkwuA=
cloud.google.com/go/compute/metadata v0.8.0 h1:HxMRIbao8w17ZX6wBnjhcDkW6lTFpgcaobyVfZWqRLA=
cloud.google.com/go/compute/metadata v0.8.0/go.mod h1:sYOGTp851OV9bOFJ9CH7elVvyzopvWQFNNghtDQ/Biw=
cloud.google.com/go/compute/metadata v0.8.4 h1:oXMa1VMQBVCyewMIOm3WQsnVd9FbKBtm8reqWRaXnHQ=
cloud.google.com/go/compute/metadata v0.8.4/go.mod h1:E0bWwX5wTnLPedCKqk3pJmVgCBSM6qQI1yTBdEb3C10=
cloud.google.com/go/contactcenterinsights v1.3.0/go.mod h1:Eu2oemoePuEFc/xKFPjbTuPSj0fYJcPls9TFlPNnHHY=
cloud.google.com/go/contactcenterinsights v1.4.0/go.mod h1:L2YzkGbPsv+vMQMCADxJoT9YiTTnSEd6fEvCeHTYVck=
cloud.google.com/go/contactcenterinsights v1.6.0/go.mod h1:IIDlT6CLcDoyv79kDv8iWxMSTZhLxSCofVV5W6YFM/w=
@@ -216,8 +216,8 @@ cloud.google.com/go/datacatalog v1.8.0/go.mod h1:KYuoVOv9BM8EYz/4eMFxrr4DUKhGIOX
cloud.google.com/go/datacatalog v1.8.1/go.mod h1:RJ58z4rMp3gvETA465Vg+ag8BGgBdnRPEMMSTr5Uv+M=
cloud.google.com/go/datacatalog v1.12.0/go.mod h1:CWae8rFkfp6LzLumKOnmVh4+Zle4A3NXLzVJ1d1mRm0=
cloud.google.com/go/datacatalog v1.13.0/go.mod h1:E4Rj9a5ZtAxcQJlEBTLgMTphfP11/lNaAshpoBgemX8=
cloud.google.com/go/datacatalog v1.26.0 h1:eFgygb3DTufTWWUB8ARk+dSuXz+aefNJXTlkWlQcWwE=
cloud.google.com/go/datacatalog v1.26.0/go.mod h1:bLN2HLBAwB3kLTFT5ZKLHVPj/weNz6bR0c7nYp0LE14=
cloud.google.com/go/datacatalog v1.26.1 h1:bCRKA8uSQN8wGW3Tw0gwko4E9a64GRmbW1nCblhgC2k=
cloud.google.com/go/datacatalog v1.26.1/go.mod h1:2Qcq8vsHNxMDgjgadRFmFG47Y+uuIVsyEGUrlrKEdrg=
cloud.google.com/go/dataflow v0.6.0/go.mod h1:9QwV89cGoxjjSR9/r7eFDqqjtvbKxAK2BaYU6PVk9UM=
cloud.google.com/go/dataflow v0.7.0/go.mod h1:PX526vb4ijFMesO1o202EaUmouZKBpjHsTlCtB4parQ=
cloud.google.com/go/dataflow v0.8.0/go.mod h1:Rcf5YgTKPtQyYz8bLYhFoIV/vP39eL7fWNcSOyFfLJE=
@@ -236,8 +236,8 @@ cloud.google.com/go/dataplex v1.3.0/go.mod h1:hQuRtDg+fCiFgC8j0zV222HvzFQdRd+SVX
cloud.google.com/go/dataplex v1.4.0/go.mod h1:X51GfLXEMVJ6UN47ESVqvlsRplbLhcsAt0kZCCKsU0A=
cloud.google.com/go/dataplex v1.5.2/go.mod h1:cVMgQHsmfRoI5KFYq4JtIBEUbYwc3c7tXmIDhRmNNVQ=
cloud.google.com/go/dataplex v1.6.0/go.mod h1:bMsomC/aEJOSpHXdFKFGQ1b0TDPIeL28nJObeO1ppRs=
cloud.google.com/go/dataplex v1.27.0 h1:k6gf5DnX7YHD/hilShxVP9ExmGrEWZFjfBZ7rHt4JlM=
cloud.google.com/go/dataplex v1.27.0/go.mod h1:VB+xlYJiJ5kreonXsa2cHPj0A3CfPh/mgiHG4JFhbUA=
cloud.google.com/go/dataplex v1.27.1 h1:renSEYTQZMQ3ag7lM0BDmSj4FWqaTGW60YQ/lvAE5iA=
cloud.google.com/go/dataplex v1.27.1/go.mod h1:VB+xlYJiJ5kreonXsa2cHPj0A3CfPh/mgiHG4JFhbUA=
cloud.google.com/go/dataproc v1.7.0/go.mod h1:CKAlMjII9H90RXaMpSxQ8EU6dQx6iAYNPcYPOkSbi8s=
cloud.google.com/go/dataproc v1.8.0/go.mod h1:5OW+zNAH0pMpw14JVrPONsxMQYMBqJuzORhIBfBn9uI=
cloud.google.com/go/dataproc v1.12.0/go.mod h1:zrF3aX0uV3ikkMz6z4uBbIKyhRITnxvr4i3IjKsKrw4=
@@ -737,6 +737,10 @@ github.com/aws/aws-sdk-go-v2/service/sts v1.38.4/go.mod h1:Z+Gd23v97pX9zK97+tX4p
github.com/aws/smithy-go v1.23.0 h1:8n6I3gXzWJB2DxBDnfxgBaSX6oe0d/t10qGz7OKqMCE=
github.com/aws/smithy-go v1.23.0/go.mod h1:t1ufH5HMublsJYulve2RKmHDC15xu1f26kHCp/HgceI=
github.com/benbjohnson/clock v1.1.0/go.mod h1:J11/hYXuz8f4ySSvYwY0FKfm+ezbsZBKZxNJlLklBHA=
github.com/bitly/go-hostpool v0.0.0-20171023180738-a3a6125de932 h1:mXoPYz/Ul5HYEDvkta6I8/rnYM5gSdSV2tJ6XbZuEtY=
github.com/bitly/go-hostpool v0.0.0-20171023180738-a3a6125de932/go.mod h1:NOuUCSz6Q9T7+igc/hlvDOUdtWKryOrtFyIVABv/p7k=
github.com/bmizerany/assert v0.0.0-20160611221934-b7ed37b82869 h1:DDGfHa7BWjL4YnC6+E63dPcxHo2sUxDIu8g3QgEJdRY=
github.com/bmizerany/assert v0.0.0-20160611221934-b7ed37b82869/go.mod h1:Ekp36dRnpXw/yCqJaO+ZrUyxD+3VXMFFr56k5XYrpB4=
github.com/boombuler/barcode v1.0.0/go.mod h1:paBWMcWSl3LHKBqUq+rly7CNSldXjb2rDl3JlRe0mD8=
github.com/boombuler/barcode v1.0.1/go.mod h1:paBWMcWSl3LHKBqUq+rly7CNSldXjb2rDl3JlRe0mD8=
github.com/bsm/ginkgo/v2 v2.12.0 h1:Ny8MWAHyOepLGlLKYmXG4IEkioBysk6GpaRTLC8zwWs=
@@ -897,6 +901,8 @@ github.com/goccy/go-json v0.10.5 h1:Fq85nIqj+gXn/S5ahsiTlK3TmC85qgirsdTP/+DeaC4=
github.com/goccy/go-json v0.10.5/go.mod h1:oq7eo15ShAhp70Anwd5lgX2pLfOS3QCiwU/PULtXL6M=
github.com/goccy/go-yaml v1.18.0 h1:8W7wMFS12Pcas7KU+VVkaiCng+kG8QiFeFwzFb+rwuw=
github.com/goccy/go-yaml v1.18.0/go.mod h1:XBurs7gK8ATbW4ZPGKgcbrY1Br56PdM69F7LkFRi1kA=
github.com/gocql/gocql v1.7.0 h1:O+7U7/1gSN7QTEAaMEsJc1Oq2QHXvCWoF3DFK9HDHus=
github.com/gocql/gocql v1.7.0/go.mod h1:vnlvXyFZeLBF0Wy+RS8hrOdbn0UWsWtdg07XJnFxZ+4=
github.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q=
github.com/golang-jwt/jwt/v5 v5.3.0 h1:pv4AsKCKKZuqlgs5sUmn4x8UlGa0kEVt/puTpKx9vvo=
github.com/golang-jwt/jwt/v5 v5.3.0/go.mod h1:fxCRLWMO43lRc8nhHWY6LGqRcf+1gQWArsqaEUEa5bE=
@@ -1044,6 +1050,8 @@ github.com/grpc-ecosystem/grpc-gateway/v2 v2.7.0/go.mod h1:hgWBS7lorOAVIJEQMi4Zs
github.com/grpc-ecosystem/grpc-gateway/v2 v2.11.3/go.mod h1:o//XUCC/F+yRGJoPO/VU0GSB0f8Nhgmxx0VIRUvaC0w=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.1 h1:X5VWvz21y3gzm9Nw/kaUeku/1+uBhcekkmy4IkffJww=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.1/go.mod h1:Zanoh4+gvIgluNqcfMVTJueD4wSS5hT7zTt4Mrutd90=
github.com/hailocab/go-hostpool v0.0.0-20160125115350-e80d13ce29ed h1:5upAirOpQc1Q53c0bnx2ufif5kANL7bfZWcc6VJWJd8=
github.com/hailocab/go-hostpool v0.0.0-20160125115350-e80d13ce29ed/go.mod h1:tMWxXQ9wFIaZeTI9F+hmhFiGpFmhOHzyShyFUhRm0H4=
github.com/hashicorp/go-uuid v1.0.2/go.mod h1:6SBZvOh/SIDV7/2o3Jml5SYk/TvGqwFJ/bN7x4byOro=
github.com/hashicorp/go-uuid v1.0.3 h1:2gKiV6YVmrJ1i2CKKa9obLvRieoRGviZFL26PcT/Co8=
github.com/hashicorp/go-uuid v1.0.3/go.mod h1:6SBZvOh/SIDV7/2o3Jml5SYk/TvGqwFJ/bN7x4byOro=
@@ -1679,8 +1687,8 @@ golang.org/x/time v0.0.0-20191024005414-555d28b269f0/go.mod h1:tRJNPiyCQ0inRvYxb
golang.org/x/time v0.0.0-20220922220347-f3bd1da661af/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.1.0/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.3.0/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.12.0 h1:ScB/8o8olJvc+CQPWrK3fPZNfh7qgwCrY0zJmoEQLSE=
golang.org/x/time v0.12.0/go.mod h1:CDIdPxbZBQxdj6cxyCIdrNogrJKMJ7pr37NYpMcMDSg=
golang.org/x/time v0.13.0 h1:eUlYslOIt32DgYD6utsuUeHs4d7AsEYLuIAdg7FlYgI=
golang.org/x/time v0.13.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
golang.org/x/tools v0.0.0-20180525024113-a5b4c53f6e8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190114222345-bf090417da8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
@@ -1826,8 +1834,8 @@ google.golang.org/api v0.108.0/go.mod h1:2Ts0XTHNVWxypznxWOYUeI4g3WdP9Pk2Qk58+a/
google.golang.org/api v0.110.0/go.mod h1:7FC4Vvx1Mooxh8C5HWjzZHcavuS2f6pmJpZx60ca7iI=
google.golang.org/api v0.111.0/go.mod h1:qtFHvU9mhgTJegR31csQ+rwxyUTHOKFqCKWp1J0fdw0=
google.golang.org/api v0.114.0/go.mod h1:ifYI2ZsFK6/uGddGfAD5BMxlnkBqCmqHSDUVi45N5Yg=
google.golang.org/api v0.249.0 h1:0VrsWAKzIZi058aeq+I86uIXbNhm9GxSHpbmZ92a38w=
google.golang.org/api v0.249.0/go.mod h1:dGk9qyI0UYPwO/cjt2q06LG/EhUpwZGdAbYF14wHHrQ=
google.golang.org/api v0.250.0 h1:qvkwrf/raASj82UegU2RSDGWi/89WkLckn4LuO4lVXM=
google.golang.org/api v0.250.0/go.mod h1:Y9Uup8bDLJJtMzJyQnu+rLRJLA0wn+wTtc6vTlOvfXo=
google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=
google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
google.golang.org/appengine v1.5.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
@@ -1968,12 +1976,12 @@ google.golang.org/genproto v0.0.0-20230323212658-478b75c54725/go.mod h1:UUQDJDOl
google.golang.org/genproto v0.0.0-20230330154414-c0448cd141ea/go.mod h1:UUQDJDOlWu4KYeJZffbWgBkS1YFobzKbLVfK69pe0Ak=
google.golang.org/genproto v0.0.0-20230331144136-dcfb400f0633/go.mod h1:UUQDJDOlWu4KYeJZffbWgBkS1YFobzKbLVfK69pe0Ak=
google.golang.org/genproto v0.0.0-20230410155749-daa745c078e1/go.mod h1:nKE/iIaLqn2bQwXBg8f1g2Ylh6r5MN5CmZvuzZCgsCU=
google.golang.org/genproto v0.0.0-20250826171959-ef028d996bc1 h1:Nm5SEGIguOIBDXs5rhfz2aKwEVWlgwC58UcmEnLDc8Y=
google.golang.org/genproto v0.0.0-20250826171959-ef028d996bc1/go.mod h1:Jz9LrroM7Mcm+a0QrLh4UpZ1B/WhjIbqwEcUf4y08nQ=
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c h1:AtEkQdl5b6zsybXcbz00j1LwNodDuH6hVifIaNqk7NQ=
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c/go.mod h1:ea2MjsO70ssTfCjiwHgI0ZFqcw45Ksuk2ckf9G468GA=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250818200422-3122310a409c h1:qXWI/sQtv5UKboZ/zUk7h+mrf/lXORyI+n9DKDAusdg=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250818200422-3122310a409c/go.mod h1:gw1tLEfykwDz2ET4a12jcXt4couGAm7IwsVaTy0Sflo=
google.golang.org/genproto v0.0.0-20250922171735-9219d122eba9 h1:LvZVVaPE0JSqL+ZWb6ErZfnEOKIqqFWUJE2D0fObSmc=
google.golang.org/genproto v0.0.0-20250922171735-9219d122eba9/go.mod h1:QFOrLhdAe2PsTp3vQY4quuLKTi9j3XG3r6JPPaw7MSc=
google.golang.org/genproto/googleapis/api v0.0.0-20250908214217-97024824d090 h1:d8Nakh1G+ur7+P3GcMjpRDEkoLUcLW2iU92XVqR+XMQ=
google.golang.org/genproto/googleapis/api v0.0.0-20250908214217-97024824d090/go.mod h1:U8EXRNSd8sUYyDfs/It7KVWodQr+Hf9xtxyxWudSwEw=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250908214217-97024824d090 h1:/OQuEa4YWtDt7uQWHd3q3sUMb+QOLQUg1xa8CEsRv5w=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250908214217-97024824d090/go.mod h1:GmFNa4BdJZ2a8G+wCe9Bg3wwThLrJun751XstdJt5Og=
google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
google.golang.org/grpc v1.20.1/go.mod h1:10oTOabMzJvdu6/UiuZezV6QK5dSlG84ov/aaiqXj38=
google.golang.org/grpc v1.21.1/go.mod h1:oYelfM1adQP15Ek0mdvEgi9Df8B9CZIaU1084ijfRaM=
@@ -2015,8 +2023,8 @@ google.golang.org/grpc v1.52.3/go.mod h1:pu6fVzoFb+NBYNAvQL08ic+lvB2IojljRYuun5v
google.golang.org/grpc v1.53.0/go.mod h1:OnIrk0ipVdj4N5d9IUoFUx72/VlD7+jUsHwZgwSMQpw=
google.golang.org/grpc v1.54.0/go.mod h1:PUSEXI6iWghWaB6lXM4knEgpJNu2qUcKfDtNci3EC2g=
google.golang.org/grpc v1.56.3/go.mod h1:I9bI3vqKfayGqPUAwGdOSu7kt6oIJLixfffKrpXqQ9s=
google.golang.org/grpc v1.75.0 h1:+TW+dqTd2Biwe6KKfhE5JpiYIBWq865PhKGSXiivqt4=
google.golang.org/grpc v1.75.0/go.mod h1:JtPAzKiq4v1xcAB2hydNlWI2RnF85XXcV0mhKXr2ecQ=
google.golang.org/grpc v1.75.1 h1:/ODCNEuf9VghjgO3rqLcfg8fiOP0nSluljWFlDxELLI=
google.golang.org/grpc v1.75.1/go.mod h1:JtPAzKiq4v1xcAB2hydNlWI2RnF85XXcV0mhKXr2ecQ=
google.golang.org/grpc/cmd/protoc-gen-go-grpc v1.1.0/go.mod h1:6Kw0yEErY5E/yWrBtf03jp27GLLJujG4z/JK95pnjjw=
google.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8=
google.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0=
@@ -2035,13 +2043,15 @@ google.golang.org/protobuf v1.28.0/go.mod h1:HV8QOd/L58Z+nl8r43ehVNZIU/HEI6OcFqw
google.golang.org/protobuf v1.28.1/go.mod h1:HV8QOd/L58Z+nl8r43ehVNZIU/HEI6OcFqwMG9pJV4I=
google.golang.org/protobuf v1.29.1/go.mod h1:HV8QOd/L58Z+nl8r43ehVNZIU/HEI6OcFqwMG9pJV4I=
google.golang.org/protobuf v1.30.0/go.mod h1:HV8QOd/L58Z+nl8r43ehVNZIU/HEI6OcFqwMG9pJV4I=
google.golang.org/protobuf v1.36.8 h1:xHScyCOEuuwZEc6UtSOvPbAT4zRh0xcNRYekJwfqyMc=
google.golang.org/protobuf v1.36.8/go.mod h1:fuxRtAxBytpl4zzqUh6/eyUujkJdNiuEkXntxiD/uRU=
google.golang.org/protobuf v1.36.9 h1:w2gp2mA27hUeUzj9Ex9FBjsBm40zfaDtEWow293U7Iw=
google.golang.org/protobuf v1.36.9/go.mod h1:fuxRtAxBytpl4zzqUh6/eyUujkJdNiuEkXntxiD/uRU=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
gopkg.in/errgo.v2 v2.1.0/go.mod h1:hNsd1EY+bozCKY1Ytp96fpM3vjJbqLJn88ws8XvfDNI=
gopkg.in/inf.v0 v0.9.1 h1:73M5CoZyi3ZLMOyDlQh031Cx6N9NDJ2Vvfl76EDAgDc=
gopkg.in/inf.v0 v0.9.1/go.mod h1:cWUDdTG/fYaXco+Dcufb5Vnc6Gp2YChqWtbxRZE0mXw=
gopkg.in/ini.v1 v1.67.0 h1:Dgnx+6+nfE+IfzjUEISNeydPJh9AXNNsWbGP9KzCsOA=
gopkg.in/ini.v1 v1.67.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=

View File

@@ -75,11 +75,17 @@ func (t MockTool) RequiresClientAuthorization() bool {
func (t MockTool) McpManifest() tools.McpManifest {
properties := make(map[string]tools.ParameterMcpManifest)
required := make([]string, 0)
authParams := make(map[string][]string)
for _, p := range t.Params {
name := p.GetName()
properties[name] = p.McpManifest()
paramManifest, authParamList := p.McpManifest()
properties[name] = paramManifest
required = append(required, name)
if len(authParamList) > 0 {
authParams[name] = authParamList
}
}
toolsSchema := tools.McpToolsSchema{
@@ -88,11 +94,19 @@ func (t MockTool) McpManifest() tools.McpManifest {
Required: required,
}
return tools.McpManifest{
mcpManifest := tools.McpManifest{
Name: t.Name,
Description: t.Description,
InputSchema: toolsSchema,
}
if len(authParams) > 0 {
mcpManifest.Metadata = map[string]any{
"toolbox/authParams": authParams,
}
}
return mcpManifest
}
var tool1 = MockTool{

View File

@@ -117,11 +117,15 @@ func getOpts(ipType, userAgent string, useIAM bool) ([]alloydbconn.Option, error
}
func getConnectionConfig(ctx context.Context, user, pass, dbname string) (string, bool, error) {
userAgent, err := util.UserAgentFromContext(ctx)
if err != nil {
userAgent = "genai-toolbox"
}
useIAM := true
// If username and password both provided, use password authentication
if user != "" && pass != "" {
dsn := fmt.Sprintf("user=%s password=%s dbname=%s sslmode=disable", user, pass, dbname)
dsn := fmt.Sprintf("user=%s password=%s dbname=%s sslmode=disable application_name=%s", user, pass, dbname, userAgent)
useIAM = false
return dsn, useIAM, nil
}
@@ -141,7 +145,7 @@ func getConnectionConfig(ctx context.Context, user, pass, dbname string) (string
}
// Construct IAM connection string with username
dsn := fmt.Sprintf("user=%s dbname=%s sslmode=disable", user, dbname)
dsn := fmt.Sprintf("user=%s dbname=%s sslmode=disable application_name=%s", user, dbname, userAgent)
return dsn, useIAM, nil
}

View File

@@ -0,0 +1,134 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package cassandra
import (
"context"
"fmt"
"github.com/goccy/go-yaml"
"github.com/gocql/gocql"
"github.com/googleapis/genai-toolbox/internal/sources"
"go.opentelemetry.io/otel/trace"
)
const SourceKind string = "cassandra"
func init() {
if !sources.Register(SourceKind, newConfig) {
panic(fmt.Sprintf("source kind %q already registered", SourceKind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (sources.SourceConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Hosts []string `yaml:"hosts" validate:"required"`
Keyspace string `yaml:"keyspace"`
ProtoVersion int `yaml:"protoVersion"`
Username string `yaml:"username"`
Password string `yaml:"password"`
CAPath string `yaml:"caPath"`
CertPath string `yaml:"certPath"`
KeyPath string `yaml:"keyPath"`
EnableHostVerification bool `yaml:"enableHostVerification"`
}
// Initialize implements sources.SourceConfig.
func (c Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.Source, error) {
session, err := initCassandraSession(ctx, tracer, c)
if err != nil {
return nil, fmt.Errorf("unable to create session: %v", err)
}
s := &Source{
Name: c.Name,
Kind: SourceKind,
Session: session,
}
return s, nil
}
// SourceConfigKind implements sources.SourceConfig.
func (c Config) SourceConfigKind() string {
return SourceKind
}
var _ sources.SourceConfig = Config{}
type Source struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Session *gocql.Session
}
// CassandraSession implements cassandra.compatibleSource.
func (s *Source) CassandraSession() *gocql.Session {
return s.Session
}
// SourceKind implements sources.Source.
func (s Source) SourceKind() string {
return SourceKind
}
var _ sources.Source = &Source{}
func initCassandraSession(ctx context.Context, tracer trace.Tracer, c Config) (*gocql.Session, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, c.Name)
defer span.End()
// Validate authentication configuration
if c.Password != "" && c.Username == "" {
return nil, fmt.Errorf("invalid Cassandra configuration: password provided without a username")
}
cluster := gocql.NewCluster(c.Hosts...)
cluster.ProtoVersion = c.ProtoVersion
cluster.Keyspace = c.Keyspace
// Configure authentication if username is provided
if c.Username != "" {
cluster.Authenticator = gocql.PasswordAuthenticator{
Username: c.Username,
Password: c.Password,
}
}
// Configure SSL options if any are specified
if c.CAPath != "" || c.CertPath != "" || c.KeyPath != "" || c.EnableHostVerification {
cluster.SslOpts = &gocql.SslOptions{
CaPath: c.CAPath,
CertPath: c.CertPath,
KeyPath: c.KeyPath,
EnableHostVerification: c.EnableHostVerification,
}
}
// Create session
session, err := cluster.CreateSession()
if err != nil {
return nil, fmt.Errorf("failed to create Cassandra session: %w", err)
}
return session, nil
}

View File

@@ -0,0 +1,158 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package cassandra_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/sources/cassandra"
"github.com/googleapis/genai-toolbox/internal/testutils"
)
func TestParseFromYamlCassandra(t *testing.T) {
tcs := []struct {
desc string
in string
want server.SourceConfigs
}{
{
desc: "basic example (without optional fields)",
in: `
sources:
my-cassandra-instance:
kind: cassandra
hosts:
- "my-host1"
- "my-host2"
`,
want: server.SourceConfigs{
"my-cassandra-instance": cassandra.Config{
Name: "my-cassandra-instance",
Kind: cassandra.SourceKind,
Hosts: []string{"my-host1", "my-host2"},
Username: "",
Password: "",
ProtoVersion: 0,
CAPath: "",
CertPath: "",
KeyPath: "",
Keyspace: "",
EnableHostVerification: false,
},
},
},
{
desc: "with optional fields",
in: `
sources:
my-cassandra-instance:
kind: cassandra
hosts:
- "my-host1"
- "my-host2"
username: "user"
password: "pass"
keyspace: "example_keyspace"
protoVersion: 4
caPath: "path/to/ca.crt"
certPath: "path/to/cert"
keyPath: "path/to/key"
enableHostVerification: true
`,
want: server.SourceConfigs{
"my-cassandra-instance": cassandra.Config{
Name: "my-cassandra-instance",
Kind: cassandra.SourceKind,
Hosts: []string{"my-host1", "my-host2"},
Username: "user",
Password: "pass",
Keyspace: "example_keyspace",
ProtoVersion: 4,
CAPath: "path/to/ca.crt",
CertPath: "path/to/cert",
KeyPath: "path/to/key",
EnableHostVerification: true,
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Sources server.SourceConfigs `yaml:"sources"`
}{}
// Parse contents
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if !cmp.Equal(tc.want, got.Sources) {
t.Fatalf("incorrect parse: want %v, got %v", tc.want, got.Sources)
}
})
}
}
func TestFailParseFromYaml(t *testing.T) {
tcs := []struct {
desc string
in string
err string
}{
{
desc: "extra field",
in: `
sources:
my-cassandra-instance:
kind: cassandra
hosts:
- "my-host"
foo: bar
`,
err: "unable to parse source \"my-cassandra-instance\" as \"cassandra\": [1:1] unknown field \"foo\"\n> 1 | foo: bar\n ^\n 2 | hosts:\n 3 | - my-host\n 4 | kind: cassandra",
},
{
desc: "missing required field",
in: `
sources:
my-cassandra-instance:
kind: cassandra
`,
err: "unable to parse source \"my-cassandra-instance\" as \"cassandra\": Key: 'Config.Hosts' Error:Field validation for 'Hosts' failed on the 'required' tag",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Sources server.SourceConfigs `yaml:"sources"`
}{}
// Parse contents
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if errStr != tc.err {
t.Fatalf("unexpected error: got %q, want %q", errStr, tc.err)
}
})
}
}

View File

@@ -98,11 +98,15 @@ func (s *Source) PostgresPool() *pgxpool.Pool {
}
func getConnectionConfig(ctx context.Context, user, pass, dbname string) (string, bool, error) {
userAgent, err := util.UserAgentFromContext(ctx)
if err != nil {
userAgent = "genai-toolbox"
}
useIAM := true
// If username and password both provided, use password authentication
if user != "" && pass != "" {
dsn := fmt.Sprintf("user=%s password=%s dbname=%s sslmode=disable", user, pass, dbname)
dsn := fmt.Sprintf("user=%s password=%s dbname=%s sslmode=disable application_name=%s", user, pass, dbname, userAgent)
useIAM = false
return dsn, useIAM, nil
}
@@ -122,7 +126,7 @@ func getConnectionConfig(ctx context.Context, user, pass, dbname string) (string
}
// Construct IAM connection string with username
dsn := fmt.Sprintf("user=%s dbname=%s sslmode=disable", user, dbname)
dsn := fmt.Sprintf("user=%s dbname=%s sslmode=disable application_name=%s", user, dbname, userAgent)
return dsn, useIAM, nil
}

View File

@@ -58,8 +58,8 @@ type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
BaseURL string `yaml:"base_url" validate:"required"`
ClientId string `yaml:"client_id" validate:"required"`
ClientSecret string `yaml:"client_secret" validate:"required"`
ClientId string `yaml:"client_id"`
ClientSecret string `yaml:"client_secret"`
SslVerification bool `yaml:"verify_ssl"`
UseClientOAuth bool `yaml:"use_client_oauth"`
Timeout string `yaml:"timeout"`

View File

@@ -100,10 +100,8 @@ func TestFailParseFromYamlLooker(t *testing.T) {
sources:
my-looker-instance:
kind: looker
base_url: http://example.looker.com/
client_id: jasdl;k;tjl
`,
err: "unable to parse source \"my-looker-instance\" as \"looker\": Key: 'Config.ClientSecret' Error:Field validation for 'ClientSecret' failed on the 'required' tag",
err: "unable to parse source \"my-looker-instance\" as \"looker\": Key: 'Config.BaseURL' Error:Field validation for 'BaseURL' failed on the 'required' tag",
},
}
for _, tc := range tcs {

View File

@@ -22,6 +22,8 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/jackc/pgx/v5/pgxpool"
"go.opentelemetry.io/otel/trace"
)
@@ -91,14 +93,46 @@ func (s *Source) SourceKind() string {
return SourceKind
}
func (s *Source) PostgresPool() *pgxpool.Pool {
return s.Pool
func (s *Source) RunSQL(ctx context.Context, statement string, params any) (any, error) {
sliceParams := params.(tools.ParamValues).AsSlice()
results, err := s.Pool.Query(ctx, statement, sliceParams...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
fields := results.FieldDescriptions()
var out []any
for results.Next() {
v, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, f := range fields {
vMap[f.Name] = v[i]
}
out = append(out, vMap)
}
return out, nil
}
func initPostgresConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, pass, dbname string, queryParams map[string]string) (*pgxpool.Pool, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
defer span.End()
userAgent, err := util.UserAgentFromContext(ctx)
if err != nil {
userAgent = "genai-toolbox"
}
if queryParams == nil {
// Initialize the map before using it
queryParams = make(map[string]string)
}
if _, ok := queryParams["application_name"]; !ok {
queryParams["application_name"] = userAgent
}
// urlExample := "postgres:dd//username:password@localhost:5432/database_name"
url := &url.URL{

View File

@@ -80,19 +80,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "cluster", "password"}
description := cfg.Description
if description == "" {
description = "Creates a new AlloyDB cluster. This is a long-running operation, but the API call returns quickly. This will return operation id to be used by get operations tool. Take all parameters from user in one go."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -81,19 +81,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "location", "cluster", "instance"}
description := cfg.Description
if description == "" {
description = "Creates a new AlloyDB instance (PRIMARY or READ_POOL) within a cluster. This is a long-running operation. This will return operation id to be used by get operations tool. Take all parameters from user in one go."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -81,19 +81,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "location", "cluster", "user", "userType"}
description := cfg.Description
if description == "" {
description = "Creates a new AlloyDB user within a cluster. Takes the new user's name and a secure password. Optionally, a list of database roles can be assigned. Always ask the user for the type of user to create. ALLOYDB_IAM_USER is recommended."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -77,19 +77,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "location", "cluster"}
description := cfg.Description
if description == "" {
description = "Retrieves details about a specific AlloyDB cluster."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -78,19 +78,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "location", "cluster", "instance"}
description := cfg.Description
if description == "" {
description = "Retrieves details about a specific AlloyDB instance."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -78,19 +78,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "location", "cluster", "user"}
description := cfg.Description
if description == "" {
description = "Retrieves details about a specific AlloyDB user."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -76,19 +76,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project"}
description := cfg.Description
if description == "" {
description = "Lists all AlloyDB clusters in a given project and location."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -77,19 +77,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project"}
description := cfg.Description
if description == "" {
description = "Lists all AlloyDB instances in a given project, location and cluster."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -77,19 +77,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "location", "cluster"}
description := cfg.Description
if description == "" {
description = "Lists all AlloyDB users in a given project, location and cluster."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -129,19 +129,12 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "location", "operation"}
description := cfg.Description
if description == "" {
description = "This will poll on operations API until the operation is done. For checking operation status we need projectId, locationID and operationId. Once instance is created give follow up steps on how to use the variables to bring data plane MCP server up in local and remote setup."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
var delay time.Duration
if cfg.Delay == "" {

View File

@@ -119,11 +119,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
cfg.NLConfigParameters = append([]tools.Parameter{newQuestionParam}, cfg.NLConfigParameters...)
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: cfg.NLConfigParameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, cfg.NLConfigParameters)
t := Tool{
Name: cfg.Name,

View File

@@ -118,11 +118,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
pruningMethodParameter,
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// finish tool setup
t := Tool{

View File

@@ -0,0 +1,431 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package bigquerycommon
import (
"fmt"
"strings"
"unicode"
)
// parserState defines the state of the SQL parser's state machine.
type parserState int
const (
stateNormal parserState = iota
// String states
stateInSingleQuoteString
stateInDoubleQuoteString
stateInTripleSingleQuoteString
stateInTripleDoubleQuoteString
stateInRawSingleQuoteString
stateInRawDoubleQuoteString
stateInRawTripleSingleQuoteString
stateInRawTripleDoubleQuoteString
// Comment states
stateInSingleLineCommentDash
stateInSingleLineCommentHash
stateInMultiLineComment
)
// SQL statement verbs
const (
verbCreate = "create"
verbAlter = "alter"
verbDrop = "drop"
verbSelect = "select"
verbInsert = "insert"
verbUpdate = "update"
verbDelete = "delete"
verbMerge = "merge"
)
var tableFollowsKeywords = map[string]bool{
"from": true,
"join": true,
"update": true,
"into": true, // INSERT INTO, MERGE INTO
"table": true, // CREATE TABLE, ALTER TABLE
"using": true, // MERGE ... USING
"insert": true, // INSERT my_table
"merge": true, // MERGE my_table
}
var tableContextExitKeywords = map[string]bool{
"where": true,
"group": true, // GROUP BY
"having": true,
"order": true, // ORDER BY
"limit": true,
"window": true,
"on": true, // JOIN ... ON
"set": true, // UPDATE ... SET
"when": true, // MERGE ... WHEN
}
// TableParser is the main entry point for parsing a SQL string to find all referenced table IDs.
// It handles multi-statement SQL, comments, and recursive parsing of EXECUTE IMMEDIATE statements.
func TableParser(sql, defaultProjectID string) ([]string, error) {
tableIDSet := make(map[string]struct{})
visitedSQLs := make(map[string]struct{})
if _, err := parseSQL(sql, defaultProjectID, tableIDSet, visitedSQLs, false); err != nil {
return nil, err
}
tableIDs := make([]string, 0, len(tableIDSet))
for id := range tableIDSet {
tableIDs = append(tableIDs, id)
}
return tableIDs, nil
}
// parseSQL is the core recursive function that processes SQL strings.
// It uses a state machine to find table names and recursively parse EXECUTE IMMEDIATE.
func parseSQL(sql, defaultProjectID string, tableIDSet map[string]struct{}, visitedSQLs map[string]struct{}, inSubquery bool) (int, error) {
// Prevent infinite recursion.
if _, ok := visitedSQLs[sql]; ok {
return len(sql), nil
}
visitedSQLs[sql] = struct{}{}
state := stateNormal
expectingTable := false
var lastTableKeyword, lastToken, statementVerb string
runes := []rune(sql)
for i := 0; i < len(runes); {
char := runes[i]
remaining := sql[i:]
switch state {
case stateNormal:
if strings.HasPrefix(remaining, "--") {
state = stateInSingleLineCommentDash
i += 2
continue
}
if strings.HasPrefix(remaining, "#") {
state = stateInSingleLineCommentHash
i++
continue
}
if strings.HasPrefix(remaining, "/*") {
state = stateInMultiLineComment
i += 2
continue
}
if char == '(' {
if expectingTable {
// The subquery starts after '('.
consumed, err := parseSQL(remaining[1:], defaultProjectID, tableIDSet, visitedSQLs, true)
if err != nil {
return 0, err
}
// Advance i by the length of the subquery + the opening parenthesis.
// The recursive call returns what it consumed, including the closing parenthesis.
i += consumed + 1
// For most keywords, we expect only one table. `from` can have multiple "tables" (subqueries).
if lastTableKeyword != "from" {
expectingTable = false
}
continue
}
}
if char == ')' {
if inSubquery {
return i + 1, nil
}
}
if char == ';' {
statementVerb = ""
lastToken = ""
i++
continue
}
// Raw strings must be checked before regular strings.
if strings.HasPrefix(remaining, "r'''") || strings.HasPrefix(remaining, "R'''") {
state = stateInRawTripleSingleQuoteString
i += 4
continue
}
if strings.HasPrefix(remaining, `r"""`) || strings.HasPrefix(remaining, `R"""`) {
state = stateInRawTripleDoubleQuoteString
i += 4
continue
}
if strings.HasPrefix(remaining, "r'") || strings.HasPrefix(remaining, "R'") {
state = stateInRawSingleQuoteString
i += 2
continue
}
if strings.HasPrefix(remaining, `r"`) || strings.HasPrefix(remaining, `R"`) {
state = stateInRawDoubleQuoteString
i += 2
continue
}
if strings.HasPrefix(remaining, "'''") {
state = stateInTripleSingleQuoteString
i += 3
continue
}
if strings.HasPrefix(remaining, `"""`) {
state = stateInTripleDoubleQuoteString
i += 3
continue
}
if char == '\'' {
state = stateInSingleQuoteString
i++
continue
}
if char == '"' {
state = stateInDoubleQuoteString
i++
continue
}
if unicode.IsLetter(char) || char == '`' {
parts, consumed, err := parseIdentifierSequence(remaining)
if err != nil {
return 0, err
}
if consumed == 0 {
i++
continue
}
if len(parts) == 1 {
keyword := strings.ToLower(parts[0])
switch keyword {
case "call":
return 0, fmt.Errorf("CALL is not allowed when dataset restrictions are in place, as the called procedure's contents cannot be safely analyzed")
case "immediate":
if lastToken == "execute" {
return 0, fmt.Errorf("EXECUTE IMMEDIATE is not allowed when dataset restrictions are in place, as its contents cannot be safely analyzed")
}
case "procedure", "function":
if lastToken == "create" || lastToken == "create or replace" {
return 0, fmt.Errorf("unanalyzable statements like '%s %s' are not allowed", strings.ToUpper(lastToken), strings.ToUpper(keyword))
}
case verbCreate, verbAlter, verbDrop, verbSelect, verbInsert, verbUpdate, verbDelete, verbMerge:
if statementVerb == "" {
statementVerb = keyword
}
}
if statementVerb == verbCreate || statementVerb == verbAlter || statementVerb == verbDrop {
if keyword == "schema" || keyword == "dataset" {
return 0, fmt.Errorf("dataset-level operations like '%s %s' are not allowed when dataset restrictions are in place", strings.ToUpper(statementVerb), strings.ToUpper(keyword))
}
}
if _, ok := tableFollowsKeywords[keyword]; ok {
expectingTable = true
lastTableKeyword = keyword
} else if _, ok := tableContextExitKeywords[keyword]; ok {
expectingTable = false
lastTableKeyword = ""
}
if lastToken == "create" && keyword == "or" {
lastToken = "create or"
} else if lastToken == "create or" && keyword == "replace" {
lastToken = "create or replace"
} else {
lastToken = keyword
}
} else if len(parts) >= 2 {
// This is a multi-part identifier. If we were expecting a table, this is it.
if expectingTable {
tableID, err := formatTableID(parts, defaultProjectID)
if err != nil {
return 0, err
}
if tableID != "" {
tableIDSet[tableID] = struct{}{}
}
// For most keywords, we expect only one table.
if lastTableKeyword != "from" {
expectingTable = false
}
}
lastToken = ""
}
i += consumed
continue
}
i++
case stateInSingleQuoteString:
if char == '\\' {
i += 2 // Skip backslash and the escaped character.
continue
}
if char == '\'' {
state = stateNormal
}
i++
case stateInDoubleQuoteString:
if char == '\\' {
i += 2 // Skip backslash and the escaped character.
continue
}
if char == '"' {
state = stateNormal
}
i++
case stateInTripleSingleQuoteString:
if strings.HasPrefix(remaining, "'''") {
state = stateNormal
i += 3
} else {
i++
}
case stateInTripleDoubleQuoteString:
if strings.HasPrefix(remaining, `"""`) {
state = stateNormal
i += 3
} else {
i++
}
case stateInSingleLineCommentDash, stateInSingleLineCommentHash:
if char == '\n' {
state = stateNormal
}
i++
case stateInMultiLineComment:
if strings.HasPrefix(remaining, "*/") {
state = stateNormal
i += 2
} else {
i++
}
case stateInRawSingleQuoteString:
if char == '\'' {
state = stateNormal
}
i++
case stateInRawDoubleQuoteString:
if char == '"' {
state = stateNormal
}
i++
case stateInRawTripleSingleQuoteString:
if strings.HasPrefix(remaining, "'''") {
state = stateNormal
i += 3
} else {
i++
}
case stateInRawTripleDoubleQuoteString:
if strings.HasPrefix(remaining, `"""`) {
state = stateNormal
i += 3
} else {
i++
}
}
}
if inSubquery {
return 0, fmt.Errorf("unclosed subquery parenthesis")
}
return len(sql), nil
}
// parseIdentifierSequence parses a sequence of dot-separated identifiers.
// It returns the parts of the identifier, the number of characters consumed, and an error.
func parseIdentifierSequence(s string) ([]string, int, error) {
var parts []string
var totalConsumed int
for {
remaining := s[totalConsumed:]
trimmed := strings.TrimLeftFunc(remaining, unicode.IsSpace)
totalConsumed += len(remaining) - len(trimmed)
current := s[totalConsumed:]
if len(current) == 0 {
break
}
var part string
var consumed int
if current[0] == '`' {
end := strings.Index(current[1:], "`")
if end == -1 {
return nil, 0, fmt.Errorf("unclosed backtick identifier")
}
part = current[1 : end+1]
consumed = end + 2
} else if len(current) > 0 && unicode.IsLetter(rune(current[0])) {
end := strings.IndexFunc(current, func(r rune) bool {
return !unicode.IsLetter(r) && !unicode.IsNumber(r) && r != '_' && r != '-'
})
if end == -1 {
part = current
consumed = len(current)
} else {
part = current[:end]
consumed = end
}
} else {
break
}
if current[0] == '`' && strings.Contains(part, ".") {
// This handles cases like `project.dataset.table` but not `project.dataset`.table.
// If the character after the quoted identifier is not a dot, we treat it as a full name.
if len(current) <= consumed || current[consumed] != '.' {
parts = append(parts, strings.Split(part, ".")...)
totalConsumed += consumed
break
}
}
parts = append(parts, strings.Split(part, ".")...)
totalConsumed += consumed
if len(s) <= totalConsumed || s[totalConsumed] != '.' {
break
}
totalConsumed++
}
return parts, totalConsumed, nil
}
func formatTableID(parts []string, defaultProjectID string) (string, error) {
if len(parts) < 2 || len(parts) > 3 {
// Not a table identifier (could be a CTE, column, etc.).
// Return the consumed length so the main loop can skip this identifier.
return "", nil
}
var tableID string
if len(parts) == 3 { // project.dataset.table
tableID = strings.Join(parts, ".")
} else { // dataset.table
if defaultProjectID == "" {
return "", fmt.Errorf("query contains table '%s' without project ID, and no default project ID is provided", strings.Join(parts, "."))
}
tableID = fmt.Sprintf("%s.%s", defaultProjectID, strings.Join(parts, "."))
}
return tableID, nil
}

View File

@@ -0,0 +1,496 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package bigquerycommon_test
import (
"sort"
"strings"
"testing"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerycommon"
)
func TestTableParser(t *testing.T) {
testCases := []struct {
name string
sql string
defaultProjectID string
want []string
wantErr bool
wantErrMsg string
}{
{
name: "single fully qualified table",
sql: "SELECT * FROM `my-project.my_dataset.my_table`",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "multiple statements with same table",
sql: "select * from proj1.data1.tbl1 limit 1; select A.b from proj1.data1.tbl1 as A limit 1;",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1"},
wantErr: false,
},
{
name: "multiple fully qualified tables",
sql: "SELECT * FROM `proj1.data1`.`tbl1` JOIN proj2.`data2.tbl2` ON id",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1", "proj2.data2.tbl2"},
wantErr: false,
},
{
name: "duplicate tables",
sql: "SELECT * FROM `proj1.data1.tbl1` JOIN proj1.data1.tbl1 ON id",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1"},
wantErr: false,
},
{
name: "partial table with default project",
sql: "SELECT * FROM `my_dataset`.my_table",
defaultProjectID: "default-proj",
want: []string{"default-proj.my_dataset.my_table"},
wantErr: false,
},
{
name: "partial table without default project",
sql: "SELECT * FROM `my_dataset.my_table`",
defaultProjectID: "",
want: nil,
wantErr: true,
},
{
name: "mixed fully qualified and partial tables",
sql: "SELECT t1.*, t2.* FROM `proj1.data1.tbl1` AS t1 JOIN `data2.tbl2` AS t2 ON t1.id = t2.id",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1", "default-proj.data2.tbl2"},
wantErr: false,
},
{
name: "no tables",
sql: "SELECT 1+1",
defaultProjectID: "default-proj",
want: []string{},
wantErr: false,
},
{
name: "ignore single part identifiers (like CTEs)",
sql: "WITH my_cte AS (SELECT 1) SELECT * FROM `my_cte`",
defaultProjectID: "default-proj",
want: []string{},
wantErr: false,
},
{
name: "complex CTE",
sql: "WITH cte1 AS (SELECT * FROM `real.table.one`), cte2 AS (SELECT * FROM cte1) SELECT * FROM cte2 JOIN `real.table.two` ON true",
defaultProjectID: "default-proj",
want: []string{"real.table.one", "real.table.two"},
wantErr: false,
},
{
name: "nested subquery should be parsed",
sql: "SELECT * FROM (SELECT a FROM (SELECT A.b FROM `real.table.nested` AS A))",
defaultProjectID: "default-proj",
want: []string{"real.table.nested"},
wantErr: false,
},
{
name: "from clause with unnest",
sql: "SELECT event.name FROM `my-project.my_dataset.my_table` AS A, UNNEST(A.events) AS event",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "ignore more than 3 parts",
sql: "SELECT * FROM `proj.data.tbl.col`",
defaultProjectID: "default-proj",
want: []string{},
wantErr: false,
},
{
name: "complex query",
sql: "SELECT name FROM (SELECT name FROM `proj1.data1.tbl1`) UNION ALL SELECT name FROM `data2.tbl2`",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1", "default-proj.data2.tbl2"},
wantErr: false,
},
{
name: "empty sql",
sql: "",
defaultProjectID: "default-proj",
want: []string{},
wantErr: false,
},
{
name: "with comments",
sql: "SELECT * FROM `proj1.data1.tbl1`; -- comment `fake.table.one` \n SELECT * FROM `proj2.data2.tbl2`; # comment `fake.table.two`",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1", "proj2.data2.tbl2"},
wantErr: false,
},
{
name: "multi-statement with semicolon",
sql: "SELECT * FROM `proj1.data1.tbl1`; SELECT * FROM `proj2.data2.tbl2`",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1", "proj2.data2.tbl2"},
wantErr: false,
},
{
name: "simple execute immediate",
sql: "EXECUTE IMMEDIATE 'SELECT * FROM `exec.proj.tbl`'",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "EXECUTE IMMEDIATE is not allowed when dataset restrictions are in place",
},
{
name: "execute immediate with multiple spaces",
sql: "EXECUTE IMMEDIATE 'SELECT 1'",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "EXECUTE IMMEDIATE is not allowed when dataset restrictions are in place",
},
{
name: "execute immediate with newline",
sql: "EXECUTE\nIMMEDIATE 'SELECT 1'",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "EXECUTE IMMEDIATE is not allowed when dataset restrictions are in place",
},
{
name: "execute immediate with comment",
sql: "EXECUTE -- some comment\n IMMEDIATE 'SELECT * FROM `exec.proj.tbl`'",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "EXECUTE IMMEDIATE is not allowed when dataset restrictions are in place",
},
{
name: "nested execute immediate",
sql: "EXECUTE IMMEDIATE \"EXECUTE IMMEDIATE '''SELECT * FROM `nested.exec.tbl`'''\"",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "EXECUTE IMMEDIATE is not allowed when dataset restrictions are in place",
},
{
name: "begin execute immediate",
sql: "BEGIN EXECUTE IMMEDIATE 'SELECT * FROM `exec.proj.tbl`'; END;",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "EXECUTE IMMEDIATE is not allowed when dataset restrictions are in place",
},
{
name: "table inside string literal should be ignored",
sql: "SELECT * FROM `real.table.one` WHERE name = 'select * from `fake.table.two`'",
defaultProjectID: "default-proj",
want: []string{"real.table.one"},
wantErr: false,
},
{
name: "string with escaped single quote",
sql: "SELECT 'this is a string with an escaped quote \\' and a fake table `fake.table.one`' FROM `real.table.two`",
defaultProjectID: "default-proj",
want: []string{"real.table.two"},
wantErr: false,
},
{
name: "string with escaped double quote",
sql: `SELECT "this is a string with an escaped quote \" and a fake table ` + "`fake.table.one`" + `" FROM ` + "`real.table.two`",
defaultProjectID: "default-proj",
want: []string{"real.table.two"},
wantErr: false,
},
{
name: "multi-line comment",
sql: "/* `fake.table.1` */ SELECT * FROM `real.table.2`",
defaultProjectID: "default-proj",
want: []string{"real.table.2"},
wantErr: false,
},
{
name: "raw string with backslash should be ignored",
sql: "SELECT * FROM `real.table.one` WHERE name = r'a raw string with a \\ and a fake table `fake.table.two`'",
defaultProjectID: "default-proj",
want: []string{"real.table.one"},
wantErr: false,
},
{
name: "capital R raw string with quotes inside should be ignored",
sql: `SELECT * FROM ` + "`real.table.one`" + ` WHERE name = R"""a raw string with a ' and a " and a \ and a fake table ` + "`fake.table.two`" + `"""`,
defaultProjectID: "default-proj",
want: []string{"real.table.one"},
wantErr: false,
},
{
name: "triple quoted raw string should be ignored",
sql: "SELECT * FROM `real.table.one` WHERE name = r'''a raw string with a ' and a \" and a \\ and a fake table `fake.table.two`'''",
defaultProjectID: "default-proj",
want: []string{"real.table.one"},
wantErr: false,
},
{
name: "triple quoted capital R raw string should be ignored",
sql: `SELECT * FROM ` + "`real.table.one`" + ` WHERE name = R"""a raw string with a ' and a " and a \ and a fake table ` + "`fake.table.two`" + `"""`,
defaultProjectID: "default-proj",
want: []string{"real.table.one"},
wantErr: false,
},
{
name: "unquoted fully qualified table",
sql: "SELECT * FROM my-project.my_dataset.my_table",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "unquoted partial table with default project",
sql: "SELECT * FROM my_dataset.my_table",
defaultProjectID: "default-proj",
want: []string{"default-proj.my_dataset.my_table"},
wantErr: false,
},
{
name: "unquoted partial table without default project",
sql: "SELECT * FROM my_dataset.my_table",
defaultProjectID: "",
want: nil,
wantErr: true,
},
{
name: "mixed quoting style 1",
sql: "SELECT * FROM `my-project`.my_dataset.my_table",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "mixed quoting style 2",
sql: "SELECT * FROM `my-project`.`my_dataset`.my_table",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "mixed quoting style 3",
sql: "SELECT * FROM `my-project`.`my_dataset`.`my_table`",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "mixed quoted and unquoted tables",
sql: "SELECT * FROM `proj1.data1.tbl1` JOIN proj2.data2.tbl2 ON id",
defaultProjectID: "default-proj",
want: []string{"proj1.data1.tbl1", "proj2.data2.tbl2"},
wantErr: false,
},
{
name: "create table statement",
sql: "CREATE TABLE `my-project.my_dataset.my_table` (x INT64)",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "insert into statement",
sql: "INSERT INTO `my-project.my_dataset.my_table` (x) VALUES (1)",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "update statement",
sql: "UPDATE `my-project.my_dataset.my_table` SET x = 2 WHERE true",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "delete from statement",
sql: "DELETE FROM `my-project.my_dataset.my_table` WHERE true",
defaultProjectID: "default-proj",
want: []string{"my-project.my_dataset.my_table"},
wantErr: false,
},
{
name: "merge into statement",
sql: "MERGE `proj.data.target` T USING `proj.data.source` S ON T.id = S.id WHEN NOT MATCHED THEN INSERT ROW",
defaultProjectID: "default-proj",
want: []string{"proj.data.source", "proj.data.target"},
wantErr: false,
},
{
name: "create schema statement",
sql: "CREATE SCHEMA `my-project.my_dataset`",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "dataset-level operations like 'CREATE SCHEMA' are not allowed",
},
{
name: "create dataset statement",
sql: "CREATE DATASET `my-project.my_dataset`",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "dataset-level operations like 'CREATE DATASET' are not allowed",
},
{
name: "drop schema statement",
sql: "DROP SCHEMA `my-project.my_dataset`",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "dataset-level operations like 'DROP SCHEMA' are not allowed",
},
{
name: "drop dataset statement",
sql: "DROP DATASET `my-project.my_dataset`",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "dataset-level operations like 'DROP DATASET' are not allowed",
},
{
name: "alter schema statement",
sql: "ALTER SCHEMA my_dataset SET OPTIONS(description='new description')",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "dataset-level operations like 'ALTER SCHEMA' are not allowed",
},
{
name: "alter dataset statement",
sql: "ALTER DATASET my_dataset SET OPTIONS(description='new description')",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "dataset-level operations like 'ALTER DATASET' are not allowed",
},
{
name: "begin...end block",
sql: "BEGIN CREATE TABLE `proj.data.tbl1` (x INT64); INSERT `proj.data.tbl2` (y) VALUES (1); END;",
defaultProjectID: "default-proj",
want: []string{"proj.data.tbl1", "proj.data.tbl2"},
wantErr: false,
},
{
name: "complex begin...end block with comments and different quoting",
sql: `
BEGIN
-- Create a new table
CREATE TABLE proj.data.tbl1 (x INT64);
/* Insert some data from another table */
INSERT INTO ` + "`proj.data.tbl2`" + ` (y) SELECT y FROM proj.data.source;
END;`,
defaultProjectID: "default-proj",
want: []string{"proj.data.source", "proj.data.tbl1", "proj.data.tbl2"},
wantErr: false,
},
{
name: "call fully qualified procedure",
sql: "CALL my-project.my_dataset.my_procedure()",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "CALL is not allowed when dataset restrictions are in place",
},
{
name: "call partially qualified procedure",
sql: "CALL my_dataset.my_procedure()",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "CALL is not allowed when dataset restrictions are in place",
},
{
name: "call procedure in begin...end block",
sql: "BEGIN CALL proj.data.proc1(); SELECT * FROM proj.data.tbl1; END;",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "CALL is not allowed when dataset restrictions are in place",
},
{
name: "call procedure with newline",
sql: "CALL\nmy_dataset.my_procedure()",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "CALL is not allowed when dataset restrictions are in place",
},
{
name: "call procedure without default project should fail",
sql: "CALL my_dataset.my_procedure()",
defaultProjectID: "",
want: nil,
wantErr: true,
wantErrMsg: "CALL is not allowed when dataset restrictions are in place",
},
{
name: "create procedure statement",
sql: "CREATE PROCEDURE my_dataset.my_procedure() BEGIN SELECT 1; END;",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "unanalyzable statements like 'CREATE PROCEDURE' are not allowed",
},
{
name: "create or replace procedure statement",
sql: "CREATE\n OR \nREPLACE \nPROCEDURE my_dataset.my_procedure() BEGIN SELECT 1; END;",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "unanalyzable statements like 'CREATE OR REPLACE PROCEDURE' are not allowed",
},
{
name: "create function statement",
sql: "CREATE FUNCTION my_dataset.my_function() RETURNS INT64 AS (1);",
defaultProjectID: "default-proj",
want: nil,
wantErr: true,
wantErrMsg: "unanalyzable statements like 'CREATE FUNCTION' are not allowed",
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
got, err := bigquerycommon.TableParser(tc.sql, tc.defaultProjectID)
if (err != nil) != tc.wantErr {
t.Errorf("TableParser() error = %v, wantErr %v", err, tc.wantErr)
return
}
if tc.wantErr && tc.wantErrMsg != "" {
if err == nil || !strings.Contains(err.Error(), tc.wantErrMsg) {
t.Errorf("TableParser() error = %v, want err containing %q", err, tc.wantErrMsg)
}
}
// Sort slices to ensure comparison is order-independent.
sort.Strings(got)
sort.Strings(tc.want)
if diff := cmp.Diff(tc.want, got); diff != "" {
t.Errorf("TableParser() mismatch (-want +got):\n%s", diff)
}
})
}
}

View File

@@ -0,0 +1,55 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package bigquerycommon
import (
"context"
"fmt"
bigqueryapi "cloud.google.com/go/bigquery"
bigqueryrestapi "google.golang.org/api/bigquery/v2"
)
// DryRunQuery performs a dry run of the SQL query to validate it and get metadata.
func DryRunQuery(ctx context.Context, restService *bigqueryrestapi.Service, projectID string, location string, sql string, params []*bigqueryrestapi.QueryParameter, connProps []*bigqueryapi.ConnectionProperty) (*bigqueryrestapi.Job, error) {
useLegacySql := false
restConnProps := make([]*bigqueryrestapi.ConnectionProperty, len(connProps))
for i, prop := range connProps {
restConnProps[i] = &bigqueryrestapi.ConnectionProperty{Key: prop.Key, Value: prop.Value}
}
jobToInsert := &bigqueryrestapi.Job{
JobReference: &bigqueryrestapi.JobReference{
ProjectId: projectID,
Location: location,
},
Configuration: &bigqueryrestapi.JobConfiguration{
DryRun: true,
Query: &bigqueryrestapi.JobConfigurationQuery{
Query: sql,
UseLegacySql: &useLegacySql,
ConnectionProperties: restConnProps,
QueryParameters: params,
},
},
}
insertResponse, err := restService.Jobs.Insert(projectID, jobToInsert).Context(ctx).Do()
if err != nil {
return nil, fmt.Errorf("failed to insert dry run job: %w", err)
}
return insertResponse, nil
}

View File

@@ -149,12 +149,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
tableRefsParameter := tools.NewStringParameter("table_references", tableRefsDescription)
parameters := tools.Parameters{userQueryParameter, tableRefsParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// Get cloud-platform token source for Gemini Data Analytics API during initialization
var bigQueryTokenSourceWithScope oauth2.TokenSource

View File

@@ -18,12 +18,14 @@ import (
"context"
"encoding/json"
"fmt"
"strings"
bigqueryapi "cloud.google.com/go/bigquery"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
bigqueryds "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
"github.com/googleapis/genai-toolbox/internal/tools"
bqutil "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerycommon"
"github.com/googleapis/genai-toolbox/internal/util"
bigqueryrestapi "google.golang.org/api/bigquery/v2"
"google.golang.org/api/iterator"
@@ -50,6 +52,8 @@ type compatibleSource interface {
BigQueryRestService() *bigqueryrestapi.Service
BigQueryClientCreator() bigqueryds.BigqueryClientCreator
UseClientAuthorization() bool
IsDatasetAllowed(projectID, datasetID string) bool
BigQueryAllowedDatasets() []string
}
// validate compatible sources are still compatible
@@ -85,7 +89,28 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
sqlParameter := tools.NewStringParameter("sql", "The sql to execute.")
sqlDescription := "The sql to execute."
allowedDatasets := s.BigQueryAllowedDatasets()
if len(allowedDatasets) > 0 {
datasetIDs := []string{}
for _, ds := range allowedDatasets {
datasetIDs = append(datasetIDs, fmt.Sprintf("`%s`", ds))
}
if len(datasetIDs) == 1 {
parts := strings.Split(allowedDatasets[0], ".")
if len(parts) < 2 {
return nil, fmt.Errorf("expected split to have 2 parts: %s", allowedDatasets[0])
}
datasetID := parts[1]
sqlDescription += fmt.Sprintf(" The query must only access the %s dataset. "+
"To query a table within this dataset (e.g., `my_table`), "+
"qualify it with the dataset id (e.g., `%s.my_table`).", datasetIDs[0], datasetID)
} else {
sqlDescription += fmt.Sprintf(" The query must only access datasets from the following list: %s.", strings.Join(datasetIDs, ", "))
}
}
sqlParameter := tools.NewStringParameter("sql", sqlDescription)
dryRunParameter := tools.NewBooleanParameterWithDefault(
"dry_run",
false,
@@ -93,25 +118,22 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
"will be returned without running the query. Defaults to false.",
)
parameters := tools.Parameters{sqlParameter, dryRunParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// finish tool setup
t := Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
UseClientOAuth: s.UseClientAuthorization(),
ClientCreator: s.BigQueryClientCreator(),
Client: s.BigQueryClient(),
RestService: s.BigQueryRestService(),
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
mcpManifest: mcpManifest,
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
UseClientOAuth: s.UseClientAuthorization(),
ClientCreator: s.BigQueryClientCreator(),
Client: s.BigQueryClient(),
RestService: s.BigQueryRestService(),
IsDatasetAllowed: s.IsDatasetAllowed,
AllowedDatasets: allowedDatasets,
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
mcpManifest: mcpManifest,
}
return t, nil
}
@@ -126,11 +148,13 @@ type Tool struct {
UseClientOAuth bool `yaml:"useClientOAuth"`
Parameters tools.Parameters `yaml:"parameters"`
Client *bigqueryapi.Client
RestService *bigqueryrestapi.Service
ClientCreator bigqueryds.BigqueryClientCreator
manifest tools.Manifest
mcpManifest tools.McpManifest
Client *bigqueryapi.Client
RestService *bigqueryrestapi.Service
ClientCreator bigqueryds.BigqueryClientCreator
IsDatasetAllowed func(projectID, datasetID string) bool
AllowedDatasets []string
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
@@ -160,10 +184,65 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
}
}
dryRunJob, err := dryRunQuery(ctx, restService, bqClient.Project(), bqClient.Location, sql)
dryRunJob, err := bqutil.DryRunQuery(ctx, restService, bqClient.Project(), bqClient.Location, sql, nil, nil)
if err != nil {
return nil, fmt.Errorf("query validation failed during dry run: %w", err)
}
statementType := dryRunJob.Statistics.Query.StatementType
if len(t.AllowedDatasets) > 0 {
switch statementType {
case "CREATE_SCHEMA", "DROP_SCHEMA", "ALTER_SCHEMA":
return nil, fmt.Errorf("dataset-level operations like '%s' are not allowed when dataset restrictions are in place", statementType)
case "CREATE_FUNCTION", "CREATE_TABLE_FUNCTION", "CREATE_PROCEDURE":
return nil, fmt.Errorf("creating stored routines ('%s') is not allowed when dataset restrictions are in place, as their contents cannot be safely analyzed", statementType)
case "CALL":
return nil, fmt.Errorf("calling stored procedures ('%s') is not allowed when dataset restrictions are in place, as their contents cannot be safely analyzed", statementType)
}
// Use a map to avoid duplicate table names.
tableIDSet := make(map[string]struct{})
// Get all tables from the dry run result. This is the most reliable method.
queryStats := dryRunJob.Statistics.Query
if queryStats != nil {
for _, tableRef := range queryStats.ReferencedTables {
tableIDSet[fmt.Sprintf("%s.%s.%s", tableRef.ProjectId, tableRef.DatasetId, tableRef.TableId)] = struct{}{}
}
if tableRef := queryStats.DdlTargetTable; tableRef != nil {
tableIDSet[fmt.Sprintf("%s.%s.%s", tableRef.ProjectId, tableRef.DatasetId, tableRef.TableId)] = struct{}{}
}
if tableRef := queryStats.DdlDestinationTable; tableRef != nil {
tableIDSet[fmt.Sprintf("%s.%s.%s", tableRef.ProjectId, tableRef.DatasetId, tableRef.TableId)] = struct{}{}
}
}
var tableNames []string
if len(tableIDSet) > 0 {
for tableID := range tableIDSet {
tableNames = append(tableNames, tableID)
}
} else if statementType != "SELECT" {
// If dry run yields no tables, fall back to the parser for non-SELECT statements
// to catch unsafe operations like EXECUTE IMMEDIATE.
parsedTables, parseErr := bqutil.TableParser(sql, t.Client.Project())
if parseErr != nil {
// If parsing fails (e.g., EXECUTE IMMEDIATE), we cannot guarantee safety, so we must fail.
return nil, fmt.Errorf("could not parse tables from query to validate against allowed datasets: %w", parseErr)
}
tableNames = parsedTables
}
for _, tableID := range tableNames {
parts := strings.Split(tableID, ".")
if len(parts) == 3 {
projectID, datasetID := parts[0], parts[1]
if !t.IsDatasetAllowed(projectID, datasetID) {
return nil, fmt.Errorf("query accesses dataset '%s.%s', which is not in the allowed list", projectID, datasetID)
}
}
}
}
if dryRun {
if dryRunJob != nil {
@@ -177,8 +256,6 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
return "Dry run was requested, but no job information was returned.", nil
}
statementType := dryRunJob.Statistics.Query.StatementType
// JobStatistics.QueryStatistics.StatementType
query := bqClient.Query(sql)
query.Location = bqClient.Location
@@ -248,27 +325,3 @@ func (t Tool) Authorized(verifiedAuthServices []string) bool {
func (t Tool) RequiresClientAuthorization() bool {
return t.UseClientOAuth
}
// dryRunQuery performs a dry run of the SQL query to validate it and get metadata.
func dryRunQuery(ctx context.Context, restService *bigqueryrestapi.Service, projectID string, location string, sql string) (*bigqueryrestapi.Job, error) {
useLegacySql := false
jobToInsert := &bigqueryrestapi.Job{
JobReference: &bigqueryrestapi.JobReference{
ProjectId: projectID,
Location: location,
},
Configuration: &bigqueryrestapi.JobConfiguration{
DryRun: true,
Query: &bigqueryrestapi.JobConfigurationQuery{
Query: sql,
UseLegacySql: &useLegacySql,
},
},
}
insertResponse, err := restService.Jobs.Insert(projectID, jobToInsert).Context(ctx).Do()
if err != nil {
return nil, fmt.Errorf("failed to insert dry run job: %w", err)
}
return insertResponse, nil
}

View File

@@ -24,6 +24,7 @@ import (
"github.com/googleapis/genai-toolbox/internal/sources"
bigqueryds "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
"github.com/googleapis/genai-toolbox/internal/tools"
bqutil "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerycommon"
"github.com/googleapis/genai-toolbox/internal/util"
bigqueryrestapi "google.golang.org/api/bigquery/v2"
"google.golang.org/api/iterator"
@@ -50,6 +51,8 @@ type compatibleSource interface {
BigQueryRestService() *bigqueryrestapi.Service
BigQueryClientCreator() bigqueryds.BigqueryClientCreator
UseClientAuthorization() bool
IsDatasetAllowed(projectID, datasetID string) bool
BigQueryAllowedDatasets() []string
}
// validate compatible sources are still compatible
@@ -85,8 +88,17 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
historyDataParameter := tools.NewStringParameter("history_data",
"The table id or the query of the history time series data.")
allowedDatasets := s.BigQueryAllowedDatasets()
historyDataDescription := "The table id or the query of the history time series data."
if len(allowedDatasets) > 0 {
datasetIDs := []string{}
for _, ds := range allowedDatasets {
datasetIDs = append(datasetIDs, fmt.Sprintf("`%s`", ds))
}
historyDataDescription += fmt.Sprintf(" The query or table must only access datasets from the following list: %s.", strings.Join(datasetIDs, ", "))
}
historyDataParameter := tools.NewStringParameter("history_data", historyDataDescription)
timestampColumnNameParameter := tools.NewStringParameter("timestamp_col",
"The name of the time series timestamp column.")
dataColumnNameParameter := tools.NewStringParameter("data_col",
@@ -98,24 +110,22 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
parameters := tools.Parameters{historyDataParameter,
timestampColumnNameParameter, dataColumnNameParameter, idColumnNameParameter, horizonParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// finish tool setup
t := Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
UseClientOAuth: s.UseClientAuthorization(),
ClientCreator: s.BigQueryClientCreator(),
Client: s.BigQueryClient(),
RestService: s.BigQueryRestService(),
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
mcpManifest: mcpManifest,
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
UseClientOAuth: s.UseClientAuthorization(),
ClientCreator: s.BigQueryClientCreator(),
Client: s.BigQueryClient(),
RestService: s.BigQueryRestService(),
IsDatasetAllowed: s.IsDatasetAllowed,
AllowedDatasets: allowedDatasets,
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
mcpManifest: mcpManifest,
}
return t, nil
}
@@ -130,11 +140,13 @@ type Tool struct {
UseClientOAuth bool `yaml:"useClientOAuth"`
Parameters tools.Parameters `yaml:"parameters"`
Client *bigqueryapi.Client
RestService *bigqueryrestapi.Service
ClientCreator bigqueryds.BigqueryClientCreator
manifest tools.Manifest
mcpManifest tools.McpManifest
Client *bigqueryapi.Client
RestService *bigqueryrestapi.Service
ClientCreator bigqueryds.BigqueryClientCreator
IsDatasetAllowed func(projectID, datasetID string) bool
AllowedDatasets []string
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
@@ -175,8 +187,48 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
var historyDataSource string
trimmedUpperHistoryData := strings.TrimSpace(strings.ToUpper(historyData))
if strings.HasPrefix(trimmedUpperHistoryData, "SELECT") || strings.HasPrefix(trimmedUpperHistoryData, "WITH") {
if len(t.AllowedDatasets) > 0 {
dryRunJob, err := bqutil.DryRunQuery(ctx, t.RestService, t.Client.Project(), t.Client.Location, historyData, nil, nil)
if err != nil {
return nil, fmt.Errorf("query validation failed during dry run: %w", err)
}
statementType := dryRunJob.Statistics.Query.StatementType
if statementType != "SELECT" {
return nil, fmt.Errorf("the 'history_data' parameter only supports a table ID or a SELECT query. The provided query has statement type '%s'", statementType)
}
queryStats := dryRunJob.Statistics.Query
if queryStats != nil {
for _, tableRef := range queryStats.ReferencedTables {
if !t.IsDatasetAllowed(tableRef.ProjectId, tableRef.DatasetId) {
return nil, fmt.Errorf("query in history_data accesses dataset '%s.%s', which is not in the allowed list", tableRef.ProjectId, tableRef.DatasetId)
}
}
} else {
return nil, fmt.Errorf("could not analyze query in history_data to validate against allowed datasets")
}
}
historyDataSource = fmt.Sprintf("(%s)", historyData)
} else {
if len(t.AllowedDatasets) > 0 {
parts := strings.Split(historyData, ".")
var projectID, datasetID string
switch len(parts) {
case 3: // project.dataset.table
projectID = parts[0]
datasetID = parts[1]
case 2: // dataset.table
projectID = t.Client.Project()
datasetID = parts[0]
default:
return nil, fmt.Errorf("invalid table ID format for 'history_data': %q. Expected 'dataset.table' or 'project.dataset.table'", historyData)
}
if !t.IsDatasetAllowed(projectID, datasetID) {
return nil, fmt.Errorf("access to dataset '%s.%s' (from table '%s') is not allowed", projectID, datasetID, historyData)
}
}
historyDataSource = fmt.Sprintf("TABLE `%s`", historyData)
}

View File

@@ -87,11 +87,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
datasetParameter := tools.NewStringParameter(datasetKey, "The dataset to get metadata information.")
parameters := tools.Parameters{projectParameter, datasetParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// finish tool setup
t := Tool{

View File

@@ -89,11 +89,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
tableParameter := tools.NewStringParameter(tableKey, "The table to get metadata information.")
parameters := tools.Parameters{projectParameter, datasetParameter, tableParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// finish tool setup
t := Tool{

View File

@@ -87,11 +87,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
parameters := tools.Parameters{projectParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// finish tool setup
t := Tool{

View File

@@ -128,11 +128,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
parameters := tools.Parameters{projectParameter, datasetParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
// finish tool setup
t := Tool{

View File

@@ -96,12 +96,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
if cfg.Description != "" {
description = cfg.Description
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, parameters)
t := Tool{
Name: cfg.Name,

View File

@@ -26,6 +26,7 @@ import (
bigqueryds "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
"github.com/googleapis/genai-toolbox/internal/tools"
bqutil "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerycommon"
bigqueryrestapi "google.golang.org/api/bigquery/v2"
"google.golang.org/api/iterator"
)
@@ -89,16 +90,12 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters, paramManifest, paramMcpManifest, err := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
allParameters, paramManifest, err := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
if err != nil {
return nil, err
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: paramMcpManifest,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters)
// finish tool setup
t := Tool{
@@ -236,7 +233,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
query.Parameters = highLevelParams
query.Location = bqClient.Location
dryRunJob, err := dryRunQuery(ctx, restService, bqClient.Project(), bqClient.Location, newStatement, lowLevelParams, query.ConnectionProperties)
dryRunJob, err := bqutil.DryRunQuery(ctx, restService, bqClient.Project(), bqClient.Location, newStatement, lowLevelParams, query.ConnectionProperties)
if err != nil {
// This is a fallback check in case the switch logic was bypassed.
return nil, fmt.Errorf("final query validation failed: %w", err)
@@ -319,42 +316,3 @@ func BQTypeStringFromToolType(toolType string) (string, error) {
return "", fmt.Errorf("unsupported tool parameter type for BigQuery: %s", toolType)
}
}
func dryRunQuery(
ctx context.Context,
restService *bigqueryrestapi.Service,
projectID string,
location string,
sql string,
params []*bigqueryrestapi.QueryParameter,
connProps []*bigqueryapi.ConnectionProperty,
) (*bigqueryrestapi.Job, error) {
useLegacySql := false
restConnProps := make([]*bigqueryrestapi.ConnectionProperty, len(connProps))
for i, prop := range connProps {
restConnProps[i] = &bigqueryrestapi.ConnectionProperty{Key: prop.Key, Value: prop.Value}
}
jobToInsert := &bigqueryrestapi.Job{
JobReference: &bigqueryrestapi.JobReference{
ProjectId: projectID,
Location: location,
},
Configuration: &bigqueryrestapi.JobConfiguration{
DryRun: true,
Query: &bigqueryrestapi.JobConfigurationQuery{
Query: sql,
UseLegacySql: &useLegacySql,
ConnectionProperties: restConnProps,
QueryParameters: params,
},
},
}
insertResponse, err := restService.Jobs.Insert(projectID, jobToInsert).Context(ctx).Do()
if err != nil {
return nil, fmt.Errorf("failed to insert dry run job: %w", err)
}
return insertResponse, nil
}

View File

@@ -81,16 +81,12 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters, paramManifest, paramMcpManifest, err := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
allParameters, paramManifest, err := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
if err != nil {
return nil, err
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: paramMcpManifest,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters)
// finish tool setup
t := Tool{

View File

@@ -0,0 +1,178 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package cassandracql
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/gocql/gocql"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/sources/cassandra"
"github.com/googleapis/genai-toolbox/internal/tools"
)
const kind string = "cassandra-cql"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type compatibleSource interface {
CassandraSession() *gocql.Session
}
var _ compatibleSource = &cassandra.Source{}
var compatibleSources = [...]string{cassandra.SourceKind}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
Statement string `yaml:"statement" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
TemplateParameters tools.Parameters `yaml:"templateParameters"`
}
// Initialize implements tools.ToolConfig.
func (c Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[c.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", c.Source)
}
// verify the source is compatible
s, ok := rawS.(compatibleSource)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters, paramManifest, err := tools.ProcessParameters(c.TemplateParameters, c.Parameters)
if err != nil {
return nil, err
}
mcpManifest := tools.GetMcpManifest(c.Name, c.Description, c.AuthRequired, allParameters)
t := Tool{
Name: c.Name,
Kind: kind,
Parameters: c.Parameters,
TemplateParameters: c.TemplateParameters,
AllParams: allParameters,
Statement: c.Statement,
AuthRequired: c.AuthRequired,
Session: s.CassandraSession(),
manifest: tools.Manifest{Description: c.Description, Parameters: paramManifest, AuthRequired: c.AuthRequired},
mcpManifest: mcpManifest,
}
return t, nil
}
// ToolConfigKind implements tools.ToolConfig.
func (c Config) ToolConfigKind() string {
return kind
}
var _ tools.ToolConfig = Config{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
TemplateParameters tools.Parameters `yaml:"templateParameters"`
AllParams tools.Parameters `yaml:"allParams"`
Session *gocql.Session
Statement string
manifest tools.Manifest
mcpManifest tools.McpManifest
}
// RequiresClientAuthorization implements tools.Tool.
func (t Tool) RequiresClientAuthorization() bool {
return false
}
// Authorized implements tools.Tool.
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}
// Invoke implements tools.Tool.
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
paramsMap := params.AsMap()
newStatement, err := tools.ResolveTemplateParams(t.TemplateParameters, t.Statement, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract template params %w", err)
}
newParams, err := tools.GetParams(t.Parameters, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
iter := t.Session.Query(newStatement, sliceParams...).WithContext(ctx).Iter()
// Create a slice to store the out
var out []map[string]interface{}
// Scan results into a map and append to the slice
for {
row := make(map[string]interface{}) // Create a new map for each row
if !iter.MapScan(row) {
break // No more rows
}
out = append(out, row)
}
if err := iter.Close(); err != nil {
return nil, fmt.Errorf("unable to parse rows: %w", err)
}
return out, nil
}
// Manifest implements tools.Tool.
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
// McpManifest implements tools.Tool.
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
// ParseParams implements tools.Tool.
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.AllParams, data, claims)
}
var _ tools.Tool = Tool{}

View File

@@ -0,0 +1,171 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package cassandracql_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/tools/cassandra/cassandracql"
)
func TestParseFromYamlCassandra(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: cassandra-cql
source: my-cassandra-instance
description: some description
statement: |
SELECT * FROM CQL_STATEMENT;
authRequired:
- my-google-auth-service
- other-auth-service
parameters:
- name: country
type: string
description: some description
authServices:
- name: my-google-auth-service
field: user_id
- name: other-auth-service
field: user_id
`,
want: server.ToolConfigs{
"example_tool": cassandracql.Config{
Name: "example_tool",
Kind: "cassandra-cql",
Source: "my-cassandra-instance",
Description: "some description",
Statement: "SELECT * FROM CQL_STATEMENT;\n",
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
Parameters: []tools.Parameter{
tools.NewStringParameterWithAuth("country", "some description",
[]tools.ParamAuthService{{Name: "my-google-auth-service", Field: "user_id"},
{Name: "other-auth-service", Field: "user_id"}}),
},
},
},
},
{
desc: "with template parameters",
in: `
tools:
example_tool:
kind: cassandra-cql
source: my-cassandra-instance
description: some description
statement: |
SELECT * FROM CQL_STATEMENT;
authRequired:
- my-google-auth-service
- other-auth-service
parameters:
- name: country
type: string
description: some description
authServices:
- name: my-google-auth-service
field: user_id
- name: other-auth-service
field: user_id
templateParameters:
- name: tableName
type: string
description: some description.
- name: fieldArray
type: array
description: The columns to return for the query.
items:
name: column
type: string
description: A column name that will be returned from the query.
`,
want: server.ToolConfigs{
"example_tool": cassandracql.Config{
Name: "example_tool",
Kind: "cassandra-cql",
Source: "my-cassandra-instance",
Description: "some description",
Statement: "SELECT * FROM CQL_STATEMENT;\n",
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
Parameters: []tools.Parameter{
tools.NewStringParameterWithAuth("country", "some description",
[]tools.ParamAuthService{{Name: "my-google-auth-service", Field: "user_id"},
{Name: "other-auth-service", Field: "user_id"}}),
},
TemplateParameters: []tools.Parameter{
tools.NewStringParameter("tableName", "some description."),
tools.NewArrayParameter("fieldArray", "The columns to return for the query.", tools.NewStringParameter("column", "A column name that will be returned from the query.")),
},
},
},
},
{
desc: "without optional fields",
in: `
tools:
example_tool:
kind: cassandra-cql
source: my-cassandra-instance
description: some description
statement: |
SELECT * FROM CQL_STATEMENT;
`,
want: server.ToolConfigs{
"example_tool": cassandracql.Config{
Name: "example_tool",
Kind: "cassandra-cql",
Source: "my-cassandra-instance",
Description: "some description",
Statement: "SELECT * FROM CQL_STATEMENT;\n",
AuthRequired: []string{},
Parameters: nil,
TemplateParameters: nil,
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}

View File

@@ -74,11 +74,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
sqlParameter := tools.NewStringParameter("sql", "The SQL statement to execute.")
parameters := tools.Parameters{sqlParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
t := ExecuteSQLTool{
Name: cfg.Name,

View File

@@ -72,13 +72,8 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", listDatabasesKind, compatibleSources)
}
allParameters, paramManifest, paramMcpManifest, _ := tools.ProcessParameters(nil, cfg.Parameters)
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: paramMcpManifest,
}
allParameters, paramManifest, _ := tools.ProcessParameters(nil, cfg.Parameters)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters)
t := Tool{
Name: cfg.Name,

View File

@@ -76,13 +76,8 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
databaseParameter := tools.NewStringParameter(databaseKey, "The database to list tables from.")
parameters := tools.Parameters{databaseParameter}
allParameters, paramManifest, paramMcpManifest, _ := tools.ProcessParameters(nil, parameters)
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: paramMcpManifest,
}
allParameters, paramManifest, _ := tools.ProcessParameters(nil, parameters)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters)
t := Tool{
Name: cfg.Name,

View File

@@ -74,13 +74,8 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", sqlKind, compatibleSources)
}
allParameters, paramManifest, paramMcpManifest, _ := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: paramMcpManifest,
}
allParameters, paramManifest, _ := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters)
t := Tool{
Name: cfg.Name,

View File

@@ -77,6 +77,8 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
tools.NewStringParameterWithRequired("query", "The promql query to execute.", true),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,
Kind: kind,
@@ -86,7 +88,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
UserAgent: s.UserAgent,
Client: s.Client,
manifest: tools.Manifest{Description: cfg.Description, Parameters: allParameters.Manifest()},
mcpManifest: tools.McpManifest{Name: cfg.Name, Description: cfg.Description, InputSchema: allParameters.McpManifest()},
mcpManifest: mcpManifest,
}, nil
}

View File

@@ -76,19 +76,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "instance", "name"}
description := cfg.Description
if description == "" {
description = "Creates a new database in a Cloud SQL instance."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -78,18 +78,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
description := cfg.Description
if description == "" {
description = "Creates a new user in a Cloud SQL instance. Both built-in and IAM users are supported. IAM users require an email account as the user name. IAM is the more secure and recommended way to manage users. The agent should always ask the user what type of user they want to create. For more information, see https://cloud.google.com/sql/docs/postgres/add-manage-iam-users"
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -75,19 +75,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"projectId", "instanceId"}
description := cfg.Description
if description == "" {
description = "Gets a particular cloud sql instance."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -74,19 +74,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "instance"}
description := cfg.Description
if description == "" {
description = "Lists all databases for a Cloud SQL instance."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -73,19 +73,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project"}
description := cfg.Description
if description == "" {
description = "Lists all type of Cloud SQL instances for a project."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -128,19 +128,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "operation"}
description := cfg.Description
if description == "" {
description = "This will poll on operations API until the operation is done. For checking operation status we need projectId and operationId. Once instance is created give follow up steps on how to use the variables to bring data plane MCP server up in local and remote setup."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
var delay time.Duration
if cfg.Delay == "" {

View File

@@ -79,19 +79,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "name", "editionPreset", "rootPassword"}
description := cfg.Description
if description == "" {
description = "Creates a SQL Server instance using `Production` and `Development` presets. For the `Development` template, it chooses a 2 vCPU, 8 GiB RAM (`db-custom-2-8192`) configuration with Non-HA/zonal availability. For the `Production` template, it chooses a 4 vCPU, 26 GiB RAM (`db-custom-4-26624`) configuration with HA/regional availability. The Enterprise edition is used in both cases. The default database version is `SQLSERVER_2022_STANDARD`. The agent should ask the user if they want to use a different version."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -79,19 +79,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "name", "editionPreset", "rootPassword"}
description := cfg.Description
if description == "" {
description = "Creates a MySQL instance using `Production` and `Development` presets. For the `Development` template, it chooses a 2 vCPU, 16 GiB RAM, 100 GiB SSD configuration with Non-HA/zonal availability. For the `Production` template, it chooses an 8 vCPU, 64 GiB RAM, 250 GiB SSD configuration with HA/regional availability. The Enterprise Plus edition is used in both cases. The default database version is `MYSQL_8_4`. The agent should ask the user if they want to use a different version."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -79,19 +79,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
paramManifest := allParameters.Manifest()
inputSchema := allParameters.McpManifest()
inputSchema.Required = []string{"project", "name", "editionPreset", "rootPassword"}
description := cfg.Description
if description == "" {
description = "Creates a Postgres instance using `Production` and `Development` presets. For the `Development` template, it chooses a 2 vCPU, 16 GiB RAM, 100 GiB SSD configuration with Non-HA/zonal availability. For the `Production` template, it chooses an 8 vCPU, 64 GiB RAM, 250 GiB SSD configuration with HA/regional availability. The Enterprise Plus edition is used in both cases. The default database version is `POSTGRES_17`. The agent should ask the user if they want to use a different version."
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: description,
InputSchema: inputSchema,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters)
return Tool{
Name: cfg.Name,

View File

@@ -83,16 +83,12 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters, paramManifest, paramMcpManifest, err := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
allParameters, paramManifest, err := tools.ProcessParameters(cfg.TemplateParameters, cfg.Parameters)
if err != nil {
return nil, err
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: paramMcpManifest,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters)
// finish tool setup
t := Tool{
Name: cfg.Name,

View File

@@ -100,11 +100,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
entry := tools.NewStringParameter("entry", "The resource name of the Entry in the following form: projects/{project}/locations/{location}/entryGroups/{entryGroup}/entries/{entry}.")
parameters := tools.Parameters{name, view, aspectTypes, entry}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
t := Tool{
Name: cfg.Name,

View File

@@ -85,11 +85,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
orderBy := tools.NewStringParameterWithDefault("orderBy", "relevance", "Specifies the ordering of results. Supported values are: relevance, last_modified_timestamp, last_modified_timestamp asc")
parameters := tools.Parameters{query, pageSize, orderBy}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
t := Tool{
Name: cfg.Name,

View File

@@ -84,11 +84,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
orderBy := tools.NewStringParameterWithDefault("orderBy", "relevance", "Specifies the ordering of results. Supported values are: relevance, last_modified_timestamp, last_modified_timestamp asc")
parameters := tools.Parameters{query, pageSize, orderBy}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
t := Tool{
Name: cfg.Name,

View File

@@ -82,11 +82,7 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: cfg.Parameters.McpManifest(),
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, cfg.Parameters)
// finish tool setup
t := Tool{

View File

@@ -78,23 +78,14 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
sqlParameter := tools.NewStringParameter("sql", "The sql to execute.")
parameters := tools.Parameters{sqlParameter}
_, paramManifest, paramMcpManifest, err := tools.ProcessParameters(nil, parameters)
if err != nil {
return nil, err
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: paramMcpManifest,
}
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
t := &Tool{
Name: cfg.Name,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Db: s.FirebirdDB(),
manifest: tools.Manifest{Description: cfg.Description, Parameters: paramManifest, AuthRequired: cfg.AuthRequired},
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
mcpManifest: mcpManifest,
}
return t, nil

Some files were not shown because too many files have changed in this diff Show More