Compare commits

..

190 Commits

Author SHA1 Message Date
Harsh Jha
baa070d631 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-23 16:19:29 +05:30
Harsh Jha
2a49d93c31 Merge branch 'main' into sts-python-ci 2025-09-23 16:14:40 +05:30
Harsh Jha
e2147d0623 chore: removed deps.json variable from scirpt 2025-09-23 16:14:18 +05:30
Harsh Jha
cdf648c83f chore: removed deps.json 2025-09-23 16:13:40 +05:30
Harsh Jha
db3abc67b5 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-23 15:09:48 +05:30
Harsh Jha
fb524b91e0 Merge branch 'main' into sts-python-ci 2025-09-23 15:09:01 +05:30
Harsh Jha
48ed91c8d1 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-19 14:13:53 +05:30
Harsh Jha
73e196d3e6 nit: renamed files 2025-09-19 14:12:19 +05:30
Harsh Jha
2c9a27f24f Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-19 11:59:38 +05:30
Harsh Jha
36b9d1ac3c chore: revamped folder structure 2025-09-19 11:58:49 +05:30
Harsh Jha
4dc9cf8e69 Merge branch 'main' into sts-python-ci 2025-09-19 11:50:44 +05:30
Harsh Jha
d11b83a7fe Merge branch 'main' into sts-python-ci 2025-09-19 10:49:33 +05:30
Harsh Jha
beef5d0e22 Merge branch 'main' into sts-python-ci 2025-09-18 14:17:12 +05:30
Harsh Jha
dcaac13994 Merge branch 'main' into sts-python-ci 2025-09-18 12:36:28 +05:30
Harsh Jha
e67839b23c Merge branch 'main' into sts-python-ci 2025-09-18 12:36:06 +05:30
Twisha Bansal
93a591036d Update run_py_quickstart_tests.sh 2025-09-18 10:33:49 +05:30
Harsh Jha
02755b8799 Merge branch 'main' into sts-python-ci 2025-09-17 19:57:34 +05:30
Harsh Jha
f3808a57e1 chore: removed cloudsql public ip 2025-09-17 14:24:15 +05:30
Harsh Jha
8b26eea73c Merge branch 'main' into sts-python-ci 2025-09-17 14:14:06 +05:30
Harsh Jha
46fb0e9c07 chore: reverted unnecessary changes 2025-09-17 12:45:48 +05:30
Harsh Jha
f12f1aa3bf Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-17 12:26:49 +05:30
Harsh Jha
6c35298be4 chore: rename script file 2025-09-17 12:24:42 +05:30
Harsh Jha
41f34dcf65 Merge branch 'main' into sts-python-ci 2025-09-17 12:12:10 +05:30
Harsh Jha
0fb0cc42f1 Merge branch 'main' into sts-python-ci 2025-09-17 12:01:46 +05:30
Harsh Jha
3b34766283 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-17 11:43:26 +05:30
Harsh Jha
01667447f7 Merge branch 'main' into sts-python-ci 2025-09-17 11:35:57 +05:30
Harsh Jha
8d18e6c835 Update .ci/quickstart_py.integration.cloudbuild.yaml
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-16 23:15:12 +05:30
Harsh Jha
6b2cf28243 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-16 17:23:41 +05:30
Harsh Jha
c581a2c48e chore: addressed pr comments 2025-09-16 17:20:40 +05:30
Harsh Jha
d186003456 Merge branch 'sts-python-test' of https://github.com/googleapis/genai-toolbox into sts-python-ci 2025-09-16 17:12:51 +05:30
Harsh Jha
cad7305209 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-ci 2025-09-16 17:12:24 +05:30
Harsh Jha
4fd4387faf Update .ci/quickstart_py.sh
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-16 17:11:35 +05:30
Harsh Jha
0e2541f1c4 chore: added header for test file 2025-09-16 17:05:47 +05:30
Harsh Jha
aa8a886a76 Merge branch 'main' into sts-python-test 2025-09-16 17:02:46 +05:30
Harsh Jha
42e8cd4c07 cloudsqlproxy polling mehcanism 2025-09-16 16:49:03 +05:30
Harsh Jha
2a4bfb1045 nc 2025-09-16 16:42:30 +05:30
Harsh Jha
9cc85125b2 nc 2025-09-16 16:37:46 +05:30
Harsh Jha
81365831b5 netcat 2025-09-16 16:33:03 +05:30
Harsh Jha
3a8911fe96 netcat 2025-09-16 16:29:45 +05:30
Harsh Jha
05834f1cbf testing nc 2025-09-16 16:25:23 +05:30
Harsh Jha
4b94c0bf3e nc version 2025-09-16 16:20:56 +05:30
Harsh Jha
1a51a4c6ed Update netcat version to 1.10-46 2025-09-16 15:02:02 +05:30
Harsh Jha
17a845e10d Update netcat version to 1.10-46 2025-09-16 14:59:50 +05:30
Harsh Jha
f9cf22a824 chore: removed public ip connection method for cloudsql 2025-09-16 14:56:05 +05:30
Harsh Jha
b00f2b254c chore: added public ip of cloudsql 2025-09-16 14:41:23 +05:30
Harsh Jha
0450b83438 chore: removed cloudsql proxy 2025-09-16 13:04:19 +05:30
Harsh Jha
3d072c88ac chore: removed cloudsql proxy 2025-09-16 13:01:38 +05:30
Harsh Jha
a694b71331 chore: removed create table and substitution 2025-09-16 12:24:05 +05:30
Harsh Jha
8a316ecf4b chore: testing moving substitution at the end 2025-09-16 12:16:04 +05:30
Harsh Jha
e3842651c1 Merge branch 'sts-python-test' of https://github.com/googleapis/genai-toolbox into sts-python-ci 2025-09-16 12:02:17 +05:30
Harsh Jha
9315428714 Merge branch 'main' into sts-python-test 2025-09-16 11:58:59 +05:30
Harsh Jha
88d18ef21e testing: ci setup 2025-09-16 11:35:47 +05:30
Harsh Jha
9875d94fde applying changes for gettextbase 2025-09-15 21:37:40 +05:30
Harsh Jha
20e50ee003 update: gettext-base version 2025-09-15 21:32:49 +05:30
Harsh Jha
d8ad68b16c chore: jq not found 2025-09-15 21:26:57 +05:30
Harsh Jha
674372b774 chore: rename variable 2025-09-15 21:25:24 +05:30
Harsh Jha
af6f42ab0d chore: testing json format for requirement 2025-09-15 21:23:44 +05:30
Harsh Jha
4e082ea6ac chore: fix of cloudsql version variable 2025-09-15 20:59:06 +05:30
Harsh Jha
611789b440 chore: testing for script requirement file 2025-09-15 20:55:33 +05:30
Harsh Jha
ab9213c78a Update docs/en/getting-started/quickstart/python/__init__.py
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-15 20:52:57 +05:30
Harsh Jha
3b81df3b33 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-15 15:39:08 +05:30
Harsh Jha
c9d6a29ab1 Merge branch 'sts-python-test' of https://github.com/googleapis/genai-toolbox into sts-python-ci 2025-09-15 15:37:48 +05:30
Harsh Jha
531df33c85 Merge branch 'main' into sts-python-test 2025-09-15 15:36:59 +05:30
Harsh Jha
a6125f0116 test: add dynamic module import and initialization for orchestration 2025-09-15 15:36:28 +05:30
Harsh Jha
f6a0a17c91 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-15 13:17:57 +05:30
Harsh Jha
2cc43faa7d ci: python sample testing 2025-09-15 13:17:23 +05:30
Harsh Jha
a62e59e8de chore: testing simplying script file 2025-09-15 13:09:00 +05:30
Harsh Jha
d4b98918d6 chore: fix version of tools secret 2025-09-15 12:58:22 +05:30
Harsh Jha
d8d7c3e323 chore: testing approaches for sql file 2025-09-15 12:45:24 +05:30
Harsh Jha
67b829d8da Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-15 12:37:12 +05:30
Harsh Jha
4b271cd196 chore: added gettest package 2025-09-15 12:36:33 +05:30
Harsh Jha
d8c5d1eb63 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-15 12:29:57 +05:30
Harsh Jha
c143427e85 chore: fix if statement is script file 2025-09-15 12:29:27 +05:30
Harsh Jha
85d4b26dd3 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-15 12:25:41 +05:30
Harsh Jha
9f11962714 chore: fix table creation 2025-09-15 12:25:05 +05:30
Harsh Jha
c0cc26b709 fix: comments 2025-09-15 12:14:59 +05:30
Harsh Jha
78cdf94700 Merge branch 'sts-python-test' of https://github.com/googleapis/genai-toolbox into sts-python-ci 2025-09-15 12:05:49 +05:30
Harsh Jha
29fbeef928 chore: removed unnecessary changes 2025-09-15 12:04:22 +05:30
Harsh Jha
0a79f2a116 chore: removed extra file 2025-09-15 11:56:24 +05:30
Harsh Jha
7ae06781e7 chore(python/quickstart): address multiple review comments in sample code 2025-09-15 11:56:24 +05:30
Harsh Jha
1ef9d1ff08 feat: Implement language-agnostic snippet shortcode 2025-09-15 11:56:23 +05:30
Harsh Jha
cf677a23d0 refactor(test): update quickstart tests to only check for error-free execution 2025-09-15 11:56:23 +05:30
Harsh Jha
0c0204cda4 refactor(docs): Replace duplicated content with a shared shortcode 2025-09-15 11:56:23 +05:30
Harsh Jha
3fb36e3fa5 feat: Add shortcode for including file regions 2025-09-15 11:56:23 +05:30
Harsh Jha
e07e5c8d0f refactor: ci setup for python sample 2025-09-15 11:35:08 +05:30
Ajaykumar Yadav
82ae0fbc76 feat(prebuilt/sqlite): prebuilt tools for the sqlite. (#1227)
## Description
Prebuilt tools for the sqlite
- [x] `list_table` with simple and detailed(trigger,index,column) for
each table
- [x]  `execute-sql` for executing any SQL statement for sqlite.
- [x]  added tests and done required changes in config.
- [x] **Documentation update**:Done

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involves a breaking change

🛠️ Fixes: https://github.com/googleapis/genai-toolbox/issues/1226

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:08 +05:30
Huan Chen
f692cb7aa8 chore: add usage tracker for bigquery-conversational-analytics (#1442)
## Description

---
Add client_id_enum for usage tracking.

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-15 11:35:08 +05:30
Huan Chen
36e0f32ee5 chore: add retry to flaky integration tests in conversational analytics api (#1430)
## Description

Added retries to bigquery-conversational-analytics api tests for timeout
error(25s in test) and service unavailable error(code 503). Other errors
will not retry.

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:08 +05:30
Wenxin Du
19364a6108 feat(tools/bigquery): Add useClientOAuth to BigQuery prebuilt source config (#1431)
Allow users to set `useClientOAuth` in the BQ prebuilt config
2025-09-15 11:35:08 +05:30
prernakakkar-google
079259ccb7 feat(prebuilt/cloudsql): Add create user tool for cloud sql (#1406)
## Description

---
This pull request introduces a new tool, cloud-sql-create-users, which
allows for the creation of both built-in and IAM users in a Cloud SQL
instance.
<img width="518" height="846" alt="image"
src="https://github.com/user-attachments/assets/2f96f0be-658b-46d1-9de6-f47db2804274"
/>
<img width="518" height="956" alt="image"
src="https://github.com/user-attachments/assets/2a7d80d4-eab2-4e91-b08b-5fb78c150319"
/>


## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:08 +05:30
prernakakkar-google
68ffa75817 feat(prebuilt/cloudsql): Add cloud-sql-get-instances tool (#1383)
## Description

---
This pull request introduces the `cloud-sql-get-instances` tool, which
enables users to retrieve detailed information about a specified Cloud
SQL instance. This tool enhances the toolbox by providing a direct and
authenticated way to interact with the Google Cloud SQL Admin API.
Authentication is handled automatically by generating a bearer token
from the environment's Application Default Credentials with the
`https://www.googleapis.com/auth/sqlservice.admin` scope.


<img width="282" height="1064" alt="image"
src="https://github.com/user-attachments/assets/253d3939-7de2-4324-bc2b-8a2eb20eb133"
/>



####


## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea - Tracked internally
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:08 +05:30
prernakakkar-google
8560dfb597 feat(prebuilt/cloudsql): Add list instances tool for cloudsql (#1310)
## Description
---
This change introduces a new tool, `cloudsqllistinstance`, to the
`cloudsql` toolset. This tool allows users to list all Cloud SQL
instances within a specified GCP project.

The implementation includes:

- The `cloudsqllistinstance` tool definition and logic, which makes an
authenticated GET request to the `sqladmin.googleapis.com` API.
- The tool takes a single required parameter: `project`.
<img width="654" height="1406" alt="image"
src="https://github.com/user-attachments/assets/7c129a54-acb7-4695-9a0b-215914a6a273"
/>



## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea - tracked internally
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:08 +05:30
Sri Varshitha
ff5442e273 feat(tools/alloydb-get-cluster): Add get-cluster tool for alloydb (#1420)
## Description

This pull request introduces a new custom tool kind
`alloydb-get-cluster` that retrieves detailed information for a specific
AlloyDB cluster.

### Example Configuration

```yaml
tools:
  get_cluster:
    kind: alloydb-get-cluster
    source: alloydb-admin-source
    description: Use this tool to retrieve detailed information for a specific AlloyDB cluster.
```

### Example Request
``` 
curl -X POST http://127.0.0.1:5000/api/tool/get_cluster/invoke \
-H "Content-Type: application/json" \
-d '{
    "projectId": "example-project",
    "locationId": "us-central1",
    "clusterId": "my-alloydb-cluster",
}'
```
## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:07 +05:30
Sri Varshitha
bed5eaff32 feat(tools/alloydb-list-instances): Add custom tool kind for AlloyDB list_instances (#1357)
## Description

---
This pull request introduces a new custom tool kind
`alloydb-list-instances` that allows users to list the AlloyDB instances
in a given project, cluster and location.

### Example Configuration

```yaml
tools:
  list_instances:
    kind: alloydb-list-instances
    source: alloydb-admin-source
    description: Use this tool to list all AlloyDB instances for a given project, cluster and location.
```

### Example Request
``` 
curl -X POST http://127.0.0.1:5000/api/tool/list_instances/invoke \
-H "Content-Type: application/json" \
-d '{
    "projectId": "example-project",
    "locationId": "us-central1",
    "clusterId": "example-cluster",
}'
```

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:07 +05:30
Sri Varshitha
3baee129d7 feat(tools/alloydb-list-users): Add new custom tool kind for AlloyDB list_users (#1377)
## Description

---
This pull request introduces a new custom tool kind `alloydb-list-users`
that allows users to list all database users within an AlloyDB cluster.

### Example Configuration

```yaml
tools:
  list_users:
    kind: alloydb-list-users
    source: alloydb-admin-source
    description: Use this tool to list all database users within an AlloyDB cluster
```

### Example Request
``` 
curl -X POST http://127.0.0.1:5000/api/tool/list_users/invoke \
-H "Content-Type: application/json" \
-d '{
    "projectId": "example-project",
    "locationId": "us-central1",
    "clusterId": "example-cluster",
}'
```
## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:07 +05:30
Manu Paliwal
85f3154671 feat(observability): add cloud sql observability tools (#1425)
## Description
  ---
> This PR introduces a new observability prebuilt configs that allows
fetching system and query level metrics from the Cloud Monitoring API
for CloudSQL PG, MySQL and SQLServer.
  >
  > The key changes include:
> - 3 new configs using the cloud-monitoring-query-prometheus tool and
source cloud-monitoring
> - Manual testing is also done by locally running the server/tools and
integrating with Gemini CLI

 ## Followup Changes
  ---
   > - Documentation around the tools.

 ## PR Checklist
  ---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
  > few things you can do to make sure it goes smoothly:
   - [x] Make sure you reviewed
CONTRIBUTING.md
(https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
   - [x] Make sure to open an issue as a
bug/issue
(https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
    designs, and agree on the general idea
   - [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
   - [] Appropriate docs were updated (if necessary)
   - [ ] Make sure to add ! if this involve a breaking change

---------

Co-authored-by: prernakakkar-google <158031829+prernakakkar-google@users.noreply.github.com>
2025-09-15 11:35:07 +05:30
prernakakkar-google
cce2d5e0bc feat(tool/cloudsql): Add cloud sql wait for operation tool with exponential backoff (#1306)
## Description
---
This pull request introduces a new tool, `cloudsql-wait-for-operation`,
to improve the handling of long-running operations in Google Cloud SQL.

__Key Features:__

- __Asynchronous Operation Polling:__ The `cloudsql-wait-for-operation`
tool polls the Cloud SQL operations API at a specified interval until
the operation completes or fails. This is essential for managing
asynchronous tasks like instance and database creation, which can take
several minutes.
- __Configurable Retries:__ The tool includes configurable retry logic
with exponential backoff (`delay`, `maxDelay`, `multiplier`,
`maxRetries`) to handle transient network issues and make the polling
mechanism more resilient.
- __Improved User Experience:__ By waiting for operations to complete,
this tool provides a more synchronous-like experience for the user, who
can be confident that a resource is ready before the next step in a
workflow is executed.

Tested:
<img width="592" height="1118" alt="image"
src="https://github.com/user-attachments/assets/fd64d367-0fba-4d6a-a6f1-8fc642132208"
/>



## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [X] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [X] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea - (Internal bug)
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [X] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:07 +05:30
Mend Renovate
bf57e26736 chore(deps): update module google.golang.org/genai to v1.24.0 (#1417)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/genai](https://redirect.github.com/googleapis/go-genai)
| `v1.23.0` -> `v1.24.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fgenai/v1.24.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fgenai/v1.23.0/v1.24.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/go-genai (google.golang.org/genai)</summary>

###
[`v1.24.0`](https://redirect.github.com/googleapis/go-genai/releases/tag/v1.24.0)

[Compare
Source](https://redirect.github.com/googleapis/go-genai/compare/v1.23.0...v1.24.0)

##### Features

- \[Python] Implement async embedding batches for MLDev.
([f32fb26](f32fb26a12))
- Add labels to create tuning job config
([c13a2a5](c13a2a5f68))
- generate the function\_call class's converters
([995a3ac](995a3acc0a))
- Support Veo 2 Editing on Vertex
([7fd6940](7fd694074b))

##### Bug Fixes

- Enable `id` field in `FunctionCall` for Vertex AI.
([a3f3c2b](a3f3c2b37e))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-15 11:35:07 +05:30
Mend Renovate
e9410556bf chore(deps): update module cloud.google.com/go/dataplex to v1.27.0 (#1429)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[cloud.google.com/go/dataplex](https://redirect.github.com/googleapis/google-cloud-go)
| `v1.26.0` -> `v1.27.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fdataplex/v1.27.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fdataplex/v1.26.0/v1.27.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->
2025-09-15 11:35:07 +05:30
Mend Renovate
627734ff01 chore(deps): update module github.com/redis/go-redis/v9 to v9.14.0 (#1405)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/redis/go-redis/v9](https://redirect.github.com/redis/go-redis)
| `v9.13.0` -> `v9.14.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fredis%2fgo-redis%2fv9/v9.14.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fredis%2fgo-redis%2fv9/v9.13.0/v9.14.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>redis/go-redis (github.com/redis/go-redis/v9)</summary>

###
[`v9.14.0`](https://redirect.github.com/redis/go-redis/releases/tag/v9.14.0):
9.14.0

[Compare
Source](https://redirect.github.com/redis/go-redis/compare/v9.13.0...v9.14.0)

#### Highlights

- Added batch process method to the pipeline
([#&#8203;3510](https://redirect.github.com/redis/go-redis/pull/3510))

### Changes

#### 🚀 New Features

- Added batch process method to the pipeline
([#&#8203;3510](https://redirect.github.com/redis/go-redis/pull/3510))

#### 🐛 Bug Fixes

- fix: SetErr on Cmd if the command cannot be queued correctly in
multi/exec
([#&#8203;3509](https://redirect.github.com/redis/go-redis/pull/3509))

#### 🧰 Maintenance

- Updates release drafter config to exclude dependabot
([#&#8203;3511](https://redirect.github.com/redis/go-redis/pull/3511))
- chore(deps): bump actions/setup-go from 5 to 6
([#&#8203;3504](https://redirect.github.com/redis/go-redis/pull/3504))

#### Contributors

We'd like to thank all the contributors who worked on this release!

[@&#8203;elena-kolevska](https://redirect.github.com/elena-kolevksa),
[@&#8203;htemelski-redis](https://redirect.github.com/htemelski-redis)
and [@&#8203;ndyakov](https://redirect.github.com/ndyakov)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-15 11:35:07 +05:30
Twisha Bansal
c11bb06cb6 fix: update mcp implementation comments (#1285)
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-15 11:35:07 +05:30
Yuan Teoh
f8d03bd974 docs: fix broken links in python quickstart (#1427) 2025-09-15 11:35:07 +05:30
Sri Varshitha
21bdaf4045 feat(tools/alloydb-list-cluster): Add custom tool kind for AlloyDB list_clusters (#1319)
## Description
---
This pull request introduces a new custom tool kind
`alloydb-list-clusters` that lists all AlloyDB clusters in a given
project and location.

### Example Configuration

```yaml
tools:
  list_clusters:
    kind: alloydb-list-clusters
    source: alloydb-admin-source
    description: Use this tool to list all AlloyDB clusters in a given project and location.
```

### Example Request
``` 
curl -X POST http://127.0.0.1:5000/api/tool/list_clusters/invoke \
-H "Content-Type: application/json" \
-d '{
    "projectId": "example-project",
    "locationId": "us-central1"
}'
```

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:07 +05:30
Wenxin Du
f3f92eeac3 docs(bigquery): Update end-user OAuth doc (#1373)
Now all BQ tools support end-user credentials passthrough.
2025-09-15 11:35:07 +05:30
Pete Hampton
47302f313e feat(clickhouse): add list-databases tool (#1274)
## Description
---

Adds a tool to the list databases in a clickhouse cluster

<img width="1440" height="813" alt="Screenshot 2025-08-28 at 09 58 15"
src="https://github.com/user-attachments/assets/73643f5d-0c37-4e58-a81c-47bc3a2a5f8e"
/>


## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Pete Hampton <pjhampton@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:07 +05:30
Manu Paliwal
a30531a4e9 feat: Add cloud monitoring tool (#1311)
## Description
  ---
> This PR introduces a new observability tool that allows fetching
system and query level metrics from the Cloud Monitoring API for AlloyDB
instances.
  >
  > The key changes include:
> - A new observability tool that can make authenticated requests to the
monitoring API.
> - Pre-built configurations in observability.yaml for fetching AlloyDB
system and query metrics.
  > - Unit and integration tests for the new observability tool.
> - List of metrics is fetched from the public documentation available
https://cloud.google.com/alloydb/docs/reference/system-insights-metrics
> - Manual testing is also done by locally running the server/tools and
integrating with Gemini CLI

 ## Followup Changes
  ---
   > - Similar observability tools for Postgres, MySQL and SQLServer.
   > - Documentation around the tools.

 ## PR Checklist
  ---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
  > few things you can do to make sure it goes smoothly:
   - [x] Make sure you reviewed
CONTRIBUTING.md
(https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
   - [x] Make sure to open an issue as a
bug/issue
(https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
    designs, and agree on the general idea
   - [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
   - [] Appropriate docs were updated (if necessary)
   - [ ] Make sure to add ! if this involve a breaking change

Tested Locally:
<img width="4886" height="3068" alt="image"
src="https://github.com/user-attachments/assets/774d0776-6c91-48aa-8d1a-64d84cfdbc75"
/>

<img width="4752" height="2778" alt="image"
src="https://github.com/user-attachments/assets/839f099a-dd1a-410c-bbe4-9913603e5ff3"
/>
2025-09-15 11:35:06 +05:30
Sri Varshitha
3f21464d78 feat(tools/mysql-list-tables): Add new tool for mysql (#1287)
## Description
---
This pull request introduces a new custom tool kind `mysql-list-tables`
that allows users to list tables within a MySQL database.

### Example Configuration

```yaml
tools:
  mysql_list_tables:
    kind: mysql-list-tables
    source: mysql-source
    description: Use this tool to retrieve schema information for all or specified tables. Output format can be simple (only table names) or detailed.
```

### Example Request
``` 
curl -X POST http://127.0.0.1:5000/api/tool/mysql_list_tables/invoke \
-H "Content-Type: application/json" \
-d '{
    "table_names": "users",
    "output_format": "simple"
}'
```

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:06 +05:30
Sri Varshitha
45821cf73e feat(tools/postgres-list-tables): Add new tool postgres source (#1284)
## Description
---
This pull request introduces a new custom tool kind
`postgres-list-tables` that allows users to list tables within a
Postgres database.

### Example Configuration

```yaml
tools:
  postgres_list_tables:
    kind: postgres-list-tables
    source: postgres-source
    description: Use this tool to retrieve schema information for all or specified tables. Output format can be simple (only table names) or detailed.
```

### Example Request
``` 
curl -X POST http://127.0.0.1:5000/api/tool/postgres_list_tables/invoke \
-H "Content-Type: application/json" \
-d '{
    "table_names": "users",
    "output_format": "simple"
}'
```

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:06 +05:30
Sri Varshitha
a261188f4f feat: add alloydb admin source (#1369)
## Description

---
This pull request introduces a new source alloydb-admin which provides a
client for the Google Cloud AlloyDB API.

### Example Configuration

```yaml
sources
  alloydb_admin_source:
    kind: alloydb-admin
```

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:06 +05:30
prernakakkar-google
cd50e6b23b feat(source/cloudsqladmin): Add cloud sql admin source (#1408)
## Description

---
This PR introduces the `cloud-sql-admin` source, which provides a client
for the Google Cloud SQL Admin API.

```

sources:
  my-cloud-sql-admin:
    kind: cloud-sql-admin
```



## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:06 +05:30
Yuan Teoh
5bc42045bb docs: add tools naming convention (#1414)
Add guidelines for MCP Toolbox tools naming properties: **tool name**
and **tool kind**.
2025-09-15 11:35:06 +05:30
Mend Renovate
01f795b071 chore(deps): update module google.golang.org/api to v0.249.0 (#1374)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.248.0` -> `v0.249.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.249.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.248.0/v0.249.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.249.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.249.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.248.0...v0.249.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3281](https://redirect.github.com/googleapis/google-api-go-client/issues/3281))
([c03d56b](c03d56bd80))
- **all:** Auto-regenerate discovery clients
([#&#8203;3283](https://redirect.github.com/googleapis/google-api-go-client/issues/3283))
([4d1d336](4d1d336897))
- **all:** Auto-regenerate discovery clients
([#&#8203;3284](https://redirect.github.com/googleapis/google-api-go-client/issues/3284))
([d2b41b6](d2b41b6ebb))
- **all:** Auto-regenerate discovery clients
([#&#8203;3285](https://redirect.github.com/googleapis/google-api-go-client/issues/3285))
([bcde943](bcde9430e3))
- **all:** Auto-regenerate discovery clients
([#&#8203;3286](https://redirect.github.com/googleapis/google-api-go-client/issues/3286))
([27530dd](27530ddc51))
- **all:** Auto-regenerate discovery clients
([#&#8203;3288](https://redirect.github.com/googleapis/google-api-go-client/issues/3288))
([240e54a](240e54a474))
- **all:** Auto-regenerate discovery clients
([#&#8203;3289](https://redirect.github.com/googleapis/google-api-go-client/issues/3289))
([2bd4a72](2bd4a72fae))
- **all:** Auto-regenerate discovery clients
([#&#8203;3291](https://redirect.github.com/googleapis/google-api-go-client/issues/3291))
([d09c7d3](d09c7d319a))
- **all:** Auto-regenerate discovery clients
([#&#8203;3292](https://redirect.github.com/googleapis/google-api-go-client/issues/3292))
([7b2235b](7b2235b382))
- **all:** Auto-regenerate discovery clients
([#&#8203;3293](https://redirect.github.com/googleapis/google-api-go-client/issues/3293))
([2e60cd5](2e60cd5bcc))
- **all:** Auto-regenerate discovery clients
([#&#8203;3294](https://redirect.github.com/googleapis/google-api-go-client/issues/3294))
([162aba4](162aba409d))
- **all:** Auto-regenerate discovery clients
([#&#8203;3295](https://redirect.github.com/googleapis/google-api-go-client/issues/3295))
([e297a42](e297a42860))
- **all:** Auto-regenerate discovery clients
([#&#8203;3296](https://redirect.github.com/googleapis/google-api-go-client/issues/3296))
([f98f835](f98f8357e2))
- **all:** Auto-regenerate discovery clients
([#&#8203;3297](https://redirect.github.com/googleapis/google-api-go-client/issues/3297))
([ababe60](ababe603c1))
- **all:** Auto-regenerate discovery clients
([#&#8203;3299](https://redirect.github.com/googleapis/google-api-go-client/issues/3299))
([ad70b6e](ad70b6eee9))
- **all:** Auto-regenerate discovery clients
([#&#8203;3300](https://redirect.github.com/googleapis/google-api-go-client/issues/3300))
([4fcd5bb](4fcd5bbd71))
- **all:** Auto-regenerate discovery clients
([#&#8203;3301](https://redirect.github.com/googleapis/google-api-go-client/issues/3301))
([6db0561](6db056121d))
- **all:** Auto-regenerate discovery clients
([#&#8203;3302](https://redirect.github.com/googleapis/google-api-go-client/issues/3302))
([79b251a](79b251ae0f))
- **all:** Auto-regenerate discovery clients
([#&#8203;3303](https://redirect.github.com/googleapis/google-api-go-client/issues/3303))
([e93c8a8](e93c8a870b))
- **all:** Auto-regenerate discovery clients
([#&#8203;3305](https://redirect.github.com/googleapis/google-api-go-client/issues/3305))
([1ca0330](1ca0330e52))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-15 11:35:06 +05:30
Averi Kitsch
e9faa2ba57 chore: Add label for release candidate (#1413)
## Description

---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:06 +05:30
Haoming Chen
77ba31846f feat(tools/bigquery-analyze-contribution): Add analyze contribution tool (#1223)
This tool creates a contribution analysis model and use ml.get_insights
to get the results.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:06 +05:30
Niraj Nandre
bd28550fa3 feat(prebuilt/cloud-sql-mysql): Add env var support for IP Type (#1232) (#1347)
This PR fixes #1232 by allowing the `CLOUD_SQL_MYSQL_IP_TYPE`
environment variable to control the connection type for the
`cloud-sql-mysql` prebuilt tool, defaulting to "PUBLIC".

**Changes:**

* **`internal/prebuiltconfigs/tools/cloud-sql-mysql.yaml`:** Updated
`ipType` to use `${CLOUD_SQL_MYSQL_IP_TYPE:PUBLIC}`.
* **`internal/sources/cloudsqlmysql/cloud_sql_mysql.go`:** Removed
`validate:"required"` from the `IPType` field in the `Config` struct to
allow the default to be used when the env var is unset.
* **Documentation:** Updated relevant documentation to reflect the new
`CLOUD_SQL_MYSQL_IP_TYPE` environment variable usage.

**Testing:**

Built a local Docker image and tested on a GCE VM:

*   Setting `CLOUD_SQL_MYSQL_IP_TYPE="private"` connects via Private IP.
*   Setting `CLOUD_SQL_MYSQL_IP_TYPE="PUBLIC"` connects via Public IP.
*   Leaving `CLOUD_SQL_MYSQL_IP_TYPE` unset defaults to Public IP.

All tests initialized the toolbox successfully.

Fixes #1232



## Description

---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1232

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-15 11:35:06 +05:30
Huan Chen
bf5ff627ca feat(source/bigquery): Add support for datasets selection (#1313)
## Description
---
- bigquery Source: The source configuration now supports a new
allowedDatasets field, which defines the list of datasets the tools are
allowed to access.
- bigquery-list-table-ids: Now verifies that the requested dataset is in
the allowed datasets list before listing its tables. An error is
returned if access is not permitted.
## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-15 11:35:06 +05:30
Mend Renovate
6f17188207 chore(deps): update module google.golang.org/genai to v1.23.0 (#1361)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/genai](https://redirect.github.com/googleapis/go-genai)
| `v1.21.0` -> `v1.23.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fgenai/v1.23.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fgenai/v1.21.0/v1.23.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/go-genai (google.golang.org/genai)</summary>

###
[`v1.23.0`](https://redirect.github.com/googleapis/go-genai/releases/tag/v1.23.0)

[Compare
Source](https://redirect.github.com/googleapis/go-genai/compare/v1.22.0...v1.23.0)

##### Features

- Add resolution field for Gemini Developer API Veo 3 generation
([b6a989c](b6a989cdca))

###
[`v1.22.0`](https://redirect.github.com/googleapis/go-genai/releases/tag/v1.22.0)

[Compare
Source](https://redirect.github.com/googleapis/go-genai/compare/v1.21.0...v1.22.0)

##### Features

- add `sdkHttpResponse.headers` to \*Delete responses.
([ac0e763](ac0e7632e5))
- Add add\_watermark field for recontext\_image (Virtual Try-On, Product
Recontext)
([07b6f57](07b6f573b2))
- Add output\_gcs\_uri to Imagen upscale\_image
([c08d9f3](c08d9f35c3))
- Add VALIDATED mode into FunctionCallingConfigMode
([c282e79](c282e79bed))
- Add VideoGenerationReferenceType enum for generate\_videos
([635b825](635b825bed))
- refactor Go SDK to use pointers for optional parameters
([3ff328a](3ff328ac19))
- support tunings.cancel in the genai SDK for Python, Java, JS, and Go
([8c46fd2](8c46fd26e1))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTcuMTAiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbXX0=-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
2025-09-15 11:35:06 +05:30
Harsh Jha
e4a0bb8220 Update docs/en/getting-started/quickstart/python/quickstart_test.py
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-15 11:35:05 +05:30
Anmol Shukla
46d4a6e86d docs: updated google api key conf (#1398)
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-15 11:35:05 +05:30
Mend Renovate
2e239424e0 chore(deps): update module github.com/googleapis/mcp-toolbox-sdk-go to v0.3.0 (#1360)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/googleapis/mcp-toolbox-sdk-go](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go)
| `v0.2.0` -> `v0.3.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fgoogleapis%2fmcp-toolbox-sdk-go/v0.3.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fgoogleapis%2fmcp-toolbox-sdk-go/v0.2.0/v0.3.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/mcp-toolbox-sdk-go
(github.com/googleapis/mcp-toolbox-sdk-go)</summary>

###
[`v0.3.0`](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/releases/tag/v0.3.0)

[Compare
Source](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/compare/v0.2.0...v0.3.0)

##### Features

- Add support for Map parameter type
([#&#8203;51](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/issues/51))
([c80d8d6](c80d8d6e2b))
- Add support for nested maps in generic map parameter
([#&#8203;61](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/issues/61))
([3b33c52](3b33c52150))

##### Miscellaneous Chores

- Release 0.3.0
([#&#8203;59](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/issues/59))
([a89adcf](a89adcf440))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTcuMTAiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbXX0=-->

Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-15 11:35:05 +05:30
Yuan Teoh
d66c010b9d chore: fix lint in recent merged prs (#1396)
Run `golangcilint` on recent merged prs.
2025-09-15 11:35:05 +05:30
Wenxin Du
67e890edf8 fix(bigquery)!: Add Bearer parsing to auth token (#1386)
Previously we propagate tokens directly to the BQ API. But MCP inspector
adds a "Bearer" prefix to all authorization header. We will need to
parse the token accordingly to make it work.
2025-09-15 11:35:05 +05:30
trehanshakuntG
70b849ec24 feat(tools/firestore): Add firestore-query tool (#1305)
## Description
---

This PR introduces a new tool kind `firestore-query` that enables
parameterized querying of Firestore collections with support for
Firestore native JSON value types, ensuring proper type handling for
complex queries.

### Feature

A new Firestore tool that allows:

- __Parameterized collection paths, filters, select, orderBy, limit and
analyzeQuery__ using Go template syntax
- __Native JSON value type support__ for proper type handling in queries
- __Complex filter structures__ with AND/OR logical operators
- __Dynamic query building__ with template parameter substitution

Example usage:
<img width="761" height="721" alt="Screenshot 2025-09-09 at 1 21 16 PM"
src="https://github.com/user-attachments/assets/bb359ea8-f750-492d-9f13-cef8f3b6bfd1"
/>



## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change
2025-09-15 11:35:05 +05:30
Averi Kitsch
708c8f311f feat(prebuilt): Update default values for prebuilt tools (#1355)
## Description
Provide default values for prebuilt tools

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-15 11:35:05 +05:30
Harsh Jha
b1e616f319 Merge branch 'main' into sts-python-test 2025-09-15 11:23:17 +05:30
Harsh Jha
81f0a15a4f Update docs/en/getting-started/quickstart/python/quickstart_test.py
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-10 14:26:14 +05:30
Harsh Jha
4adc455cab Merge branch 'main' into sts-python-test 2025-09-10 12:27:47 +05:30
Harsh Jha
88a929d0b7 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-10 01:54:41 +05:30
Harsh Jha
a06a4226ae fix: psql error 2025-09-10 01:54:14 +05:30
Harsh Jha
356403674a Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-10 01:40:14 +05:30
Harsh Jha
af48843fb4 chore: clenanup of script 2025-09-10 01:39:38 +05:30
Harsh Jha
f9856a5fc7 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-10 00:44:17 +05:30
Harsh Jha
1573e8ce9a chore: updated script for single quickstart_test.py 2025-09-10 00:42:20 +05:30
Harsh Jha
475b71051c chore: rolled back to previous script 2025-09-10 00:40:31 +05:30
Harsh Jha
41d447de9f chore: fix of installing requirements.txt 2025-09-10 00:33:31 +05:30
Harsh Jha
41c71c0098 fix: requirements.txt installation 2025-09-10 00:14:20 +05:30
Harsh Jha
cad3a8acf2 fix: requirements.txt installation 2025-09-10 00:05:19 +05:30
Harsh Jha
18adf98106 fix: requirements.txt installation 2025-09-10 00:02:27 +05:30
Harsh Jha
85417324d3 MERGE sts-python-ci into sts-python-poc 2025-09-09 23:54:32 +05:30
Harsh Jha
59ae6b5208 new: ci files for py quickstart sample testing 2025-09-09 23:49:13 +05:30
Harsh Jha
82c86cfeab Merge branch 'main' into sts-python-test 2025-09-09 23:16:27 +05:30
Harsh Jha
1a272a919f chore: removed redudant test files of py quickstart sample 2025-09-09 23:14:13 +05:30
Harsh Jha
85f7e86104 fix: await of adk quickstart 2025-09-09 20:54:46 +05:30
Harsh Jha
202a30df2b Merge branch 'sts-python-test' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-09 20:53:32 +05:30
Harsh Jha
936f0d5d49 update: adk verison in py quickstart 2025-09-09 20:52:50 +05:30
Harsh Jha
ef86e37207 Merge branch 'main' into sts-python-test 2025-09-09 20:32:07 +05:30
Harsh Jha
10517eaf2e fix: proejct id for llamaindex py quickstart sample 2025-09-09 20:31:10 +05:30
Harsh Jha
60e5bcf45d fix: core quickstart sample 2025-09-09 20:21:00 +05:30
Harsh Jha
a628bcc97a fix: os import in core py sample 2025-09-09 20:13:47 +05:30
Harsh Jha
05de1b6eb4 fix: proejct id for quickstart core 2025-09-09 20:09:21 +05:30
Harsh Jha
83f21fb563 fix: google api key 2025-09-09 20:02:55 +05:30
Harsh Jha
bbf74bf241 chore: await keyword 2025-09-09 19:57:58 +05:30
Harsh Jha
730ff27485 Merge branch 'sts-python-ci' of https://github.com/googleapis/genai-toolbox into sts-python-poc 2025-09-09 19:48:32 +05:30
Harsh Jha
c1be3a730f new: ci files for py quickstart samples 2025-09-09 19:19:35 +05:30
Harsh Jha
72d70180ae new: ci files for py quickstart samples 2025-09-09 19:11:54 +05:30
Harsh Jha
1de1d8a8c7 Merge branch 'main' into sts-python-test 2025-09-09 19:06:58 +05:30
Harsh Jha
9f411c0bd6 updated: version of python packages in sample quickstart 2025-09-08 12:52:13 +05:30
Harsh Jha
3c81ee2b87 chore: corrected the name of test class 2025-09-08 12:49:34 +05:30
Harsh Jha
8c1467d8a9 Merge branch 'main' into sts-python-test 2025-09-03 10:53:23 +05:30
Harsh Jha
623b91af84 Merge branch 'main' into sts-python-test 2025-09-02 15:28:47 +05:30
Harsh Jha
e1cc144e72 Remove unnecessary newline in run_application function 2025-09-02 15:26:56 +05:30
Harsh Jha
1a0386b360 chore(python/quickstart): address multiple review comments in sample code 2025-09-02 15:25:48 +05:30
Harsh Jha
7dfdede4be test: Remove redundant script output check from tests 2025-09-02 15:17:50 +05:30
Harsh Jha
25e116584a fix(python/quickstart): pin exact versions in requirements.txt 2025-09-02 15:17:50 +05:30
Harsh Jha
e0d6c1f6d8 fix(python/quickstart): revert code sample in all python quickstart examples 2025-09-02 15:17:50 +05:30
Harsh Jha
7ffb0e2d82 feat: Implement language-agnostic snippet shortcode 2025-09-02 15:17:50 +05:30
Harsh Jha
0e23d73ea2 refactor(docs): Fix Table of Contents and optimize region include shortcode 2025-09-02 15:17:50 +05:30
Harsh Jha
79adf9d70b refactor(docs): Replace duplicated content with a shared shortcode 2025-09-02 15:17:50 +05:30
Harsh Jha
c97b89555e feat: Add shortcode for including file regions 2025-09-02 15:17:50 +05:30
Harsh Jha
018edb5d61 Merge branch 'main' into sts-python-test 2025-09-01 10:43:51 +05:30
Harsh Jha
a5beea5756 chore(docs/python): remove compile-time test from quickstart samples 2025-09-01 10:42:40 +05:30
Harsh Jha
1130354ac5 refactor: replace duplicate snippet code with regionInclude 2025-08-29 10:47:03 +05:30
Harsh Jha
28021d760e Merge branch 'main' of https://github.com/googleapis/genai-toolbox into sts-python-test 2025-08-29 10:39:56 +05:30
Harsh Jha
ff7f34bba6 feat: remove duplicate snippet shortcode 2025-08-29 10:39:44 +05:30
Harsh Jha
a93ab0c25a Merge branch 'main' into sts-python-test 2025-08-28 17:17:01 +05:30
Harsh Jha
921db2a546 refactor(test): update quickstart tests to only check for error-free execution 2025-08-28 16:08:34 +05:30
Harsh Jha
5b7cc83472 feat: add pytest-based test infrastructure for Python quickstart frameworks 2025-08-21 14:27:04 +05:30
Harsh Jha
e5922a69ec Merge branch 'sample-testing-strategy-python' of https://github.com/googleapis/genai-toolbox into sts-python-test 2025-08-20 19:46:37 +05:30
Harsh Jha
faa79cd3c1 Update quickstart.py 2025-08-20 18:21:11 +05:30
Harsh Jha
959941d4ae feat(adk): add Python quickstart testing infrastructure with requirements.txt, golden.txt, and quickstart_test.py 2025-08-20 18:14:31 +05:30
Harsh Jha
7dc77fec19 fix(quickstart): remove angle brackets from ToolboxSyncClient URL in Python ADK 2025-08-20 18:09:48 +05:30
Harsh Jha
061f38382e feat(python): add quickstart samples for adk, core, langchain, and llamaindex 2025-08-19 17:14:44 +05:30
Harsh Jha
64e64a0d51 Update quickstart docs and shortcodes for improved region includes 2025-08-19 16:34:13 +05:30
Anmol Shukla
5e0a1b03ab Merge branch 'main' into docs/add-region-shortcode 2025-08-19 14:28:14 +05:30
Anmol Shukla
b7cf0562ed Merge branch 'main' into docs/add-region-shortcode 2025-08-18 14:16:04 +05:30
Anmol Shukla
8309816f44 Merge branch 'main' into docs/add-region-shortcode 2025-08-14 15:12:34 +05:30
Harsh Jha
114a0c91d8 feat: Implement language-agnostic snippet shortcode 2025-08-05 14:39:53 +05:30
Harsh Jha
5f1e4b940c refactor(docs): Fix Table of Contents and optimize region include shortcode 2025-08-01 19:56:13 +05:30
Harsh Jha
e0b6d2d26b refactor(docs): Replace duplicated content with a shared shortcode 2025-07-31 18:52:46 +05:30
Harsh Jha
590bfaf4d3 feat: Add shortcode for including file regions 2025-07-31 11:31:19 +05:30
275 changed files with 1600 additions and 10829 deletions

View File

@@ -194,26 +194,6 @@ steps:
dataplex \
dataplex
- id: "dataform"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
secretEnv: ["CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
apt-get update && apt-get install -y npm && \
npm install -g @dataform/cli && \
.ci/test_with_coverage.sh \
"Dataform" \
dataform \
dataform
- id: "postgres"
name: golang:1
waitFor: ["compile-test-binary"]
@@ -537,8 +517,6 @@ steps:
- "FIRESTORE_PROJECT=$PROJECT_ID"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
- "LOOKER_VERIFY_SSL=$_LOOKER_VERIFY_SSL"
- "LOOKER_PROJECT=$_LOOKER_PROJECT"
- "LOOKER_LOCATION=$_LOOKER_LOCATION"
secretEnv:
[
"CLIENT_ID",
@@ -684,51 +662,6 @@ steps:
- |
./yugabytedb.test -test.v
- id: "cassandra"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID", "CASSANDRA_USER", "CASSANDRA_PASS", "CASSANDRA_HOST"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Cassandra" \
cassandra \
cassandra
- id: "oracle"
name: ghcr.io/oracle/oraclelinux8-instantclient:21
waitFor: ["install-dependencies"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
- "ORACLE_SERVER_NAME=$_ORACLE_SERVER_NAME"
secretEnv: ["CLIENT_ID", "ORACLE_USER", "ORACLE_PASS", "ORACLE_HOST"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
# Install the C compiler and Oracle SDK headers needed for cgo
dnf install -y gcc oracle-instantclient-devel
# Install Go
curl -L -o go.tar.gz "https://go.dev/dl/go1.25.1.linux-amd64.tar.gz"
tar -C /usr/local -xzf go.tar.gz
export PATH="/usr/local/go/bin:$$PATH"
go test ./tests/oracle
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
@@ -813,18 +746,6 @@ availableSecrets:
env: YUGABYTEDB_USER
- versionName: projects/$PROJECT_ID/secrets/yugabytedb_pass/versions/latest
env: YUGABYTEDB_PASS
- versionName: projects/$PROJECT_ID/secrets/cassandra_user/versions/latest
env: CASSANDRA_USER
- versionName: projects/$PROJECT_ID/secrets/cassandra_pass/versions/latest
env: CASSANDRA_PASS
- versionName: projects/$PROJECT_ID/secrets/cassandra_host/versions/latest
env: CASSANDRA_HOST
- versionName: projects/$PROJECT_ID/secrets/oracle_user/versions/latest
env: ORACLE_USER
- versionName: projects/$PROJECT_ID/secrets/oracle_pass/versions/latest
env: ORACLE_PASS
- versionName: projects/$PROJECT_ID/secrets/oracle_host/versions/latest
env: ORACLE_HOST
options:
logging: CLOUD_LOGGING_ONLY
@@ -857,8 +778,6 @@ substitutions:
_DGRAPHURL: "https://play.dgraph.io"
_COUCHBASE_BUCKET: "couchbase-bucket"
_COUCHBASE_SCOPE: "couchbase-scope"
_LOOKER_LOCATION: "us"
_LOOKER_PROJECT: "149671255749"
_LOOKER_VERIFY_SSL: "true"
_TIDB_HOST: 127.0.0.1
_TIDB_PORT: "4000"
@@ -876,4 +795,3 @@ substitutions:
_YUGABYTEDB_DATABASE: "yugabyte"
_YUGABYTEDB_PORT: "5433"
_YUGABYTEDB_LOADBALANCE: "false"
_ORACLE_SERVER_NAME: "FREEPDB1"

View File

@@ -1,47 +0,0 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
steps:
- name: 'golang:1.25.1'
id: 'go-quickstart-test'
entrypoint: 'bash'
args:
# The '-c' flag tells bash to execute the following string as a command.
# The 'set -ex' enables debug output and exits on error for easier troubleshooting.
- -c
- |
set -ex
export VERSION=$(cat ./cmd/version.txt)
chmod +x .ci/quickstart_test/run_go_tests.sh
.ci/quickstart_test/run_go_tests.sh
env:
- 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
- 'GCP_PROJECT=${_GCP_PROJECT}'
- 'DATABASE_NAME=${_DATABASE_NAME}'
- 'DB_USER=${_DB_USER}'
secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
availableSecrets:
secretManager:
- versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/7
env: 'TOOLS_YAML_CONTENT'
- versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
env: 'GOOGLE_API_KEY'
- versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
env: 'DB_PASSWORD'
timeout: 1000s
options:
logging: CLOUD_LOGGING_ONLY

View File

@@ -1,47 +0,0 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
steps:
- name: 'node:20'
id: 'js-quickstart-test'
entrypoint: 'bash'
args:
# The '-c' flag tells bash to execute the following string as a command.
# The 'set -ex' enables debug output and exits on error for easier troubleshooting.
- -c
- |
set -ex
export VERSION=$(cat ./cmd/version.txt)
chmod +x .ci/quickstart_test/run_js_tests.sh
.ci/quickstart_test/run_js_tests.sh
env:
- 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
- 'GCP_PROJECT=${_GCP_PROJECT}'
- 'DATABASE_NAME=${_DATABASE_NAME}'
- 'DB_USER=${_DB_USER}'
secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
availableSecrets:
secretManager:
- versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/6
env: 'TOOLS_YAML_CONTENT'
- versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
env: 'GOOGLE_API_KEY'
- versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
env: 'DB_PASSWORD'
timeout: 1000s
options:
logging: CLOUD_LOGGING_ONLY

View File

@@ -1,125 +0,0 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#!/bin/bash
set -e
TABLE_NAME="hotels_go"
QUICKSTART_GO_DIR="docs/en/getting-started/quickstart/go"
SQL_FILE=".ci/quickstart_test/setup_hotels_sample.sql"
PROXY_PID=""
TOOLBOX_PID=""
install_system_packages() {
apt-get update && apt-get install -y \
postgresql-client \
wget \
gettext-base \
netcat-openbsd
}
start_cloud_sql_proxy() {
wget "https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.10.0/cloud-sql-proxy.linux.amd64" -O /usr/local/bin/cloud-sql-proxy
chmod +x /usr/local/bin/cloud-sql-proxy
cloud-sql-proxy "${CLOUD_SQL_INSTANCE}" &
PROXY_PID=$!
for i in {1..30}; do
if nc -z 127.0.0.1 5432; then
echo "Cloud SQL Proxy is up and running."
return
fi
sleep 1
done
echo "Cloud SQL Proxy failed to start within the timeout period."
exit 1
}
setup_toolbox() {
TOOLBOX_YAML="/tools.yaml"
echo "${TOOLS_YAML_CONTENT}" > "$TOOLBOX_YAML"
if [ ! -f "$TOOLBOX_YAML" ]; then echo "Failed to create tools.yaml"; exit 1; fi
wget "https://storage.googleapis.com/genai-toolbox/v${VERSION}/linux/amd64/toolbox" -O "/toolbox"
chmod +x "/toolbox"
/toolbox --tools-file "$TOOLBOX_YAML" &
TOOLBOX_PID=$!
sleep 2
}
setup_orch_table() {
export TABLE_NAME
envsubst < "$SQL_FILE" | psql -h "$PGHOST" -p "$PGPORT" -U "$DB_USER" -d "$DATABASE_NAME"
}
run_orch_test() {
local orch_dir="$1"
local orch_name
orch_name=$(basename "$orch_dir")
if [ "$orch_name" == "openAI" ]; then
echo -e "\nSkipping framework '${orch_name}': Temporarily excluded."
return
fi
(
set -e
setup_orch_table
echo "--- Preparing module for $orch_name ---"
cd "$orch_dir"
if [ -f "go.mod" ]; then
go mod tidy
fi
cd ..
export ORCH_NAME="$orch_name"
echo "--- Running tests for $orch_name ---"
go test -v ./...
)
}
cleanup_all() {
echo "--- Final cleanup: Shutting down processes and dropping table ---"
if [ -n "$TOOLBOX_PID" ]; then
kill $TOOLBOX_PID || true
fi
if [ -n "$PROXY_PID" ]; then
kill $PROXY_PID || true
fi
}
trap cleanup_all EXIT
# Main script execution
install_system_packages
start_cloud_sql_proxy
export PGHOST=127.0.0.1
export PGPORT=5432
export PGPASSWORD="$DB_PASSWORD"
export GOOGLE_API_KEY="$GOOGLE_API_KEY"
setup_toolbox
for ORCH_DIR in "$QUICKSTART_GO_DIR"/*/; do
if [ ! -d "$ORCH_DIR" ]; then
continue
fi
run_orch_test "$ORCH_DIR"
done

View File

@@ -1,123 +0,0 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#!/bin/bash
set -e
TABLE_NAME="hotels_js"
QUICKSTART_JS_DIR="docs/en/getting-started/quickstart/js"
SQL_FILE=".ci/quickstart_test/setup_hotels_sample.sql"
# Initialize process IDs to empty at the top of the script
PROXY_PID=""
TOOLBOX_PID=""
install_system_packages() {
apt-get update && apt-get install -y \
postgresql-client \
wget \
gettext-base \
netcat-openbsd
}
start_cloud_sql_proxy() {
wget "https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.10.0/cloud-sql-proxy.linux.amd64" -O /usr/local/bin/cloud-sql-proxy
chmod +x /usr/local/bin/cloud-sql-proxy
cloud-sql-proxy "${CLOUD_SQL_INSTANCE}" &
PROXY_PID=$!
for i in {1..30}; do
if nc -z 127.0.0.1 5432; then
echo "Cloud SQL Proxy is up and running."
return
fi
sleep 1
done
echo "Cloud SQL Proxy failed to start within the timeout period."
exit 1
}
setup_toolbox() {
TOOLBOX_YAML="/tools.yaml"
echo "${TOOLS_YAML_CONTENT}" > "$TOOLBOX_YAML"
if [ ! -f "$TOOLBOX_YAML" ]; then echo "Failed to create tools.yaml"; exit 1; fi
wget "https://storage.googleapis.com/genai-toolbox/v${VERSION}/linux/amd64/toolbox" -O "/toolbox"
chmod +x "/toolbox"
/toolbox --tools-file "$TOOLBOX_YAML" &
TOOLBOX_PID=$!
sleep 2
}
setup_orch_table() {
export TABLE_NAME
envsubst < "$SQL_FILE" | psql -h "$PGHOST" -p "$PGPORT" -U "$DB_USER" -d "$DATABASE_NAME"
}
run_orch_test() {
local orch_dir="$1"
local orch_name
orch_name=$(basename "$orch_dir")
(
set -e
echo "--- Preparing environment for $orch_name ---"
setup_orch_table
cd "$orch_dir"
if [ -f "package.json" ]; then
echo "Installing dependencies for $orch_name..."
npm install
fi
cd ..
echo "--- Running tests for $orch_name ---"
export ORCH_NAME="$orch_name"
node --test quickstart.test.js
echo "--- Cleaning environment for $orch_name ---"
rm -rf "${orch_name}/node_modules"
)
}
cleanup_all() {
echo "--- Final cleanup: Shutting down processes and dropping table ---"
if [ -n "$TOOLBOX_PID" ]; then
kill $TOOLBOX_PID || true
fi
if [ -n "$PROXY_PID" ]; then
kill $PROXY_PID || true
fi
}
trap cleanup_all EXIT
# Main script execution
install_system_packages
start_cloud_sql_proxy
export PGHOST=127.0.0.1
export PGPORT=5432
export PGPASSWORD="$DB_PASSWORD"
export GOOGLE_API_KEY="$GOOGLE_API_KEY"
setup_toolbox
for ORCH_DIR in "$QUICKSTART_JS_DIR"/*/; do
if [ ! -d "$ORCH_DIR" ]; then
continue
fi
run_orch_test "$ORCH_DIR"
done

View File

@@ -1,10 +1,12 @@
## Description
---
> Should include a concise description of the changes (bug or feature), it's
> impact, along with a summary of the solution
## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there are a
> few things you can do to make sure it goes smoothly:
@@ -12,7 +14,7 @@
[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a
[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
before writing your code! That way we can discuss the change, evaluate
designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)

View File

@@ -37,4 +37,4 @@ extraFiles: [
"docs/en/how-to/connect-ide/postgres_mcp.md",
"docs/en/how-to/connect-ide/neo4j_mcp.md",
"docs/en/how-to/connect-ide/sqlite_mcp.md",
]
]

View File

@@ -37,7 +37,7 @@ jobs:
runs-on: 'ubuntu-latest'
steps:
- uses: 'actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd' # v8
- uses: 'actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b' # v7
with:
script: |-
// parse test names

View File

@@ -1,84 +0,0 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
name: "Deploy In-development docs"
permissions:
contents: write
on:
push:
branches:
- main
paths:
- 'docs/**'
- 'github/workflows/docs**'
- '.hugo/**'
# Allow triggering manually.
workflow_dispatch:
jobs:
deploy:
runs-on: ubuntu-24.04
defaults:
run:
working-directory: .hugo
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5
with:
fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
- name: Setup Hugo
uses: peaceiris/actions-hugo@75d2e84710de30f6ff7268e08f310b60ef14033f # v3
with:
hugo-version: "0.145.0"
extended: true
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "22"
- name: Cache dependencies
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- run: npm ci
- run: hugo --minify
env:
HUGO_BASEURL: https://${{ github.repository_owner }}.github.io/${{ github.event.repository.name }}/dev
HUGO_RELATIVEURLS: false
- name: Create Staging Directory
run: |
mkdir staging
mv public staging/dev
mv staging/dev/releases.releases staging/releases.releases
- name: Deploy
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./.hugo/staging
publish_branch: versioned-gh-pages
keep_files: true
commit_message: "deploy: ${{ github.event.head_commit.message }}"

View File

@@ -1,104 +0,0 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
name: "Deploy Previous Version Docs"
on:
workflow_dispatch:
inputs:
version_tag:
description: 'The old version tag to build docs for (e.g., v0.15.0)'
required: true
type: string
jobs:
build_and_deploy:
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Checkout main branch (for latest templates and theme)
uses: actions/checkout@v4
with:
ref: 'main'
submodules: 'recursive'
fetch-depth: 0
- name: Checkout old content from tag into a temporary directory
uses: actions/checkout@v4
with:
ref: ${{ github.event.inputs.version_tag }}
path: 'old_version_source' # Checkout into a temp subdir
# Sparse checkout to only get the content directory
sparse-checkout: |
docs
- name: Replace content with old version
run: |
# Remove the current content directory from the main branch checkout
rm -rf docs/
# Move the old content directory into place
mv ./old_version_source/docs docs
- name: Setup Hugo and Node
uses: peaceiris/actions-hugo@v3
with:
hugo-version: "0.145.0"
extended: true
- uses: actions/setup-node@v4
with:
node-version: "22"
- name: Install Dependencies
run: npm ci
working-directory: .hugo
- name: Build Hugo Site for Archived Version
run: hugo --minify
working-directory: .hugo
env:
HUGO_BASEURL: https://${{ github.repository_owner }}.github.io/${{ github.event.repository.name }}/${{ github.event.inputs.version_tag }}/
HUGO_RELATIVEURLS: false
- name: Deploy to gh-pages
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: .hugo/public
publish_branch: versioned-gh-pages
destination_dir: ./${{ github.event.inputs.version_tag }}
keep_files: true
allow_empty_commit: true
commit_message: "docs(backport): deploy docs for ${{ github.event.inputs.version_tag }}"
- name: Clean Build Directory
run: rm -rf .hugo/public
- name: Build Hugo Site
run: hugo --minify
working-directory: .hugo
env:
HUGO_BASEURL: https://${{ github.repository_owner }}.github.io/${{ github.event.repository.name }}/
HUGO_RELATIVEURLS: false
- name: Deploy to root
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: .hugo/public
publish_branch: versioned-gh-pages
keep_files: true
allow_empty_commit: true
commit_message: "deploy: docs to root for ${{ github.event.inputs.version_tag }}"

View File

@@ -1,86 +0,0 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
name: "Deploy Versioned Docs"
permissions:
contents: write
on:
release:
types: [published]
jobs:
deploy:
runs-on: ubuntu-24.04
steps:
- name: Checkout Code at Tag
uses: actions/checkout@v4
with:
ref: ${{ github.event.release.tag_name }}
- name: Get Version from Release Tag
run: echo "VERSION=${{ github.event.release.tag_name }}" >> $GITHUB_ENV
- name: Setup Hugo
uses: peaceiris/actions-hugo@v3
with:
hugo-version: "0.145.0"
extended: true
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: "22"
- name: Install Dependencies
run: npm ci
working-directory: .hugo
- name: Build Hugo Site
run: hugo --minify
working-directory: .hugo
env:
HUGO_BASEURL: https://${{ github.repository_owner }}.github.io/${{ github.event.repository.name }}/${{ env.VERSION }}/
HUGO_RELATIVEURLS: false
- name: Deploy
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: .hugo/public
publish_branch: versioned-gh-pages
destination_dir: ./${{ env.VERSION }}
keep_files: true
commit_message: "deploy: docs for ${{ env.VERSION }}"
- name: Clean Build Directory
run: rm -rf .hugo/public
- name: Build Hugo Site
run: hugo --minify
working-directory: .hugo
env:
HUGO_BASEURL: https://${{ github.repository_owner }}.github.io/${{ github.event.repository.name }}/
HUGO_RELATIVEURLS: false
- name: Deploy to root
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: .hugo/public
publish_branch: versioned-gh-pages
keep_files: true
allow_empty_commit: true
commit_message: "deploy: docs to root for ${{ env.VERSION }}"

View File

@@ -50,7 +50,7 @@ jobs:
extended: true
- name: Setup Node
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "22"

View File

@@ -36,7 +36,7 @@ jobs:
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5
with:
ref: versioned-gh-pages
ref: gh-pages
- name: Remove Preview
run: |
@@ -48,7 +48,7 @@ jobs:
git push
- name: Comment
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7
with:
script: |
github.rest.issues.createComment({
@@ -56,4 +56,4 @@ jobs:
owner: context.repo.owner,
repo: context.repo.repo,
body: "🧨 Preview deployments removed."
})
})

View File

@@ -62,7 +62,7 @@ jobs:
extended: true
- name: Setup Node
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "22"
@@ -86,12 +86,11 @@ jobs:
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: .hugo/public
publish_branch: versioned-gh-pages
destination_dir: ./previews/PR-${{ github.event.number }}
commit_message: "stage: PR-${{ github.event.number }}: ${{ github.event.head_commit.message }}"
- name: Comment
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7
with:
script: |
github.rest.issues.createComment({

View File

@@ -36,7 +36,7 @@ jobs:
steps:
- name: Remove PR Label
if: "${{ github.event.action == 'labeled' && github.event.label.name == 'tests: run' }}"
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
@@ -51,9 +51,9 @@ jobs:
console.log('Failed to remove label. Another job may have already removed it!');
}
- name: Setup Go
uses: actions/setup-go@44694675825211faa026b3c33043df3e48a5fa00 # v6.0.0
uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
with:
go-version: "1.25"
go-version: "1.22"
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:

View File

@@ -41,7 +41,7 @@ jobs:
steps:
- name: Remove PR label
if: "${{ github.event.action == 'labeled' && github.event.label.name == 'tests: run' }}"
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
@@ -57,9 +57,9 @@ jobs:
}
- name: Setup Go
uses: actions/setup-go@44694675825211faa026b3c33043df3e48a5fa00 # v6.0.0
uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
with:
go-version: "1.24"
go-version: "1.22"
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0

View File

@@ -1,5 +1,5 @@
title = 'MCP Toolbox for Databases'
relativeURLs = false
relativeURLs = true
languageCode = 'en-us'
defaultContentLanguage = "en"
@@ -36,7 +36,6 @@ ignoreFiles = ["quickstart/shared", "quickstart/python", "quickstart/js", "quick
github_project_repo = "https://github.com/googleapis/genai-toolbox"
github_subdir = "docs"
offlineSearch = true
version_menu = "Releases"
[params.ui]
ul_show = 100
showLightDarkModeMenu = true
@@ -44,50 +43,6 @@ ignoreFiles = ["quickstart/shared", "quickstart/python", "quickstart/js", "quick
sidebar_menu_foldable = true
sidebar_menu_compact = false
[[params.versions]]
version = "Dev"
url = "https://googleapis.github.io/genai-toolbox/dev/"
# Add a new version block here before every release
# The order of versions in this file is mirrored into the dropdown
[[params.versions]]
version = "v0.16.0"
url = "https://googleapis.github.io/genai-toolbox/v0.16.0/"
[[params.versions]]
version = "v0.15.0"
url = "https://googleapis.github.io/genai-toolbox/v0.15.0/"
[[params.versions]]
version = "v0.14.0"
url = "https://googleapis.github.io/genai-toolbox/v0.14.0/"
[[params.versions]]
version = "v0.13.0"
url = "https://googleapis.github.io/genai-toolbox/v0.13.0/"
[[params.versions]]
version = "v0.12.0"
url = "https://googleapis.github.io/genai-toolbox/v0.12.0/"
[[params.versions]]
version = "v0.11.0"
url = "https://googleapis.github.io/genai-toolbox/v0.11.0/"
[[params.versions]]
version = "v0.10.0"
url = "https://googleapis.github.io/genai-toolbox/v0.10.0/"
[[params.versions]]
version = "v0.9.0"
url = "https://googleapis.github.io/genai-toolbox/v0.9.0/"
[[params.versions]]
version = "v0.8.0"
url = "https://googleapis.github.io/genai-toolbox/v0.8.0/"
[[menu.main]]
name = "GitHub"
weight = 50
@@ -112,13 +67,6 @@ ignoreFiles = ["quickstart/shared", "quickstart/python", "quickstart/js", "quick
baseName = "llms-full"
isPlainText = true
root = true
[outputFormats.releases]
baseName = 'releases'
isPlainText = true
mediaType = 'text/releases'
[mediaTypes."text/releases"]
suffixes = ["releases"]
[outputs]
home = ["HTML", "RSS", "LLMS", "LLMS-FULL", "releases"]
home = ["HTML", "RSS", "LLMS", "LLMS-FULL"]

View File

@@ -1,9 +0,0 @@
{{ if .Site.Params.versions -}}
{{ $path := "" -}}
{{ if .Site.Params.version_menu_pagelinks -}}
{{ $path = .Page.RelPermalink -}}
{{ end -}}
{{ range .Site.Params.versions -}}
<a class="dropdown-item" href="{{ .url }}{{ $path }}">{{ .version }}</a>
{{ end -}}
{{ end -}}

View File

@@ -1 +0,0 @@
<script src='{{ .Site.BaseURL }}js/w3.js' type="application/x-javascript"></script>

View File

@@ -1,12 +0,0 @@
{{ if .Site.Params.versions -}}
<a class="nav-link dropdown-toggle" href="#" id="navbarDropdown" role="button" data-bs-toggle="dropdown" aria-expanded="false">
{{ .Site.Params.version_menu }}
</a>
<div class="dropdown-menu" aria-labelledby="navbarDropdown">
<div w3-include-html="/genai-toolbox/releases.releases" w3-include-html-default='<a class="dropdown-item" href="/genai-toolbox/dev/">Dev</a>'></div>
<script>
// This must run after the w3.js script has loaded.
w3.includeHTML();
</script>
</div>
{{ end -}}

View File

@@ -1,405 +0,0 @@
/* W3.JS 1.04 April 2019 by w3schools.com */
"use strict";
var w3 = {};
w3.hide = function (sel) {
w3.hideElements(w3.getElements(sel));
};
w3.hideElements = function (elements) {
var i, l = elements.length;
for (i = 0; i < l; i++) {
w3.hideElement(elements[i]);
}
};
w3.hideElement = function (element) {
w3.styleElement(element, "display", "none");
};
w3.show = function (sel, a) {
var elements = w3.getElements(sel);
if (a) {w3.hideElements(elements);}
w3.showElements(elements);
};
w3.showElements = function (elements) {
var i, l = elements.length;
for (i = 0; i < l; i++) {
w3.showElement(elements[i]);
}
};
w3.showElement = function (element) {
w3.styleElement(element, "display", "block");
};
w3.addStyle = function (sel, prop, val) {
w3.styleElements(w3.getElements(sel), prop, val);
};
w3.styleElements = function (elements, prop, val) {
var i, l = elements.length;
for (i = 0; i < l; i++) {
w3.styleElement(elements[i], prop, val);
}
};
w3.styleElement = function (element, prop, val) {
element.style.setProperty(prop, val);
};
w3.toggleShow = function (sel) {
var i, x = w3.getElements(sel), l = x.length;
for (i = 0; i < l; i++) {
if (x[i].style.display == "none") {
w3.styleElement(x[i], "display", "block");
} else {
w3.styleElement(x[i], "display", "none");
}
}
};
w3.addClass = function (sel, name) {
w3.addClassElements(w3.getElements(sel), name);
};
w3.addClassElements = function (elements, name) {
var i, l = elements.length;
for (i = 0; i < l; i++) {
w3.addClassElement(elements[i], name);
}
};
w3.addClassElement = function (element, name) {
var i, arr1, arr2;
arr1 = element.className.split(" ");
arr2 = name.split(" ");
for (i = 0; i < arr2.length; i++) {
if (arr1.indexOf(arr2[i]) == -1) {element.className += " " + arr2[i];}
}
};
w3.removeClass = function (sel, name) {
w3.removeClassElements(w3.getElements(sel), name);
};
w3.removeClassElements = function (elements, name) {
var i, l = elements.length, arr1, arr2, j;
for (i = 0; i < l; i++) {
w3.removeClassElement(elements[i], name);
}
};
w3.removeClassElement = function (element, name) {
var i, arr1, arr2;
arr1 = element.className.split(" ");
arr2 = name.split(" ");
for (i = 0; i < arr2.length; i++) {
while (arr1.indexOf(arr2[i]) > -1) {
arr1.splice(arr1.indexOf(arr2[i]), 1);
}
}
element.className = arr1.join(" ");
};
w3.toggleClass = function (sel, c1, c2) {
w3.toggleClassElements(w3.getElements(sel), c1, c2);
};
w3.toggleClassElements = function (elements, c1, c2) {
var i, l = elements.length;
for (i = 0; i < l; i++) {
w3.toggleClassElement(elements[i], c1, c2);
}
};
w3.toggleClassElement = function (element, c1, c2) {
var t1, t2, t1Arr, t2Arr, j, arr, allPresent;
t1 = (c1 || "");
t2 = (c2 || "");
t1Arr = t1.split(" ");
t2Arr = t2.split(" ");
arr = element.className.split(" ");
if (t2Arr.length == 0) {
allPresent = true;
for (j = 0; j < t1Arr.length; j++) {
if (arr.indexOf(t1Arr[j]) == -1) {allPresent = false;}
}
if (allPresent) {
w3.removeClassElement(element, t1);
} else {
w3.addClassElement(element, t1);
}
} else {
allPresent = true;
for (j = 0; j < t1Arr.length; j++) {
if (arr.indexOf(t1Arr[j]) == -1) {allPresent = false;}
}
if (allPresent) {
w3.removeClassElement(element, t1);
w3.addClassElement(element, t2);
} else {
w3.removeClassElement(element, t2);
w3.addClassElement(element, t1);
}
}
};
w3.getElements = function (id) {
if (typeof id == "object") {
return [id];
} else {
return document.querySelectorAll(id);
}
};
w3.filterHTML = function(id, sel, filter) {
var a, b, c, i, ii, iii, hit;
a = w3.getElements(id);
for (i = 0; i < a.length; i++) {
b = a[i].querySelectorAll(sel);
for (ii = 0; ii < b.length; ii++) {
hit = 0;
if (b[ii].innerText.toUpperCase().indexOf(filter.toUpperCase()) > -1) {
hit = 1;
}
c = b[ii].getElementsByTagName("*");
for (iii = 0; iii < c.length; iii++) {
if (c[iii].innerText.toUpperCase().indexOf(filter.toUpperCase()) > -1) {
hit = 1;
}
}
if (hit == 1) {
b[ii].style.display = "";
} else {
b[ii].style.display = "none";
}
}
}
};
w3.sortHTML = function(id, sel, sortvalue) {
var a, b, i, ii, y, bytt, v1, v2, cc, j;
a = w3.getElements(id);
for (i = 0; i < a.length; i++) {
for (j = 0; j < 2; j++) {
cc = 0;
y = 1;
while (y == 1) {
y = 0;
b = a[i].querySelectorAll(sel);
for (ii = 0; ii < (b.length - 1); ii++) {
bytt = 0;
if (sortvalue) {
v1 = b[ii].querySelector(sortvalue).innerText;
v2 = b[ii + 1].querySelector(sortvalue).innerText;
} else {
v1 = b[ii].innerText;
v2 = b[ii + 1].innerText;
}
v1 = v1.toLowerCase();
v2 = v2.toLowerCase();
if ((j == 0 && (v1 > v2)) || (j == 1 && (v1 < v2))) {
bytt = 1;
break;
}
}
if (bytt == 1) {
b[ii].parentNode.insertBefore(b[ii + 1], b[ii]);
y = 1;
cc++;
}
}
if (cc > 0) {break;}
}
}
};
w3.slideshow = function (sel, ms, func) {
var i, ss, x = w3.getElements(sel), l = x.length;
ss = {};
ss.current = 1;
ss.x = x;
ss.ondisplaychange = func;
if (!isNaN(ms) || ms == 0) {
ss.milliseconds = ms;
} else {
ss.milliseconds = 1000;
}
ss.start = function() {
ss.display(ss.current)
if (ss.ondisplaychange) {ss.ondisplaychange();}
if (ss.milliseconds > 0) {
window.clearTimeout(ss.timeout);
ss.timeout = window.setTimeout(ss.next, ss.milliseconds);
}
};
ss.next = function() {
ss.current += 1;
if (ss.current > ss.x.length) {ss.current = 1;}
ss.start();
};
ss.previous = function() {
ss.current -= 1;
if (ss.current < 1) {ss.current = ss.x.length;}
ss.start();
};
ss.display = function (n) {
w3.styleElements(ss.x, "display", "none");
w3.styleElement(ss.x[n - 1], "display", "block");
}
ss.start();
return ss;
};
w3.includeHTML = function(cb) {
var z, i, elmnt, file, xhttp;
z = document.getElementsByTagName("*");
for (i = 0; i < z.length; i++) {
elmnt = z[i];
file = elmnt.getAttribute("w3-include-html");
if (file) {
xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4) {
if (this.status == 200) {elmnt.innerHTML = this.responseText;}
if (this.status == 404) {
if (elmnt.getAttribute("w3-include-html-default")) {
elmnt.innerHTML = elmnt.getAttribute("w3-include-html-default");
}
else { elmnt.innerHTML = "Page not found."; }
}
elmnt.removeAttribute("w3-include-html");
w3.includeHTML(cb);
}
}
xhttp.open("GET", file, true);
xhttp.send();
return;
}
}
if (cb) cb();
};
w3.getHttpData = function (file, func) {
w3.http(file, function () {
if (this.readyState == 4 && this.status == 200) {
func(this.responseText);
}
});
};
w3.getHttpObject = function (file, func) {
w3.http(file, function () {
if (this.readyState == 4 && this.status == 200) {
func(JSON.parse(this.responseText));
}
});
};
w3.displayHttp = function (id, file) {
w3.http(file, function () {
if (this.readyState == 4 && this.status == 200) {
w3.displayObject(id, JSON.parse(this.responseText));
}
});
};
w3.http = function (target, readyfunc, xml, method) {
var httpObj;
if (!method) {method = "GET"; }
if (window.XMLHttpRequest) {
httpObj = new XMLHttpRequest();
} else if (window.ActiveXObject) {
httpObj = new ActiveXObject("Microsoft.XMLHTTP");
}
if (httpObj) {
if (readyfunc) {httpObj.onreadystatechange = readyfunc;}
httpObj.open(method, target, true);
httpObj.send(xml);
}
};
w3.getElementsByAttribute = function (x, att) {
var arr = [], arrCount = -1, i, l, y = x.getElementsByTagName("*"), z = att.toUpperCase();
l = y.length;
for (i = -1; i < l; i += 1) {
if (i == -1) {y[i] = x;}
if (y[i].getAttribute(z) !== null) {arrCount += 1; arr[arrCount] = y[i];}
}
return arr;
};
w3.dataObject = {},
w3.displayObject = function (id, data) {
var htmlObj, htmlTemplate, html, arr = [], a, l, rowClone, x, j, i, ii, cc, repeat, repeatObj, repeatX = "";
htmlObj = document.getElementById(id);
htmlTemplate = init_template(id, htmlObj);
html = htmlTemplate.cloneNode(true);
arr = w3.getElementsByAttribute(html, "w3-repeat");
l = arr.length;
for (j = (l - 1); j >= 0; j -= 1) {
cc = arr[j].getAttribute("w3-repeat").split(" ");
if (cc.length == 1) {
repeat = cc[0];
} else {
repeatX = cc[0];
repeat = cc[2];
}
arr[j].removeAttribute("w3-repeat");
repeatObj = data[repeat];
if (repeatObj && typeof repeatObj == "object" && repeatObj.length != "undefined") {
i = 0;
for (x in repeatObj) {
i += 1;
rowClone = arr[j];
rowClone = w3_replace_curly(rowClone, "element", repeatX, repeatObj[x]);
a = rowClone.attributes;
for (ii = 0; ii < a.length; ii += 1) {
a[ii].value = w3_replace_curly(a[ii], "attribute", repeatX, repeatObj[x]).value;
}
(i === repeatObj.length) ? arr[j].parentNode.replaceChild(rowClone, arr[j]) : arr[j].parentNode.insertBefore(rowClone, arr[j]);
}
} else {
console.log("w3-repeat must be an array. " + repeat + " is not an array.");
continue;
}
}
html = w3_replace_curly(html, "element");
htmlObj.parentNode.replaceChild(html, htmlObj);
function init_template(id, obj) {
var template;
template = obj.cloneNode(true);
if (w3.dataObject.hasOwnProperty(id)) {return w3.dataObject[id];}
w3.dataObject[id] = template;
return template;
}
function w3_replace_curly(elmnt, typ, repeatX, x) {
var value, rowClone, pos1, pos2, originalHTML, lookFor, lookForARR = [], i, cc, r;
rowClone = elmnt.cloneNode(true);
pos1 = 0;
while (pos1 > -1) {
originalHTML = (typ == "attribute") ? rowClone.value : rowClone.innerHTML;
pos1 = originalHTML.indexOf("{{", pos1);
if (pos1 === -1) {break;}
pos2 = originalHTML.indexOf("}}", pos1 + 1);
lookFor = originalHTML.substring(pos1 + 2, pos2);
lookForARR = lookFor.split("||");
value = undefined;
for (i = 0; i < lookForARR.length; i += 1) {
lookForARR[i] = lookForARR[i].replace(/^\s+|\s+$/gm, ''); //trim
if (x) {value = x[lookForARR[i]];}
if (value == undefined && data) {value = data[lookForARR[i]];}
if (value == undefined) {
cc = lookForARR[i].split(".");
if (cc[0] == repeatX) {value = x[cc[1]]; }
}
if (value == undefined) {
if (lookForARR[i] == repeatX) {value = x;}
}
if (value == undefined) {
if (lookForARR[i].substr(0, 1) == '"') {
value = lookForARR[i].replace(/"/g, "");
} else if (lookForARR[i].substr(0,1) == "'") {
value = lookForARR[i].replace(/'/g, "");
}
}
if (value != undefined) {break;}
}
if (value != undefined) {
r = "{{" + lookFor + "}}";
if (typ == "attribute") {
rowClone.value = rowClone.value.replace(r, value);
} else {
w3_replace_html(rowClone, r, value);
}
}
pos1 = pos1 + 1;
}
return rowClone;
}
function w3_replace_html(a, r, result) {
var b, l, i, a, x, j;
if (a.hasAttributes()) {
b = a.attributes;
l = b.length;
for (i = 0; i < l; i += 1) {
if (b[i].value.indexOf(r) > -1) {b[i].value = b[i].value.replace(r, result);}
}
}
x = a.getElementsByTagName("*");
l = x.length;
a.innerHTML = a.innerHTML.replace(r, result);
}
};

View File

@@ -1,65 +1,5 @@
# Changelog
## [0.17.0](https://github.com/googleapis/genai-toolbox/compare/v0.16.0...v0.17.0) (2025-10-10)
### ⚠ BREAKING CHANGES
* **tools/bigquery-get-table-info:** add allowed dataset support ([#1093](https://github.com/googleapis/genai-toolbox/issues/1093))
* **tool/bigquery-list-dataset-ids:** add allowed datasets support ([#1573](https://github.com/googleapis/genai-toolbox/issues/1573))
### Features
* Add configs and workflows for docs versioning ([#1611](https://github.com/googleapis/genai-toolbox/issues/1611)) ([21ac98b](https://github.com/googleapis/genai-toolbox/commit/21ac98bc065e95bde911d66185c67d8380891bf8))
* Add metadata in MCP Manifest for Toolbox auth ([#1395](https://github.com/googleapis/genai-toolbox/issues/1395)) ([0b3dac4](https://github.com/googleapis/genai-toolbox/commit/0b3dac41322f7aaa5a19df571686fa8fd4338ca5))
* Add program name to MySQL connections ([#1617](https://github.com/googleapis/genai-toolbox/issues/1617)) ([c4a22b8](https://github.com/googleapis/genai-toolbox/commit/c4a22b8d3bd0307325215ebd2f30ba37927cd37e))
* **source/bigquery:** Add optional write mode config ([#1157](https://github.com/googleapis/genai-toolbox/issues/1157)) ([63adc78](https://github.com/googleapis/genai-toolbox/commit/63adc78beae949dfe5e300c50e5ceef073e9652c))
* **sources/mssql:** Add app name to MSSQL ([#1620](https://github.com/googleapis/genai-toolbox/issues/1620)) ([1536d1f](https://github.com/googleapis/genai-toolbox/commit/1536d1fdabb9d7f73dbdeebeb05a83d9a3b78e1c))
* **sources/oracle:** Add Oracle Source and Tool ([#1456](https://github.com/googleapis/genai-toolbox/issues/1456)) ([3a19a50](https://github.com/googleapis/genai-toolbox/commit/3a19a50ff211e33429de1d05338d353359a52987))
* **tools/bigquery-list-dataset-ids:** Add allowed datasets support ([#1573](https://github.com/googleapis/genai-toolbox/issues/1573)) ([1a44c67](https://github.com/googleapis/genai-toolbox/commit/1a44c671ec593e764a2d2f67f70a98ceec20a168))
* **tools/bigquery-get-table-info:** Add allowed dataset support ([#1093](https://github.com/googleapis/genai-toolbox/issues/1093)) ([acb205c](https://github.com/googleapis/genai-toolbox/commit/acb205ca4761d59ce97b804827230978c8c98ede))
* **tools/dataform:** Add dataform compile tool ([#1470](https://github.com/googleapis/genai-toolbox/issues/1470)) ([3be9b7b](https://github.com/googleapis/genai-toolbox/commit/3be9b7b3bdf112fe7303706e56e9f39935cde661))
* **tools/looker:** Add support for pulse, vacuum and analyze audit and performance functions on a Looker instance ([#1581](https://github.com/googleapis/genai-toolbox/issues/1581)) ([5aed4e1](https://github.com/googleapis/genai-toolbox/commit/5aed4e136d0091731d2ded10ec076ee789e1987c))
* **tools/looker:** Enable access to the Conversational Analytics API for Looker ([#1596](https://github.com/googleapis/genai-toolbox/issues/1596)) ([2d5a93e](https://github.com/googleapis/genai-toolbox/commit/2d5a93e312990c8a9f3170c7e9c655f97cf11712))
### Bug Fixes
* Added google_ml_integration extension to use alloydb ai-nl support api ([#1445](https://github.com/googleapis/genai-toolbox/issues/1445)) ([dbc477a](https://github.com/googleapis/genai-toolbox/commit/dbc477ab0f832495cf51f73ea16ae363472d6eed))
* Fix broken links ([#1625](https://github.com/googleapis/genai-toolbox/issues/1625)) ([36c6584](https://github.com/googleapis/genai-toolbox/commit/36c658472ccdeb6cddd8a4452a8b3438aeb0a744))
* Remove duplicated build type in Dockerfile ([#1598](https://github.com/googleapis/genai-toolbox/issues/1598)) ([b43c945](https://github.com/googleapis/genai-toolbox/commit/b43c94575d86aa65b0528d59f9b41d30b439fee5))
* **source/bigquery:** Allowed datasets project id issue with client oauth ([#1663](https://github.com/googleapis/genai-toolbox/issues/1663)) ([f4cf486](https://github.com/googleapis/genai-toolbox/commit/f4cf486fa929299fef076cf71689776e5dec19c1))
* **sources/looker:** Allow Looker to be configured without setting a Client Id or Secret ([#1496](https://github.com/googleapis/genai-toolbox/issues/1496)) ([67d8221](https://github.com/googleapis/genai-toolbox/commit/67d8221a2e780df54a81f0f7e8f7e41e4bf1a82e))
* **tools/looker:** Refactor run-inline-query logic to helper function ([#1497](https://github.com/googleapis/genai-toolbox/issues/1497)) ([62af39d](https://github.com/googleapis/genai-toolbox/commit/62af39d751443eb758586663969b162c868a233f))
* **tools/mysql-list-tables:** Update sql query to resolve subquery scope error ([#1629](https://github.com/googleapis/genai-toolbox/issues/1629)) ([94e19d8](https://github.com/googleapis/genai-toolbox/commit/94e19d87e54e831b80eb766572e48bc7370305d8))
### Miscellaneous Chores
* Release 0.17.0 ([#1676](https://github.com/googleapis/genai-toolbox/issues/1676)) ([7e22cb4](https://github.com/googleapis/genai-toolbox/commit/7e22cb455d2709559c198024f3e782ef1a4bbacb))
## [0.16.0](https://github.com/googleapis/genai-toolbox/compare/v0.15.0...v0.16.0) (2025-09-25)
### ⚠ BREAKING CHANGES
* **tool/bigquery-execute-sql:** add allowed datasets support ([#1443](https://github.com/googleapis/genai-toolbox/issues/1443))
* **tool/bigquery-forecast:** add allowed datasets support ([#1412](https://github.com/googleapis/genai-toolbox/issues/1412))
### Features
* **cassandra:** Add Cassandra Source and Tool ([#1012](https://github.com/googleapis/genai-toolbox/issues/1012)) ([6e42053](https://github.com/googleapis/genai-toolbox/commit/6e420534ee894da4a8d226acb6cdb63d0d5d9ce5))
* **sources/postgres:** Add application_name ([#1504](https://github.com/googleapis/genai-toolbox/issues/1504)) ([72a2366](https://github.com/googleapis/genai-toolbox/commit/72a2366b28870aa6f81c4f890f4770ec5ecffdba))
* **tools/bigquery-execute-sql:** Add allowed datasets support ([#1443](https://github.com/googleapis/genai-toolbox/issues/1443)) ([9501ebb](https://github.com/googleapis/genai-toolbox/commit/9501ebbdbcba871b98663185c690308dda1729b5))
* **tools/bigquery-forecast:** Add allowed datasets support ([#1412](https://github.com/googleapis/genai-toolbox/issues/1412)) ([88bac7e](https://github.com/googleapis/genai-toolbox/commit/88bac7e36f5ebb6ad18773bff30b85ef678431e7))
* **tools/clickhouse-list-tables:** Add list-tables tool ([#1446](https://github.com/googleapis/genai-toolbox/issues/1446)) ([69a3caf](https://github.com/googleapis/genai-toolbox/commit/69a3cafabec5a40e2776d71de3587c0d16c722a2))
### Bug Fixes
* **tool/mongodb-find:** Fix find tool `limit` field ([#1570](https://github.com/googleapis/genai-toolbox/issues/1570)) ([4166bf7](https://github.com/googleapis/genai-toolbox/commit/4166bf7ab85732f64b877d5f20235057df919049))
* **tools/mongodb:** Concat filter params only once in mongodb update tools ([#1545](https://github.com/googleapis/genai-toolbox/issues/1545)) ([295f9dc](https://github.com/googleapis/genai-toolbox/commit/295f9dc41a43f0a4bdbd99e465bf2be01249084e))
## [0.15.0](https://github.com/googleapis/genai-toolbox/compare/v0.14.0...v0.15.0) (2025-09-18)

View File

@@ -135,22 +135,6 @@ go test -race -v ./...
go test -race -v ./tests/alloydbpg
```
1. **Timeout:** The integration test should have a timeout on the server.
Look for code like this:
```go
ctx, cancel := context.WithTimeout(context.Background(), time.Minute)
defer cancel()
cmd, cleanup, err := tests.StartCmd(ctx, toolsFile, args...)
if err != nil {
t.Fatalf("command initialization returned an error: %s", err)
}
defer cleanup()
```
Be sure to set the timeout to a reasonable value for your tests.
#### Running on Pull Requests
* **Internal Contributors:** Testing workflows should trigger automatically.
@@ -245,24 +229,6 @@ Follow these steps to preview documentation changes locally using a Hugo server:
### Previewing Documentation on Pull Requests
### Document Versioning Setup
There are 3 GHA workflows we use to achieve document versioning:
1. **Deploy In-development docs:**
This workflow is run on every commit merged into the main branch. It deploys the built site to the `/dev/` subdirectory for the in-development documentation.
1. **Deploy Versioned Docs:**
When a new GitHub Release is published, it performs two deployments based on the new release tag.
One to the new version subdirectory and one to the root directory of the versioned-gh-pages branch.
**Note:** Before the release PR from release-please is merged, add the newest version into the hugo.toml file.
1. **Deploy Previous Version Docs:**
This is a manual workflow, started from the GitHub Actions UI.
To rebuild and redeploy documentation for an already released version that were released before this new system was in place. This workflow can be started on the UI by providing the git version tag which you want to create the documentation for.
The specific versioned subdirectory and the root docs are updated on the versioned-gh-pages branch.
#### Contributors
Request a repo owner to run the preview deployment workflow on your PR. A

View File

@@ -25,7 +25,7 @@ ARG COMMIT_SHA=""
RUN go get ./...
RUN CGO_ENABLED=0 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}"
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=container.${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}"
# Final Stage
FROM gcr.io/distroless/static:nonroot

View File

@@ -33,6 +33,7 @@ documentation](https://googleapis.github.io/genai-toolbox/).
- [Getting Started](#getting-started)
- [Installing the server](#installing-the-server)
- [Running the server](#running-the-server)
- [Homebrew Users](#homebrew-users)
- [Integrating your application](#integrating-your-application)
- [Configuration](#configuration)
- [Sources](#sources)
@@ -114,57 +115,13 @@ following instructions for your OS and CPU architecture.
To install Toolbox as a binary:
<!-- {x-release-please-start-version} -->
> <details>
> <summary>Linux (AMD64)</summary>
>
> To install Toolbox as a binary on Linux (AMD64):
>
> ```sh
> # see releases page for other versions
> export VERSION=0.17.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>macOS (Apple Silicon)</summary>
>
> To install Toolbox as a binary on macOS (Apple Silicon):
>
> ```sh
> # see releases page for other versions
> export VERSION=0.17.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>macOS (Intel)</summary>
>
> To install Toolbox as a binary on macOS (Intel):
>
> ```sh
> # see releases page for other versions
> export VERSION=0.17.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>Windows (AMD64)</summary>
>
> To install Toolbox as a binary on Windows (AMD64):
>
> ```powershell
> # see releases page for other versions
> $VERSION = "0.17.0"
> Invoke-WebRequest -Uri "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" -OutFile "toolbox.exe"
> ```
>
> </details>
```sh
# see releases page for other versions
export VERSION=0.15.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
</details>
<details>
@@ -173,7 +130,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.17.0
export VERSION=0.15.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -197,23 +154,12 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.17.0
go install github.com/googleapis/genai-toolbox@v0.15.0
```
<!-- {x-release-please-end} -->
</details>
<details>
<summary>Gemini CLI Extensions</summary>
To install Gemini CLI Extensions for MCP Toolbox, run the following command:
```sh
gemini extensions install https://github.com/gemini-cli-extensions/mcp-toolbox
```
</details>
### Running the server
[Configure](#configuration) a `tools.yaml` to define your tools, and then
@@ -287,16 +233,6 @@ toolbox --tools-file "tools.yaml"
</details>
<details>
<summary>Gemini CLI</summary>
Interact with your custom tools using natural language. Check
[gemini-cli-extensions/mcp-toolbox](https://github.com/gemini-cli-extensions/mcp-toolbox)
for more information.
</details>
You can use `toolbox help` for a full list of flags! To stop the server, send a
terminate signal (`ctrl+c` on most platforms).

View File

@@ -64,7 +64,6 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysearchcatalog"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/tools/cassandra/cassandracql"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouseexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouselistdatabases"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouselisttables"
@@ -80,7 +79,6 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsqlmysql/cloudsqlmysqlcreateinstance"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsqlpg/cloudsqlpgcreateinstances"
_ "github.com/googleapis/genai-toolbox/internal/tools/couchbase"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataform/dataformcompilelocal"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexlookupentry"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexsearchaspecttypes"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexsearchentries"
@@ -98,7 +96,6 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorevalidaterules"
_ "github.com/googleapis/genai-toolbox/internal/tools/http"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookeradddashboardelement"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerconversationalanalytics"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdashboards"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdimensions"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetexplores"
@@ -107,9 +104,6 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmeasures"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmodels"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetparameters"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerhealthanalyze"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerhealthpulse"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerhealthvacuum"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookermakedashboard"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookermakelook"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquery"
@@ -139,8 +133,6 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jschema"
_ "github.com/googleapis/genai-toolbox/internal/tools/oceanbase/oceanbaseexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/oceanbase/oceanbasesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/oracle/oracleexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/oracle/oraclesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistactivequeries"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistavailableextensions"
@@ -167,7 +159,6 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
_ "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
_ "github.com/googleapis/genai-toolbox/internal/sources/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/sources/cassandra"
_ "github.com/googleapis/genai-toolbox/internal/sources/clickhouse"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudmonitoring"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqladmin"
@@ -186,7 +177,6 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/mysql"
_ "github.com/googleapis/genai-toolbox/internal/sources/neo4j"
_ "github.com/googleapis/genai-toolbox/internal/sources/oceanbase"
_ "github.com/googleapis/genai-toolbox/internal/sources/oracle"
_ "github.com/googleapis/genai-toolbox/internal/sources/postgres"
_ "github.com/googleapis/genai-toolbox/internal/sources/redis"
_ "github.com/googleapis/genai-toolbox/internal/sources/spanner"
@@ -663,6 +653,20 @@ func watchChanges(ctx context.Context, watchDirs map[string]bool, watchedFiles m
}
}
// updateLogLevel checks if Toolbox have to update the existing log level set by users.
// stdio doesn't support "debug" and "info" logs.
func updateLogLevel(stdio bool, logLevel string) bool {
if stdio {
switch strings.ToUpper(logLevel) {
case log.Debug, log.Info:
return true
default:
return false
}
}
return false
}
func resolveWatcherInputs(toolsFile string, toolsFiles []string, toolsFolder string) (map[string]bool, map[string]bool) {
var relevantFiles []string
@@ -691,6 +695,10 @@ func resolveWatcherInputs(toolsFile string, toolsFiles []string, toolsFolder str
}
func run(cmd *Command) error {
if updateLogLevel(cmd.cfg.Stdio, cmd.cfg.LogLevel.String()) {
cmd.cfg.LogLevel = server.StringLevel(log.Warn)
}
ctx, cancel := context.WithCancel(cmd.Context())
defer cancel()
@@ -714,22 +722,16 @@ func run(cmd *Command) error {
cancel()
}(ctx)
// If stdio, set logger's out stream (usually DEBUG and INFO logs) to errStream
loggerOut := cmd.outStream
if cmd.cfg.Stdio {
loggerOut = cmd.errStream
}
// Handle logger separately from config
switch strings.ToLower(cmd.cfg.LoggingFormat.String()) {
case "json":
logger, err := log.NewStructuredLogger(loggerOut, cmd.errStream, cmd.cfg.LogLevel.String())
logger, err := log.NewStructuredLogger(cmd.outStream, cmd.errStream, cmd.cfg.LogLevel.String())
if err != nil {
return fmt.Errorf("unable to initialize logger: %w", err)
}
cmd.logger = logger
case "standard":
logger, err := log.NewStdLogger(loggerOut, cmd.errStream, cmd.cfg.LogLevel.String())
logger, err := log.NewStdLogger(cmd.outStream, cmd.errStream, cmd.cfg.LogLevel.String())
if err != nil {
return fmt.Errorf("unable to initialize logger: %w", err)
}

View File

@@ -1244,7 +1244,6 @@ func TestPrebuiltTools(t *testing.T) {
mysql_config, _ := prebuiltconfigs.Get("mysql")
mssql_config, _ := prebuiltconfigs.Get("mssql")
looker_config, _ := prebuiltconfigs.Get("looker")
lookerca_config, _ := prebuiltconfigs.Get("looker-conversational-analytics")
postgresconfig, _ := prebuiltconfigs.Get("postgres")
spanner_config, _ := prebuiltconfigs.Get("spanner")
spannerpg_config, _ := prebuiltconfigs.Get("spanner-postgres")
@@ -1328,9 +1327,6 @@ func TestPrebuiltTools(t *testing.T) {
t.Setenv("LOOKER_CLIENT_SECRET", "your_looker_client_secret")
t.Setenv("LOOKER_VERIFY_SSL", "true")
t.Setenv("LOOKER_PROJECT", "your_project_id")
t.Setenv("LOOKER_LOCATION", "us")
t.Setenv("SQLITE_DATABASE", "test.db")
t.Setenv("NEO4J_URI", "bolt://localhost:7687")
@@ -1493,17 +1489,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"looker_tools": tools.ToolsetConfig{
Name: "looker_tools",
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look", "make_look", "get_dashboards", "make_dashboard", "add_dashboard_element", "health_pulse", "health_analyze", "health_vacuum"},
},
},
},
{
name: "looker-conversational-analytics prebuilt tools",
in: lookerca_config,
wantToolset: server.ToolsetConfigs{
"looker_conversational_analytics_tools": tools.ToolsetConfig{
Name: "looker_conversational_analytics_tools",
ToolNames: []string{"ask_data_insights", "get_models", "get_explores"},
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look", "make_look", "get_dashboards", "make_dashboard", "add_dashboard_element"},
},
},
},
@@ -1611,3 +1597,51 @@ func TestPrebuiltTools(t *testing.T) {
})
}
}
func TestUpdateLogLevel(t *testing.T) {
tcs := []struct {
desc string
stdio bool
logLevel string
want bool
}{
{
desc: "no stdio",
stdio: false,
logLevel: "info",
want: false,
},
{
desc: "stdio with info log",
stdio: true,
logLevel: "info",
want: true,
},
{
desc: "stdio with debug log",
stdio: true,
logLevel: "debug",
want: true,
},
{
desc: "stdio with warn log",
stdio: true,
logLevel: "warn",
want: false,
},
{
desc: "stdio with error log",
stdio: true,
logLevel: "error",
want: false,
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := updateLogLevel(tc.stdio, tc.logLevel)
if got != tc.want {
t.Fatalf("incorrect indication to update log level: got %t, want %t", got, tc.want)
}
})
}
}

View File

@@ -1 +1 @@
0.17.0
0.15.0

View File

@@ -234,7 +234,7 @@
},
"outputs": [],
"source": [
"version = \"0.17.0\" # x-release-please-version\n",
"version = \"0.15.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -81,50 +81,23 @@ following instructions for your OS and CPU architecture.
<!-- {x-release-please-start-version} -->
{{< tabpane text=true >}}
{{% tab header="Binary" lang="en" %}}
{{< tabpane text=true >}}
{{% tab header="Linux (AMD64)" lang="en" %}}
To install Toolbox as a binary on Linux (AMD64):
To install Toolbox as a binary:
```sh
# see releases page for other versions
export VERSION=0.17.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
export VERSION=0.15.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="macOS (Apple Silicon)" lang="en" %}}
To install Toolbox as a binary on macOS (Apple Silicon):
```sh
# see releases page for other versions
export VERSION=0.17.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="macOS (Intel)" lang="en" %}}
To install Toolbox as a binary on macOS (Intel):
```sh
# see releases page for other versions
export VERSION=0.17.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="Windows (AMD64)" lang="en" %}}
To install Toolbox as a binary on Windows (AMD64):
```powershell
# see releases page for other versions
$VERSION = "0.17.0"
Invoke-WebRequest -Uri "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" -OutFile "toolbox.exe"
```
{{% /tab %}}
{{< /tabpane >}}
{{% /tab %}}
{{% tab header="Container image" lang="en" %}}
You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.17.0
export VERSION=0.15.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -143,7 +116,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.17.0
go install github.com/googleapis/genai-toolbox@v0.15.0
```
{{% /tab %}}

View File

@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -4,7 +4,7 @@ go 1.24.6
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.3.0
google.golang.org/genai v1.28.0
google.golang.org/genai v1.25.0
)
require (

View File

@@ -102,8 +102,8 @@ golang.org/x/time v0.12.0 h1:ScB/8o8olJvc+CQPWrK3fPZNfh7qgwCrY0zJmoEQLSE=
golang.org/x/time v0.12.0/go.mod h1:CDIdPxbZBQxdj6cxyCIdrNogrJKMJ7pr37NYpMcMDSg=
google.golang.org/api v0.248.0 h1:hUotakSkcwGdYUqzCRc5yGYsg4wXxpkKlW5ryVqvC1Y=
google.golang.org/api v0.248.0/go.mod h1:yAFUAF56Li7IuIQbTFoLwXTCI6XCFKueOlS7S9e4F9k=
google.golang.org/genai v1.28.0 h1:6qpUWFH3PkHPhxNnu3wjaCVJ6Jri1EIR7ks07f9IpIk=
google.golang.org/genai v1.28.0/go.mod h1:7pAilaICJlQBonjKKJNhftDFv3SREhZcTe9F6nRcjbg=
google.golang.org/genai v1.23.0 h1:0VkQPd1CVT5FbykwkWvnB7jq1d+PZFuVf0n57UyyOzs=
google.golang.org/genai v1.23.0/go.mod h1:QPj5NGJw+3wEOHg+PrsWwJKvG6UC84ex5FR7qAYsN/M=
google.golang.org/genproto v0.0.0-20250603155806-513f23925822 h1:rHWScKit0gvAPuOnu87KpaYtjK5zBMLcULh7gxkCXu4=
google.golang.org/genproto v0.0.0-20250603155806-513f23925822/go.mod h1:HubltRL7rMh0LfnQPkMH4NPDFEWp0jw3vixw7jEM53s=
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c h1:AtEkQdl5b6zsybXcbz00j1LwNodDuH6hVifIaNqk7NQ=

View File

@@ -48,7 +48,7 @@ func main() {
// Initialize Genkit
g, err := genkit.Init(ctx,
genkit.WithPlugins(&googlegenai.GoogleAI{}),
genkit.WithDefaultModel("googleai/gemini-2.0-flash"),
genkit.WithDefaultModel("googleai/gemini-1.5-flash"),
)
if err != nil {
log.Fatalf("Failed to init genkit: %v\n", err)

View File

@@ -59,7 +59,7 @@ func main() {
ctx := context.Background()
// Initialize the Google AI client (LLM).
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-2.0-flash"))
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-1.5-flash"))
if err != nil {
log.Fatalf("Failed to create Google AI client: %v", err)
}

View File

@@ -345,10 +345,9 @@
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q=="
},
"node_modules/axios": {
"version": "1.12.2",
"resolved": "https://registry.npmjs.org/axios/-/axios-1.12.2.tgz",
"integrity": "sha512-vMJzPewAlRyOgxV2dU0Cuz2O8zzzx9VYtbJOaBgXFeLc4IV/Eg50n4LowmehOOR61S8ZMpc2K5Sa7g6A4jfkUw==",
"license": "MIT",
"version": "1.11.0",
"resolved": "https://registry.npmjs.org/axios/-/axios-1.11.0.tgz",
"integrity": "sha512-1Lx3WLFQWm3ooKDYZD1eXmoGO9fxYQjrycfHFC8P0sCfQVXyROp0p9PFWBehewBOdCwHc+f/b8I0fMto5eSfwA==",
"dependencies": {
"follow-redirects": "^1.15.6",
"form-data": "^4.0.4",

View File

@@ -11,7 +11,6 @@ import os
# TODO(developer): replace this with your Google API key
api_key = os.environ.get("GOOGLE_API_KEY") or "your-api-key" # Set your API key here
os.environ["GOOGLE_API_KEY"] = api_key
async def main():
with ToolboxSyncClient("http://127.0.0.1:5000") as toolbox_client:

View File

@@ -1,3 +1,3 @@
google-adk==1.15.0
toolbox-core==0.5.2
google-adk==1.14.1
toolbox-core==0.5.0
pytest==8.4.2

View File

@@ -113,4 +113,4 @@ async def main():
else:
print(response.text)
asyncio.run(main())
asyncio.run(main())

View File

@@ -1,3 +1,3 @@
google-genai==1.42.0
toolbox-core==0.5.2
google-genai==1.38.0
toolbox-core==0.5.0
pytest==8.4.2

View File

@@ -1,5 +1,5 @@
langchain==0.3.27
langchain-google-vertexai==2.1.2
langgraph==0.6.8
langgraph==0.6.7
toolbox-langchain==0.5.2
pytest==8.4.2

View File

@@ -1,4 +1,4 @@
llama-index==0.14.3
llama-index-llms-google-genai==0.6.0
llama-index==0.14.2
llama-index-llms-google-genai==0.5.0
toolbox-llamaindex==0.5.2
pytest==8.4.2

View File

@@ -13,7 +13,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -48,19 +48,19 @@ to expose your developer assistant tools to a Looker instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -45,19 +45,19 @@ instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ expose your developer assistant tools to a MySQL instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -44,19 +44,19 @@ expose your developer assistant tools to a Neo4j instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -56,19 +56,19 @@ Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ to expose your developer assistant tools to a SQLite instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -1,38 +0,0 @@
---
title: Connect via Gemini CLI Extensions
type: docs
weight: 2
description: "Connect to Toolbox via Gemini CLI Extensions."
---
## Gemini CLI Extensions
[Gemini CLI][gemini-cli] is an open-source AI agent designed to assist with development workflows by assisting with coding, debugging, data exploration, and content creation. Its mission is to provide an agentic interface for interacting with database and analytics services and popular open-source databases.
### How extensions work
Gemini CLI is highly extensible, allowing for the addition of new tools and capabilities through extensions. You can load the extensions from a GitHub URL, a local directory, or a configurable registry. They provide new tools, slash commands, and prompts to assist with your workflow.
Use the Gemini CLI Extensions to load prebuilt or custom tools to interact with your databases.
[gemini-cli]: https://google-gemini.github.io/gemini-cli/
Below are a list of Gemini CLI Extensions powered by MCP Toolbox:
* [alloydb](https://github.com/gemini-cli-extensions/alloydb)
* [alloydb-observability](https://github.com/gemini-cli-extensions/alloydb-observability)
* [bigquery-conversational-analytics](https://github.com/gemini-cli-extensions/bigquery-conversational-analytics)
* [bigquery-data-analytics](https://github.com/gemini-cli-extensions/bigquery-data-analytics)
* [cloud-sql-mysql](https://github.com/gemini-cli-extensions/cloud-sql-mysql)
* [cloud-sql-mysql-observability](https://github.com/gemini-cli-extensions/cloud-sql-mysql-observability)
* [cloud-sql-postgresql](https://github.com/gemini-cli-extensions/cloud-sql-postgresql)
* [cloud-sql-postgresql-observability](https://github.com/gemini-cli-extensions/cloud-sql-postgresql-observability)
* [cloud-sql-sqlserver](https://github.com/gemini-cli-extensions/cloud-sql-sqlserver)
* [cloud-sql-sqlserver-observability](https://github.com/gemini-cli-extensions/cloud-sql-sqlserver-observability)
* [dataplex](https://github.com/gemini-cli-extensions/dataplex)
* [firestore-native](https://github.com/gemini-cli-extensions/firestore-native)
* [looker](https://github.com/gemini-cli-extensions/looker)
* [mcp-toolbox](https://github.com/gemini-cli-extensions/mcp-toolbox)
* [mysql](https://github.com/gemini-cli-extensions/mysql)
* [postgres](https://github.com/gemini-cli-extensions/postgres)
* [spanner](https://github.com/gemini-cli-extensions/spanner)
* [sql-server](https://github.com/gemini-cli-extensions/sql-server)

View File

@@ -84,8 +84,6 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* **Environment Variables:**
* `BIGQUERY_PROJECT`: The GCP project ID.
* `BIGQUERY_LOCATION`: (Optional) The dataset location.
* `BIGQUERY_USE_CLIENT_OAUTH`: (Optional) If `true`, forwards the client's
OAuth access token for authentication. Defaults to `false`.
* **Permissions:**
* **BigQuery User** (`roles/bigquery.user`) to execute queries and view
metadata.
@@ -134,10 +132,6 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_tables`: Lists tables in the database.
* `get_query_plan`: Provides information about how MySQL executes a SQL
statement.
* `list_active_queries`: Lists ongoing queries.
* `list_tables_missing_unique_indexes`: Looks for tables that do not have
primary or unique key contraint.
* `list_table_fragmentation`: Displays table fragmentation in MySQL.
## Cloud SQL for MySQL Observability
@@ -360,10 +354,6 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `LOOKER_CLIENT_ID`: The client ID for the Looker API.
* `LOOKER_CLIENT_SECRET`: The client secret for the Looker API.
* `LOOKER_VERIFY_SSL`: Whether to verify SSL certificates.
* `LOOKER_USE_CLIENT_OAUTH`: Whether to use OAuth for authentication.
* `LOOKER_SHOW_HIDDEN_MODELS`: Whether to show hidden models.
* `LOOKER_SHOW_HIDDEN_EXPLORES`: Whether to show hidden explores.
* `LOOKER_SHOW_HIDDEN_FIELDS`: Whether to show hidden fields.
* **Permissions:**
* A Looker account with permissions to access the desired models,
explores, and data is required.
@@ -383,35 +373,6 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `get_dashboards`: Searches for saved dashboards.
* `make_dashboard`: Creates a new dashboard.
* `add_dashboard_element`: Adds a tile to a dashboard.
* `health_pulse`: Test the health of a Looker instance.
* `health_analyze`: Analyze the LookML usage of a Looker instance.
* `health_vacuum`: Suggest LookML elements that can be removed.
## Looker Conversational Analytics
* `--prebuilt` value: `looker-conversational-analytics`
* **Environment Variables:**
* `LOOKER_BASE_URL`: The URL of your Looker instance.
* `LOOKER_CLIENT_ID`: The client ID for the Looker API.
* `LOOKER_CLIENT_SECRET`: The client secret for the Looker API.
* `LOOKER_VERIFY_SSL`: Whether to verify SSL certificates.
* `LOOKER_USE_CLIENT_OAUTH`: Whether to use OAuth for authentication.
* `LOOKER_PROJECT`: The GCP Project to use for Conversational Analytics.
* `LOOKER_LOCATION`: The GCP Location to use for Conversational Analytics.
* **Permissions:**
* A Looker account with permissions to access the desired models,
explores, and data is required.
* **Looker Instance User** (`roles/looker.instanceUser`): IAM role to
access Looker.
* **Gemini for Google Cloud User** (`roles/cloudaicompanion.user`): IAM
role to access Conversational Analytics.
* **Gemini Data Analytics Stateless Chat User (Beta)**
(`roles/geminidataanalytics.dataAgentStatelessUser`): IAM role to
access Conversational Analytics.
* **Tools:**
* `ask_data_insights`: Ask a question of the data.
* `get_models`: Retrieves the list of LookML models.
* `get_explores`: Retrieves the list of explores in a model.
## Microsoft SQL Server
@@ -446,10 +407,6 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_tables`: Lists tables in the database.
* `get_query_plan`: Provides information about how MySQL executes a SQL
statement.
* `list_active_queries`: Lists ongoing queries.
* `list_tables_missing_unique_indexes`: Looks for tables that do not have
primary or unique key contraint.
* `list_table_fragmentation`: Displays table fragmentation in MySQL.
## OceanBase

View File

@@ -119,7 +119,6 @@ sources:
kind: "bigquery"
project: "my-project-id"
# location: "US" # Optional: Specifies the location for query jobs.
# writeMode: "allowed" # One of: allowed, blocked, protected. Defaults to "allowed".
# allowedDatasets: # Optional: Restricts tool access to a specific list of datasets.
# - "my_dataset_1"
# - "other_project.my_dataset_2"
@@ -134,7 +133,6 @@ sources:
project: "my-project-id"
useClientOAuth: true
# location: "US" # Optional: Specifies the location for query jobs.
# writeMode: "allowed" # One of: allowed, blocked, protected. Defaults to "allowed".
# allowedDatasets: # Optional: Restricts tool access to a specific list of datasets.
# - "my_dataset_1"
# - "other_project.my_dataset_2"
@@ -147,6 +145,5 @@ sources:
| kind | string | true | Must be "bigquery". |
| project | string | true | Id of the Google Cloud project to use for billing and as the default project for BigQuery resources. |
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. Defaults to the table's location or 'US' if the location cannot be determined. [Learn More](https://cloud.google.com/bigquery/docs/locations) |
| writeMode | string | false | Controls the write behavior for tools. `allowed` (default): All queries are permitted. `blocked`: Only `SELECT` statements are allowed for the `bigquery-execute-sql` tool. `protected`: Enables session-based execution where all tools associated with this source instance share the same [BigQuery session](https://cloud.google.com/bigquery/docs/sessions-intro). This allows for stateful operations using temporary tables (e.g., `CREATE TEMP TABLE`). For `bigquery-execute-sql`, `SELECT` statements can be used on all tables, but write operations are restricted to the session's temporary dataset. For tools like `bigquery-sql`, `bigquery-forecast`, and `bigquery-analyze-contribution`, the `writeMode` restrictions do not apply, but they will operate within the shared session. **Note:** The `protected` mode cannot be used with `useClientOAuth: true`. It is also not recommended for multi-user server environments, as all users would share the same session. A session is terminated automatically after 24 hours of inactivity or after 7 days, whichever comes first. A new session is created on the next request, and any temporary data from the previous session will be lost. |
| allowedDatasets | []string | false | An optional list of dataset IDs that tools using this source are allowed to access. If provided, any tool operation attempting to access a dataset not in this list will be rejected. To enforce this, two types of operations are also disallowed: 1) Dataset-level operations (e.g., `CREATE SCHEMA`), and 2) operations where table access cannot be statically analyzed (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`). If a single dataset is provided, it will be treated as the default for prebuilt tools. |
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. |
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. |

View File

@@ -1,57 +0,0 @@
---
title: "Cassandra"
type: docs
weight: 1
description: >
Cassandra is a NoSQL distributed database known for its horizontal scalability, distributed architecture, and flexible schema definition.
---
## About
[Cassandra][cassandra-docs] is a NoSQL distributed database. By design, NoSQL databases are lightweight, open-source, non-relational, and largely distributed. Counted among their strengths are horizontal scalability, distributed architectures, and a flexible approach to schema definition.
[cassandra-docs]: https://cassandra.apache.org/
## Available Tools
- [`cassandra-cql`](../tools/cassandra/cassandra-cql.md)
Run parameterized CQL queries in Cassandra.
## Example
```yaml
sources:
my-cassandra-source:
kind: cassandra
hosts:
- 127.0.0.1
keyspace: my_keyspace
protoVersion: 4
username: ${USER_NAME}
password: ${PASSWORD}
caPath: /path/to/ca.crt # Optional: path to CA certificate
certPath: /path/to/client.crt # Optional: path to client certificate
keyPath: /path/to/client.key # Optional: path to client key
enableHostVerification: true # Optional: enable host verification
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|------------------------|:---------:|:------------:|-------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cassandra". |
| hosts | string[] | true | List of IP addresses to connect to (e.g., ["192.168.1.1:9042", "192.168.1.2:9042","192.168.1.3:9042"]). The default port is 9042 if not specified. |
| keyspace | string | true | Name of the Cassandra keyspace to connect to (e.g., "my_keyspace"). |
| protoVersion | integer | false | Protocol version for the Cassandra connection (e.g., 4). |
| username | string | false | Name of the Cassandra user to connect as (e.g., "my-cassandra-user"). |
| password | string | false | Password of the Cassandra user (e.g., "my-password"). |
| caPath | string | false | Path to the CA certificate for SSL/TLS (e.g., "/path/to/ca.crt"). |
| certPath | string | false | Path to the client certificate for SSL/TLS (e.g., "/path/to/client.crt"). |
| keyPath | string | false | Path to the client key for SSL/TLS (e.g., "/path/to/client.key"). |
| enableHostVerification | boolean | false | Enable host verification for SSL/TLS (e.g., true). By default, host verification is disabled. |

View File

@@ -11,7 +11,7 @@ aliases:
## About
The `cloud-sql-admin` source provides a client to interact with the [Google
Cloud SQL Admin API](https://cloud.google.com/sql/docs/mysql/admin-api). This
Cloud SQL Admin API](https://cloud.google.com/sql/docs/mysql/admin-api/v1). This
allows tools to perform administrative tasks on Cloud SQL instances, such as
creating users and databases.

View File

@@ -31,7 +31,7 @@ maintenance.
This source uses standard authentication. You will need to [create a Firebird
user][fb-users] to login to the database with.
[fb-users]: https://www.firebirdsql.org/refdocs/langrefupd25-security-sql-user-mgmt.html#langrefupd25-security-create-user
[fb-users]: https://firebirdsql.org/refdocs/langrefupd25-sql-create-user.html
## Example

View File

@@ -16,7 +16,7 @@ in the cloud, on GCP, or on premises.
## Requirements
### Looker User
### Database User
This source only uses API authentication. You will need to
[create an API user][looker-user] to login to Looker.
@@ -24,35 +24,6 @@ This source only uses API authentication. You will need to
[looker-user]:
https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk
{{< notice note >}}
To use the Conversational Analytics API, you will need to have the following
Google Cloud Project API enabled and IAM permissions.
{{< /notice >}}
### API Enablement in GCP
Enable the following APIs in your Google Cloud Project:
```
gcloud services enable geminidataanalytics.googleapis.com --project=$PROJECT_ID
gcloud services enable cloudaicompanion.googleapis.com --project=$PROJECT_ID
```
### IAM Permissions in GCP
In addition to [setting the ADC for your server][set-adc], you need to ensure
the IAM identity has been given the following IAM roles (or corresponding
permissions):
- `roles/looker.instanceUser`
- `roles/cloudaicompanion.user`
- `roles/geminidataanalytics.dataAgentStatelessUser`
To initialize the application default credential run `gcloud auth login --update-adc`
in your environment before starting MCP Toolbox.
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
## Example
```yaml
@@ -62,8 +33,6 @@ sources:
base_url: http://looker.example.com
client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET}
project: ${LOOKER_PROJECT}
location: ${LOOKER_LOCATION}
verify_ssl: true
timeout: 600s
```
@@ -81,8 +50,6 @@ The client id and client secret are seemingly random character sequences
assigned by the looker server. If you are using Looker OAuth you don't need
these settings
The `project` and `location` fields are utilized **only** when using the conversational analytics tool.
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
@@ -97,8 +64,6 @@ instead of hardcoding your secrets into the configuration file.
| client_id | string | false | The client id assigned by Looker. |
| client_secret | string | false | The client secret assigned by Looker. |
| verify_ssl | string | false | Whether to check the ssl certificate of the server. |
| project | string | false | The project id to use in Google Cloud. |
| location | string | false | The location to use in Google Cloud. (default: us) |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |
| use_client_oauth | string | false | Use OAuth tokens instead of client_id and client_secret. (default: false) |
| show_hidden_models | string | false | Show or hide hidden models. (default: true) |

View File

@@ -1,104 +0,0 @@
---
title: "Oracle"
type: docs
weight: 1
description: >
Oracle Database is a widely-used relational database management system.
---
## About
[Oracle Database][oracle-docs] is a multi-model database management system produced and marketed by Oracle Corporation. It is commonly used for running online transaction processing (OLTP), data warehousing (DW), and mixed (OLTP & DW) database workloads.
[oracle-docs]: https://www.oracle.com/database/
## Available Tools
- [`oracle-sql`](../tools/oracle/oracle-sql.md)
Execute pre-defined prepared SQL queries in Oracle.
- [`oracle-execute-sql`](../tools/oracle/oracle-execute-sql.md)
Run parameterized SQL queries in Oracle.
## Requirements
### Database User
This source uses standard authentication. You will need to [create an Oracle user][oracle-users] to log in to the database with the necessary permissions.
[oracle-users]:
https://docs.oracle.com/en/database/oracle/oracle-database/21/sqlrf/CREATE-USER.html
### Oracle Instant Client (OIC)
The underlying database driver requires the [Oracle Instant Client][oracle-ic] libraries to connect to the database. These libraries must be installed on the machine where the application is running.
After installing the client, ensure the library path is correctly configured for your operating system (e.g., by setting the `LD_LIBRARY_PATH` environment variable on Linux or adding the directory to the `PATH` on Windows) so the application can find the necessary files at runtime.
[oracle-ic]: https://www.oracle.com/database/technologies/instant-client/downloads.html
## Connection Methods
You can configure the connection to your Oracle database using one of the following three methods. **You should only use one method** in your source configuration.
### Basic Connection (Host/Port/Service Name)
This is the most straightforward method, where you provide the connection details as separate fields:
- `host`: The IP address or hostname of the database server.
- `port`: The port number the Oracle listener is running on (typically 1521).
- `serviceName`: The service name for the database instance you wish to connect to.
### Connection String
As an alternative, you can provide all the connection details in a single `connectionString`. This is a convenient way to consolidate the connection information. The typical format is `hostname:port/servicename`.
### TNS Alias
For environments that use a `tnsnames.ora` configuration file, you can connect using a TNS (Transparent Network Substrate) alias.
- `tnsAlias`: Specify the alias name defined in your `tnsnames.ora` file.
- `tnsAdmin` (Optional): If your configuration file is not in a standard location, you can use this field to provide the path to the directory containing it. This setting will override the `TNS_ADMIN` environment variable.
## Example
```yaml
sources:
my-oracle-source:
kind: oracle
# --- Choose one connection method ---
# 1. Host, Port, and Service Name
host: 127.0.0.1
port: 1521
serviceName: XEPDB1
# 2. Direct Connection String
connectionString: "127.0.0.1:1521/XEPDB1"
# 3. TNS Alias (requires tnsnames.ora)
tnsAlias: "MY_DB_ALIAS"
tnsAdmin: "/opt/oracle/network/admin" # Optional: overrides TNS_ADMIN env var
user: ${USER_NAME}
password: ${PASSWORD}
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oracle". |
| user | string | true | Name of the Oracle user to connect as (e.g. "my-oracle-user"). |
| password | string | true | Password of the Oracle user (e.g. "my-password"). |
| host | string | false | IP address or hostname to connect to (e.g. "127.0.0.1"). Required if not using `connectionString` or `tnsAlias`. |
| port | integer | false | Port to connect to (e.g. "1521"). Required if not using `connectionString` or `tnsAlias`. |
| serviceName | string | false | The Oracle service name of the database to connect to. Required if not using `connectionString` or `tnsAlias`. |
| connectionString | string | false | A direct connection string (e.g. "hostname:port/servicename"). Use as an alternative to `host`, `port`, and `serviceName`. |
| tnsAlias | string | false | A TNS alias from a `tnsnames.ora` file. Use as an alternative to `host`/`port` or `connectionString`. |
| tnsAdmin | string | false | Path to the directory containing the `tnsnames.ora` file. This overrides the `TNS_ADMIN` environment variable if it is set. |

View File

@@ -16,7 +16,7 @@ lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, and
geospatial indexes with radius queries.
If you are new to Redis, you can find installation and getting started guides on
the [official Redis website](https://redis.io/docs/).
the [official Redis website](https://redis.io/docs/getting-started/).
## Available Tools

View File

@@ -14,8 +14,9 @@ This tool provisions a cluster with a **private IP address** within the specifie
**Permissions & APIs Required:**
Before using, ensure the following on your GCP project:
1. The [AlloyDB API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com) is enabled.
2. The user or service account executing the tool has one of the following IAM roles:
1. The [AlloyDB API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com) is enabled.
2. The user or service account executing the tool has one of the following IAM roles:
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
- `roles/owner` (the Owner basic IAM role)

View File

@@ -15,14 +15,14 @@ or `ALLOYDB_IAM_USER`) within a specified cluster. It is compatible with
**Permissions & APIs Required:**
Before using, ensure the following on your GCP project:
1. The [AlloyDB
1. The [AlloyDB
API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com)
is enabled.
2. The user or service account executing the tool has one of the following IAM
2. The user or service account executing the tool has one of the following IAM
roles:
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
- `roles/owner` (the Owner basic IAM role)
- `roles/editor` (the Editor basic IAM role)
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
- `roles/owner` (the Owner basic IAM role)
- `roles/editor` (the Editor basic IAM role)
The tool takes the following input parameters:

View File

@@ -39,13 +39,6 @@ It's compatible with the following sources:
insights. Can be `'NO_PRUNING'` or `'PRUNE_REDUNDANT_INSIGHTS'`. Defaults to
`'PRUNE_REDUNDANT_INSIGHTS'`.
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
- **`allowed` (default) and `blocked`:** These modes do not impose any special restrictions on the `bigquery-analyze-contribution` tool.
- **`protected`:** This mode enables session-based execution. The tool will operate within the same BigQuery session as other
tools using the same source. This allows the `input_data` parameter to be a query that references temporary resources (e.g.,
`TEMP` tables) created within that session.
## Example

View File

@@ -15,32 +15,9 @@ It's compatible with the following sources:
- [bigquery](../../sources/bigquery.md)
`bigquery-execute-sql` accepts the following parameters:
- **`sql`** (required): The GoogleSQL statement to execute.
- **`dry_run`** (optional): If set to `true`, the query is validated but not run,
returning information about the execution instead. Defaults to `false`.
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
- **`allowed` (default):** All SQL statements are permitted.
- **`blocked`:** Only `SELECT` statements are allowed. Any other type of statement (e.g., `INSERT`, `UPDATE`, `CREATE`) will be rejected.
- **`protected`:** This mode enables session-based execution. `SELECT` statements can be used on all tables, while write operations are allowed only for the session's temporary dataset (e.g., `CREATE TEMP TABLE ...`). This prevents modifications to permanent datasets while allowing stateful, multi-step operations within a secure session.
The tool's behavior is influenced by the `allowedDatasets` restriction on the
`bigquery` source. Similar to `writeMode`, this setting provides an additional layer of security by controlling which datasets can be accessed:
- **Without `allowedDatasets` restriction:** The tool can execute any valid GoogleSQL
query.
- **With `allowedDatasets` restriction:** Before execution, the tool performs a dry run
to analyze the query.
It will reject the query if it attempts to access any table outside the
allowed `datasets` list. To enforce this restriction, the following operations
are also disallowed:
- **Dataset-level operations** (e.g., `CREATE SCHEMA`, `ALTER SCHEMA`).
- **Unanalyzable operations** where the accessed tables cannot be determined
statically (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`, `CALL`).
> **Note:** This tool is intended for developer assistant workflows with human-in-the-loop and shouldn't be used for production agents.
`bigquery-execute-sql` takes a required `sql` input parameter and runs the SQL
statement against the configured `source`. It also supports an optional `dry_run`
parameter to validate a query without executing it.
## Example

View File

@@ -33,20 +33,6 @@ query based on the provided parameters:
- **horizon** (integer, optional): The number of future time steps you want to
predict. It defaults to 10 if not specified.
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
- **`allowed` (default) and `blocked`:** These modes do not impose any special restrictions on the `bigquery-forecast` tool.
- **`protected`:** This mode enables session-based execution. The tool will operate within the same BigQuery session as other
tools using the same source. This allows the `history_data` parameter to be a query that references temporary resources (e.g.,
`TEMP` tables) created within that session.
The tool's behavior is also influenced by the `allowedDatasets` restriction on the `bigquery` source:
- **Without `allowedDatasets` restriction:** The tool can use any table or query for the `history_data` parameter.
- **With `allowedDatasets` restriction:** The tool verifies that the `history_data` parameter only accesses tables within the allowed datasets.
- If `history_data` is a table ID, the tool checks if the table's dataset is in the allowed list.
- If `history_data` is a query, the tool performs a dry run to analyze the query and rejects it if it accesses any table outside the allowed list.
## Example
```yaml

View File

@@ -15,20 +15,10 @@ It's compatible with the following sources:
- [bigquery](../../sources/bigquery.md)
`bigquery-get-table-info` accepts the following parameters:
- **`table`** (required): The name of the table for which to retrieve metadata.
- **`dataset`** (required): The dataset containing the specified table.
- **`project`** (optional): The Google Cloud project ID. If not provided, the
tool defaults to the project from the source configuration.
The tool's behavior regarding these parameters is influenced by the
`allowedDatasets` restriction on the `bigquery` source:
- **Without `allowedDatasets` restriction:** The tool can retrieve metadata for
any table specified by the `table`, `dataset`, and `project` parameters.
- **With `allowedDatasets` restriction:** Before retrieving metadata, the tool
verifies that the requested dataset is in the allowed list. If it is not, the
request is denied. If only one dataset is specified in the `allowedDatasets`
list, it will be used as the default value for the `dataset` parameter.
`bigquery-get-table-info` takes `dataset` and `table` parameters to specify
the target table. It also optionally accepts a `project` parameter to define
the Google Cloud project ID. If the `project` parameter is not provided, the
tool defaults to using the project defined in the source configuration.
## Example

View File

@@ -15,17 +15,9 @@ It's compatible with the following sources:
- [bigquery](../../sources/bigquery.md)
`bigquery-list-dataset-ids` accepts the following parameter:
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
the tool defaults to the project from the source configuration.
The tool's behavior regarding this parameter is influenced by the
`allowedDatasets` restriction on the `bigquery` source:
- **Without `allowedDatasets` restriction:** The tool can list datasets from any
project specified by the `project` parameter.
- **With `allowedDatasets` restriction:** The tool directly returns the
pre-configured list of dataset IDs from the source, and the `project`
parameter is ignored.
`bigquery-list-dataset-ids` optionally accepts a `project` parameter to define
the Google Cloud project ID. If the `project` parameter is not provided, the
tool defaults to using the project defined in the source configuration.
## Example

View File

@@ -16,15 +16,13 @@ It's compatible with the following sources:
- [bigquery](../../sources/bigquery.md)
`bigquery-list-table-ids` accepts the following parameters:
- **`dataset`** (required): Specifies the dataset from which to list table IDs.
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
the tool defaults to the project from the source configuration.
The tool's behavior regarding these parameters is influenced by the
The tool's behavior regarding these parameters is influenced by the
`allowedDatasets` restriction on the `bigquery` source:
- **Without `allowedDatasets` restriction:** The tool can list tables from any
- **Without `allowedDatasets` restriction:** The tool can list tables from any
dataset specified by the `dataset` and `project` parameters.
- **With `allowedDatasets` restriction:** Before listing tables, the tool verifies
that the requested dataset is in the allowed list. If it is not, the request is

View File

@@ -61,4 +61,4 @@ tools:
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-search-catalog". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -15,11 +15,6 @@ the following sources:
- [bigquery](../../sources/bigquery.md)
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
- **`allowed` (default) and `blocked`:** These modes do not impose any restrictions on the `bigquery-sql` tool. The pre-defined SQL statement will be executed as-is.
- **`protected`:** This mode enables session-based execution. The tool will operate within the same BigQuery session as other tools using the same source, allowing it to interact with temporary resources like `TEMP` tables created within that session.
### GoogleSQL
BigQuery uses [GoogleSQL][bigquery-googlesql] for querying data. The integration

View File

@@ -1,7 +0,0 @@
---
title: "Cassandra"
type: docs
weight: 1
description: >
Tools that work with Cassandra Sources.
---

View File

@@ -1,96 +0,0 @@
---
title: "cassandra-cql"
type: docs
weight: 1
description: >
A "cassandra-cql" tool executes a pre-defined CQL statement against a Cassandra
database.
aliases:
- /resources/tools/cassandra-cql
---
## About
A `cassandra-cql` tool executes a pre-defined CQL statement against a Cassandra
database. It's compatible with any of the following sources:
- [cassandra](../sources/cassandra.md)
The specified CQL statement is executed as a [prepared statement][cassandra-prepare],
and expects parameters in the CQL query to be in the form of placeholders `?`.
[cassandra-prepare]: https://docs.datastax.com/en/developer/go-driver/4.8/cql-prepared-statements/
## Example
> **Note:** This tool uses parameterized queries to prevent CQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for keyspaces, table names, column names,
> or other parts of the query.
```yaml
tools:
search_users_by_email:
kind: cassandra-cql
source: my-cassandra-cluster
statement: |
SELECT user_id, email, first_name, last_name, created_at
FROM users
WHERE email = ?
description: |
Use this tool to retrieve specific user information by their email address.
Takes an email address and returns user details including user ID, email,
first name, last name, and account creation timestamp.
Do NOT use this tool with a user ID or other identifiers.
Example:
{{
"email": "user@example.com",
}}
parameters:
- name: email
type: string
description: User's email address
```
### Example with Template Parameters
> **Note:** This tool allows direct modifications to the CQL statement,
> including keyspaces, table names, and column names. **This makes it more
> vulnerable to CQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](../#template-parameters).
```yaml
tools:
list_keyspace_table:
kind: cassandra-cql
source: my-cassandra-cluster
statement: |
SELECT * FROM {{.keyspace}}.{{.tableName}};
description: |
Use this tool to list all information from a specific table in a keyspace.
Example:
{{
"keyspace": "my_keyspace",
"tableName": "users",
}}
templateParameters:
- name: keyspace
type: string
description: Keyspace containing the table
- name: tableName
type: string
description: Table to select from
```
## Reference
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cassandra-cql". |
| source | string | true | Name of the source the CQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | CQL statement to execute. |
| authRequired | []string | false | List of authentication requirements for the source. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the CQL statement. |
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the CQL statement before executing prepared statement. |

View File

@@ -14,8 +14,8 @@ A `clickhouse-list-databases` tool lists all available databases in a ClickHouse
instance. It's compatible with the [clickhouse](../../sources/clickhouse.md)
source.
This tool executes the `SHOW DATABASES` command and returns a list of all
databases accessible to the configured user, making it useful for database
This tool executes the `SHOW DATABASES` command and returns a list of all
databases accessible to the configured user, making it useful for database
discovery and exploration tasks.
## Example
@@ -31,11 +31,9 @@ tools:
## Return Value
The tool returns an array of objects, where each object contains:
- `name`: The name of the database
Example response:
```json
[
{"name": "default"},

View File

@@ -57,4 +57,4 @@ Example response:
| source | string | true | Name of the ClickHouse source to list tables from. |
| description | string | true | Description of the tool that is passed to the LLM. |
| authRequired | array of string | false | Authentication services required to use this tool. |
| parameters | array of Parameter | false | Parameters for the tool (see Parameters section above). |
| parameters | array of Parameter | false | Parameters for the tool (see Parameters section above). |

View File

@@ -79,4 +79,4 @@ tools:
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The SQL statement template to execute. |
| parameters | array of Parameter | false | Parameters for prepared statement values. |
| templateParameters | array of Parameter | false | Parameters for SQL statement template customization. |
| templateParameters | array of Parameter | false | Parameters for SQL statement template customization. |

View File

@@ -1,7 +0,0 @@
---
title: "Dataform"
type: docs
weight: 1
description: >
Tools that work with Dataform.
---

View File

@@ -1,51 +0,0 @@
---
title: "dataform-compile-local"
type: docs
weight: 1
description: >
A "dataform-compile-local" tool runs the `dataform compile` CLI command on a local project directory.
aliases:
- /resources/tools/dataform-compile-local
---
## About
A `dataform-compile-local` tool runs the `dataform compile` command on a local Dataform project.
It is a standalone tool and **is not** compatible with any sources.
At invocation time, the tool executes `dataform compile --json` in the specified project directory and returns the resulting JSON object from the CLI.
`dataform-compile-local` takes the following parameter:
- `project_dir` (string): The absolute or relative path to the local Dataform project directory. The server process must have read access to this path.
## Requirements
### Dataform CLI
This tool executes the `dataform` command-line interface (CLI) via a system call. You must have the **`dataform` CLI** installed and available in the server's system `PATH`.
You can typically install the CLI via `npm`:
```bash
npm install -g @dataform/cli
```
See the [official Dataform documentation](https://www.google.com/search?q=https://cloud.google.com/dataform/docs/install-dataform-cli) for more details.
## Example
```yaml
tools:
my_dataform_compiler:
kind: dataform-compile-local
description: Use this tool to compile a local Dataform project.
```
## Reference
| **field** | **type** | **required** | **description** |
| :---- | :---- | :---- | :---- |
| kind | string | true | Must be "dataform-compile-local". |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -21,15 +21,14 @@ form: projects/{project}/locations/{location} and also a required `entry`
parameter which is the resource name of the entry in the following form:
projects/{project}/locations/{location}/entryGroups/{entryGroup}/entries/{entry}.
It also optionally accepts following parameters:
- `view` - View to control which parts of an entry the service should return.
- `view` - View to control which parts of an entry the service should return.
It takes integer values from 1-4 corresponding to type of view - BASIC,
FULL, CUSTOM, ALL
- `aspectTypes` - Limits the aspects returned to the provided aspect types in
- `aspectTypes` - Limits the aspects returned to the provided aspect types in
the format
`projects/{project}/locations/{location}/aspectTypes/{aspectType}`. It only
works for CUSTOM view.
- `paths` - Limits the aspects returned to those associated with the provided
- `paths` - Limits the aspects returned to those associated with the provided
paths within the Entry. It only works for CUSTOM view.
## Requirements
@@ -37,13 +36,13 @@ It also optionally accepts following parameters:
### IAM Permissions
Dataplex uses [Identity and Access Management (IAM)][iam-overview] to control
user and group access to Dataplex resources. Toolbox will use your
[Application Default Credentials (ADC)][adc] to authorize and authenticate when
user and group access to Dataplex resources. Toolbox will use your
[Application Default Credentials (ADC)][adc] to authorize and authenticate when
interacting with [Dataplex][dataplex-docs].
In addition to [setting the ADC for your server][set-adc], you need to ensure
the IAM identity has been given the correct IAM permissions for the tasks you
intend to perform. See [Dataplex Universal Catalog IAM permissions][iam-permissions]
intend to perform. See [Dataplex Universal Catalog IAM permissions][iam-permissions]
and [Dataplex Universal Catalog IAM roles][iam-roles] for more information on
applying IAM permissions and roles to an identity.
@@ -69,4 +68,4 @@ tools:
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "dataplex-lookup-entry". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -62,4 +62,4 @@ tools:
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "dataplex-search-aspect-types". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -256,7 +256,6 @@ tools:
## Error Handling
Common errors include:
- Invalid collection path
- Missing or invalid document data
- Permission denied (if Firestore security rules block the operation)

View File

@@ -8,7 +8,7 @@ aliases:
- /resources/tools/firestore-query-collection
---
## About
# About
The `firestore-query-collection` tool allows you to query Firestore collections
with filters, ordering, and limit capabilities.

View File

@@ -146,7 +146,6 @@ templates throughout the configuration.
## Filter Format
### Simple Filter
```json
{
"field": "age",
@@ -156,7 +155,6 @@ templates throughout the configuration.
```
### AND Filter
```json
{
"and": [
@@ -167,7 +165,6 @@ templates throughout the configuration.
```
### OR Filter
```json
{
"or": [
@@ -178,7 +175,6 @@ templates throughout the configuration.
```
### Nested Filters
```json
{
"or": [
@@ -212,7 +208,6 @@ The tool supports all Firestore native JSON value types:
### Complex Type Examples
**GeoPoint:**
```json
{
"field": "location",
@@ -227,7 +222,6 @@ The tool supports all Firestore native JSON value types:
```
**Array:**
```json
{
"field": "tags",
@@ -372,7 +366,6 @@ curl -X POST http://localhost:5000/api/tool/your-tool-name/invoke \
### Response Format
**Without analyzeQuery:**
```json
[
{
@@ -391,7 +384,6 @@ curl -X POST http://localhost:5000/api/tool/your-tool-name/invoke \
```
**With analyzeQuery:**
```json
{
"documents": [...],

View File

@@ -41,7 +41,6 @@ The tool requires Firestore's native JSON format for document data. Each field
must be wrapped with its type indicator:
### Basic Types
- **String**: `{"stringValue": "your string"}`
- **Integer**: `{"integerValue": "123"}` or `{"integerValue": 123}`
- **Double**: `{"doubleValue": 123.45}`
@@ -51,7 +50,6 @@ must be wrapped with its type indicator:
- **Timestamp**: `{"timestampValue": "2025-01-07T10:00:00Z"}` (RFC3339 format)
### Complex Types
- **GeoPoint**: `{"geoPointValue": {"latitude": 34.052235, "longitude": -118.243683}}`
- **Array**: `{"arrayValue": {"values": [{"stringValue": "item1"}, {"integerValue": "2"}]}}`
- **Map**: `{"mapValue": {"fields": {"key1": {"stringValue": "value1"}, "key2": {"booleanValue": true}}}}`
@@ -80,6 +78,7 @@ deleted. To delete a field, include it in the `updateMask` but omit it from
| source | string | true | Name of the Firestore source to update documents in. |
| description | string | true | Description of the tool that is passed to the LLM. |
## Examples
### Basic Document Update (Full Merge)
@@ -156,7 +155,6 @@ To delete fields, include them in the `updateMask` but omit them from `documentD
```
In this example:
- `name` will be updated to "John Smith"
- `temporaryField` and `obsoleteData` will be deleted from the document (they are in the mask but not in the data)
@@ -304,11 +302,8 @@ tools:
- google-oauth
- api-key
```
## Error Handling
Common errors include:
- Document not found (when using update with a non-existent document)
- Invalid document path
- Missing or invalid document data
@@ -316,7 +311,6 @@ Common errors include:
- Invalid data type conversions
## Best Practices
1. **Use update masks for precision**: When you only need to update specific fields, use the `updateMask` parameter to avoid unintended changes
2. **Always use typed values**: Every field must be wrapped with its appropriate type indicator (e.g., `{"stringValue": "text"}`)
3. **Integer values can be strings**: The tool accepts integer values as strings (e.g., `{"integerValue": "1500"}`)

View File

@@ -3,7 +3,8 @@ title: "looker-add-dashboard-element"
type: docs
weight: 1
description: >
"looker-add-dashboard-element" creates a dashboard element in the given dashboard.
"looker-add-dashboard-element" generates a Looker look in the users personal folder in
Looker
aliases:
- /resources/tools/looker-add-dashboard-element
---

View File

@@ -1,45 +0,0 @@
---
title: "looker-conversational-analytics"
type: docs
weight: 1
description: >
The "looker-conversational-analytics" tool will use the Conversational
Analaytics API to analyze data from Looker
aliases:
- /resources/tools/looker-conversational-analytics
---
## About
A `looker-conversational-analytics` tool allows you to ask questions about your Looker data.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-conversational-analytics` accepts two parameters:
1. `user_query_with_context`: The question asked of the Conversational Analytics system.
2. `explore_references`: A list of one to five explores that can be queried to answer the
question. The form of the entry is `[{"model": "model name", "explore": "explore name"}, ...]`
## Example
```yaml
tools:
ask_data_insights:
kind: looker-conversational-analytics
source: looker-source
description: |
Use this tool to perform data analysis, get insights,
or answer complex questions about the contents of specific
Looker explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "lookerca-conversational-analytics". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -3,7 +3,8 @@ title: "looker-get-dashboards"
type: docs
weight: 1
description: >
"looker-get-dashboards" tool searches for a saved Dashboard by name or description.
"looker-get-dashboards" searches for saved Looks in a Looker
source.
aliases:
- /resources/tools/looker-get-dashboards
---

View File

@@ -1,63 +0,0 @@
---
title: "looker-health-analyze"
type: docs
weight: 1
description: >
"looker-health-analyze" provides a set of analytical commands for a Looker instance, allowing users to analyze projects, models, and explores.
aliases:
- /resources/tools/looker-health-analyze
---
## About
The `looker-health-analyze` tool performs various analysis tasks on a Looker instance. The `action` parameter selects the type of analysis to perform:
- `projects`: Analyzes all projects or a specified project, reporting on the number of models and view files, as well as Git connection and validation status.
- `models`: Analyzes all models or a specified model, providing a count of explores, unused explores, and total query counts.
- `explores`: Analyzes all explores or a specified explore, reporting on the number of joins, unused joins, fields, unused fields, and query counts. Being classified as **Unused** is determined by whether a field has been used as a field or filter within the past 90 days in production.
## Parameters
| **field** | **type** | **required** | **description** |
| :--- | :--- | :--- | :--- |
| kind | string | true | Must be "looker-health-analyze" |
| source | string | true | Looker source name |
| action | string | true | The analysis to perform: `projects`, `models`, or `explores`. |
| project | string | false | The name of the Looker project to analyze. |
| model | string | false | The name of the Looker model to analyze. Required for `explores` actions. |
| explore | string | false | The name of the Looker explore to analyze. Required for the `explores` action. |
| timeframe | int | false | The timeframe in days to analyze. Defaults to 90. |
| min_queries | int | false | The minimum number of queries for a model or explore to be considered used. Defaults to 1. |
## Example
Analyze all models in `thelook` project.
```yaml
tools:
analyze-tool:
kind: looker-health-analyze
source: looker-source
description: |
Analyzes Looker projects, models, and explores.
Specify the `action` parameter to select the type of analysis.
parameters:
action: models
project: "thelook"
Analyze all the explores in the `ecomm` model of `thelook` project. Specifically look at usage within the past 20 days. Usage minimum should be at least 10 queries.
```yaml
tools:
analyze-tool:
kind: looker-health-analyze
source: looker-source
description: |
Analyzes Looker projects, models, and explores.
Specify the `action` parameter to select the type of analysis.
parameters:
action: explores
project: "thelook"
model: "ecomm"
timeframe: 20
min_queries: 10

View File

@@ -1,55 +0,0 @@
---
title: "looker-health-pulse"
type: docs
weight: 1
description: >
"looker-health-pulse" performs health checks on a Looker instance, with multiple actions available (e.g., checking database connections, dashboard performance, etc).
aliases:
- /resources/tools/looker-health-pulse
---
## About
The `looker-health-pulse` tool performs health checks on a Looker instance. The `action` parameter selects the type of check to perform:
- `check_db_connections`: Checks all database connections, runs supported tests, and reports query counts.
- `check_dashboard_performance`: Finds dashboards with slow running queries in the last 7 days.
- `check_dashboard_errors`: Lists dashboards with erroring queries in the last 7 days.
- `check_explore_performance`: Lists the slowest explores in the last 7 days and reports average query runtime.
- `check_schedule_failures`: Lists schedules that have failed in the last 7 days.
- `check_legacy_features`: Lists enabled legacy features. (*To note, this function is not available in Looker Core. You will get an error running this command with a Core instance configured.*)
## Parameters
| **field** | **type** | **required** | **description** |
|---------------|:--------:|:------------:|---------------------------------------------|
| kind | string | true | Must be "looker-health-pulse" |
| source | string | true | Looker source name |
| action | string | true | The health check to perform |
## Example
```yaml
tools:
pulse:
kind: looker-health-pulse
source: looker-source
description: |
Pulse Tool
Performs health checks on Looker instance.
Specify the `action` parameter to select the check.
parameters:
action: check_dashboard_performance
```
## Reference
| **action** | **description** |
|---------------------------|--------------------------------------------------------------------------------|
| check_db_connections | Checks all database connections and reports query counts and errors |
| check_dashboard_performance | Finds dashboards with slow queries (>30s) in the last 7 days |
| check_dashboard_errors | Lists dashboards with erroring queries in the last 7 days |
| check_explore_performance | Lists slowest explores and average query runtime |
| check_schedule_failures | Lists failed schedules in the last 7 days |
| check_legacy_features | Lists enabled legacy features |

View File

@@ -1,63 +0,0 @@
---
title: "looker-health-vacuum"
type: docs
weight: 1
description: >
"looker-health-vacuum" provides a set of commands to audit and identify unused LookML objects in a Looker instance.
aliases:
- /resources/tools/looker-health-vacuum
---
## About
The `looker-health-vacuum` tool helps you identify unused LookML objects such as models, explores, joins, and fields. The `action` parameter selects the type of vacuum to perform:
- `models`: Identifies unused explores within a model.
- `explores`: Identifies unused joins and fields within an explore.
## Parameters
| **field** | **type** | **required** | **description** |
| :--- | :--- | :--- | :--- |
| kind | string | true | Must be "looker-health-vacuum" |
| source | string | true | Looker source name |
| action | string | true | The vacuum to perform: `models`, or `explores`. |
| project | string | false | The name of the Looker project to vacuum. |
| model | string | false | The name of the Looker model to vacuum. |
| explore | string | false | The name of the Looker explore to vacuum. |
| timeframe | int | false | The timeframe in days to analyze for usage. Defaults to 90. |
| min_queries | int | false | The minimum number of queries for an object to be considered used. Defaults to 1. |
## Example
Identify unnused fields (*in this case, less than 1 query in the last 20 days*) and joins in the `order_items` explore and `thelook` model
```yaml
tools:
vacuum-tool:
kind: looker-health-vacuum
source: looker-source
description: |
Vacuums the Looker instance by identifying unused explores, fields, and joins.
parameters:
action: explores
project: "thelook_core"
model: "thelook"
explore: "order_items"
timeframe: 20
min_queries: 1
```
Identify unnused explores across all models in `thelook_core` project.
```yaml
tools:
vacuum-tool:
kind: looker-health-vacuum
source: looker-source
description: |
Vacuums the Looker instance by identifying unused explores, fields, and joins.
parameters:
action: models
project: "thelook_core"

View File

@@ -1,7 +0,0 @@
---
title: "Oracle"
type: docs
weight: 1
description: >
Tools that work with Oracle Sources.
---

View File

@@ -1,31 +0,0 @@
---
title: "oracle-execute-sql"
type: docs
weight: 1
description: >
An "oracle-execute-sql" tool executes a SQL statement against an Oracle database.
aliases:
- /resources/tools/oracle-execute-sql
---
## About
An `oracle-execute-sql` tool executes a SQL statement against an Oracle
database. It's compatible with the following source:
- [oracle](../../sources/oracle.md)
`oracle-execute-sql` takes one input parameter `sql` and runs the sql
statement against the `source`.
> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.
## Example
```yaml
tools:
execute_sql_tool:
kind: oracle-execute-sql
source: my-oracle-instance
description: Use this tool to execute sql statement.

View File

@@ -1,57 +0,0 @@
---
title: "oracle-sql"
type: docs
weight: 1
description: >
An "oracle-sql" tool executes a pre-defined SQL statement against an Oracle database.
aliases:
- /resources/tools/oracle-sql
---
## About
An `oracle-sql` tool executes a pre-defined SQL statement against an
Oracle database. It's compatible with the following source:
- [oracle](../../sources/oracle.md)
The specified SQL statement is executed using [prepared statements][oracle-stmt]
for security and performance. It expects parameter placeholders in the SQL query
to be in the native Oracle format (e.g., `:1`, `:2`).
[oracle-stmt]: https://docs.oracle.com/javase/tutorial/jdbc/basics/prepared.html
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for identifiers, column names, table
> names, or other parts of the query.
```yaml
tools:
search_flights_by_number:
kind: oracle-sql
source: my-oracle-instance
statement: |
SELECT * FROM flights
WHERE airline = :1
AND flight_number = :2
FETCH FIRST 10 ROWS ONLY
description: |
Use this tool to get information for a specific flight.
Takes an airline code and flight number and returns info on the flight.
Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
Example:
{{
"airline": "CY",
"flight_number": "888",
}}
parameters:
- name: airline
type: string
description: Airline unique 2 letter identifier
- name: flight_number
type: string
description: 1 to 4 digit number
```

View File

@@ -3,8 +3,7 @@ title: "postgres-list-available-extensions"
type: docs
weight: 1
description: >
The "postgres-list-available-extensions" tool retrieves all PostgreSQL
extensions available for installation on a Postgres database.
The "postgres-list-available-extensions" tool retrieves all PostgreSQL extensions available for installation on a Postgres database.
aliases:
- /resources/tools/postgres-list-available-extensions
---

View File

@@ -3,8 +3,7 @@ title: "postgres-list-installed-extensions"
type: docs
weight: 1
description: >
The "postgres-list-installed-extensions" tool retrieves all PostgreSQL
extensions installed on a Postgres database.
The "postgres-list-installed-extensions" tool retrieves all PostgreSQL extensions installed on a Postgres database.
aliases:
- /resources/tools/postgres-list-installed-extensions
---

View File

@@ -3,8 +3,7 @@ title: "postgres-list-tables"
type: docs
weight: 1
description: >
The "postgres-list-tables" tool lists schema information for all or specified
tables in a Postgres database.
The "postgres-list-tables" tool lists schema information for all or specified tables in a Postgres database.
aliases:
- /resources/tools/postgres-list-tables
---
@@ -21,12 +20,12 @@ following sources:
`postgres-list-tables` lists detailed schema information (object type, columns,
constraints, indexes, triggers, owner, comment) as JSON for user-created tables
(ordinary or partitioned). The tool takes the following input parameters: *
`table_names` (optional): Filters by a comma-separated list of names. By
default, it lists all tables in user schemas.* `output_format` (optional):
Indicate the output format of table schema. `simple` will return only the
table names, `detailed` will return the full table information. Default:
`detailed`.
(ordinary or partitioned). The tool takes the following input parameters:
* `table_names` (optional): Filters by a comma-separated list of names. By
default, it lists all tables in user schemas.
* `output_format` (optional): Indicate the output format of table schema.
`simple` will return only the table names, `detailed` will return the full
table information. Default: `detailed`.
## Example
@@ -35,8 +34,7 @@ tools:
postgres_list_tables:
kind: postgres-list-tables
source: postgres-source
description: Use this tool to retrieve schema information for all or
specified tables. Output format can be simple (only table names) or detailed.
description: Use this tool to retrieve schema information for all or specified tables. Output format can be simple (only table names) or detailed.
```
## Reference

View File

@@ -34,10 +34,9 @@ tools:
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------------------------------------------|
| kind | string | true | Must be "spanner-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| readOnly | bool | false | When set to `true`, the `statement` is run as a read-only transaction. Default: `false`. |
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "spanner-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| readOnly | bool | false | When set to `true`, the `statement` is run as a read-only transaction. Default: `false`. |

View File

@@ -297,8 +297,7 @@
"setup_queries = [\n",
" # Install required extension to use the AlloyDB AI natural language support API\n",
" \"\"\"CREATE EXTENSION IF NOT EXISTS alloydb_ai_nl cascade;\"\"\",\n",
" \"\"\"CREATE EXTENSION IF NOT EXISTS google_ml_integration; \"\"\",\n",
" \n",
"\n",
" # Create schema\n",
" \"\"\"CREATE SCHEMA IF NOT EXISTS nla_demo;\"\"\",\n",
"\n",
@@ -690,18 +689,15 @@
"source": [
"async def run_queries(pool):\n",
" async with pool.connect() as db_conn:\n",
"\n",
" # Verify the generated context for the tables\n",
" for query in verify_context_queries:\n",
" response = await db_conn.execute(sqlalchemy.text(query))\n",
" print(\"Verify the context:\", response.mappings().all())\n",
"\n",
" \n",
" # Update context that needs revision\n",
" for query in update_context_queries:\n",
" await db_conn.execute(sqlalchemy.text(query))\n",
"\n",
" \n",
" # The resulting context entries in the alloydb_ai_nl.generated_schema_context_view \n",
" # view are applied to the corresponding schema objects, and the comments are overwritten.\n",
" for query in apply_generated_context_queries:\n",
@@ -771,7 +767,7 @@
},
"outputs": [],
"source": [
"version = \"0.17.0\" # x-release-please-version\n",
"version = \"0.15.0\" # x-release-please-version\n",
"! curl -L -o /content/toolbox https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -6,4 +6,4 @@ description: >
An end to end tutorial for building an ADK agent using the AlloyDB AI NL tool.
---
{{< ipynb "alloydb_ai_nl.ipynb" >}}
{{< ipynb "alloydb_ai_nl.ipynb" >}}

View File

@@ -123,7 +123,7 @@ In this section, we will download and install the Toolbox binary.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
export VERSION="0.17.0"
export VERSION="0.15.0"
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/$OS/toolbox
```
<!-- {x-release-please-end} -->
@@ -361,4 +361,4 @@ Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
- Learn more about [MCP Inspector](../../how-to/connect_via_mcp.md).
- Learn more about [Toolbox Resources](../../resources/).
- Learn more about [Toolbox How-to guides](../../how-to/).
- Learn more about [Toolbox How-to guides](../../how-to/).

View File

@@ -220,7 +220,7 @@
},
"outputs": [],
"source": [
"version = \"0.17.0\" # x-release-please-version\n",
"version = \"0.15.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -179,7 +179,7 @@ to use BigQuery, and then run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -98,7 +98,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.17.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

Some files were not shown because too many files have changed in this diff Show More