Compare commits

...

42 Commits

Author SHA1 Message Date
duwenxin99
8c6f290481 remove manifest dups 2025-07-24 15:45:12 -04:00
duwenxin99
768cbdc07c refactor manifest 2025-07-24 15:45:12 -04:00
duwenxin99
6a7edab6e6 mcp manifest change 2025-07-24 15:45:12 -04:00
duwenxin99
71b751c400 update comments 2025-07-24 15:45:12 -04:00
duwenxin99
7815dce6a0 update docs table 2025-07-24 15:45:12 -04:00
duwenxin
6b4dd50a20 fix links 2025-07-24 15:45:12 -04:00
duwenxin
7e814804aa add docs 2025-07-24 15:45:12 -04:00
Author: Dennis Geurts
296a7a7aa8 feat: Add MongoDB insert Tools 2025-07-24 15:45:12 -04:00
duwenxin
b37aedd880 move verify code 2025-07-24 15:38:36 -04:00
duwenxin
3a19e46d63 dedup with http 2025-07-24 15:35:34 -04:00
Author: Dennis Geurts
2c6eb4f47f add commons 2025-07-24 15:34:25 -04:00
duwenxin
8a8649b789 remove test config 2025-07-24 15:33:18 -04:00
Author: Venkatesh Shanbhag
9159550089 feat: Add MongoDB Source 2025-07-24 15:33:05 -04:00
Wenxin Du
78e9752f62 feat: Add MongoDB delete Tools (#974)
Add MongoDB `delete` Tools:

- mongodb-delete-one
- mongodb-delete-many

---------

Co-authored-by: Author: Venkatesh Shanbhag <91714892+theshanbhag@users.noreply.github.com>
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
2025-07-24 15:24:24 -04:00
Wenxin Du
dfde52ca9a feat: Add MongoDB update Tools (#972)
Add MongoDB Tools:
- mongodb-update-one
- mongodb-update-many

---------
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
2025-07-24 15:08:27 -04:00
Wenxin Du
a7474752d8 feat: Add MongoDB find Tools (#970)
Add MongoDB Tools:
- mongodb-find
- mongodb-find-one

---------
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com>
2025-07-24 14:46:29 -04:00
prernakakkar-google
be65924aa6 docs: Add documentation for alloydb control plane tools (#988)
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-24 11:20:46 -07:00
bitsark
59e23e1725 fix: replace 'float' with 'number' in McpManifest (#985)
According to the json schema spec:There are two numeric types in JSON
Schema: integer and number So 'float' is not with mcpcurl and mcphost

Fixes #984, #797

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 18:03:34 +00:00
Wenxin Du
74dbd6124d feat: Add MongoDB Source (#969)
Add MongoDB Source

---------

Co-authored-by: Author: Venkatesh Shanbhag <91714892+theshanbhag@users.noreply.github.com>
Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com>
2025-07-24 13:48:58 -04:00
prernakakkar-google
c600c30374 feat(prebuilt/mysql,prebuilt/mssql): add generic mysql and mssql prebuilt tools (#983)
Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 22:07:07 +05:30
Yuan Teoh
ccc3498cf0 fix(tools/mysqlsql): unmarshal json data from database during invoke (#979)
mysql driver return []uint8 types for "JSON", hence we will need to cast
it.

Toolbox will unmarshal JSON data retrieved from database during
invocation. If directly convert results into string, it will be
marshaled again when returning results to the user (causing double
marshaling).

Fixes #840
2025-07-24 16:25:53 +00:00
Dr. Strangelove
f55dd6fcd0 fix: add agent tag to Looker API calls. (#966) 2025-07-24 09:19:25 -06:00
trehanshakuntG
1526b8ab8c docs: update firestore documentation (#982)
Co-authored-by: Anubhav Dhawan <anubhavdhawan@google.com>
2025-07-24 08:56:33 +00:00
prernakakkar-google
2a1b1ff787 feat: Add indexing to docs for utility (#981) 2025-07-24 13:26:20 +05:30
Averi Kitsch
86ccff0b43 docs: fix wait tool notice tag (#968) 2025-07-23 14:20:28 -07:00
Dr. Strangelove
d61e552ead chore: refactor some attributes in Looker (#965) 2025-07-23 15:10:19 -04:00
Yuan Teoh
9c289da638 chore: update instructions for adding integration test coverage (#964) 2025-07-23 18:59:11 +00:00
prernakakkar-google
0b28b72aa0 feat: Add alloydb control plane as a prebuilt config (#937) 2025-07-24 00:14:43 +05:30
prernakakkar-google
3f6ec2944e feat: Add wait for operation tool with exponential backoff (#920)
Example:

```
  alloydb-operations-get:
    kind: wait-for-operation
    source: alloydb-api-source
    method: GET
    path: /v1/projects/{{.projectId}}/locations/{{.locationId}}/operations/{{.operationId}}
    description: "Makes API call to check whether operation is done or not. This tool is run first then wait for tool. if its still in create phase trigger it after 3 minutes.  Print a message saying still not done. We will notify once its done."
    pathParams:
      - name: projectId
        type: string
        description: The dynamic path parameter
      - name: locationId
        type: string
        description: The dynamic path parameter
        default: us-central1
      - name: operationId
        type: string
        description: Operation status check for previous task

```
2025-07-24 00:04:36 +05:30
Dr. Strangelove
c67e01bcf9 feat: Looker MCP Server (#923)
Add support for Looker with the following prebuilt tools:
* get_models
* get_explores
* get_dimensions
* get_measures
* get_filters
* get_parameters
* query
* query_sql
* get_looks
* run_look

---------

Co-authored-by: duwenxin <duwenxin@google.com>
2025-07-23 18:12:06 +00:00
nester-neo4j
81d05053b2 feat: Neo4j tools enhancements - neo4j-execute-cypher (#946)
This pull request introduces a new tool for executing arbitrary Cypher
queries against a Neo4j database (`neo4j-execute-cypher`) and implements
robust query classification functionality to distinguish between read
and write operations. The changes include updates to documentation, the
addition of a query classifier, and comprehensive test coverage for the
classifier.

### Addition of `neo4j-execute-cypher` tool:

- **Documentation**: Added a new markdown file `neo4j-execute-cypher.md`
that explains the tool's functionality, usage, and configuration
options, including the ability to enforce read-only mode for security.
- **Import statement**: Registered the new tool in the `cmd/root.go`
file to make it available in the toolbox.

### Query classification functionality:

- **Query classifier implementation**: Added `QueryClassifier` in
`classifier.go`, which classifies Cypher queries into read or write
operations based on keywords, procedures, and subquery analysis. It
supports handling edge cases like nested subqueries, multi-word
keywords, and invalid syntax.
- **Test coverage**: Created extensive tests in `classifier_test.go` to
validate the classifier's behavior across various query types, including
abuse cases, subqueries, and procedure calls. Tests ensure the
classifier is robust and does not panic on malformed queries.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-23 17:33:44 +00:00
trehanshakuntG
15417d4e0c fix: mark database field as required in the firestore prebuilt tools (#959)
Env variables cannot be used as optional parameters, hence marking
database field as required.
2025-07-23 10:13:12 -07:00
dishaprakash
9aa6aa079d docs: Update sample for Genkit Go using tbgenkit package (#943)
Previously code snippets were added to the README on how to use the new
Toolbox Go Core SDK.
Recently we have released a `tbgenkit` client-side package to make
integration with Genkit Go easier.
This PR updates the code snippet to use the newer package

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-23 11:59:47 +05:30
Anubhav Dhawan
fa3e9ac04b docs: fix type guidance for map parameter (#951)
`number` is now changed to `integer` and `float`.
2025-07-22 22:19:02 +00:00
Yuan Teoh
4ee8cfa1f4 chore: fix labels description (#957)
The `sync label` job failed during PR merge due to fail parsing.
2025-07-22 21:53:41 +00:00
Mend Renovate
552e86bc43 chore(deps): update module google.golang.org/api to v0.243.0 (#956)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.242.0` -> `v0.243.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.243.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.242.0/v0.243.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.243.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.243.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.242.0...v0.243.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3233](https://redirect.github.com/googleapis/google-api-go-client/issues/3233))
([a269dca](a269dca39e))
- **all:** Auto-regenerate discovery clients
([#&#8203;3235](https://redirect.github.com/googleapis/google-api-go-client/issues/3235))
([b656000](b656000d19))
- **all:** Auto-regenerate discovery clients
([#&#8203;3236](https://redirect.github.com/googleapis/google-api-go-client/issues/3236))
([971135a](971135a022))
- **all:** Auto-regenerate discovery clients
([#&#8203;3237](https://redirect.github.com/googleapis/google-api-go-client/issues/3237))
([be7e601](be7e601ced))
- **all:** Auto-regenerate discovery clients
([#&#8203;3239](https://redirect.github.com/googleapis/google-api-go-client/issues/3239))
([b2202ca](b2202ca571))
- **all:** Auto-regenerate discovery clients
([#&#8203;3240](https://redirect.github.com/googleapis/google-api-go-client/issues/3240))
([ceceb79](ceceb79c86))

##### Bug Fixes

- **gensupport:** Update chunk upload logic for robust timeout handling.
([#&#8203;3208](https://redirect.github.com/googleapis/google-api-go-client/issues/3208))
([93865aa](93865aac32))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-22 21:21:50 +00:00
Kurtis Van Gent
1387f858b6 chore: update labels.yaml (#901) 2025-07-22 19:27:45 +00:00
Twisha Bansal
481cc608ba fix: fix document preview pipeline for forked PRs (#950)
The checkout actions uses ref as `github.sha` by default. For forked
repos, this refers to the last commit on the main branch
([ref](https://docs.github.com/en/actions/reference/events-that-trigger-workflows#fork)).

This means that the docs preview pipeline when run on forked PRs does
not allow us to preview changes made in the PR.

`github.event.pull_request.head.sha` allows to to checkout the latest
commit on the forked PR.
2025-07-22 21:59:44 +05:30
Niraj Nandre
d16728e5c6 fix: correct source reference for execute_sql tool in internal/prebuiltconfigs/tools/cloud-sql-mssql.yaml (#938)
### What this PR does:

This Pull Request resolves a critical configuration error in the
`cloud-sql-mssql.yaml` prebuilt tool definition, which was preventing
the `execute_sql` tool from correctly initializing and connecting to a
Cloud SQL for SQL Server instance.

### Problem:

Previously, the `source` field in
`internal/prebuiltconfigs/tools/cloud-sql-mssql.yaml` was incorrectly
configured as `cloud-sql-mssql-source`. This led to the `genai-toolbox`
server failing to find the corresponding data source, resulting in
errors like:
2025-07-22 15:09:09 +00:00
Anushka Saxena
49b1562f73 docs: add llamaindex integration to js quickstart (#931)
### Description

Enhance the [JavaScript
Quickstart](https://googleapis.github.io/genai-toolbox/getting-started/local_quickstart_js/)
by adding a new tab dedicated to integrating the MCP Toolbox with
LlamaIndex. This provides users with a direct, runnable example of how
to utilize Toolbox's capabilities within a LlamaIndex-powered LLM agent
for hotel search and booking operations.

The changes include:
- Addition of a new `LlamaIndex` tab in `docs/quickstarts/js-local.md`.
- Corresponding `hotelAgent.js` code snippet for LlamaIndex integration.

### Related Issue

This PR addresses and resolves #930.

---------

Signed-off-by: Anushka Saxena <anushka.saxena-it-2k19-05@iips.edu.in>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-07-22 16:20:33 +05:30
Anushka Saxena
a1def43b35 docs: add available tools for each source (#914)
- This PR adds an `"Available Tools"` section under each source page in
the
[documentation](https://googleapis.github.io/genai-toolbox/resources/sources/).
- The purpose is to help users quickly identify relevant tools
compatible with each data source, improving discoverability and
developer experience.

---------

Signed-off-by: Anushka Saxena <anushkasaxenaa@google.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-07-22 15:53:39 +05:30
dishaprakash
480d76dfff docs: add Toolbox Go SDK quickstart (#941)
docs: Add Toolbox Go SDK Quickstart
2025-07-22 13:51:49 +05:30
132 changed files with 13053 additions and 155 deletions

View File

@@ -445,8 +445,49 @@ steps:
"Firestore" \
firestore \
firestore
- id: "looker"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "FIRESTORE_PROJECT=$PROJECT_ID"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
- "LOOKER_VERIFY_SSL=$_LOOKER_VERIFY_SSL"
secretEnv: ["CLIENT_ID", "LOOKER_BASE_URL", "LOOKER_CLIENT_ID", "LOOKER_CLIENT_SECRET"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Looker" \
looker \
looker
- id: "alloydbwaitforoperation"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "API_KEY=$(gcloud auth print-access-token)"
secretEnv: ["CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Alloydb Wait for Operation" \
utility \
utility/alloydbwaitforoperation
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
@@ -499,6 +540,12 @@ availableSecrets:
env: REDIS_PASS
- versionName: projects/$PROJECT_ID/secrets/memorystore_valkey_address/versions/latest
env: VALKEY_ADDRESS
- versionName: projects/107716898620/secrets/looker_base_url/versions/latest
env: LOOKER_BASE_URL
- versionName: projects/107716898620/secrets/looker_client_id/versions/latest
env: LOOKER_CLIENT_ID
- versionName: projects/107716898620/secrets/looker_client_secret/versions/latest
env: LOOKER_CLIENT_SECRET
options:
@@ -530,4 +577,5 @@ substitutions:
_MSSQL_PORT: "1433"
_DGRAPHURL: "https://play.dgraph.io"
_COUCHBASE_BUCKET: "couchbase-bucket"
_COUCHBASE_SCOPE: "couchbase-scope"
_COUCHBASE_SCOPE: "couchbase-scope"
_LOOKER_VERIFY_SSL: "true"

View File

@@ -2,8 +2,10 @@
# Arguments:
# $1: Display name for logs (e.g., "Cloud SQL Postgres")
# $2: Source package name (e.g., cloudsqlpg)
# $3, $4, ...: Tool package names for grep (e.g., postgressql)
# $2: Integration test's package name (e.g., cloudsqlpg)
# $3, $4, ...: Tool package names for grep (e.g., postgressql), if the
# integration test specifically check a separate package inside a folder, please
# specify the full path instead (e.g., postgressql/postgresexecutesql)
DISPLAY_NAME="$1"
SOURCE_PACKAGE_NAME="$2"
@@ -57,4 +59,4 @@ if awk -v coverage="$coverage_numeric" 'BEGIN {exit !(coverage < 50)}'; then
exit 1
else
echo "Coverage for ${DISPLAY_NAME} is sufficient."
fi
fi

24
.github/labels.yaml vendored
View File

@@ -50,7 +50,7 @@
color: ffffc7
description: Desirable enhancement or fix. May not be included in next release.
- name: do not merge
- name: 'do not merge'
color: d93f0b
description: Indicates a pull request not ready for merge, due to either quality
or timing.
@@ -65,28 +65,26 @@
color: ededed
description: Release please has completed a release for this.
- name: 'blunderbuss: assign'
color: 3DED97
description: Have blunderbuss assign this to someone new.
- name: 'tests: run'
color: 3DED97
description: Label to trigger Github Action tests.
- name: 'docs: deploy-preview'
color: BFDADC
description: Label to trigger Github Action docs preview.
- name: 'status: contribution welcome'
- name: 'status: help wanted'
color: 8befd7
description: Status - Contributions welcome.
- name: 'status: awaiting response'
description: 'Status: Unplanned work open to contributions from the community.'
- name: 'status: feedback wanted'
color: 8befd7
description: Status - Awaiting response from author.
- name: 'status: awaiting codeowners'
color: 8befd7
description: Status - Awaiting response from code owners.
description: 'Status: waiting for feedback from community or issue author.'
# Product Labels
- name: 'product: bigquery'
color: 5065c7
description: Product - Assigned to the BigQuery team.
description: 'Product: Assigned to the BigQuery team.'

View File

@@ -21,6 +21,7 @@ extraFiles: [
"docs/en/getting-started/introduction/_index.md",
"docs/en/getting-started/local_quickstart.md",
"docs/en/getting-started/local_quickstart_js.md",
"docs/en/getting-started/local_quickstart_go.md",
"docs/en/getting-started/mcp_quickstart/_index.md",
"docs/en/samples/bigquery/local_quickstart.md",
"docs/en/samples/bigquery/mcp_quickstart/_index.md",

View File

@@ -51,6 +51,8 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# Checkout the PR's HEAD commit (supports forks).
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
- name: Setup Hugo

7
.gitignore vendored
View File

@@ -4,6 +4,9 @@
# vscode
.vscode/
# idea
.idea/
# npm
node_modules
@@ -14,3 +17,7 @@ node_modules
# coverage
.coverage
# executable
genai-toolbox
toolbox

View File

@@ -490,6 +490,7 @@ For more detailed instructions on using the Toolbox Core SDK, see the
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/invopop/jsonschema"
)
@@ -505,30 +506,12 @@ For more detailed instructions on using the Toolbox Core SDK, see the
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
var schema *jsonschema.Schema
_ = json.Unmarshal(inputschema, &schema)
executeFn := func(ctx *ai.ToolContext, input any) (string, error) {
result, err := tool.Invoke(ctx, input.(map[string]any))
if err != nil {
// Propagate errors from the tool invocation.
return "", err
}
return result.(string), nil
}
// Convert the tool using the tbgenkit package
// Use this tool with Genkit Go
genkitTool := genkit.DefineToolWithInputSchema(
g,
tool.Name(),
tool.Description(),
schema,
executeFn,
)
genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
}
```

View File

@@ -59,17 +59,37 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorequerycollection"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorevalidaterules"
_ "github.com/googleapis/genai-toolbox/internal/tools/http"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdimensions"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetexplores"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetfilters"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetlooks"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmeasures"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmodels"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetparameters"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquery"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrunlook"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbdeletemany"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbdeleteone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbfind"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbfindone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbinsertmany"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbinsertone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbupdatemany"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbupdateone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jcypher"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jexecutecypher"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
_ "github.com/googleapis/genai-toolbox/internal/tools/redis"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannersql"
_ "github.com/googleapis/genai-toolbox/internal/tools/sqlitesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/alloydbwaitforoperation"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/wait"
_ "github.com/googleapis/genai-toolbox/internal/tools/valkey"
@@ -85,6 +105,8 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/dgraph"
_ "github.com/googleapis/genai-toolbox/internal/sources/firestore"
_ "github.com/googleapis/genai-toolbox/internal/sources/http"
_ "github.com/googleapis/genai-toolbox/internal/sources/looker"
_ "github.com/googleapis/genai-toolbox/internal/sources/mongodb"
_ "github.com/googleapis/genai-toolbox/internal/sources/mssql"
_ "github.com/googleapis/genai-toolbox/internal/sources/mysql"
_ "github.com/googleapis/genai-toolbox/internal/sources/neo4j"
@@ -190,7 +212,7 @@ func NewCommand(opts ...Option) *Command {
flags.BoolVar(&cmd.cfg.TelemetryGCP, "telemetry-gcp", false, "Enable exporting directly to Google Cloud Monitoring.")
flags.StringVar(&cmd.cfg.TelemetryOTLP, "telemetry-otlp", "", "Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318')")
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'postgres', 'spanner', 'spanner-postgres'.")
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres-admin', alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'firestore', 'mssql', 'mysql', 'postgres', 'spanner', 'spanner-postgres'.")
flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.")
flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.")

View File

@@ -1161,12 +1161,16 @@ func TestSingleEdit(t *testing.T) {
}
func TestPrebuiltTools(t *testing.T) {
alloydb_admin_config, _ := prebuiltconfigs.Get("alloydb-postgres-admin")
alloydb_config, _ := prebuiltconfigs.Get("alloydb-postgres")
bigquery_config, _ := prebuiltconfigs.Get("bigquery")
cloudsqlpg_config, _ := prebuiltconfigs.Get("cloud-sql-postgres")
cloudsqlmysql_config, _ := prebuiltconfigs.Get("cloud-sql-mysql")
cloudsqlmssql_config, _ := prebuiltconfigs.Get("cloud-sql-mssql")
firestoreconfig, _ := prebuiltconfigs.Get("firestore")
mysql_config, _ := prebuiltconfigs.Get("mysql")
mssql_config, _ := prebuiltconfigs.Get("mssql")
looker_config, _ := prebuiltconfigs.Get("looker")
postgresconfig, _ := prebuiltconfigs.Get("postgres")
spanner_config, _ := prebuiltconfigs.Get("spanner")
spannerpg_config, _ := prebuiltconfigs.Get("spanner-postgres")
@@ -1179,6 +1183,16 @@ func TestPrebuiltTools(t *testing.T) {
in []byte
wantToolset server.ToolsetConfigs
}{
{
name: "alloydb postgres admin prebuilt tools",
in: alloydb_admin_config,
wantToolset: server.ToolsetConfigs{
"alloydb-postgres-admin-tools": tools.ToolsetConfig{
Name: "alloydb-postgres-admin-tools",
ToolNames: []string{"alloydb-create-cluster", "alloydb-operations-get", "alloydb-create-instance"},
},
},
},
{
name: "alloydb prebuilt tools",
in: alloydb_config,
@@ -1239,6 +1253,36 @@ func TestPrebuiltTools(t *testing.T) {
},
},
},
{
name: "mysql prebuilt tools",
in: mysql_config,
wantToolset: server.ToolsetConfigs{
"mysql-database-tools": tools.ToolsetConfig{
Name: "mysql-database-tools",
ToolNames: []string{"execute_sql", "list_tables"},
},
},
},
{
name: "mssql prebuilt tools",
in: mssql_config,
wantToolset: server.ToolsetConfigs{
"mssql-database-tools": tools.ToolsetConfig{
Name: "mssql-database-tools",
ToolNames: []string{"execute_sql", "list_tables"},
},
},
},
{
name: "looker prebuilt tools",
in: looker_config,
wantToolset: server.ToolsetConfigs{
"looker-tools": tools.ToolsetConfig{
Name: "looker-tools",
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "get_looks", "run_look"},
},
},
},
{
name: "postgres prebuilt tools",
in: postgresconfig,

View File

@@ -1,7 +1,7 @@
---
title: "Configuration"
type: docs
weight: 4
weight: 6
description: >
How to configure Toolbox's tools.yaml file.
---

View File

@@ -421,6 +421,7 @@ import (
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/invopop/jsonschema"
)
@@ -442,33 +443,12 @@ func main() {
log.Fatalf("Failed to load tools: %v", err)
}
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
// Convert the tool using the tbgenkit package
// Use this tool with Genkit Go
genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to fetch inputSchema: %v", err)
log.Fatalf("Failed to convert tool: %v\n", err)
}
var schema *jsonschema.Schema
_ = json.Unmarshal(inputschema, &schema)
executeFn := func(ctx *ai.ToolContext, input any) (string, error) {
result, err := tool.Invoke(ctx, input.(map[string]any))
if err != nil {
// Propagate errors from the tool invocation.
return "", err
}
return result.(string), nil
}
// Use this tool with Genkit Go
genkitTool := genkit.DefineToolWithInputSchema(
g,
tool.Name(),
tool.Description(),
schema,
executeFn,
)
}
{{< /highlight >}}
@@ -585,4 +565,4 @@ func main() {
For more detailed instructions on using the Toolbox Go SDK, see the
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/README.md).
For end-to-end samples on using the Toolbox Go SDK with orchestration frameworks, see the [project's samples](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/core/samples)
For end-to-end samples on using the Toolbox Go SDK with orchestration frameworks, see the [project's samples](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/core/samples)

View File

@@ -0,0 +1,921 @@
---
title: "Go Quickstart (Local)"
type: docs
weight: 4
description: >
How to get started running Toolbox locally with [Go](https://github.com/googleapis/mcp-toolbox-sdk-go), PostgreSQL, and orchestration frameworks such as [LangChain Go](https://tmc.github.io/langchaingo/docs/), [GenkitGo](https://genkit.dev/go/docs/get-started-go/), [Go GenAI](https://github.com/googleapis/go-genai) and [OpenAI Go](https://github.com/openai/openai-go).
---
## Before you begin
This guide assumes you have already done the following:
1. Installed [Go (v1.24.2 or higher)].
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
### Cloud Setup (Optional)
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using Gemini or PaLM models), follow these one-time setup steps:
1. [Install the Google Cloud CLI]
1. [Set up Application Default Credentials (ADC)]
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[Go (v1.24.2 or higher)]: https://go.dev/doc/install
[install-postgres]: https://www.postgresql.org/download/
[Install the Google Cloud CLI]: https://cloud.google.com/sdk/docs/install
[Set up Application Default Credentials (ADC)]: https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
## Step 1: Set up your database
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create a table using the following command:
```sql
CREATE TABLE hotels(
id INTEGER NOT NULL PRIMARY KEY,
name VARCHAR NOT NULL,
location VARCHAR NOT NULL,
price_tier VARCHAR NOT NULL,
checkin_date DATE NOT NULL,
checkout_date DATE NOT NULL,
booked BIT NOT NULL
);
```
1. Insert data into the table.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```
1. End the database session:
```bash
\q
```
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox, configure our tools in a
`tools.yaml`, and then run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.9.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Write the following into a `tools.yaml` file. Be sure to update any fields
such as `user`, `password`, or `database` that you may have customized in the
previous step.
{{< notice tip >}}
In practice, use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
```yaml
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: ${USER_NAME}
password: ${PASSWORD}
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
```
For more info on tools, check out the `Resources` section of the docs.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
{{< /notice >}}
## Step 3: Connect your agent to Toolbox
In this section, we will write and run an agent that will load the Tools
from Toolbox.
1. Initialize a go module:
```bash
go mod init main
```
1. In a new terminal, install the [SDK](https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go).
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
1. Create a new file named `hotelagent.go` and copy the following code to create an agent:
{{< tabpane persist=header >}}
{{< tab header="LangChain Go" lang="go" >}}
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
"github.com/tmc/langchaingo/llms/googleai"
)
// ConvertToLangchainTool converts a generic core.ToolboxTool into a LangChainGo llms.Tool.
func ConvertToLangchainTool(toolboxTool *core.ToolboxTool) llms.Tool {
// Fetch the tool's input schema
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return llms.Tool{}
}
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, &paramsSchema)
// Convert into LangChain's llms.Tool
return llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
genaiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
ctx := context.Background()
// Initialize the Google AI client (LLM).
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-1.5-flash"))
if err != nil {
log.Fatalf("Failed to create Google AI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
langchainTools := make([]llms.Tool, len(tools))
// Convert the loaded ToolboxTools into the format LangChainGo requires.
for i, tool := range tools {
langchainTools[i] = ConvertToLangchainTool(tool)
toolsMap[tool.Name()] = tool
}
// Start the conversation history.
messageHistory := []llms.MessageContent{
llms.TextParts(llms.ChatMessageTypeSystem, systemPrompt),
}
for _, query := range queries {
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeHuman, query))
// Make the first call to the LLM, making it aware of the tool.
resp, err := llm.GenerateContent(ctx, messageHistory, llms.WithTools(langchainTools))
if err != nil {
log.Fatalf("LLM call failed: %v", err)
}
respChoice := resp.Choices[0]
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI, respChoice.Content)
for _, tc := range respChoice.ToolCalls {
assistantResponse.Parts = append(assistantResponse.Parts, tc)
}
messageHistory = append(messageHistory, assistantResponse)
// Process each tool call requested by the model.
for _, tc := range respChoice.ToolCalls {
toolName := tc.FunctionCall.Name
tool := toolsMap[toolName]
var args map[string]any
if err := json.Unmarshal([]byte(tc.FunctionCall.Arguments), &args); err != nil {
log.Fatalf("Failed to unmarshal arguments for tool '%s': %v", toolName, err)
}
toolResult, err := tool.Invoke(ctx, args)
if err != nil {
log.Fatalf("Failed to execute tool '%s': %v", toolName, err)
}
if toolResult == "" || toolResult == nil {
toolResult = "Operation completed successfully with no specific return value."
}
// Create the tool call response message and add it to the history.
toolResponse := llms.MessageContent{
Role: llms.ChatMessageTypeTool,
Parts: []llms.ContentPart{
llms.ToolCallResponse{
Name: toolName,
Content: fmt.Sprintf("%v", toolResult),
},
},
}
messageHistory = append(messageHistory, toolResponse)
}
finalResp, err := llm.GenerateContent(ctx, messageHistory)
if err != nil {
log.Fatalf("Final LLM call failed after tool execution: %v", err)
}
// Add the final textual response from the LLM to the history
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeAI, finalResp.Choices[0].Content))
fmt.Println(finalResp.Choices[0].Content)
}
}
{{< /tab >}}
{{< tab header="Genkit Go" lang="go" >}}
package main
import (
"context"
"fmt"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/firebase/genkit/go/plugins/googlegenai"
)
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
ctx := context.Background()
// Create Toolbox Client
toolboxClient, err := core.NewToolboxClient("http://127.0.0.1:5000")
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
// Initialize Genkit
g, err := genkit.Init(ctx,
genkit.WithPlugins(&googlegenai.GoogleAI{}),
genkit.WithDefaultModel("googleai/gemini-1.5-flash"),
)
if err != nil {
log.Fatalf("Failed to init genkit: %v\n", err)
}
// Create a conversation history
conversationHistory := []*ai.Message{
ai.NewSystemTextMessage(systemPrompt),
}
// Convert your tool to a Genkit tool.
genkitTools := make([]ai.Tool, len(tools))
for i, tool := range tools {
newTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
genkitTools[i] = newTool
}
toolRefs := make([]ai.ToolRef, len(genkitTools))
for i, tool := range genkitTools {
toolRefs[i] = tool
}
for _, query := range queries {
conversationHistory = append(conversationHistory, ai.NewUserTextMessage(query))
response, err := genkit.Generate(ctx, g,
ai.WithMessages(conversationHistory...),
ai.WithTools(toolRefs...),
ai.WithReturnToolRequests(true),
)
if err != nil {
log.Fatalf("%v\n", err)
}
conversationHistory = append(conversationHistory, response.Message)
parts := []*ai.Part{}
for _, req := range response.ToolRequests() {
tool := genkit.LookupTool(g, req.Name)
if tool == nil {
log.Fatalf("tool %q not found", req.Name)
}
output, err := tool.RunRaw(ctx, req.Input)
if err != nil {
log.Fatalf("tool %q execution failed: %v", tool.Name(), err)
}
parts = append(parts,
ai.NewToolResponsePart(&ai.ToolResponse{
Name: req.Name,
Ref: req.Ref,
Output: output,
}))
}
if len(parts) > 0 {
resp, err := genkit.Generate(ctx, g,
ai.WithMessages(append(response.History(), ai.NewMessage(ai.RoleTool, nil, parts...))...),
ai.WithTools(toolRefs...),
)
if err != nil {
log.Fatal(err)
}
fmt.Println("\n", resp.Text())
conversationHistory = append(conversationHistory, resp.Message)
} else {
fmt.Println("\n", response.Text())
}
}
}
{{< /tab >}}
{{< tab header="Go GenAI" lang="go" >}}
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
// ConvertToGenaiTool translates a ToolboxTool into the genai.FunctionDeclaration format.
func ConvertToGenaiTool(toolboxTool *core.ToolboxTool) *genai.Tool {
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return &genai.Tool{}
}
var paramsSchema *genai.Schema
_ = json.Unmarshal(inputschema, &paramsSchema)
// First, create the function declaration.
funcDeclaration := &genai.FunctionDeclaration{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
}
// Then, wrap the function declaration in a genai.Tool struct.
return &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
func printResponse(resp *genai.GenerateContentResponse) {
for _, cand := range resp.Candidates {
if cand.Content != nil {
for _, part := range cand.Content.Parts {
fmt.Println(part.Text)
}
}
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
apiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
// Initialize the Google GenAI client using the explicit ClientConfig.
client, err := genai.NewClient(ctx, &genai.ClientConfig{
APIKey: apiKey,
})
if err != nil {
log.Fatalf("Failed to create Google GenAI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
genAITools := make([]*genai.Tool, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
genAITools[i] = ConvertToGenaiTool(tool)
toolsMap[tool.Name()] = tool
}
// Set up the generative model with the available tool.
modelName := "gemini-2.0-flash"
// Create the initial content prompt for the model.
messageHistory := []*genai.Content{
genai.NewContentFromText(systemPrompt, genai.RoleUser),
}
config := &genai.GenerateContentConfig{
Tools: genAITools,
ToolConfig: &genai.ToolConfig{
FunctionCallingConfig: &genai.FunctionCallingConfig{
Mode: genai.FunctionCallingConfigModeAny,
},
},
}
for _, query := range queries {
messageHistory = append(messageHistory, genai.NewContentFromText(query, genai.RoleUser))
genContentResp, err := client.Models.GenerateContent(ctx, modelName, messageHistory, config)
if err != nil {
log.Fatalf("LLM call failed for query '%s': %v", query, err)
}
if len(genContentResp.Candidates) > 0 && genContentResp.Candidates[0].Content != nil {
messageHistory = append(messageHistory, genContentResp.Candidates[0].Content)
}
functionCalls := genContentResp.FunctionCalls()
toolResponseParts := []*genai.Part{}
for _, fc := range functionCalls {
toolToInvoke, found := toolsMap[fc.Name]
if !found {
log.Fatalf("Tool '%s' not found in loaded tools map. Check toolset configuration.", fc.Name)
}
toolResult, invokeErr := toolToInvoke.Invoke(ctx, fc.Args)
if invokeErr != nil {
log.Fatalf("Failed to execute tool '%s': %v", fc.Name, invokeErr)
}
// Enhanced Tool Result Handling (retained to prevent nil issues)
toolResultString := ""
if toolResult != nil {
jsonBytes, marshalErr := json.Marshal(toolResult)
if marshalErr == nil {
toolResultString = string(jsonBytes)
} else {
toolResultString = fmt.Sprintf("%v", toolResult)
}
}
responseMap := map[string]any{"result": toolResultString}
toolResponseParts = append(toolResponseParts, genai.NewPartFromFunctionResponse(fc.Name, responseMap))
}
// Add all accumulated tool responses for this turn to the message history.
toolResponseContent := genai.NewContentFromParts(toolResponseParts, "function")
messageHistory = append(messageHistory, toolResponseContent)
finalResponse, err := client.Models.GenerateContent(ctx, modelName, messageHistory, &genai.GenerateContentConfig{})
if err != nil {
log.Fatalf("Error calling GenerateContent (with function result): %v", err)
}
printResponse(finalResponse)
// Add the final textual response from the LLM to the history
if len(finalResponse.Candidates) > 0 && finalResponse.Candidates[0].Content != nil {
messageHistory = append(messageHistory, finalResponse.Candidates[0].Content)
}
}
}
{{< /tab >}}
{{< tab header="OpenAI Go" lang="go" >}}
package main
import (
"context"
"encoding/json"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
// ConvertToOpenAITool converts a ToolboxTool into the go-openai library's Tool format.
func ConvertToOpenAITool(toolboxTool *core.ToolboxTool) openai.ChatCompletionToolParam {
// Get the input schema
jsonSchemaBytes, err := toolboxTool.InputSchema()
if err != nil {
return openai.ChatCompletionToolParam{}
}
// Unmarshal the JSON bytes into FunctionParameters
var paramsSchema openai.FunctionParameters
if err := json.Unmarshal(jsonSchemaBytes, &paramsSchema); err != nil {
return openai.ChatCompletionToolParam{}
}
// Create and return the final tool parameter struct.
return openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: toolboxTool.Name(),
Description: openai.String(toolboxTool.Description()),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
toolboxURL := "http://localhost:5000"
openAIClient := openai.NewClient()
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tool : %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
openAITools := make([]openai.ChatCompletionToolParam, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
// Convert the Toolbox tool into the openAI FunctionDeclaration format.
openAITools[i] = ConvertToOpenAITool(tool)
// Add tool to a map for lookup later
toolsMap[tool.Name()] = tool
}
params := openai.ChatCompletionNewParams{
Messages: []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage(systemPrompt),
},
Tools: openAITools,
Seed: openai.Int(0),
Model: openai.ChatModelGPT4o,
}
for _, query := range queries {
params.Messages = append(params.Messages, openai.UserMessage(query))
// Make initial chat completion request
completion, err := openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
toolCalls := completion.Choices[0].Message.ToolCalls
// Return early if there are no tool calls
if len(toolCalls) == 0 {
log.Println("No function call")
}
// If there is a was a function call, continue the conversation
params.Messages = append(params.Messages, completion.Choices[0].Message.ToParam())
for _, toolCall := range toolCalls {
toolName := toolCall.Function.Name
toolToInvoke := toolsMap[toolName]
var args map[string]any
err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args)
if err != nil {
panic(err)
}
result, err := toolToInvoke.Invoke(ctx, args)
if err != nil {
log.Fatal("Could not invoke tool", err)
}
params.Messages = append(params.Messages, openai.ToolMessage(result.(string), toolCall.ID))
}
completion, err = openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
params.Messages = append(params.Messages, openai.AssistantMessage(query))
println("\n", completion.Choices[0].Message.Content)
}
}
{{< /tab >}}
{{< /tabpane >}}
1. Ensure all dependencies are installed:
```sh
go mod tidy
```
1. Run your agent, and observe the results:
```sh
go run hotelagent.go
```
{{< notice info >}}
For more information, visit the [Go SDK repo](https://github.com/googleapis/mcp-toolbox-sdk-go).
{{</ notice >}}

View File

@@ -3,7 +3,7 @@ title: "JS Quickstart (Local)"
type: docs
weight: 3
description: >
How to get started running Toolbox locally with [JavaScript](https://github.com/googleapis/mcp-toolbox-sdk-python), PostgreSQL, and orchestration frameworks such as [LangChain](https://js.langchain.com/docs/introduction/) and [GenkitJS](https://genkit.dev/docs/get-started/).
How to get started running Toolbox locally with [JavaScript](https://github.com/googleapis/mcp-toolbox-sdk-js), PostgreSQL, and orchestration frameworks such as [LangChain](https://js.langchain.com/docs/introduction/), [GenkitJS](https://genkit.dev/docs/get-started/), and [LlamaIndex](https://ts.llamaindex.ai/).
---
## Before you begin
@@ -285,7 +285,7 @@ from Toolbox.
1. In a new terminal, install the [SDK](https://www.npmjs.com/package/@toolbox-sdk/core).
```bash
npm install langchain @toolbox-sdk/core
npm install @toolbox-sdk/core
```
1. Install other required dependencies
@@ -297,6 +297,9 @@ npm install langchain @langchain/google-vertexai
{{< tab header="GenkitJS" lang="bash" >}}
npm install genkit @genkit-ai/vertexai
{{< /tab >}}
{{< tab header="LlamaIndex" lang="bash" >}}
npm install llamaindex @llamaindex/google @llamaindex/workflow
{{< /tab >}}
{{< /tabpane >}}
1. Create a new file named `hotelAgent.js` and copy the following code to create an agent:
@@ -477,6 +480,91 @@ async function run() {
run();
{{< /tab >}}
{{< tab header="LlamaIndex" lang="js" >}}
import { gemini, GEMINI_MODEL } from "@llamaindex/google";
import { agent } from "@llamaindex/workflow";
import { createMemory, staticBlock, tool } from "llamaindex";
import { ToolboxClient } from "@toolbox-sdk/core";
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
process.env.GOOGLE_API_KEY = 'your-api-key'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking and cancellations.
When the user searches for a hotel, mention its name, id, location and price tier.
Always mention hotel ids while performing any searches — this is very important for operations.
For any bookings or cancellations, please provide the appropriate confirmation.
Update check-in or check-out dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
// Connect to MCP Toolbox
const client = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await client.loadToolset("my-toolset");
const tools = toolboxTools.map((toolboxTool) => {
return tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool,
});
});
// Initialize LLM
const llm = gemini({
model: GEMINI_MODEL.GEMINI_2_0_FLASH,
apiKey: process.env.GOOGLE_API_KEY,
});
const memory = createMemory({
memoryBlocks: [
staticBlock({
content: prompt,
}),
],
});
// Create the Agent
const myAgent = agent({
tools: tools,
llm,
memory,
systemPrompt: prompt,
});
for (const query of queries) {
const result = await myAgent.run(query);
const output = result.data.result;
console.log(`\nUser: ${query}`);
if (typeof output === "string") {
console.log(output.trim());
} else if (typeof output === "object" && "text" in output) {
console.log(output.text.trim());
} else {
console.log(JSON.stringify(output));
}
}
//You may observe some extra logs during execution due to the run method provided by Llama.
console.log("Agent run finished.");
}
main();
{{< /tab >}}
{{< /tabpane >}}
1. Run your agent, and observe the results:

View File

@@ -1,9 +1,9 @@
---
title: "Quickstart (MCP)"
type: docs
weight: 3
weight: 5
description: >
How to get started running Toolbox locally with MCP Inspector.
How to get started running Toolbox locally with MCP Inspector.
---
## Overview

View File

@@ -0,0 +1,306 @@
---
title: "AlloyDB Admin API using MCP"
type: docs
weight: 2
description: >
Create your AlloyDB database with MCP Toolbox.
---
This guide covers how to use [MCP Toolbox for Databases][toolbox] to create AlloyDB clusters and instances from IDE enabling their E2E journey.
- [Cursor][cursor]
- [Windsurf][windsurf] (Codium)
- [Visual Studio Code ][vscode] (Copilot)
- [Cline][cline] (VS Code extension)
- [Claude desktop][claudedesktop]
- [Claude code][claudecode]
- [Gemini CLI][geminicli]
- [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
## Before you begin
1. In the Google Cloud console, on the [project selector page](https://console.cloud.google.com/projectselector2/home/dashboard), select or create a Google Cloud project.
1. [Make sure that billing is enabled for your Google Cloud project](https://cloud.google.com/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct binary](https://github.com/googleapis/genai-toolbox/releases) corresponding to your OS and CPU architecture. You are required to use Toolbox version V0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. You should see a green active status after the server is successfully connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor Settings > MCP**. You should see a green active status after the server is successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist) extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to AlloyDB using MCP. Try asking your AI assistant to create a database, cluster or instance.
The following tools are available to the LLM:
1. **alloydb-create-cluster**: creates alloydb cluster
1. **alloydb-create-instance**: creates alloydb instance (PRIMARY, READ_POOL or SECONDARY)
1. **alloydb-get-operation**: polls on operations API until the operation is done.
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
## Connect to your Data
After setting up an AlloyDB cluster and instance, you can [connect your IDE to the database](https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox).

View File

@@ -0,0 +1,313 @@
---
title: "Firestore using MCP"
type: docs
weight: 2
description: >
Connect your IDE to Firestore using Toolbox.
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Firestore. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Firestore instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client]
## Set up Firestore
1. Create or select a Google Cloud project.
* [Create a new project](https://cloud.google.com/resource-manager/docs/creating-managing-projects)
* [Select an existing project](https://cloud.google.com/resource-manager/docs/creating-managing-projects#identifying_projects)
1. [Enable the Firestore API](https://console.cloud.google.com/apis/library/firestore.googleapis.com) for your project.
1. [Create a Firestore database](https://cloud.google.com/firestore/docs/create-database-web-mobile-client-library) if you haven't already.
1. Set up authentication for your local environment.
* [Install gcloud CLI](https://cloud.google.com/sdk/docs/install)
* Run `gcloud auth application-default login` to authenticate
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
V0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Add the following configuration, replace the environment variables with your values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist) extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Add the following configuration, replace the environment variables with your values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Firestore using MCP. Try asking your AI
assistant to list collections, get documents, query collections, or manage
security rules.
The following tools are available to the LLM:
1. **firestore-get-documents**: Gets multiple documents from Firestore by their paths
1. **firestore-list-collections**: List Firestore collections for a given parent path
1. **firestore-delete-documents**: Delete multiple documents from Firestore
1. **firestore-query-collection**: Query documents from a collection with filtering, ordering, and limit options
1. **firestore-get-rules**: Retrieves the active Firestore security rules for the current project
1. **firestore-validate-rules**: Validates Firestore security rules syntax and errors
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

View File

@@ -0,0 +1,298 @@
---
title: "Looker using MCP"
type: docs
weight: 2
description: >
Connect your IDE to Looker using Toolbox.
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Looker instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
## Set up Looker
1. Get a Looker Client ID and Client Secret. Follow the directions
[here](https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk).
1. Have the base URL of your Looker instance available. It is likely
something like `https://looker.example.com`. In some cases the API is
listening at a different port, and you will need to use
`https://looker.example.com:19999` instead.
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
v0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox>
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox>
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox>
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox.exe>
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "/PATH/TO/toolbox",
"args": [
"--stdio",
"--prebuilt",
"looker"
],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. Restart Claude Code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "/PATH/TO/toolbox",
"args": [
"--stdio",
"--prebuilt",
"looker"
],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "/PATH/TO/toolbox",
"args": [
"--stdio",
"--prebuilt",
"looker"
],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "/PATH/TO/toolbox",
"args": [
"--stdio",
"--prebuilt",
"looker"
],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. Open [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "/PATH/TO/toolbox",
"args": [
"--stdio",
"--prebuilt",
"looker"
],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "/PATH/TO/toolbox",
"args": [
"--stdio",
"--prebuilt",
"looker"
],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Looker using MCP. Try asking your AI
assistant to list models, explores, dimensions, and measures. Run a
query, retrieve the SQL for a query, and run a saved Look.
The following tools are available to the LLM:
1. **get_models**: list the LookML models in Looker
1. **get_explores**: list the explores in a given model
1. **get_dimensions**: list the dimensions in a given explore
1. **get_measures**: list the measures in a given explore
1. **get_filters**: list the filters in a given explore
1. **get_parameters**: list the parameters in a given explore
1. **query**: Run a query
1. **query_sql**: Return the SQL generated by Looker for a query
1. **get_looks**: Return the saved Looks that match a title or description
1. **run_look**: Run a saved Look and return the data
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

View File

@@ -22,6 +22,17 @@ cluster][alloydb-free-trial].
[alloydb-docs]: https://cloud.google.com/alloydb/docs
[alloydb-free-trial]: https://cloud.google.com/alloydb/docs/create-free-trial-cluster
## Available Tools
- [`alloydb-ai-nl`](../tools/alloydbainl/alloydb-ai-nl.md)
Use natural language queries on AlloyDB, powered by AlloyDB AI.
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
Execute SQL queries as prepared statements in AlloyDB Postgres.
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
Run parameterized SQL statements in AlloyDB Postgres.
## Requirements
### IAM Permissions

View File

@@ -34,6 +34,26 @@ avoiding full table scans or complex filters.
[bigquery-quickstart-cli]: https://cloud.google.com/bigquery/docs/quickstarts/quickstart-command-line
[bigquery-googlesql]: https://cloud.google.com/bigquery/docs/reference/standard-sql/
## Available Tools
- [`bigquery-sql`](../tools/bigquery/bigquery-sql.md)
Run SQL queries directly against BigQuery datasets.
- [`bigquery-execute-sql`](../tools/bigquery/bigquery-execute-sql.md)
Execute structured queries using parameters.
- [`bigquery-get-dataset-info`](../tools/bigquery/bigquery-get-dataset-info.md)
Retrieve metadata for a specific dataset.
- [`bigquery-get-table-info`](../tools/bigquery/bigquery-get-table-info.md)
Retrieve metadata for a specific table.
- [`bigquery-list-dataset-ids`](../tools/bigquery/bigquery-list-dataset-ids.md)
List available dataset IDs.
- [`bigquery-list-table-ids`](../tools/bigquery/bigquery-list-table-ids.md)
List tables in a given dataset.
## Requirements
### IAM Permissions

View File

@@ -32,6 +32,11 @@ such as avoiding full table scans or complex filters.
[bigtable-googlesql]:
https://cloud.google.com/bigtable/docs/googlesql-overview
## Available Tools
- [`bigtable-sql`](../tools/bigtable/bigtable-sql.md)
Run SQL-like queries over Bigtable rows.
## Requirements
### IAM Permissions

View File

@@ -19,6 +19,14 @@ to a database by following these instructions][csql-mssql-connect].
[csql-mssql-docs]: https://cloud.google.com/sql/docs/sqlserver
[csql-mssql-connect]: https://cloud.google.com/sql/docs/sqlserver/connect-overview
## Available Tools
- [`mssql-sql`](../tools/mssql/mssql-sql.md)
Execute pre-defined SQL Server queries with placeholder parameters.
- [`mssql-execute-sql`](../tools/mssql/mssql-execute-sql.md)
Run parameterized SQL Server queries in Cloud SQL for SQL Server.
## Requirements
### IAM Permissions

View File

@@ -20,6 +20,14 @@ to a database by following these instructions][csql-mysql-quickstart].
[csql-mysql-docs]: https://cloud.google.com/sql/docs/mysql
[csql-mysql-quickstart]: https://cloud.google.com/sql/docs/mysql/connect-instance-local-computer
## Available Tools
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
Execute pre-defined prepared SQL queries in MySQL.
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
Run parameterized SQL queries in Cloud SQL for MySQL.
## Requirements
### IAM Permissions

View File

@@ -20,6 +20,14 @@ to a database by following these instructions][csql-pg-quickstart].
[csql-pg-docs]: https://cloud.google.com/sql/docs/postgres
[csql-pg-quickstart]: https://cloud.google.com/sql/docs/postgres/connect-instance-local-computer
## Available Tools
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
Execute SQL queries as prepared statements in PostgreSQL.
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
Run parameterized SQL statements in PostgreSQL.
## Requirements
### IAM Permissions

View File

@@ -11,6 +11,11 @@ description: >
A `couchbase` source establishes a connection to a Couchbase database cluster,
allowing tools to execute SQL queries against it.
## Available Tools
- [`couchbase-sql`](../tools/couchbase/couchbase-sql.md)
Run SQL++ statements on Couchbase with parameterized input.
## Example
```yaml

View File

@@ -21,6 +21,11 @@ Dgraph Cloud. If you're new to Dgraph, the fastest way to get started is to
[dgraph-docs]: https://dgraph.io/docs
[dgraph-login]: https://cloud.dgraph.io/login
## Available Tools
- [`dgraph-dql`](../tools/dgraph/dgraph-dql.md)
Run DQL (Dgraph Query Language) queries.
## Requirements
### Database User

View File

@@ -35,6 +35,7 @@ the IAM identity has been given the correct IAM permissions for accessing
Firestore. Common roles include:
- `roles/datastore.user` - Read and write access to Firestore
- `roles/datastore.viewer` - Read-only access to Firestore
- `roles/firebaserules.admin` - Full management of Firebase Security Rules for Firestore. This role is required for operations that involve creating, updating, or managing Firestore security rules (see [Firebase Security Rules roles][firebaserules-roles])
See [Firestore access control][firestore-iam] for more information on
applying IAM permissions and roles to an identity.
@@ -43,6 +44,7 @@ applying IAM permissions and roles to an identity.
[adc]: https://cloud.google.com/docs/authentication#adc
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
[firestore-iam]: https://cloud.google.com/firestore/docs/security/iam
[firebaserules-roles]: https://cloud.google.com/iam/docs/roles-permissions/firebaserules
### Database Selection

View File

@@ -13,6 +13,11 @@ The HTTP Source allows Toolbox to retrieve data from arbitrary HTTP
endpoints. This enables Generative AI applications to access data from web APIs
and other HTTP-accessible resources.
## Available Tools
- [`http`](../tools/http/http.md)
Make HTTP requests to REST APIs or other web services.
## Example
```yaml

View File

@@ -0,0 +1,64 @@
---
title: "Looker"
type: docs
weight: 1
description: >
Looker is a business intelligence tool that also provides a semantic layer.
---
## About
[Looker][looker-docs] is a web based business intelligence and data management
tool that provides a semantic layer to facilitate querying. It can be deployed
in the cloud, on GCP, or on premises.
[looker-docs]: https://cloud.google.com/looker/docs
## Requirements
### Database User
This source only uses API authentication. You will need to
[create an API user][looker-user] to login to Looker.
[looker-user]: https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk
## Example
```yaml
sources:
my-looker-source:
kind: looker
base_url: http://looker.example.com
client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET}
verify_ssl: true
timeout: 600s
```
The Looker base url will look like "https://looker.example.com", don't include a
trailing "/". In some cases, especially if your Looker is deployed on-premises,
you may need to add the API port numner like "https://looker.example.com:19999".
Verify ssl should almost always be "true" (all lower case) unless you are using
a self-signed ssl certificate for the Looker server. Anything other than "true"
will be interpretted as false.
The client id and client secret are seemingly random character sequences
assigned by the looker server.
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
| ------------- | :------: | :----------: | ----------------------------------------------------------------------------------------- |
| kind | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /). |
| client_id | string | true | The client id assigned by Looker. |
| client_secret | string | true | The client secret assigned by Looker. |
| verify_ssl | string | true | Whether to check the ssl certificate of the server. |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |

View File

@@ -0,0 +1,33 @@
---
title: "MongoDB"
type: docs
weight: 1
description: >
MongoDB is a no-sql data platform that can not only serve general purpose data requirements also perform VectorSearch where both operational data and embeddings used of search can reside in same document.
---
## About
[MongoDB][mongodb-docs] is a popular NoSQL database that stores data in flexible, JSON-like documents, making it easy to develop and scale applications.
[mongodb-docs]: https://www.mongodb.com/docs/atlas/getting-started/
## Example
```yaml
sources:
my-mongodb:
kind: mongodb
uri: "mongodb+srv://username:password@host.mongodb.net"
database: sample_mflix
```
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------|
| kind | string | true | Must be "mongodb". |
| uri | string | true | connection string to connect to MongoDB |
| database | string | true | Name of the mongodb database to connect to (e.g. "sample_mflix"). |

View File

@@ -15,6 +15,14 @@ amount of data through a structured format.
[mssql-docs]: https://www.microsoft.com/en-us/sql-server
## Available Tools
- [`mssql-sql`](../tools/mssql/mssql-sql.md)
Execute pre-defined SQL Server queries with placeholder parameters.
- [`mssql-execute-sql`](../tools/mssql/mssql-execute-sql.md)
Run parameterized SQL Server queries in SQL Server.
## Requirements
### Database User

View File

@@ -14,6 +14,14 @@ reliability, performance, and ease of use.
[mysql-docs]: https://www.mysql.com/
## Available Tools
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
Execute pre-defined prepared SQL queries in MySQL.
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
Run parameterized SQL queries in MySQL.
## Requirements
### Database User

View File

@@ -7,12 +7,19 @@ description: >
---
## About
[Neo4j][neo4j-docs] is a powerful, open source graph database system with over
15 years of active development that has earned it a strong reputation for
reliability, feature robustness, and performance.
[neo4j-docs]: https://neo4j.com/docs
## Available Tools
- [`neo4j-cypher`](../tools/neo4j/neo4j-cypher.md)
Run Cypher queries against your Neo4j graph database.
## Requirements
### Database User

View File

@@ -15,6 +15,14 @@ reputation for reliability, feature robustness, and performance.
[pg-docs]: https://www.postgresql.org/
## Available Tools
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
Execute SQL queries as prepared statements in PostgreSQL.
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
Run parameterized SQL statements in PostgreSQL.
## Requirements
### Database User

View File

@@ -18,6 +18,11 @@ geospatial indexes with radius queries.
If you are new to Redis, you can find installation and getting started guides on
the [official Redis website](https://redis.io/docs/getting-started/).
## Available Tools
- [`redis`](../tools/redis/redis.md)
Run Redis commands and interact with key-value pairs.
## Requirements
### Redis

View File

@@ -23,6 +23,14 @@ the Google Cloud console][spanner-quickstart].
[spanner-quickstart]:
https://cloud.google.com/spanner/docs/create-query-database-console
## Available Tools
- [`spanner-sql`](../tools/spanner/spanner-sql.md)
Execute SQL on Google Cloud Spanner.
- [`spanner-execute-sql`](../tools/spanner/spanner-execute-sql.md)
Run structured and parameterized queries on Spanner.
## Requirements
### IAM Permissions

View File

@@ -22,6 +22,11 @@ SQLite has the following notable characteristics:
- Zero-configuration - no setup or administration needed
- Transactional with ACID properties
## Available Tools
- [`sqlite-sql`](../tools/sqlite/sqlite-sql.md)
Run SQL queries against a local SQLite database.
## Requirements
### Database File

View File

@@ -19,6 +19,11 @@ indexes with radius queries.
If you're new to Valkey, you can find installation and getting started guides on
the [official Valkey website](https://valkey.io/topics/quickstart/).
## Available Tools
- [`valkey`](../tools/valkey/valkey.md)
Issue Valkey (Redis-compatible) commands.
## Example
```yaml

View File

@@ -121,7 +121,7 @@ Items in array should not have a `default` or `required` value. If provided, it
The map type is a collection of key-value pairs. It can be configured in two ways:
- Generic Map: By default, it accepts values of any primitive type (string, number, boolean), allowing for mixed data.
- Generic Map: By default, it accepts values of any primitive type (string, integer, float, boolean), allowing for mixed data.
- Typed Map: By setting the valueType field, you can enforce that all values
within the map must be of the same specified type.

View File

@@ -1,7 +1,11 @@
---
title: firestore-validate-rules
weight: 6
date: 2025-01-07
title: "firestore-validate-rules"
type: docs
weight: 1
description: >
A "firestore-validate-rules" tool validates Firestore security rules syntax and semantic correctness without deploying them. It provides detailed error reporting with source positions and code snippets.
aliases:
- /resources/tools/firestore-validate-rules
---
## Overview
@@ -34,20 +38,20 @@ The tool returns a `ValidationResult` object containing:
```json
{
"valid": boolean, // Whether the rules are valid
"issueCount": number, // Number of issues found
"formattedIssues": string, // Human-readable formatted issues
"rawIssues": [ // Array of raw issue objects
"valid": "boolean",
"issueCount": "number",
"formattedIssues": "string",
"rawIssues": [
{
"sourcePosition": {
"fileName": string,
"line": number,
"column": number,
"currentOffset": number,
"endOffset": number
"fileName": "string",
"line": "number",
"column": "number",
"currentOffset": "number",
"endOffset": "number"
},
"description": string,
"severity": string // e.g., "ERROR", "WARNING"
"description": "string",
"severity": "string"
}
]
}

View File

@@ -0,0 +1,7 @@
---
title: "Looker"
type: docs
weight: 1
description: >
Tools that work with Looker Sources.
---

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-dimensions"
type: docs
weight: 1
description: >
A "looker-get-dimensions" tool returns all the dimensions from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-dimensions
---
## About
A `looker-get-dimensions` tool returns all the dimensions from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-get-dimensions` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_dimensions:
kind: looker-get-dimensions
source: looker-source
description: |
The get_dimensions tool retrieves the list of dimensions defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-dimensions". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-explores"
type: docs
weight: 1
description: >
A "looker-get-explores" tool returns all explores
for the given model from the source.
aliases:
- /resources/tools/looker-get-explores
---
## About
A `looker-get-explores` tool returns all explores
for a given model from the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-get-explores` accepts one parameter, the
`model` id.
## Example
```yaml
tools:
get_explores:
kind: looker-get-explores
source: looker-source
description: |
The get_explores tool retrieves the list of explores defined in a LookML model
in the Looker system.
It takes one parameter, the model_name looked up from get_models.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-explores". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-filters"
type: docs
weight: 1
description: >
A "looker-get-filters" tool returns all the filters from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-filters
---
## About
A `looker-get-filters` tool returns all the filters from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-get-filters` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_dimensions:
kind: looker-get-filters
source: looker-source
description: |
The get_filters tool retrieves the list of filters defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-filters". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,60 @@
---
title: "looker-get-looks"
type: docs
weight: 1
description: >
"looker-get-looks" searches for saved Looks in a Looker
source.
aliases:
- /resources/tools/looker-get-looks
---
## About
The `looker-get-looks` tool searches for a saved Look by
name or description.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-get-looks` takes four parameters, the `title`, `desc`, `limit`
and `offset`.
Title and description use SQL style wildcards and are case insensitive.
Limit and offset are used to page through a larger set of matches and
default to 100 and 0.
## Example
```yaml
tools:
get_looks:
kind: looker-get-looks
source: looker-source
description: |
get_looks Tool
This tool is used to search for saved looks in a Looker instance.
String search params use case-insensitive matching. String search
params can contain % and '_' as SQL LIKE pattern match wildcard
expressions. example="dan%" will match "danger" and "Danzig" but
not "David" example="D_m%" will match "Damage" and "dump".
Most search params can accept "IS NULL" and "NOT NULL" as special
expressions to match or exclude (respectively) rows where the
column is null.
The limit and offset are used to paginate the results.
The result of the get_looks tool is a list of json objects.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-looks" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-measures"
type: docs
weight: 1
description: >
A "looker-get-measures" tool returns all the measures from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-measures
---
## About
A `looker-get-measures` tool returns all the measures from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-get-measures` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_measures:
kind: looker-get-measures
source: looker-source
description: |
The get_measures tool retrieves the list of measures defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-measures". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,40 @@
---
title: "looker-get-models"
type: docs
weight: 1
description: >
A "looker-get-models" tool returns all the models in the source.
aliases:
- /resources/tools/looker-get-models
---
## About
A `looker-get-models` tool returns all the models the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-get-models` accepts no parameters.
## Example
```yaml
tools:
get_models:
kind: looker-get-models
source: looker-source
description: |
The get_models tool retrieves the list of LookML models in the Looker system.
It takes no parameters.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-models". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-parameters"
type: docs
weight: 1
description: >
A "looker-get-parameters" tool returns all the parameters from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-parameters
---
## About
A `looker-get-parameters` tool returns all the parameters from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-get-parameters` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_parameters:
kind: looker-get-parameters
source: looker-source
description: |
The get_parameters tool retrieves the list of parameters defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-parameters". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,79 @@
---
title: "looker-query"
type: docs
weight: 1
description: >
"looker-query" runs an inline query using the Looker
semantic model.
aliases:
- /resources/tools/looker-query
---
## About
The `looker-query` tool runs a query using the Looker
semantic model.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-query` takes eight parameters:
1. the `model`
2. the `explore`
3. the `fields` list
4. an optional set of `filters`
5. an optional set of `pivots`
6. an optional set of `sorts`
7. an optional `limit`
8. an optional `tz`
## Example
```yaml
tools:
query:
kind: looker-query
source: looker-source
description: |
Query Tool
This tool is used to run a query against the LookML model. The
model, explore, and fields list must be specified. Pivots,
filters and sorts are optional.
The model can be found from the get_models tool. The explore
can be found from the get_explores tool passing in the model.
The fields can be found from the get_dimensions, get_measures,
get_filters, and get_parameters tools, passing in the model
and the explore.
Provide a model_id and explore_name, then a list
of fields. Optionally a list of pivots can be provided.
The pivots must also be included in the fields list.
Filters are provided as a map of {"field.id": "condition",
"field.id2": "condition2", ...}. Do not put the field.id in
quotes. Filter expressions can be found at
https://cloud.google.com/looker/docs/filter-expressions.
Sorts can be specified like [ "field.id desc 0" ].
An optional row limit can be added. If not provided the limit
will default to 500. "-1" can be specified for unlimited.
An optional query timezone can be added. The query_timezone to
will default to that of the workstation where this MCP server
is running, or Etc/UTC if that can't be determined. Not all
models support custom timezones.
The result of the query tool is JSON
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-query" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,79 @@
---
title: "looker-query-sql"
type: docs
weight: 1
description: >
"looker-query-sql" generates a sql query using the Looker
semantic model.
aliases:
- /resources/tools/looker-query-sql
---
## About
The `looker-query-sql` generates a sql query using the Looker
semantic model.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-query-sql` takes eight parameters:
1. the `model`
2. the `explore`
3. the `fields` list
4. an optional set of `filters`
5. an optional set of `pivots`
6. an optional set of `sorts`
7. an optional `limit`
8. an optional `tz`
## Example
```yaml
tools:
query_sql:
kind: looker-query-sql
source: looker-source
description: |
Query SQL Tool
This tool is used to generate a sql query against the LookML model. The
model, explore, and fields list must be specified. Pivots,
filters and sorts are optional.
The model can be found from the get_models tool. The explore
can be found from the get_explores tool passing in the model.
The fields can be found from the get_dimensions, get_measures,
get_filters, and get_parameters tools, passing in the model
and the explore.
Provide a model_id and explore_name, then a list
of fields. Optionally a list of pivots can be provided.
The pivots must also be included in the fields list.
Filters are provided as a map of {"field.id": "condition",
"field.id2": "condition2", ...}. Do not put the field.id in
quotes. Filter expressions can be found at
https://cloud.google.com/looker/docs/filter-expressions.
Sorts can be specified like [ "field.id desc 0" ].
An optional row limit can be added. If not provided the limit
will default to 500. "-1" can be specified for unlimited.
An optional query timezone can be added. The query_timezone to
will default to that of the workstation where this MCP server
is running, or Etc/UTC if that can't be determined. Not all
models support custom timezones.
The result of the query tool is the sql string.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-query-sql" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,43 @@
---
title: "looker-run-look"
type: docs
weight: 1
description: >
"looker-run-look" runs the query associated with a saved Look.
aliases:
- /resources/tools/looker-run-look
---
## About
The `looker-run-look` tool runs the query associated with a
saved Look.
It's compatible with the following sources:
- [looker](../sources/looker.md)
`looker-run-look` takes one parameter, the `look_id`.
## Example
```yaml
tools:
run_look:
kind: looker-run-look
source: looker-source
description: |
run_look Tool
This tool runs the query associated with a look and returns
the data in a JSON structure. It accepts the look_id as the
parameter.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-run-look" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,7 @@
---
title: "MongoDB"
type: docs
weight: 1
description: >
Tools that work with the MongoDB Source.
---

View File

@@ -0,0 +1,53 @@
---
title: "mongodb-delete-many"
type: docs
weight: 1
description: >
A "mongodb-delete-many" tool deletes all documents from a MongoDB collection that match a filter.
aliases:
- /resources/tools/mongodb-delete-many
---
## About
The `mongodb-delete-many` tool performs a **bulk destructive operation**, deleting **ALL** documents from a collection that match a specified filter.
The tool returns the total count of documents that were deleted. If the filter does not match any documents (i.e., the deleted count is 0), the tool will return an error.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
---
## Example
Here is an example that performs a cleanup task by deleting all products from the `inventory` collection that belong to a discontinued brand.
```yaml
tools:
retire_brand_products:
kind: mongodb-delete-many
source: my-mongo-source
description: Deletes all products from a specified discontinued brand.
database: ecommerce
collection: inventory
filterPayload: |
{ "brand_name": {{json .brand_to_delete}} }
filterParams:
- name: brand_to_delete
type: string
description: The name of the discontinued brand whose products should be deleted.
```
## Reference
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:--------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-delete-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection from which to delete documents. |
| filterPayload | string | true | The MongoDB query filter document to select the documents for deletion. Uses `{{json .param_name}}` for templating. |
| filterParams | list | true | A list of parameter objects that define the variables used in the `filterPayload`. |

View File

@@ -0,0 +1,55 @@
---
title: "mongodb-delete-one"
type: docs
weight: 1
description: >
A "mongodb-delete-one" tool deletes a single document from a MongoDB collection.
aliases:
- /resources/tools/mongodb-delete-one
---
## About
The `mongodb-delete-one` tool performs a destructive operation, deleting the **first single document** that matches a specified filter from a MongoDB collection.
If the filter matches multiple documents, only the first one found by the database will be deleted. This tool is useful for removing specific entries, such as a user account or a single item from an inventory based on a unique ID.
The tool returns the number of documents deleted, which will be either `1` if a document was found and deleted, or `0` if no matching document was found.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
---
## Example
Here is an example that deletes a specific user account from the `users` collection by matching their unique email address. This is a permanent action.
```yaml
tools:
delete_user_account:
kind: mongodb-delete-one
source: my-mongo-source
description: Permanently deletes a user account by their email address.
database: user_data
collection: users
filterPayload: |
{ "email": {{json .email_address}} }
filterParams:
- name: email_address
type: string
description: The email of the user account to delete.
```
## Reference
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-delete-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection from which to delete a document. |
| filterPayload | string | true | The MongoDB query filter document to select the document for deletion. Uses `{{json .param_name}}` for templating. |
| filterParams | list | true | A list of parameter objects that define the variables used in the `filterPayload`. |

View File

@@ -0,0 +1,62 @@
---
title: "mongodb-find-one"
type: docs
weight: 1
description: >
A "mongodb-find-one" tool finds and retrieves a single document from a MongoDB collection.
aliases:
- /resources/tools/mongodb-find-one
---
## About
A `mongodb-find-one` tool is used to retrieve the **first single document** that matches a specified filter from a MongoDB collection. If multiple documents match the filter, you can use `sort` options to control which document is returned. Otherwise, the selection is not guaranteed.
The tool returns a single JSON object representing the document, wrapped in a JSON array.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
---
## Example
Here's a common use case: finding a specific user by their unique email address and returning their profile information, while excluding sensitive fields like the password hash.
```yaml
tools:
get_user_profile:
kind: mongodb-find-one
source: my-mongo-source
description: Retrieves a user's profile by their email address.
database: user_data
collection: profiles
filterPayload: |
{ "email": {{json .email}} }
filterParams:
- name: email
type: string
description: The email address of the user to find.
projectPayload: |
{
"password_hash": 0,
"login_history": 0
}
```
## Reference
| **field** | **type** | **required** | **description** |
|:---------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-find-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database to query. |
| collection | string | true | The name of the MongoDB collection to query. |
| filterPayload | string | true | The MongoDB query filter document to select the document. Uses `{{json .param_name}}` for templating. |
| filterParams | list | true | A list of parameter objects that define the variables used in the `filterPayload`. |
| projectPayload | string | false | An optional MongoDB projection document to specify which fields to include (1) or exclude (0) in the result. |
| projectParams | list | false | A list of parameter objects for the `projectPayload`. |
| sortPayload | string | false | An optional MongoDB sort document. Useful for selecting which document to return if the filter matches multiple (e.g., get the most recent). |
| sortParams | list | false | A list of parameter objects for the `sortPayload`. |

View File

@@ -0,0 +1,70 @@
---
title: "mongodb-find"
type: docs
weight: 1
description: >
A "mongodb-find" tool finds and retrieves documents from a MongoDB collection.
aliases:
- /resources/tools/mongodb-find
---
## About
A `mongodb-find` tool is used to query a MongoDB collection and retrieve documents that match a specified filter. It's a flexible tool that allows you to shape the output by selecting specific fields (**projection**), ordering the results (**sorting**), and restricting the number of documents returned (**limiting**).
The tool returns a JSON array of the documents found.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
## Example
Here's an example that finds up to 10 users from the `customers` collection who live in a specific city. The results are sorted by their last name, and only their first name, last name, and email are returned.
```yaml
tools:
find_local_customers:
kind: mongodb-find
source: my-mongo-source
description: Finds customers by city, sorted by last name.
database: crm
collection: customers
limit: 10
filterPayload: |
{ "address.city": {{json .city}} }
filterParams:
- name: city
type: string
description: The city to search for customers in.
projectPayload: |
{
"first_name": 1,
"last_name": 1,
"email": 1,
"_id": 0
}
sortPayload: |
{ "last_name": {{json .sort_order}} }
sortParams:
- name: sort_order
type: integer
description: The sort order (1 for ascending, -1 for descending).
```
## Reference
| **field** | **type** | **required** | **description** |
|:---------------|:---------|:-------------|:----------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-find`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database to query. |
| collection | string | true | The name of the MongoDB collection to query. |
| filterPayload | string | true | The MongoDB query filter document to select which documents to return. Uses `{{json .param_name}}` for templating. |
| filterParams | list | true | A list of parameter objects that define the variables used in the `filterPayload`. |
| projectPayload | string | false | An optional MongoDB projection document to specify which fields to include (1) or exclude (0) in the results. |
| projectParams | list | false | A list of parameter objects for the `projectPayload`. |
| sortPayload | string | false | An optional MongoDB sort document to define the order of the returned documents. Use 1 for ascending and -1 for descending. |
| sortParams | list | false | A list of parameter objects for the `sortPayload`. |
| limit | integer | false | An optional integer specifying the maximum number of documents to return. |

View File

@@ -0,0 +1,52 @@
---
title: "mongodb-insert-many"
type: docs
weight: 1
description: >
A "mongodb-insert-many" tool inserts multiple new documents into a MongoDB collection.
aliases:
- /resources/tools/mongodb-insert-many
---
## About
The `mongodb-insert-many` tool inserts **multiple new documents** into a specified MongoDB collection in a single bulk operation. This is highly efficient for adding large amounts of data at once.
This tool takes one required parameter named `data`. This `data` parameter must be a string containing a **JSON array of document objects**. Upon successful insertion, the tool returns a JSON array containing the unique `_id` of **each** new document that was created.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
---
## Example
Here is an example configuration for a tool that logs multiple events at once.
```yaml
tools:
log_batch_events:
kind: mongodb-insert-many
source: my-mongo-source
description: Inserts a batch of event logs into the database.
database: logging
collection: events
canonical: true
```
An LLM would call this tool by providing an array of documents as a JSON string in the `data` parameter, like this:
`tool_code: log_batch_events(data='[{"event": "login", "user": "user1"}, {"event": "click", "user": "user2"}, {"event": "logout", "user": "user1"}]')`
---
## Reference
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection into which the documents will be inserted. |
| canonical | bool | true | Determines if the data string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. |

View File

@@ -0,0 +1,48 @@
---
title: "mongodb-insert-one"
type: docs
weight: 1
description: >
A "mongodb-insert-one" tool inserts a single new document into a MongoDB collection.
aliases:
- /resources/tools/mongodb-insert-one
---
## About
The `mongodb-insert-one` tool inserts a **single new document** into a specified MongoDB collection.
This tool takes one required parameter named `data`, which must be a string containing the JSON object you want to insert. Upon successful insertion, the tool returns the unique `_id` of the newly created document.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
## Example
Here is an example configuration for a tool that adds a new user to a `users` collection.
```yaml
tools:
create_new_user:
kind: mongodb-insert-one
source: my-mongo-source
description: Creates a new user record in the database.
database: user_data
collection: users
canonical: false
```
An LLM would call this tool by providing the document as a JSON string in the `data` parameter, like this:
`tool_code: create_new_user(data='{"email": "new.user@example.com", "name": "Jane Doe", "status": "active"}')`
## Reference
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection into which the document will be inserted. |
| canonical | bool | true | Determines if the data string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. |

View File

@@ -0,0 +1,68 @@
---
title: "mongodb-update-many"
type: docs
weight: 1
description: >
A "mongodb-update-many" tool updates all documents in a MongoDB collection that match a filter.
aliases:
- /resources/tools/mongodb-update-many
---
## About
A `mongodb-update-many` tool updates **all** documents within a specified MongoDB collection that match a given filter. It locates the documents using a `filterPayload` and applies the modifications defined in an `updatePayload`.
The tool returns an array of three integers: `[ModifiedCount, UpsertedCount, MatchedCount]`.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
---
## Example
Here's an example configuration. This tool applies a discount to all items within a specific category and also marks them as being on sale.
```yaml
tools:
apply_category_discount:
kind: mongodb-update-many
source: my-mongo-source
description: Use this tool to apply a discount to all items in a given category.
database: products
collection: inventory
filterPayload: |
{ "category": {{json .category_name}} }
filterParams:
- name: category_name
type: string
description: The category of items to update.
updatePayload: |
{
"$mul": { "price": {{json .discount_multiplier}} },
"$set": { "on_sale": true }
}
updateParams:
- name: discount_multiplier
type: number
description: The multiplier to apply to the price (e.g., 0.8 for a 20% discount).
canonical: false
upsert: false
```
## Reference
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection in which to update documents. |
| filterPayload | string | true | The MongoDB query filter document to select the documents for updating. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| filterParams | list | true | A list of parameter objects that define the variables used in the `filterPayload`. |
| updatePayload | string | true | The MongoDB update document, It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| updateParams | list | true | A list of parameter objects that define the variables used in the `updatePayload`. |
| canonical | bool | true | Determines if the `filterPayload` and `updatePayload` strings are parsed using MongoDB's Canonical or Relaxed Extended JSON format. **Canonical** is stricter about type representation, while **Relaxed** is more lenient. |
| upsert | bool | false | If `true`, a new document is created if no document matches the `filterPayload`. Defaults to `false`. |

View File

@@ -0,0 +1,66 @@
---
title: "mongodb-update-one"
type: docs
weight: 1
description: >
A "mongodb-update-one" tool updates a single document in a MongoDB collection.
aliases:
- /resources/tools/mongodb-update-one
---
## About
A `mongodb-update-one` tool updates a single document within a specified MongoDB collection. It locates the document to be updated using a `filterPayload` and applies modifications defined in an `updatePayload`. If the filter matches multiple documents, only the first one found will be updated.
This tool is compatible with the following source kind:
* [`mongodb`](../../sources/mongodb.md)
---
## Example
Here's an example of a `mongodb-update-one` tool configuration. This tool updates the `stock` and `status` fields of a document in the `inventory` collection where the `item` field matches a provided value. If no matching document is found, the `upsert: true` option will create a new one.
```yaml
tools:
update_inventory_item:
kind: mongodb-update-one
source: my-mongo-source
description: Use this tool to update an item's stock and status in the inventory.
database: products
collection: inventory
filterPayload: |
{ "item": {{json .item_name}} }
filterParams:
- name: item_name
type: string
description: The name of the item to update.
updatePayload: |
{ "$set": { "stock": {{json .new_stock}}, "status": {{json .new_status}} } }
updateParams:
- name: new_stock
type: integer
description: The new stock quantity.
- name: new_status
type: string
description: The new status of the item (e.g., "In Stock", "Backordered").
canonical: false
upsert: true
```
## Reference
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection to update a document in. |
| filterPayload | string | true | The MongoDB query filter document to select the document for updating. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| filterParams | list | true | A list of parameter objects that define the variables used in the `filterPayload`. |
| updatePayload | string | true | The MongoDB update document, which specifies the modifications. This often uses update operators like `$set`. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| updateParams | list | true | A list of parameter objects that define the variables used in the `updatePayload`. |
| canonical | bool | true | Determines if the `updatePayload` string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. **Canonical** is stricter about type representation (e.g., `{"$numberInt": "42"}`), while **Relaxed** is more lenient (e.g., `42`). |
| upsert | bool | false | If `true`, a new document is created if no document matches the `filterPayload`. Defaults to `false`. |

View File

@@ -0,0 +1,53 @@
---
title: "neo4j-execute-cypher"
type: docs
weight: 1
description: >
A "neo4j-execute-cypher" tool executes any arbitrary Cypher statement against a Neo4j
database.
aliases:
- /resources/tools/neo4j-execute-cypher
---
## About
A `neo4j-execute-cypher` tool executes an arbitrary Cypher query provided as a string parameter against a Neo4j database. It's designed to be a flexible tool for interacting with the database when a pre-defined query is not sufficient. This tool is compatible with any of the following sources:
- [neo4j](../sources/neo4j.md)
For security, the tool can be configured to be read-only. If the `readOnly` flag is set to `true`, the tool will analyze the incoming Cypher query and reject any write operations (like `CREATE`, `MERGE`, `DELETE`, etc.) before execution.
The Cypher query uses standard [Neo4j Cypher](https://neo4j.com/docs/cypher-manual/current/queries/) syntax and supports all Cypher features, including pattern matching, filtering, and aggregation.
`neo4j-execute-cypher` takes one input parameter `cypher` and run the cypher query against the `source`.
> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.
## Example
```yaml
tools:
query_neo4j:
kind: neo4j-execute-cypher
source: my-neo4j-prod-db
readOnly: true
description: |
Use this tool to execute a Cypher query against the production database.
Only read-only queries are allowed.
Takes a single 'cypher' parameter containing the full query string.
Example:
{{
"cypher": "MATCH (m:Movie {title: 'The Matrix'}) RETURN m.released"
}}
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|-------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "neo4j-cypher". |
| source | string | true | Name of the source the Cypher query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| readOnly | boolean | false | If set to `true`, the tool will reject any write operations in the Cypher query. Default is `false`. |

View File

@@ -0,0 +1,7 @@
---
title: "Utility tools"
type: docs
weight: 1
description: >
Tools that provide utility.
---

View File

@@ -0,0 +1,47 @@
---
title: "alloydb-wait-for-operation"
type: docs
weight: 10
description: >
Wait for a long-running AlloyDB operation to complete.
---
The `alloydb-wait-for-operation` tool is a utility tool that waits for a long-running AlloyDB operation to complete. It does this by polling the AlloyDB Admin API operation status endpoint until the operation is finished, using exponential backoff.
{{< notice info >}}
This tool is intended for developer assistant workflows with human-in-the-loop and shouldn't be used for production agents.
{{< /notice >}}
## Example
```yaml
sources:
alloydb-api-source:
kind: http
baseUrl: https://alloydb.googleapis.com
headers:
Authorization: Bearer ${API_KEY}
Content-Type: application/json
tools:
alloydb-operations-get:
kind: alloydb-wait-for-operation
source: alloydb-api-source
description: "This will poll on operations API until the operation is done. For checking operation status we need projectId, locationID and operationId. Once instance is created give follow up steps on how to use the variables to bring data plane MCP server up in local and remote setup."
delay: 1s
maxDelay: 4m
multiplier: 2
maxRetries: 10
```
## Reference
| **field** | **type** | **required** | **description** |
| ----------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "alloydb-wait-for-operation". |
| source | string | true | Name of the source the HTTP request should be sent to. |
| description | string | true | A description of the tool. |
| delay | duration | false | The initial delay between polling requests (e.g., `3s`). Defaults to 3 seconds. |
| maxDelay | duration | false | The maximum delay between polling requests (e.g., `4m`). Defaults to 4 minutes. |
| multiplier | float | false | The multiplier for the polling delay. The delay is multiplied by this value after each request. Defaults to 2.0. |
| maxRetries | int | false | The maximum number of polling attempts before giving up. Defaults to 10. |

View File

@@ -14,9 +14,9 @@ A `wait` tool pauses execution for a specified duration. This can be useful in w
`wait` takes one input parameter `duration` which is a string representing the time to wait (e.g., "10s", "2m", "1h").
{{% notice info %}}
{{< notice info >}}
This tool is intended for developer assistant workflows with human-in-the-loop and shouldn't be used for production agents.
{{% /notice %}}
{{< /notice >}}
## Example

View File

@@ -0,0 +1,7 @@
---
title: "Looker"
type: docs
weight: 1
description: >
How to get started with Toolbox using Looker.
---

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

View File

@@ -0,0 +1,108 @@
---
title: "Quickstart (MCP with Looker and Gemini-CLI)"
type: docs
weight: 2
description: >
How to get started running Toolbox with Gemini-CLI and Looker as the source.
---
## Overview
[Model Context Protocol](https://modelcontextprotocol.io) is an open protocol
that standardizes how applications provide context to LLMs. Check out this page
on how to [connect to Toolbox via MCP](../../how-to/connect_via_mcp.md).
## Step 1: Get a Looker Client ID and Client Secret
The Looker Client ID and Client Secret can be obtained from the Users page of
your Looker instance. Refer to the documentation
[here](https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk).
You may need to ask an administrator to get the Client ID and Client Secret
for you.
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox and run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Edit the file `~/.gemini/settings.json` and add the following
to the list of mcpServers. Use the Client Id and Client Secret
you obtained earlier.
```json
"mcpServers": {
"looker-toolbox": {
"command": "/path/to/toolbox",
"args": [
"--stdio",
"--prebuilt",
"looker"
],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
```
In some instances you may need to append `:19999` to
the LOOKER_BASE_URL.
## Step 3: Start Gemini-CLI
1. Run Gemini-CLI:
```bash
npx https://github.com/google-gemini/gemini-cli
```
1. Type `y` when it asks to download.
1. Log into Gemini-CLI
1. Enter the command `/mcp` and you should see a list of
available tools like
```
Configured MCP servers:
🟢 looker-toolbox - Ready (10 tools)
- looker-toolbox__get_models
- looker-toolbox__query
- looker-toolbox__get_looks
- looker-toolbox__get_measures
- looker-toolbox__get_filters
- looker-toolbox__get_parameters
- looker-toolbox__get_explores
- looker-toolbox__query_sql
- looker-toolbox__get_dimensions
- looker-toolbox__run_look
```
1. Start exploring your Looker instance with commands like
`Find an explore to see orders` or `show me my current
inventory broken down by item category`.
1. Gemini will prompt you for your approval before using
a tool. You can approve all the tools.

View File

@@ -0,0 +1,103 @@
---
title: "Quickstart (MCP with Looker)"
type: docs
weight: 2
description: >
How to get started running Toolbox with MCP Inspector and Looker as the source.
---
## Overview
[Model Context Protocol](https://modelcontextprotocol.io) is an open protocol
that standardizes how applications provide context to LLMs. Check out this page
on how to [connect to Toolbox via MCP](../../how-to/connect_via_mcp.md).
## Step 1: Get a Looker Client ID and Client Secret
The Looker Client ID and Client Secret can be obtained from the Users page of
your Looker instance. Refer to the documentation
[here](https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk).
You may need to ask an administrator to get the Client ID and Client Secret
for you.
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox and run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Create a file `looker_env` with the settings for your
Looker instance. Use the Client ID and Client Secret
you obtained earlier.
```bash
export LOOKER_BASE_URL=https://looker.example.com
export LOOKER_VERIFY_SSL=true
export LOOKER_CLIENT_ID=Q7ynZkRkvj9S9FHPm4Wj
export LOOKER_CLIENT_SECRET=P5JvZstFnhpkhCYy2yNSfJ6x
```
In some instances you may need to append `:19999` to
the LOOKER_BASE_URL.
1. Load the looker_env file into your environment.
```bash
source looker_env
```
1. Run the Toolbox server using the prebuilt Looker tools.
```bash
./toolbox --prebuilt looker
```
## Step 3: Connect to MCP Inspector
1. Run the MCP Inspector:
```bash
npx @modelcontextprotocol/inspector
```
1. Type `y` when it asks to install the inspector package.
1. It should show the following when the MCP Inspector is up and running:
```bash
🔍 MCP Inspector is up and running at http://127.0.0.1:5173 🚀
```
1. Open the above link in your browser.
1. For `Transport Type`, select `SSE`.
1. For `URL`, type in `http://127.0.0.1:5000/mcp/sse`.
1. Click Connect.
![inspector](./inspector.png)
1. Select `List Tools`, you will see a list of tools.
![inspector_tools](./inspector_tools.png)
1. Test out your tools here!

18
go.mod
View File

@@ -26,11 +26,14 @@ require (
github.com/google/uuid v1.6.0
github.com/jackc/pgx/v5 v5.7.5
github.com/json-iterator/go v1.1.12
github.com/looker-open-source/sdk-codegen/go v0.25.10
github.com/microsoft/go-mssqldb v1.9.2
github.com/neo4j/neo4j-go-driver/v5 v5.28.1
github.com/redis/go-redis/v9 v9.11.0
github.com/spf13/cobra v1.9.1
github.com/thlib/go-timezone-local v0.0.7
github.com/valkey-io/valkey-go v1.0.63
go.mongodb.org/mongo-driver v1.17.4
go.opentelemetry.io/contrib/propagators/autoprop v0.62.0
go.opentelemetry.io/otel v1.37.0
go.opentelemetry.io/otel/exporters/otlp/otlpmetric/otlpmetrichttp v1.37.0
@@ -40,7 +43,7 @@ require (
go.opentelemetry.io/otel/sdk/metric v1.37.0
go.opentelemetry.io/otel/trace v1.37.0
golang.org/x/oauth2 v0.30.0
google.golang.org/api v0.242.0
google.golang.org/api v0.243.0
modernc.org/sqlite v1.38.0
)
@@ -50,7 +53,7 @@ require (
cel.dev/expr v0.23.0 // indirect
cloud.google.com/go v0.121.2 // indirect
cloud.google.com/go/alloydb v1.18.0 // indirect
cloud.google.com/go/auth v0.16.2 // indirect
cloud.google.com/go/auth v0.16.3 // indirect
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
cloud.google.com/go/compute/metadata v0.7.0 // indirect
cloud.google.com/go/iam v1.5.2 // indirect
@@ -90,7 +93,7 @@ require (
github.com/google/flatbuffers v23.5.26+incompatible // indirect
github.com/google/s2a-go v0.1.9 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.3.6 // indirect
github.com/googleapis/gax-go/v2 v2.14.2 // indirect
github.com/googleapis/gax-go/v2 v2.15.0 // indirect
github.com/gorilla/websocket v1.5.3 // indirect
github.com/grpc-ecosystem/go-grpc-middleware v1.4.0 // indirect
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.1 // indirect
@@ -104,12 +107,16 @@ require (
github.com/mattn/go-isatty v0.0.20 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/montanaflynn/stats v0.7.1 // indirect
github.com/ncruces/go-strftime v0.1.9 // indirect
github.com/pierrec/lz4/v4 v4.1.18 // indirect
github.com/planetscale/vtprotobuf v0.6.1-0.20240319094008-0393e58bdf10 // indirect
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect
github.com/spf13/pflag v1.0.6 // indirect
github.com/spiffe/go-spiffe/v2 v2.5.0 // indirect
github.com/xdg-go/pbkdf2 v1.0.0 // indirect
github.com/xdg-go/scram v1.1.2 // indirect
github.com/xdg-go/stringprep v1.0.4 // indirect
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78 // indirect
github.com/zeebo/errs v1.4.0 // indirect
github.com/zeebo/xxh3 v1.0.2 // indirect
@@ -135,11 +142,12 @@ require (
golang.org/x/time v0.12.0 // indirect
golang.org/x/tools v0.34.0 // indirect
golang.org/x/xerrors v0.0.0-20240903120638-7835f813f4da // indirect
google.golang.org/genproto v0.0.0-20250505200425-f936aa4a68b2 // indirect
google.golang.org/genproto v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250707201910-8d1bb00bc6a7 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250715232539-7130f93afb79 // indirect
google.golang.org/grpc v1.73.0 // indirect
google.golang.org/protobuf v1.36.6 // indirect
gopkg.in/ini.v1 v1.67.0 // indirect
modernc.org/libc v1.65.10 // indirect
modernc.org/mathutil v1.7.1 // indirect
modernc.org/memory v1.11.0 // indirect

38
go.sum
View File

@@ -105,8 +105,8 @@ cloud.google.com/go/assuredworkloads v1.7.0/go.mod h1:z/736/oNmtGAyU47reJgGN+KVo
cloud.google.com/go/assuredworkloads v1.8.0/go.mod h1:AsX2cqyNCOvEQC8RMPnoc0yEarXQk6WEKkxYfL6kGIo=
cloud.google.com/go/assuredworkloads v1.9.0/go.mod h1:kFuI1P78bplYtT77Tb1hi0FMxM0vVpRC7VVoJC3ZoT0=
cloud.google.com/go/assuredworkloads v1.10.0/go.mod h1:kwdUQuXcedVdsIaKgKTp9t0UJkE5+PAVNhdQm4ZVq2E=
cloud.google.com/go/auth v0.16.2 h1:QvBAGFPLrDeoiNjyfVunhQ10HKNYuOwZ5noee0M5df4=
cloud.google.com/go/auth v0.16.2/go.mod h1:sRBas2Y1fB1vZTdurouM0AzuYQBMZinrUYL8EufhtEA=
cloud.google.com/go/auth v0.16.3 h1:kabzoQ9/bobUmnseYnBO6qQG7q4a/CffFRlJSxv2wCc=
cloud.google.com/go/auth v0.16.3/go.mod h1:NucRGjaXfzP1ltpcQ7On/VTZ0H4kWB5Jy+Y9Dnm76fA=
cloud.google.com/go/auth/oauth2adapt v0.2.8 h1:keo8NaayQZ6wimpNSmW5OPc283g65QNIiLpZnkHRbnc=
cloud.google.com/go/auth/oauth2adapt v0.2.8/go.mod h1:XQ9y31RkqZCcwJWNSx2Xvric3RrU88hAYYbjDWYDL+c=
cloud.google.com/go/automl v1.5.0/go.mod h1:34EjfoFGMZ5sgJ9EoLsRtdPSNZLcfflJR39VbVNS2M0=
@@ -940,8 +940,8 @@ github.com/googleapis/gax-go/v2 v2.5.1/go.mod h1:h6B0KMMFNtI2ddbGJn3T3ZbwkeT6yqE
github.com/googleapis/gax-go/v2 v2.6.0/go.mod h1:1mjbznJAPHFpesgE5ucqfYEscaz5kMdcIDwU/6+DDoY=
github.com/googleapis/gax-go/v2 v2.7.0/go.mod h1:TEop28CZZQ2y+c0VxMUmu1lV+fQx57QpBWsYpwqHJx8=
github.com/googleapis/gax-go/v2 v2.7.1/go.mod h1:4orTrqY6hXxxaUL4LHIPl6lGo8vAE38/qKbhSAKP6QI=
github.com/googleapis/gax-go/v2 v2.14.2 h1:eBLnkZ9635krYIPD+ag1USrOAI0Nr0QYF3+/3GqO0k0=
github.com/googleapis/gax-go/v2 v2.14.2/go.mod h1:ON64QhlJkhVtSqp4v1uaK92VyZ2gmvDQsweuyLV+8+w=
github.com/googleapis/gax-go/v2 v2.15.0 h1:SyjDc1mGgZU5LncH8gimWo9lW1DtIfPibOG81vgd/bo=
github.com/googleapis/gax-go/v2 v2.15.0/go.mod h1:zVVkkxAQHa1RQpg9z2AUCMnKhi0Qld9rcmyfL1OZhoc=
github.com/googleapis/go-type-adapters v1.0.0/go.mod h1:zHW75FOG2aur7gAO2B+MLby+cLsWGBF62rFAi7WjWO4=
github.com/googleapis/google-cloud-go-testing v0.0.0-20200911160855-bcd43fbb19e8/go.mod h1:dvDLG8qkwmyD9a/MJJN3XJcT3xFxOKAvTZGvuZmac9g=
github.com/gorilla/websocket v1.5.3 h1:saDtZ6Pbx/0u+bgYQ3q96pZgCzfhKXGPqt7kZ72aNNg=
@@ -1010,6 +1010,8 @@ github.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0
github.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=
github.com/leodido/go-urn v1.4.0 h1:WT9HwE9SGECu3lg4d/dIA+jxlljEa1/ffXKmRjqdmIQ=
github.com/leodido/go-urn v1.4.0/go.mod h1:bvxc+MVxLKB4z00jd1z+Dvzr47oO32F/QSNjSBOlFxI=
github.com/looker-open-source/sdk-codegen/go v0.25.10 h1:ltBbwkwZrQEHEIKrE5QbF+EtBlweKN0RZpQR0w2GIqo=
github.com/looker-open-source/sdk-codegen/go v0.25.10/go.mod h1:YM/IYSsTPk7I54j4l6PduNJYgXyOShuaMi7mD6xic8E=
github.com/lyft/protoc-gen-star v0.6.0/go.mod h1:TGAoBVkt8w7MPG72TrKIu85MIdXwDuzJYeZuUPFPNwA=
github.com/lyft/protoc-gen-star v0.6.1/go.mod h1:TGAoBVkt8w7MPG72TrKIu85MIdXwDuzJYeZuUPFPNwA=
github.com/lyft/protoc-gen-star/v2 v2.0.1/go.mod h1:RcCdONR2ScXaYnQC5tUzxzlpA3WVYF7/opLeUgcQs/o=
@@ -1027,6 +1029,8 @@ github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9Gz0M=
github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=
github.com/montanaflynn/stats v0.7.1 h1:etflOAAHORrCC44V+aR6Ftzort912ZU+YLiSTuV8eaE=
github.com/montanaflynn/stats v0.7.1/go.mod h1:etXPPgVO6n31NxCd9KQUMvCM+ve0ruNzt6R8Bnaayow=
github.com/ncruces/go-strftime v0.1.9 h1:bY0MQC28UADQmHmaF5dgpLmImcShSi2kHU9XLdhx/f4=
github.com/ncruces/go-strftime v0.1.9/go.mod h1:Fwc5htZGVVkseilnfgOVb9mKy6w1naJmn9CehxcKcls=
github.com/neo4j/neo4j-go-driver/v5 v5.28.1 h1:RKWQW7wTgYAY2fU9S+9LaJ9OwRPbRc0I17tlT7nDmAY=
@@ -1096,8 +1100,16 @@ github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o
github.com/stretchr/testify v1.8.3/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/thlib/go-timezone-local v0.0.7 h1:fX8zd3aJydqLlTs/TrROrIIdztzsdFV23OzOQx31jII=
github.com/thlib/go-timezone-local v0.0.7/go.mod h1:/Tnicc6m/lsJE0irFMA0LfIwTBo4QP7A8IfyIv4zZKI=
github.com/valkey-io/valkey-go v1.0.63 h1:LNlDTcUxy9jxrmGHSvd0s/NsgEmQbvREYvvBAHCIir0=
github.com/valkey-io/valkey-go v1.0.63/go.mod h1:bHmwjIEOrGq/ubOJfh5uMRs7Xj6mV3mQ/ZXUbmqpjqY=
github.com/xdg-go/pbkdf2 v1.0.0 h1:Su7DPu48wXMwC3bs7MCNG+z4FhcyEuz5dlvchbq0B0c=
github.com/xdg-go/pbkdf2 v1.0.0/go.mod h1:jrpuAogTd400dnrH08LKmI/xc1MbPOebTwRqcT5RDeI=
github.com/xdg-go/scram v1.1.2 h1:FHX5I5B4i4hKRVRBCFRxq1iQRej7WO3hhBuJf+UUySY=
github.com/xdg-go/scram v1.1.2/go.mod h1:RT/sEzTbU5y00aCK8UOx6R7YryM0iF1N2MOmC3kKLN4=
github.com/xdg-go/stringprep v1.0.4 h1:XLI/Ng3O1Atzq0oBs3TWm+5ZVgkq2aqdlvP9JtoZ6c8=
github.com/xdg-go/stringprep v1.0.4/go.mod h1:mPGuuIYwz7CmR2bT9j4GbQqutWS1zV24gijq1dTyGkM=
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78 h1:ilQV1hzziu+LLM3zUTJ0trRztfwgjqKnBWNtSRkbmwM=
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78/go.mod h1:aL8wCCfTfSfmXjznFBSZNN13rSJjlIOI1fUNAtF7rmI=
github.com/yuin/goldmark v1.1.25/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
@@ -1113,6 +1125,8 @@ github.com/zeebo/errs v1.4.0 h1:XNdoD/RRMKP7HD0UhJnIzUy74ISdGGxURlYG8HSWSfM=
github.com/zeebo/errs v1.4.0/go.mod h1:sgbWHsvVuTPHcqJJGQ1WhI5KbWlHYz+2+2C/LSEtCw4=
github.com/zeebo/xxh3 v1.0.2 h1:xZmwmqxHZA8AI603jOQ0tMqmBr9lPeFwGg6d+xy9DC0=
github.com/zeebo/xxh3 v1.0.2/go.mod h1:5NWz9Sef7zIDm2JHfFlcQvNekmcEl9ekUZQQKCYaDcA=
go.mongodb.org/mongo-driver v1.17.4 h1:jUorfmVzljjr0FLzYQsGP8cgN/qzzxlY9Vh0C9KFXVw=
go.mongodb.org/mongo-driver v1.17.4/go.mod h1:Hy04i7O2kC4RS06ZrhPRqj/u4DTYkFDAAccj+rVKqgQ=
go.opencensus.io v0.21.0/go.mod h1:mSImk1erAIZhrmZN+AvHh14ztQfjbGwt4TtuofqLduU=
go.opencensus.io v0.22.0/go.mod h1:+kGneAE2xo2IficOXnaByMWTGM9T73dGwxeWcUqIpI8=
go.opencensus.io v0.22.2/go.mod h1:yxeiOL68Rb0Xd1ddK5vPZ/oVn4vY4Ynel7k9FzqtOIw=
@@ -1402,6 +1416,7 @@ golang.org/x/sys v0.0.0-20210630005230-0f9fa26af87c/go.mod h1:oPkhp1MJrh7nUepCBc
golang.org/x/sys v0.0.0-20210806184541-e5e7981a1069/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210816183151-1e6c022a8912/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210823070655-63515b42dcdf/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210831042530-f4d43177bf5e/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210908233432-aa78b53d3365/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20211007075335-d3039528d8ac/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20211019181941-9d821ace8654/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
@@ -1609,8 +1624,8 @@ google.golang.org/api v0.108.0/go.mod h1:2Ts0XTHNVWxypznxWOYUeI4g3WdP9Pk2Qk58+a/
google.golang.org/api v0.110.0/go.mod h1:7FC4Vvx1Mooxh8C5HWjzZHcavuS2f6pmJpZx60ca7iI=
google.golang.org/api v0.111.0/go.mod h1:qtFHvU9mhgTJegR31csQ+rwxyUTHOKFqCKWp1J0fdw0=
google.golang.org/api v0.114.0/go.mod h1:ifYI2ZsFK6/uGddGfAD5BMxlnkBqCmqHSDUVi45N5Yg=
google.golang.org/api v0.242.0 h1:7Lnb1nfnpvbkCiZek6IXKdJ0MFuAZNAJKQfA1ws62xg=
google.golang.org/api v0.242.0/go.mod h1:cOVEm2TpdAGHL2z+UwyS+kmlGr3bVWQQ6sYEqkKje50=
google.golang.org/api v0.243.0 h1:sw+ESIJ4BVnlJcWu9S+p2Z6Qq1PjG77T8IJ1xtp4jZQ=
google.golang.org/api v0.243.0/go.mod h1:GE4QtYfaybx1KmeHMdBnNnyLzBZCVihGBXAmJu/uUr8=
google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=
google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
google.golang.org/appengine v1.5.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
@@ -1751,12 +1766,12 @@ google.golang.org/genproto v0.0.0-20230323212658-478b75c54725/go.mod h1:UUQDJDOl
google.golang.org/genproto v0.0.0-20230330154414-c0448cd141ea/go.mod h1:UUQDJDOlWu4KYeJZffbWgBkS1YFobzKbLVfK69pe0Ak=
google.golang.org/genproto v0.0.0-20230331144136-dcfb400f0633/go.mod h1:UUQDJDOlWu4KYeJZffbWgBkS1YFobzKbLVfK69pe0Ak=
google.golang.org/genproto v0.0.0-20230410155749-daa745c078e1/go.mod h1:nKE/iIaLqn2bQwXBg8f1g2Ylh6r5MN5CmZvuzZCgsCU=
google.golang.org/genproto v0.0.0-20250505200425-f936aa4a68b2 h1:1tXaIXCracvtsRxSBsYDiSBN0cuJvM7QYW+MrpIRY78=
google.golang.org/genproto v0.0.0-20250505200425-f936aa4a68b2/go.mod h1:49MsLSx0oWMOZqcpB3uL8ZOkAh1+TndpJ8ONoCBWiZk=
google.golang.org/genproto v0.0.0-20250603155806-513f23925822 h1:rHWScKit0gvAPuOnu87KpaYtjK5zBMLcULh7gxkCXu4=
google.golang.org/genproto v0.0.0-20250603155806-513f23925822/go.mod h1:HubltRL7rMh0LfnQPkMH4NPDFEWp0jw3vixw7jEM53s=
google.golang.org/genproto/googleapis/api v0.0.0-20250603155806-513f23925822 h1:oWVWY3NzT7KJppx2UKhKmzPq4SRe0LdCijVRwvGeikY=
google.golang.org/genproto/googleapis/api v0.0.0-20250603155806-513f23925822/go.mod h1:h3c4v36UTKzUiuaOKQ6gr3S+0hovBtUrXzTG/i3+XEc=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250707201910-8d1bb00bc6a7 h1:pFyd6EwwL2TqFf8emdthzeX+gZE1ElRq3iM8pui4KBY=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250707201910-8d1bb00bc6a7/go.mod h1:qQ0YXyHHx3XkvlzUtpXDkS29lDSafHMZBAZDc03LQ3A=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250715232539-7130f93afb79 h1:1ZwqphdOdWYXsUHgMpU/101nCtf/kSp9hOrcvFsnl10=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250715232539-7130f93afb79/go.mod h1:qQ0YXyHHx3XkvlzUtpXDkS29lDSafHMZBAZDc03LQ3A=
google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
google.golang.org/grpc v1.20.1/go.mod h1:10oTOabMzJvdu6/UiuZezV6QK5dSlG84ov/aaiqXj38=
google.golang.org/grpc v1.21.1/go.mod h1:oYelfM1adQP15Ek0mdvEgi9Df8B9CZIaU1084ijfRaM=
@@ -1824,6 +1839,9 @@ gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
gopkg.in/errgo.v2 v2.1.0/go.mod h1:hNsd1EY+bozCKY1Ytp96fpM3vjJbqLJn88ws8XvfDNI=
gopkg.in/ini.v1 v1.61.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/ini.v1 v1.67.0 h1:Dgnx+6+nfE+IfzjUEISNeydPJh9AXNNsWbGP9KzCsOA=
gopkg.in/ini.v1 v1.67.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.3/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.8/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=

View File

@@ -23,12 +23,16 @@ import (
func TestLoadPrebuiltToolYAMLs(t *testing.T) {
test_name := "test load prebuilt configs"
expectedKeys := []string{
"alloydb-postgres-admin",
"alloydb-postgres",
"bigquery",
"cloud-sql-mssql",
"cloud-sql-mysql",
"cloud-sql-postgres",
"firestore",
"looker",
"mssql",
"mysql",
"postgres",
"spanner-postgres",
"spanner",
@@ -64,15 +68,21 @@ func TestLoadPrebuiltToolYAMLs(t *testing.T) {
}
func TestGetPrebuiltTool(t *testing.T) {
alloydb_admin_config, _ := Get("alloydb-postgres-admin")
alloydb_config, _ := Get("alloydb-postgres")
bigquery_config, _ := Get("bigquery")
cloudsqlpg_config, _ := Get("cloud-sql-postgres")
cloudsqlmysql_config, _ := Get("cloud-sql-mysql")
cloudsqlmssql_config, _ := Get("cloud-sql-mssql")
firestoreconfig, _ := Get("firestore")
mysql_config, _ := Get("mysql")
mssql_config, _ := Get("mssql")
postgresconfig, _ := Get("postgres")
spanner_config, _ := Get("spanner")
spannerpg_config, _ := Get("spanner-postgres")
if len(alloydb_admin_config) <= 0 {
t.Fatalf("unexpected error: could not fetch alloydb prebuilt tools yaml")
}
if len(alloydb_config) <= 0 {
t.Fatalf("unexpected error: could not fetch alloydb prebuilt tools yaml")
}
@@ -91,6 +101,12 @@ func TestGetPrebuiltTool(t *testing.T) {
if len(firestoreconfig) <= 0 {
t.Fatalf("unexpected error: could not fetch firestore prebuilt tools yaml")
}
if len(mysql_config) <= 0 {
t.Fatalf("unexpected error: could not fetch mysql prebuilt tools yaml")
}
if len(mssql_config) <= 0 {
t.Fatalf("unexpected error: could not fetch mssql prebuilt tools yaml")
}
if len(postgresconfig) <= 0 {
t.Fatalf("unexpected error: could not fetch postgres prebuilt tools yaml")
}

View File

@@ -0,0 +1,123 @@
sources:
alloydb-api-source:
kind: http
baseUrl: https://alloydb.googleapis.com
headers:
Authorization: Bearer ${API_KEY}
Content-Type: application/json
tools:
alloydb-create-cluster:
kind: http
source: alloydb-api-source
method: POST
path: /v1/projects/{{.projectId}}/locations/{{.locationId}}/clusters
description: "Create a new AlloyDB cluster. This is a long-running operation, but the API call returns quickly. This will return operation id to be used by get operations tool. Take all parameters from user in one go."
pathParams:
- name: projectId
type: string
description: "The dynamic path parameter for project id provided by user."
- name: locationId
type: string
description: "The dynamic path parameter for location. The default value is us-central1. If quota is exhausted then use other regions."
default: us-central1
queryParams:
- name: clusterId
type: string
description: "A unique ID for the AlloyDB cluster."
requestBody: |
{
"networkConfig": {
"network": "projects/{{.project}}/global/networks/{{.network}}"
},
"initialUser": {
"password": "{{.password}}",
"user": "{{.user}}"
}
}
bodyParams:
- name: project
type: string
description: "The dynamic path parameter for project id."
- name: network
type: string
description: "The name of the VPC network to connect the cluster to (e.g., 'default')."
default: default
- name: password
type: string
description: "A secure password for the initial 'postgres' user or the custom user provided."
- name: user
type: string
description: "The name for the initial superuser. If not provided, it defaults to 'postgres'. The initial database will always be named 'postgres'."
alloydb-operations-get:
kind: alloydb-wait-for-operation
source: alloydb-api-source
description: "This will poll on operations API until the operation is done. For checking operation status we need projectId, locationID and operationId. Once instance is created give follow up steps on how to use the variables to bring data plane MCP server up in local and remote setup."
delay: 1s
maxDelay: 4m
multiplier: 2
maxRetries: 10
alloydb-create-instance:
kind: http
source: alloydb-api-source
method: POST
path: /v1/projects/{{.projectId}}/locations/{{.locationId}}/clusters/{{.clusterId}}/instances
description: "Creates a new AlloyDB instance (PRIMARY, READ_POOL, or SECONDARY) within a cluster. This is a long-running operation. Take all parameters from user in one go. This will return operation id to be used by get operations tool."
pathParams:
- name: projectId
type: string
description: "The GCP project ID."
- name: locationId
type: string
description: "The location of the cluster (e.g., 'us-central1')."
default: us-central1
- name: clusterId
type: string
description: "The ID of the cluster to create the instance in."
queryParams:
- name: instanceId
type: string
description: "A unique ID for the new AlloyDB instance."
requestBody: |
{
"instanceType": "{{.instanceType}}",
{{- if .displayName }}
"displayName": "{{.displayName}}",
{{- end }}
{{- if eq .instanceType "READ_POOL" }}
"readPoolConfig": {
"nodeCount": {{.nodeCount}}
},
{{- end }}
{{- if eq .instanceType "SECONDARY" }}
"secondaryConfig": {
"primaryClusterName": "{{.primaryClusterName}}"
},
{{- end }}
"networkConfig": {
"enablePublicIp": true
},
"databaseFlags": {
"password.enforce_complexity": "on"
}
}
bodyParams:
- name: instanceType
type: string
description: "The type of instance to create. Required. Valid values are: PRIMARY, READ_POOL, SECONDARY."
- name: displayName
type: string
description: "An optional, user-friendly name for the instance."
- name: nodeCount
type: integer
description: "The number of nodes in the read pool. Required only if instanceType is READ_POOL. Default is 1."
default: 1
- name: primaryClusterName
type: string
description: "The full resource name of the primary cluster for a SECONDARY instance. Required only if instanceType is SECONDARY. Otherwise don't ask"
default: ""
toolsets:
alloydb-postgres-admin-tools:
- alloydb-create-cluster
- alloydb-operations-get
- alloydb-create-instance

View File

@@ -11,7 +11,7 @@ sources:
tools:
execute_sql:
kind: mssql-execute-sql
source: cloud-sql-mssql-source
source: cloudsql-mssql-source
description: Use this tool to execute SQL.
list_tables:

View File

@@ -2,7 +2,7 @@ sources:
firestore-source:
kind: firestore
project: ${FIRESTORE_PROJECT}
database: ${FIRESTORE_DATABASE} # Optional, defaults to "(default)" if not specified
database: ${FIRESTORE_DATABASE}
tools:
firestore-get-documents:

View File

@@ -0,0 +1,158 @@
sources:
looker-source:
kind: looker
base_url: ${LOOKER_BASE_URL}
client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET}
verify_ssl: ${LOOKER_VERIFY_SSL}
timeout: 600s
tools:
get_models:
kind: looker-get-models
source: looker-source
description: |
The get_models tool retrieves the list of LookML models in the Looker system.
It takes no parameters.
get_explores:
kind: looker-get-explores
source: looker-source
description: |
The get_explores tool retrieves the list of explores defined in a LookML model
in the Looker system.
It takes one parameter, the model_name looked up from get_models.
get_dimensions:
kind: looker-get-dimensions
source: looker-source
description: |
The get_dimensions tool retrieves the list of dimensions defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
get_measures:
kind: looker-get-measures
source: looker-source
description: |
The get_measures tool retrieves the list of measures defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
get_filters:
kind: looker-get-filters
source: looker-source
description: |
The get_filters tool retrieves the list of filters defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
get_parameters:
kind: looker-get-parameters
source: looker-source
description: |
The get_parameters tool retrieves the list of parameters defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
query:
kind: looker-query
source: looker-source
description: |
Query Tool
This tool is used to run a query against the LookML model. The
model, explore, and fields list must be specified. Pivots,
filters and sorts are optional.
The model can be found from the get_models tool. The explore
can be found from the get_explores tool passing in the model.
The fields can be found from the get_dimensions, get_measures,
get_filters, and get_parameters tools, passing in the model
and the explore.
Provide a model_id and explore_name, then a list
of fields. Optionally a list of pivots can be provided.
The pivots must also be included in the fields list.
Filters are provided as a map of {"field.id": "condition",
"field.id2": "condition2", ...}. Do not put the field.id in
quotes. Filter expressions can be found at
https://cloud.google.com/looker/docs/filter-expressions.
Sorts can be specified like [ "field.id desc 0" ].
An optional row limit can be added. If not provided the limit
will default to 500. "-1" can be specified for unlimited.
An optional query timezone can be added. The query_timezone to
will default to that of the workstation where this MCP server
is running, or Etc/UTC if that can't be determined. Not all
models support custom timezones.
The result of the query tool is JSON
query_sql:
kind: looker-query-sql
source: looker-source
description: |
Query SQL Tool
This tool is used to generate the SQL that Looker would
run against the underlying database. The parameters are
the same as the query tool.
The result of the query sql tool is SQL text.
get_looks:
kind: looker-get-looks
source: looker-source
description: |
get_looks Tool
This tool is used to search for saved looks in a Looker instance.
String search params use case-insensitive matching. String search
params can contain % and '_' as SQL LIKE pattern match wildcard
expressions. example="dan%" will match "danger" and "Danzig" but
not "David" example="D_m%" will match "Damage" and "dump".
Most search params can accept "IS NULL" and "NOT NULL" as special
expressions to match or exclude (respectively) rows where the
column is null.
The limit and offset are used to paginate the results.
The result of the get_looks tool is a list of json objects.
run_look:
kind: looker-run-look
source: looker-source
description: |
run_look Tool
This tool runs the query associated with a look and returns
the data in a JSON structure. It accepts the look_id as the
parameter.
toolsets:
looker-tools:
- get_models
- get_explores
- get_dimensions
- get_measures
- get_filters
- get_parameters
- query
- query_sql
- get_looks
- run_look

View File

@@ -0,0 +1,269 @@
sources:
mssql-source:
kind: mssql
host: ${MSSQL_HOST}
port: ${MSSQL_PORT}
database: ${MSSQL_DATABASE}
user: ${MSSQL_USER}
password: ${MSSQL_PASSWORD}
tools:
execute_sql:
kind: mssql-execute-sql
source: mssql-source
description: Use this tool to execute SQL.
list_tables:
kind: mssql-sql
source: mssql-source
description: "Lists detailed schema information (object type, columns, constraints, indexes, triggers, comment) as JSON for user-created tables (ordinary or partitioned). Filters by a comma-separated list of names. If names are omitted, lists all tables in user schemas."
statement: |
WITH table_info AS (
SELECT
t.object_id AS table_oid,
s.name AS schema_name,
t.name AS table_name,
dp.name AS table_owner, -- Schema's owner principal name
CAST(ep.value AS NVARCHAR(MAX)) AS table_comment, -- Cast for JSON compatibility
CASE
WHEN EXISTS ( -- Check if the table has more than one partition for any of its indexes or heap
SELECT 1 FROM sys.partitions p
WHERE p.object_id = t.object_id AND p.partition_number > 1
) THEN 'PARTITIONED TABLE'
ELSE 'TABLE'
END AS object_type_detail
FROM
sys.tables t
INNER JOIN
sys.schemas s ON t.schema_id = s.schema_id
LEFT JOIN
sys.database_principals dp ON s.principal_id = dp.principal_id
LEFT JOIN
sys.extended_properties ep ON ep.major_id = t.object_id AND ep.minor_id = 0 AND ep.class = 1 AND ep.name = 'MS_Description'
WHERE
t.type = 'U' -- User tables
AND s.name NOT IN ('sys', 'INFORMATION_SCHEMA', 'guest', 'db_owner', 'db_accessadmin', 'db_backupoperator', 'db_datareader', 'db_datawriter', 'db_ddladmin', 'db_denydatareader', 'db_denydatawriter', 'db_securityadmin')
AND (@table_names IS NULL OR LTRIM(RTRIM(@table_names)) = '' OR t.name IN (SELECT LTRIM(RTRIM(value)) FROM STRING_SPLIT(@table_names, ',')))
),
columns_info AS (
SELECT
c.object_id AS table_oid,
c.name AS column_name,
CONCAT(
UPPER(TY.name), -- Base type name
CASE
WHEN TY.name IN ('char', 'varchar', 'nchar', 'nvarchar', 'binary', 'varbinary') THEN
CONCAT('(', IIF(c.max_length = -1, 'MAX', CAST(c.max_length / CASE WHEN TY.name IN ('nchar', 'nvarchar') THEN 2 ELSE 1 END AS VARCHAR(10))), ')')
WHEN TY.name IN ('decimal', 'numeric') THEN
CONCAT('(', c.precision, ',', c.scale, ')')
WHEN TY.name IN ('datetime2', 'datetimeoffset', 'time') THEN
CONCAT('(', c.scale, ')')
ELSE ''
END
) AS data_type,
c.column_id AS column_ordinal_position,
IIF(c.is_nullable = 0, CAST(1 AS BIT), CAST(0 AS BIT)) AS is_not_nullable,
dc.definition AS column_default,
CAST(epc.value AS NVARCHAR(MAX)) AS column_comment
FROM
sys.columns c
JOIN
table_info ti ON c.object_id = ti.table_oid
JOIN
sys.types TY ON c.user_type_id = TY.user_type_id AND TY.is_user_defined = 0 -- Ensure we get base types
LEFT JOIN
sys.default_constraints dc ON c.object_id = dc.parent_object_id AND c.column_id = dc.parent_column_id
LEFT JOIN
sys.extended_properties epc ON epc.major_id = c.object_id AND epc.minor_id = c.column_id AND epc.class = 1 AND epc.name = 'MS_Description'
),
constraints_info AS (
-- Primary Keys & Unique Constraints
SELECT
kc.parent_object_id AS table_oid,
kc.name AS constraint_name,
REPLACE(kc.type_desc, '_CONSTRAINT', '') AS constraint_type, -- 'PRIMARY_KEY', 'UNIQUE'
STUFF((SELECT ', ' + col.name
FROM sys.index_columns ic
JOIN sys.columns col ON ic.object_id = col.object_id AND ic.column_id = col.column_id
WHERE ic.object_id = kc.parent_object_id AND ic.index_id = kc.unique_index_id
ORDER BY ic.key_ordinal
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '') AS constraint_columns,
NULL AS foreign_key_referenced_table,
NULL AS foreign_key_referenced_columns,
CASE kc.type
WHEN 'PK' THEN 'PRIMARY KEY (' + STUFF((SELECT ', ' + col.name FROM sys.index_columns ic JOIN sys.columns col ON ic.object_id = col.object_id AND ic.column_id = col.column_id WHERE ic.object_id = kc.parent_object_id AND ic.index_id = kc.unique_index_id ORDER BY ic.key_ordinal FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '') + ')'
WHEN 'UQ' THEN 'UNIQUE (' + STUFF((SELECT ', ' + col.name FROM sys.index_columns ic JOIN sys.columns col ON ic.object_id = col.object_id AND ic.column_id = col.column_id WHERE ic.object_id = kc.parent_object_id AND ic.index_id = kc.unique_index_id ORDER BY ic.key_ordinal FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '') + ')'
END AS constraint_definition
FROM sys.key_constraints kc
JOIN table_info ti ON kc.parent_object_id = ti.table_oid
UNION ALL
-- Foreign Keys
SELECT
fk.parent_object_id AS table_oid,
fk.name AS constraint_name,
'FOREIGN KEY' AS constraint_type,
STUFF((SELECT ', ' + pc.name
FROM sys.foreign_key_columns fkc
JOIN sys.columns pc ON fkc.parent_object_id = pc.object_id AND fkc.parent_column_id = pc.column_id
WHERE fkc.constraint_object_id = fk.object_id
ORDER BY fkc.constraint_column_id
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '') AS constraint_columns,
SCHEMA_NAME(rt.schema_id) + '.' + OBJECT_NAME(fk.referenced_object_id) AS foreign_key_referenced_table,
STUFF((SELECT ', ' + rc.name
FROM sys.foreign_key_columns fkc
JOIN sys.columns rc ON fkc.referenced_object_id = rc.object_id AND fkc.referenced_column_id = rc.column_id
WHERE fkc.constraint_object_id = fk.object_id
ORDER BY fkc.constraint_column_id
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '') AS foreign_key_referenced_columns,
OBJECT_DEFINITION(fk.object_id) AS constraint_definition
FROM sys.foreign_keys fk
JOIN sys.tables rt ON fk.referenced_object_id = rt.object_id
JOIN table_info ti ON fk.parent_object_id = ti.table_oid
UNION ALL
-- Check Constraints
SELECT
cc.parent_object_id AS table_oid,
cc.name AS constraint_name,
'CHECK' AS constraint_type,
NULL AS constraint_columns, -- Definition includes column context
NULL AS foreign_key_referenced_table,
NULL AS foreign_key_referenced_columns,
cc.definition AS constraint_definition
FROM sys.check_constraints cc
JOIN table_info ti ON cc.parent_object_id = ti.table_oid
),
indexes_info AS (
SELECT
i.object_id AS table_oid,
i.name AS index_name,
i.type_desc AS index_method, -- CLUSTERED, NONCLUSTERED, XML, etc.
i.is_unique,
i.is_primary_key AS is_primary,
STUFF((SELECT ', ' + c.name
FROM sys.index_columns ic
JOIN sys.columns c ON i.object_id = c.object_id AND ic.column_id = c.column_id
WHERE ic.object_id = i.object_id AND ic.index_id = i.index_id AND ic.is_included_column = 0
ORDER BY ic.key_ordinal
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '') AS index_columns,
(
'COLUMNS: (' + ISNULL(STUFF((SELECT ', ' + c.name + CASE WHEN ic.is_descending_key = 1 THEN ' DESC' ELSE '' END
FROM sys.index_columns ic
JOIN sys.columns c ON i.object_id = c.object_id AND ic.column_id = c.column_id
WHERE ic.object_id = i.object_id AND ic.index_id = i.index_id AND ic.is_included_column = 0
ORDER BY ic.key_ordinal FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, ''), 'N/A') + ')' +
ISNULL(CHAR(13)+CHAR(10) + 'INCLUDE: (' + STUFF((SELECT ', ' + c.name
FROM sys.index_columns ic
JOIN sys.columns c ON i.object_id = c.object_id AND ic.column_id = c.column_id
WHERE ic.object_id = i.object_id AND ic.index_id = i.index_id AND ic.is_included_column = 1
ORDER BY ic.index_column_id FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '') + ')', '') +
ISNULL(CHAR(13)+CHAR(10) + 'FILTER: (' + i.filter_definition + ')', '')
) AS index_definition_details
FROM
sys.indexes i
JOIN
table_info ti ON i.object_id = ti.table_oid
WHERE i.type <> 0 -- Exclude Heaps
AND i.name IS NOT NULL -- Exclude unnamed heap indexes; named indexes (PKs are often named) are preferred.
),
triggers_info AS (
SELECT
tr.parent_id AS table_oid,
tr.name AS trigger_name,
OBJECT_DEFINITION(tr.object_id) AS trigger_definition,
CASE tr.is_disabled WHEN 0 THEN 'ENABLED' ELSE 'DISABLED' END AS trigger_enabled_state
FROM
sys.triggers tr
JOIN
table_info ti ON tr.parent_id = ti.table_oid
WHERE
tr.is_ms_shipped = 0
AND tr.parent_class_desc = 'OBJECT_OR_COLUMN' -- DML Triggers on tables/views
)
SELECT
ti.schema_name,
ti.table_name AS object_name,
(
SELECT
ti.schema_name AS schema_name,
ti.table_name AS object_name,
ti.object_type_detail AS object_type,
ti.table_owner AS owner,
ti.table_comment AS comment,
JSON_QUERY(ISNULL((
SELECT
ci.column_name,
ci.data_type,
ci.column_ordinal_position,
ci.is_not_nullable,
ci.column_default,
ci.column_comment
FROM columns_info ci
WHERE ci.table_oid = ti.table_oid
ORDER BY ci.column_ordinal_position
FOR JSON PATH
), '[]')) AS columns,
JSON_QUERY(ISNULL((
SELECT
cons.constraint_name,
cons.constraint_type,
cons.constraint_definition,
JSON_QUERY(
CASE
WHEN cons.constraint_columns IS NOT NULL AND LTRIM(RTRIM(cons.constraint_columns)) <> ''
THEN '[' + (SELECT STRING_AGG('"' + LTRIM(RTRIM(value)) + '"', ',') FROM STRING_SPLIT(cons.constraint_columns, ',')) + ']'
ELSE '[]'
END
) AS constraint_columns,
cons.foreign_key_referenced_table,
JSON_QUERY(
CASE
WHEN cons.foreign_key_referenced_columns IS NOT NULL AND LTRIM(RTRIM(cons.foreign_key_referenced_columns)) <> ''
THEN '[' + (SELECT STRING_AGG('"' + LTRIM(RTRIM(value)) + '"', ',') FROM STRING_SPLIT(cons.foreign_key_referenced_columns, ',')) + ']'
ELSE '[]'
END
) AS foreign_key_referenced_columns
FROM constraints_info cons
WHERE cons.table_oid = ti.table_oid
FOR JSON PATH
), '[]')) AS constraints,
JSON_QUERY(ISNULL((
SELECT
ii.index_name,
ii.index_definition_details AS index_definition,
ii.is_unique,
ii.is_primary,
ii.index_method,
JSON_QUERY(
CASE
WHEN ii.index_columns IS NOT NULL AND LTRIM(RTRIM(ii.index_columns)) <> ''
THEN '[' + (SELECT STRING_AGG('"' + LTRIM(RTRIM(value)) + '"', ',') FROM STRING_SPLIT(ii.index_columns, ',')) + ']'
ELSE '[]'
END
) AS index_columns
FROM indexes_info ii
WHERE ii.table_oid = ti.table_oid
FOR JSON PATH
), '[]')) AS indexes,
JSON_QUERY(ISNULL((
SELECT
tri.trigger_name,
tri.trigger_definition,
tri.trigger_enabled_state
FROM triggers_info tri
WHERE tri.table_oid = ti.table_oid
FOR JSON PATH
), '[]')) AS triggers
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER -- Creates a single JSON object for this table's details
) AS object_details
FROM
table_info ti
ORDER BY
ti.schema_name, ti.table_name;
parameters:
- name: table_names
type: string
description: "Optional: A comma-separated list of table names. If empty, details for all tables in user-accessible schemas will be listed."
toolsets:
mssql-database-tools:
- execute_sql
- list_tables

View File

@@ -0,0 +1,172 @@
sources:
mysql-source:
kind: mysql
host: ${MYSQL_HOST}
port: ${MYSQL_PORT}
database: ${MYSQL_DATABASE}
user: ${MYSQL_USER}
password: ${MYSQL_PASSWORD}
queryTimeout: 30s # Optional
tools:
execute_sql:
kind: mysql-execute-sql
source: mysql-source
description: Use this tool to execute SQL.
list_tables:
kind: mysql-sql
source: mysql-source
description: "Lists detailed schema information (object type, columns, constraints, indexes, triggers, comment) as JSON for user-created tables (ordinary or partitioned). Filters by a comma-separated list of names. If names are omitted, lists all tables in user schemas."
statement: |
SELECT
T.TABLE_SCHEMA AS schema_name,
T.TABLE_NAME AS object_name,
CONVERT( JSON_OBJECT(
'schema_name', T.TABLE_SCHEMA,
'object_name', T.TABLE_NAME,
'object_type', 'TABLE',
'owner', (
SELECT
IFNULL(U.GRANTEE, 'N/A')
FROM
INFORMATION_SCHEMA.SCHEMA_PRIVILEGES U
WHERE
U.TABLE_SCHEMA = T.TABLE_SCHEMA
LIMIT 1
),
'comment', IFNULL(T.TABLE_COMMENT, ''),
'columns', (
SELECT
IFNULL(
JSON_ARRAYAGG(
JSON_OBJECT(
'column_name', C.COLUMN_NAME,
'data_type', C.COLUMN_TYPE,
'ordinal_position', C.ORDINAL_POSITION,
'is_not_nullable', IF(C.IS_NULLABLE = 'NO', TRUE, FALSE),
'column_default', C.COLUMN_DEFAULT,
'column_comment', IFNULL(C.COLUMN_COMMENT, '')
)
),
JSON_ARRAY()
)
FROM
INFORMATION_SCHEMA.COLUMNS C
WHERE
C.TABLE_SCHEMA = T.TABLE_SCHEMA AND C.TABLE_NAME = T.TABLE_NAME
ORDER BY C.ORDINAL_POSITION
),
'constraints', (
SELECT
IFNULL(
JSON_ARRAYAGG(
JSON_OBJECT(
'constraint_name', TC.CONSTRAINT_NAME,
'constraint_type',
CASE TC.CONSTRAINT_TYPE
WHEN 'PRIMARY KEY' THEN 'PRIMARY KEY'
WHEN 'FOREIGN KEY' THEN 'FOREIGN KEY'
WHEN 'UNIQUE' THEN 'UNIQUE'
ELSE TC.CONSTRAINT_TYPE
END,
'constraint_definition', '',
'constraint_columns', (
SELECT
IFNULL(JSON_ARRAYAGG(KCU.COLUMN_NAME), JSON_ARRAY())
FROM
INFORMATION_SCHEMA.KEY_COLUMN_USAGE KCU
WHERE
KCU.CONSTRAINT_SCHEMA = TC.CONSTRAINT_SCHEMA
AND KCU.CONSTRAINT_NAME = TC.CONSTRAINT_NAME
AND KCU.TABLE_NAME = TC.TABLE_NAME
ORDER BY KCU.ORDINAL_POSITION
),
'foreign_key_referenced_table', IF(TC.CONSTRAINT_TYPE = 'FOREIGN KEY', RC.REFERENCED_TABLE_NAME, NULL),
'foreign_key_referenced_columns', IF(TC.CONSTRAINT_TYPE = 'FOREIGN KEY',
(SELECT IFNULL(JSON_ARRAYAGG(FKCU.REFERENCED_COLUMN_NAME), JSON_ARRAY())
FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE FKCU
WHERE FKCU.CONSTRAINT_SCHEMA = TC.CONSTRAINT_SCHEMA
AND FKCU.CONSTRAINT_NAME = TC.CONSTRAINT_NAME
AND FKCU.TABLE_NAME = TC.TABLE_NAME
AND FKCU.REFERENCED_TABLE_NAME IS NOT NULL
ORDER BY FKCU.ORDINAL_POSITION),
NULL
)
)
),
JSON_ARRAY()
)
FROM
INFORMATION_SCHEMA.TABLE_CONSTRAINTS TC
LEFT JOIN
INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS RC
ON TC.CONSTRAINT_SCHEMA = RC.CONSTRAINT_SCHEMA
AND TC.CONSTRAINT_NAME = RC.CONSTRAINT_NAME
AND TC.TABLE_NAME = RC.TABLE_NAME
WHERE
TC.TABLE_SCHEMA = T.TABLE_SCHEMA AND TC.TABLE_NAME = T.TABLE_NAME
),
'indexes', (
SELECT
IFNULL(
JSON_ARRAYAGG(
JSON_OBJECT(
'index_name', IndexData.INDEX_NAME,
'is_unique', IF(IndexData.NON_UNIQUE = 0, TRUE, FALSE),
'is_primary', IF(IndexData.INDEX_NAME = 'PRIMARY', TRUE, FALSE),
'index_columns', IFNULL(IndexData.INDEX_COLUMNS_ARRAY, JSON_ARRAY())
)
),
JSON_ARRAY()
)
FROM (
SELECT
S.TABLE_SCHEMA,
S.TABLE_NAME,
S.INDEX_NAME,
MIN(S.NON_UNIQUE) AS NON_UNIQUE, -- Aggregate NON_UNIQUE here to get unique status for the index
JSON_ARRAYAGG(S.COLUMN_NAME) AS INDEX_COLUMNS_ARRAY -- Aggregate columns into an array for this index
FROM
INFORMATION_SCHEMA.STATISTICS S
WHERE
S.TABLE_SCHEMA = T.TABLE_SCHEMA AND S.TABLE_NAME = T.TABLE_NAME
GROUP BY
S.TABLE_SCHEMA, S.TABLE_NAME, S.INDEX_NAME
) AS IndexData
ORDER BY IndexData.INDEX_NAME
),
'triggers', (
SELECT
IFNULL(
JSON_ARRAYAGG(
JSON_OBJECT(
'trigger_name', TR.TRIGGER_NAME,
'trigger_definition', TR.ACTION_STATEMENT
)
),
JSON_ARRAY()
)
FROM
INFORMATION_SCHEMA.TRIGGERS TR
WHERE
TR.EVENT_OBJECT_SCHEMA = T.TABLE_SCHEMA AND TR.EVENT_OBJECT_TABLE = T.TABLE_NAME
ORDER BY TR.TRIGGER_NAME
)
) USING utf8mb4) AS object_details
FROM
INFORMATION_SCHEMA.TABLES T
CROSS JOIN (SELECT @table_names := ?) AS variables
WHERE
T.TABLE_SCHEMA NOT IN ('mysql', 'information_schema', 'performance_schema', 'sys')
AND (NULLIF(TRIM(@table_names), '') IS NULL OR FIND_IN_SET(T.TABLE_NAME, @table_names))
AND T.TABLE_TYPE = 'BASE TABLE'
ORDER BY
T.TABLE_SCHEMA, T.TABLE_NAME;
parameters:
- name: table_names
type: string
description: "Optional: A comma-separated list of table names. If empty, details for all tables in user-accessible schemas will be listed."
default: ""
toolsets:
mysql-database-tools:
- execute_sql
- list_tables

View File

@@ -0,0 +1,123 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package looker
import (
"context"
"fmt"
"time"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"go.opentelemetry.io/otel/trace"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const SourceKind string = "looker"
// validate interface
var _ sources.SourceConfig = Config{}
func init() {
if !sources.Register(SourceKind, newConfig) {
panic(fmt.Sprintf("source kind %q already registered", SourceKind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (sources.SourceConfig, error) {
actual := Config{Name: name, SslVerification: "true", Timeout: "600s"} // Default Ssl,timeout
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
BaseURL string `yaml:"base_url" validate:"required"`
ClientId string `yaml:"client_id" validate:"required"`
ClientSecret string `yaml:"client_secret" validate:"required"`
SslVerification string `yaml:"verify_ssl"`
Timeout string `yaml:"timeout"`
}
func (r Config) SourceConfigKind() string {
return SourceKind
}
// Initialize initializes a Looker Source instance.
func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.Source, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
userAgent, err := util.UserAgentFromContext(ctx)
if err != nil {
return nil, err
}
duration, err := time.ParseDuration(r.Timeout)
if err != nil {
return nil, fmt.Errorf("unable to parse Timeout string as time.Duration: %s", err)
}
if r.SslVerification != "true" {
logger.WarnContext(ctx, "Insecure HTTP is enabled for Looker source %s. TLS certificate verification is skipped.\n", r.Name)
}
cfg := rtl.ApiSettings{
AgentTag: userAgent,
BaseUrl: r.BaseURL,
ApiVersion: "4.0",
VerifySsl: (r.SslVerification == "true"),
Timeout: int32(duration.Seconds()),
ClientId: r.ClientId,
ClientSecret: r.ClientSecret,
}
sdk := v4.NewLookerSDK(rtl.NewAuthSession(cfg))
me, err := sdk.Me("", &cfg)
if err != nil {
return nil, fmt.Errorf("unable to log in: %s", err)
}
logger.DebugContext(ctx, fmt.Sprintf("logged in as user %v %v.\n", *me.FirstName, *me.LastName))
s := &Source{
Name: r.Name,
Kind: SourceKind,
Timeout: r.Timeout,
Client: sdk,
ApiSettings: &cfg,
}
return s, nil
}
var _ sources.Source = &Source{}
type Source struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Timeout string `yaml:"timeout"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
}
func (s *Source) SourceKind() string {
return SourceKind
}

View File

@@ -0,0 +1,121 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package looker_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/testutils"
)
func TestParseFromYamlLooker(t *testing.T) {
tcs := []struct {
desc string
in string
want server.SourceConfigs
}{
{
desc: "basic example",
in: `
sources:
my-looker-instance:
kind: looker
base_url: http://example.looker.com/
client_id: jasdl;k;tjl
client_secret: sdakl;jgflkasdfkfg
`,
want: map[string]sources.SourceConfig{
"my-looker-instance": looker.Config{
Name: "my-looker-instance",
Kind: looker.SourceKind,
BaseURL: "http://example.looker.com/",
ClientId: "jasdl;k;tjl",
ClientSecret: "sdakl;jgflkasdfkfg",
Timeout: "600s",
SslVerification: "true",
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Sources server.SourceConfigs `yaml:"sources"`
}{}
// Parse contents
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if !cmp.Equal(tc.want, got.Sources) {
t.Fatalf("incorrect parse: want %v, got %v", tc.want, got.Sources)
}
})
}
}
func TestFailParseFromYamlLooker(t *testing.T) {
tcs := []struct {
desc string
in string
err string
}{
{
desc: "extra field",
in: `
sources:
my-looker-instance:
kind: looker
base_url: http://example.looker.com/
client_id: jasdl;k;tjl
client_secret: sdakl;jgflkasdfkfg
project: test-project
`,
err: "unable to parse source \"my-looker-instance\" as \"looker\": [5:1] unknown field \"project\"\n 2 | client_id: jasdl;k;tjl\n 3 | client_secret: sdakl;jgflkasdfkfg\n 4 | kind: looker\n> 5 | project: test-project\n ^\n",
},
{
desc: "missing required field",
in: `
sources:
my-looker-instance:
kind: looker
base_url: http://example.looker.com/
client_id: jasdl;k;tjl
`,
err: "unable to parse source \"my-looker-instance\" as \"looker\": Key: 'Config.ClientSecret' Error:Field validation for 'ClientSecret' failed on the 'required' tag",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Sources server.SourceConfigs `yaml:"sources"`
}{}
// Parse contents
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if errStr != tc.err {
t.Fatalf("unexpected error: got %q, want %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,112 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package mongodb
import (
"context"
"fmt"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"go.mongodb.org/mongo-driver/mongo"
"go.mongodb.org/mongo-driver/mongo/options"
"go.opentelemetry.io/otel/trace"
)
const SourceKind string = "mongodb"
// validate interface
var _ sources.SourceConfig = Config{}
func init() {
if !sources.Register(SourceKind, newConfig) {
panic(fmt.Sprintf("source kind %q already registered", SourceKind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (sources.SourceConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Uri string `yaml:"uri" validate:"required"` // MongoDB Atlas connection URI
}
func (r Config) SourceConfigKind() string {
return SourceKind
}
func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.Source, error) {
client, err := initMongoDBClient(ctx, tracer, r.Name, r.Uri)
if err != nil {
return nil, fmt.Errorf("unable to create MongoDB client: %w", err)
}
// Verify the connection
err = client.Ping(ctx, nil)
if err != nil {
return nil, fmt.Errorf("unable to connect successfully: %w", err)
}
s := &Source{
Name: r.Name,
Kind: SourceKind,
Client: client,
}
return s, nil
}
var _ sources.Source = &Source{}
type Source struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *mongo.Client
}
func (s *Source) SourceKind() string {
return SourceKind
}
func (s *Source) MongoClient() *mongo.Client {
return s.Client
}
func initMongoDBClient(ctx context.Context, tracer trace.Tracer, name, uri string) (*mongo.Client, error) {
// Start a tracing span
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
defer span.End()
userAgent, err := util.UserAgentFromContext(ctx)
if err != nil {
return nil, err
}
// Create a new MongoDB client
clientOpts := options.Client().ApplyURI(uri).SetAppName(userAgent)
client, err := mongo.Connect(ctx, clientOpts)
if err != nil {
return nil, fmt.Errorf("unable to create MongoDB client: %w", err)
}
return client, nil
}

View File

@@ -0,0 +1,111 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package mongodb_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/sources/mongodb"
"github.com/googleapis/genai-toolbox/internal/testutils"
)
func TestParseFromYamlMongoDB(t *testing.T) {
tcs := []struct {
desc string
in string
want server.SourceConfigs
}{
{
desc: "basic example",
in: `
sources:
mongo-db:
kind: "mongodb"
uri: "mongodb+srv://username:password@host/dbname"
`,
want: server.SourceConfigs{
"mongo-db": mongodb.Config{
Name: "mongo-db",
Kind: mongodb.SourceKind,
Uri: "mongodb+srv://username:password@host/dbname",
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Sources server.SourceConfigs `yaml:"sources"`
}{}
// Parse contents
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if !cmp.Equal(tc.want, got.Sources) {
t.Fatalf("incorrect parse: want %v, got %v", tc.want, got.Sources)
}
})
}
}
func TestFailParseFromYaml(t *testing.T) {
tcs := []struct {
desc string
in string
err string
}{
{
desc: "extra field",
in: `
sources:
mongo-db:
kind: mongodb
uri: "mongodb+srv://username:password@host/dbname"
foo: bar
`,
err: "unable to parse source \"mongo-db\" as \"mongodb\": [1:1] unknown field \"foo\"\n> 1 | foo: bar\n ^\n 2 | kind: mongodb\n 3 | uri: mongodb+srv://username:password@host/dbname",
},
{
desc: "missing required field",
in: `
sources:
mongo-db:
kind: mongodb
`,
err: "unable to parse source \"mongo-db\" as \"mongodb\": Key: 'Config.Uri' Error:Field validation for 'Uri' failed on the 'required' tag",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Sources server.SourceConfigs `yaml:"sources"`
}{}
// Parse contents
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if errStr != tc.err {
t.Fatalf("unexpected error: got \n%q, want \n%q", errStr, tc.err)
}
})
}
}

View File

@@ -15,8 +15,11 @@
package tools
import (
"bytes"
"encoding/json"
"fmt"
"regexp"
"text/template"
)
var validName = regexp.MustCompile(`^[a-zA-Z0-9_-]*$`)
@@ -25,6 +28,7 @@ func IsValidName(s string) bool {
return validName.MatchString(s)
}
// ConvertAnySliceToTyped a []any to typed slice ([]string, []int, []float etc.)
func ConvertAnySliceToTyped(s []any, itemType string) (any, error) {
var typedSlice any
switch itemType {
@@ -71,3 +75,44 @@ func ConvertAnySliceToTyped(s []any, itemType string) (any, error) {
}
return typedSlice, nil
}
// convertParamToJSON is a Go template helper function to convert a parameter to JSON formatted string.
func convertParamToJSON(param any) (string, error) {
jsonData, err := json.Marshal(param)
if err != nil {
return "", fmt.Errorf("failed to marshal param to JSON: %w", err)
}
return string(jsonData), nil
}
// PopulateTemplateWithJSON populate a Go template with a custom `json` array formatter
func PopulateTemplateWithJSON(templateName, templateString string, data map[string]any) (string, error) {
funcMap := template.FuncMap{
"json": convertParamToJSON,
}
tmpl, err := template.New(templateName).Funcs(funcMap).Parse(templateString)
if err != nil {
return "", fmt.Errorf("error parsing template '%s': %w", templateName, err)
}
var result bytes.Buffer
err = tmpl.Execute(&result, data)
if err != nil {
return "", fmt.Errorf("error executing template '%s': %w", templateName, err)
}
return result.String(), nil
}
// CheckDuplicateParameters verify there are no duplicate parameter names
func CheckDuplicateParameters(ps Parameters) error {
seenNames := make(map[string]bool)
for _, p := range ps {
pName := p.GetName()
if _, exists := seenNames[pName]; exists {
return fmt.Errorf("parameter name must be unique across all parameter fields. Duplicate parameter: %s", pName)
}
seenNames[pName] = true
}
return nil
}

View File

@@ -31,6 +31,7 @@ import (
"github.com/googleapis/genai-toolbox/internal/sources"
httpsrc "github.com/googleapis/genai-toolbox/internal/sources/http"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
)
const kind string = "http"
@@ -104,6 +105,16 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
if paramManifest == nil {
paramManifest = make([]tools.ParameterManifest, 0)
}
// Verify there are no duplicate parameter names
seenNames := make(map[string]bool)
for _, param := range paramManifest {
if _, exists := seenNames[param.Name]; exists {
return nil, fmt.Errorf("parameter name must be unique across queryParams, bodyParams, and headerParams. Duplicate parameter: %s", param.Name)
}
seenNames[param.Name] = true
}
pathMcpManifest := cfg.PathParams.McpManifest()
queryMcpManifest := cfg.QueryParams.McpManifest()
bodyMcpManifest := cfg.BodyParams.McpManifest()
@@ -142,15 +153,6 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
Required: concatRequiredManifest,
}
// Verify there are no duplicate parameter names
seenNames := make(map[string]bool)
for _, param := range paramManifest {
if _, exists := seenNames[param.Name]; exists {
return nil, fmt.Errorf("parameter name must be unique across queryParams, bodyParams, and headerParams. Duplicate parameter: %s", param.Name)
}
seenNames[param.Name] = true
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
@@ -206,15 +208,6 @@ type Tool struct {
mcpManifest tools.McpManifest
}
// helper function to convert a parameter to JSON formatted string.
func convertParamToJSON(param any) (string, error) {
jsonData, err := json.Marshal(param)
if err != nil {
return "", fmt.Errorf("failed to marshal param to JSON: %w", err)
}
return string(jsonData), nil
}
// Helper function to generate the HTTP request body upon Tool invocation.
func getRequestBody(bodyParams tools.Parameters, requestBodyPayload string, paramsMap map[string]any) (string, error) {
bodyParamValues, err := tools.GetParams(bodyParams, paramsMap)
@@ -223,20 +216,11 @@ func getRequestBody(bodyParams tools.Parameters, requestBodyPayload string, para
}
bodyParamsMap := bodyParamValues.AsMap()
// Create a FuncMap to format array parameters
funcMap := template.FuncMap{
"json": convertParamToJSON,
}
templ, err := template.New("body").Funcs(funcMap).Parse(requestBodyPayload)
requestBodyStr, err := tools.PopulateTemplateWithJSON("HTTPToolRequestBody", requestBodyPayload, bodyParamsMap)
if err != nil {
return "", fmt.Errorf("error parsing request body: %s", err)
return "", err
}
var result bytes.Buffer
err = templ.Execute(&result, bodyParamsMap)
if err != nil {
return "", fmt.Errorf("error replacing body payload: %s", err)
}
return result.String(), nil
return requestBodyStr, nil
}
// Helper function to generate the HTTP request URL upon Tool invocation.
@@ -330,6 +314,10 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error)
req.Header.Set(k, v)
}
if ua, err := util.UserAgentFromContext(ctx); err == nil {
req.Header.Set("User-Agent", ua)
}
// Make request and fetch response
resp, err := t.Client.Do(req)
if err != nil {

View File

@@ -0,0 +1,179 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetdimensions
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const kind string = "looker-get-dimensions"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
modelParameter := tools.NewStringParameter("model", "The model containing the explore.")
exploreParameter := tools.NewStringParameter("explore", "The explore containing the dimensions.")
parameters := tools.Parameters{modelParameter, exploreParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
mapParams := params.AsMap()
model, ok := mapParams["model"].(string)
if !ok {
return nil, fmt.Errorf("'model' must be a string, got %T", mapParams["model"])
}
explore, ok := mapParams["explore"].(string)
if !ok {
return nil, fmt.Errorf("'explore' must be a string, got %T", mapParams["explore"])
}
fields := "fields(dimensions(name,type,label,label_short))"
req := v4.RequestLookmlModelExplore{
LookmlModelName: model,
ExploreName: explore,
Fields: &fields,
}
resp, err := t.Client.LookmlModelExplore(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making get_dimensions request: %s", err)
}
var data []any
for _, v := range *resp.Fields.Dimensions {
logger.DebugContext(ctx, "Got response element of %v\n", v)
vMap := make(map[string]any)
if v.Name != nil {
vMap["name"] = *v.Name
}
if v.Type != nil {
vMap["type"] = *v.Type
}
if v.Label != nil {
vMap["label"] = *v.Label
}
if v.LabelShort != nil {
vMap["label_short"] = *v.LabelShort
}
logger.DebugContext(ctx, "Converted to %v\n", vMap)
data = append(data, vMap)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetdimensions_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdimensions"
)
func TestParseFromYamlLookerGetDimensions(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-get-dimensions
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-get-dimensions",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerGetDimensions(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-get-dimensions
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-get-dimensions\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-dimensions\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,165 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetexplores
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const kind string = "looker-get-explores"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
modelParameter := tools.NewStringParameter("model", "The model containing the explores.")
parameters := tools.Parameters{modelParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
mapParams := params.AsMap()
model, ok := mapParams["model"].(string)
if !ok {
return nil, fmt.Errorf("'model' must be a string, got %T", mapParams["model"])
}
resp, err := t.Client.LookmlModel(model, "explores(name,label,group_label)", t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making get_explores request: %s", err)
}
var data []any
for _, v := range *resp.Explores {
logger.DebugContext(ctx, "Got response element of %v\n", v)
vMap := make(map[string]any)
if v.Name != nil {
vMap["name"] = *v.Name
}
if v.Label != nil {
vMap["label"] = *v.Label
}
if v.GroupLabel != nil {
vMap["group_label"] = *v.GroupLabel
}
logger.DebugContext(ctx, "Converted to %v\n", vMap)
data = append(data, vMap)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetexplores_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetexplores"
)
func TestParseFromYamlLookerGetExplores(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-get-explores
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-get-explores",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerGetFilters(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-get-explores
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-get-explores\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-explores\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,179 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetfilters
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const kind string = "looker-get-filters"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
modelParameter := tools.NewStringParameter("model", "The model containing the explore.")
exploreParameter := tools.NewStringParameter("explore", "The explore containing the filters.")
parameters := tools.Parameters{modelParameter, exploreParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
mapParams := params.AsMap()
model, ok := mapParams["model"].(string)
if !ok {
return nil, fmt.Errorf("'model' must be a string, got %T", mapParams["model"])
}
explore, ok := mapParams["explore"].(string)
if !ok {
return nil, fmt.Errorf("'explore' must be a string, got %T", mapParams["explore"])
}
fields := "fields(filters(name,type,label,label_short))"
req := v4.RequestLookmlModelExplore{
LookmlModelName: model,
ExploreName: explore,
Fields: &fields,
}
resp, err := t.Client.LookmlModelExplore(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making get_filters request: %s", err)
}
var data []any
for _, v := range *resp.Fields.Filters {
logger.DebugContext(ctx, "Got response element of %v\n", v)
vMap := make(map[string]any)
if v.Name != nil {
vMap["name"] = *v.Name
}
if v.Type != nil {
vMap["type"] = *v.Type
}
if v.Label != nil {
vMap["label"] = *v.Label
}
if v.LabelShort != nil {
vMap["label_short"] = *v.LabelShort
}
logger.DebugContext(ctx, "Converted to %v\n", vMap)
data = append(data, vMap)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetfilters_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetfilters"
)
func TestParseFromYamlLookerGetFilters(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-get-filters
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-get-filters",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerGetFilters(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-get-filters
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-get-filters\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-filters\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,188 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetlooks
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const kind string = "looker-get-looks"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
titleParameter := tools.NewStringParameterWithDefault("title", "", "The title of the look.")
descParameter := tools.NewStringParameterWithDefault("desc", "", "The description of the look.")
limitParameter := tools.NewIntParameterWithDefault("limit", 100, "The number of looks to fetch. Default 100")
offsetParameter := tools.NewIntParameterWithDefault("offset", 0, "The number of looks to skip before fetching. Default 0")
parameters := tools.Parameters{
titleParameter,
descParameter,
limitParameter,
offsetParameter,
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
paramsMap := params.AsMap()
title := paramsMap["title"].(string)
title_ptr := &title
if *title_ptr == "" {
title_ptr = nil
}
desc := paramsMap["desc"].(string)
desc_ptr := &desc
if *desc_ptr == "" {
desc_ptr = nil
}
limit := int64(paramsMap["limit"].(int))
offset := int64(paramsMap["offset"].(int))
req := v4.RequestSearchLooks{
Title: title_ptr,
Description: desc_ptr,
Limit: &limit,
Offset: &offset,
}
resp, err := t.Client.SearchLooks(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making get_looks request: %s", err)
}
var data []any
for _, v := range resp {
logger.DebugContext(ctx, "Got response element of %v\n", v)
vMap := make(map[string]any)
if v.Id != nil {
vMap["id"] = *v.Id
}
if v.Title != nil {
vMap["title"] = *v.Title
}
if v.Description != nil {
vMap["description"] = *v.Description
}
vMap["model_id"] = *v.Model.Id
logger.DebugContext(ctx, "Converted to %v\n", vMap)
data = append(data, vMap)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetlooks_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetlooks"
)
func TestParseFromYamlLookerGetLooks(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-get-looks
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-get-looks",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerGetLooks(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-get-looks
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-get-looks\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-looks\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,179 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetmeasures
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const kind string = "looker-get-measures"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
modelParameter := tools.NewStringParameter("model", "The model containing the explore.")
exploreParameter := tools.NewStringParameter("explore", "The explore containing the measures.")
parameters := tools.Parameters{modelParameter, exploreParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
mapParams := params.AsMap()
model, ok := mapParams["model"].(string)
if !ok {
return nil, fmt.Errorf("'model' must be a string, got %T", mapParams["model"])
}
explore, ok := mapParams["explore"].(string)
if !ok {
return nil, fmt.Errorf("'explore' must be a string, got %T", mapParams["explore"])
}
fields := "fields(measures(name,type,label,label_short))"
req := v4.RequestLookmlModelExplore{
LookmlModelName: model,
ExploreName: explore,
Fields: &fields,
}
resp, err := t.Client.LookmlModelExplore(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making get_measures request: %s", err)
}
var data []any
for _, v := range *resp.Fields.Measures {
logger.DebugContext(ctx, "Got response element of %v\n", v)
vMap := make(map[string]any)
if v.Name != nil {
vMap["name"] = *v.Name
}
if v.Type != nil {
vMap["type"] = *v.Type
}
if v.Label != nil {
vMap["label"] = *v.Label
}
if v.LabelShort != nil {
vMap["label_short"] = *v.LabelShort
}
logger.DebugContext(ctx, "Converted to %v\n", vMap)
data = append(data, vMap)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetmeasures_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmeasures"
)
func TestParseFromYamlLookerGetMeasures(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-get-measures
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-get-measures",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerGetMeasures(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-get-measures
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-get-measures\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-measures\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,162 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetmodels
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const kind string = "looker-get-models"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
parameters := tools.Parameters{}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
excludeEmpty := false
excludeHidden := false
includeInternal := true
req := v4.RequestAllLookmlModels{
ExcludeEmpty: &excludeEmpty,
ExcludeHidden: &excludeHidden,
IncludeInternal: &includeInternal,
}
resp, err := t.Client.AllLookmlModels(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making get_models request: %s", err)
}
var data []any
for _, v := range resp {
logger.DebugContext(ctx, "Got response element of %v\n", v)
vMap := make(map[string]any)
vMap["label"] = *v.Label
vMap["name"] = *v.Name
vMap["project_name"] = *v.ProjectName
logger.DebugContext(ctx, "Converted to %v\n", vMap)
data = append(data, vMap)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParamValues{}, nil
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetmodels_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmodels"
)
func TestParseFromYamlLookerGetModels(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-get-models
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-get-models",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerGetModels(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-get-models
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-get-models\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-models\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,179 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetparameters
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
)
const kind string = "looker-get-parameters"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
modelParameter := tools.NewStringParameter("model", "The model containing the explore.")
exploreParameter := tools.NewStringParameter("explore", "The explore containing the parameters.")
parameters := tools.Parameters{modelParameter, exploreParameter}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
mapParams := params.AsMap()
model, ok := mapParams["model"].(string)
if !ok {
return nil, fmt.Errorf("'model' must be a string, got %T", mapParams["model"])
}
explore, ok := mapParams["explore"].(string)
if !ok {
return nil, fmt.Errorf("'explore' must be a string, got %T", mapParams["explore"])
}
fields := "fields(parameters(name,type,label,label_short))"
req := v4.RequestLookmlModelExplore{
LookmlModelName: model,
ExploreName: explore,
Fields: &fields,
}
resp, err := t.Client.LookmlModelExplore(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making get_parameters request: %s", err)
}
var data []any
for _, v := range *resp.Fields.Parameters {
logger.DebugContext(ctx, "Got response element of %v\n", v)
vMap := make(map[string]any)
if v.Name != nil {
vMap["name"] = *v.Name
}
if v.Type != nil {
vMap["type"] = *v.Type
}
if v.Label != nil {
vMap["label"] = *v.Label
}
if v.LabelShort != nil {
vMap["label_short"] = *v.LabelShort
}
logger.DebugContext(ctx, "Converted to %v\n", vMap)
data = append(data, vMap)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookergetparameters_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetparameters"
)
func TestParseFromYamlLookerGetParameters(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-get-parameters
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-get-parameters",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerGetParameters(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-get-parameters
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-get-parameters\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-parameters\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,239 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookerquery
import (
"context"
"encoding/json"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
"github.com/thlib/go-timezone-local/tzlocal"
)
const kind string = "looker-query"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
modelParameter := tools.NewStringParameter("model", "The model containing the explore.")
exploreParameter := tools.NewStringParameter("explore", "The explore to be queried.")
fieldsParameter := tools.NewArrayParameter("fields",
"The fields to be retrieved.",
tools.NewStringParameter("field", "A field to be returned in the query"),
)
filtersParameter := tools.NewMapParameterWithDefault("filters",
map[string]any{},
"The filters for the query",
"",
)
pivotsParameter := tools.NewArrayParameterWithDefault("pivots",
[]any{},
"The query pivots (must be included in fields as well).",
tools.NewStringParameter("pivot_field", "A field to be used as a pivot in the query"),
)
sortsParameter := tools.NewArrayParameterWithDefault("sorts",
[]any{},
"The sorts like \"field.id desc 0\".",
tools.NewStringParameter("sort_field", "A field to be used as a sort in the query"),
)
limitParameter := tools.NewIntParameterWithDefault("limit", 500, "The row limit.")
tzParameter := tools.NewStringParameterWithRequired("tz", "The query timezone.", false)
parameters := tools.Parameters{
modelParameter,
exploreParameter,
fieldsParameter,
filtersParameter,
pivotsParameter,
sortsParameter,
limitParameter,
tzParameter,
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
logger.DebugContext(ctx, "params = ", params)
paramsMap := params.AsMap()
f, err := tools.ConvertAnySliceToTyped(paramsMap["fields"].([]any), "string")
if err != nil {
return nil, fmt.Errorf("can't convert fields to array of strings: %s", err)
}
fields := f.([]string)
filters := paramsMap["filters"].(map[string]any)
// Sometimes filters come as "'field.id'": "expression" so strip extra ''
for k, v := range filters {
if len(k) > 0 && k[0] == '\'' && k[len(k)-1] == '\'' {
delete(filters, k)
filters[k[1:len(k)-1]] = v
}
}
p, err := tools.ConvertAnySliceToTyped(paramsMap["pivots"].([]any), "string")
if err != nil {
return nil, fmt.Errorf("can't convert pivots to array of strings: %s", err)
}
pivots := p.([]string)
s, err := tools.ConvertAnySliceToTyped(paramsMap["sorts"].([]any), "string")
if err != nil {
return nil, fmt.Errorf("can't convert sorts to array of strings: %s", err)
}
sorts := s.([]string)
limit := int64(paramsMap["limit"].(int))
var tz string
if paramsMap["tz"] != nil {
tz = paramsMap["tz"].(string)
} else {
tzname, err := tzlocal.RuntimeTZ()
if err != nil {
logger.ErrorContext(ctx, fmt.Sprintf("Error getting local timezone: %s", err))
tzname = "Etc/UTC"
}
tz = tzname
}
wq := v4.WriteQuery{
Model: paramsMap["model"].(string),
View: paramsMap["explore"].(string),
Fields: &fields,
Pivots: &pivots,
Filters: &filters,
Sorts: &sorts,
QueryTimezone: &tz,
}
req := v4.RequestRunInlineQuery{
ResultFormat: "json",
Limit: &limit,
Body: wq,
}
resp, err := t.Client.RunInlineQuery(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making query request: %s", err)
}
logger.DebugContext(ctx, "resp = ", resp)
var data []any
e := json.Unmarshal([]byte(resp), &data)
if e != nil {
return nil, fmt.Errorf("error unmarshaling query response: %s", e)
}
logger.DebugContext(ctx, "data = ", data)
return data, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

View File

@@ -0,0 +1,116 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookerquery_test
import (
"strings"
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquery"
)
func TestParseFromYamlLookerQuery(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: looker-query
source: my-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": lkr.Config{
Name: "example_tool",
Kind: "looker-query",
Source: "my-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}
func TestFailParseFromYamlLookerQuery(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
err string
}{
{
desc: "Invalid method",
in: `
tools:
example_tool:
kind: looker-query
source: my-instance
method: GOT
description: some description
`,
err: "unable to parse tool \"example_tool\" as kind \"looker-query\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-query\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
errStr := err.Error()
if !strings.Contains(errStr, tc.err) {
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
}
})
}
}

View File

@@ -0,0 +1,232 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package lookerquerysql
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/looker-open-source/sdk-codegen/go/rtl"
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
"github.com/thlib/go-timezone-local/tzlocal"
)
const kind string = "looker-query-sql"
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description" validate:"required"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(*lookersrc.Source)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
}
modelParameter := tools.NewStringParameter("model", "The model containing the explore.")
exploreParameter := tools.NewStringParameter("explore", "The explore to be queried.")
fieldsParameter := tools.NewArrayParameterWithDefault("fields",
[]any{},
"The fields to be retrieved.",
tools.NewStringParameter("field", "A field to be returned in the query"),
)
filtersParameter := tools.NewMapParameterWithDefault("filters",
map[string]any{},
"The filters for the query",
"",
)
pivotsParameter := tools.NewArrayParameterWithDefault("pivots",
[]any{},
"The query pivots (must be included in fields as well).",
tools.NewStringParameter("pivot_field", "A field to be used as a pivot in the query"),
)
sortsParameter := tools.NewArrayParameterWithDefault("sorts",
[]any{},
"The sorts like \"field.id desc 0\".",
tools.NewStringParameter("sort_field", "A field to be used as a sort in the query"),
)
limitParameter := tools.NewIntParameterWithDefault("limit", 500, "The row limit.")
tzParameter := tools.NewStringParameterWithRequired("tz", "The query timezone.", false)
parameters := tools.Parameters{
modelParameter,
exploreParameter,
fieldsParameter,
filtersParameter,
pivotsParameter,
sortsParameter,
limitParameter,
tzParameter,
}
mcpManifest := tools.McpManifest{
Name: cfg.Name,
Description: cfg.Description,
InputSchema: parameters.McpManifest(),
}
// finish tool setup
return Tool{
Name: cfg.Name,
Kind: kind,
Parameters: parameters,
AuthRequired: cfg.AuthRequired,
Client: s.Client,
ApiSettings: s.ApiSettings,
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: parameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Client *v4.LookerSDK
ApiSettings *rtl.ApiSettings
AuthRequired []string `yaml:"authRequired"`
Parameters tools.Parameters `yaml:"parameters"`
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
logger.DebugContext(ctx, "params = ", params)
paramsMap := params.AsMap()
f, err := tools.ConvertAnySliceToTyped(paramsMap["fields"].([]any), "string")
if err != nil {
return nil, fmt.Errorf("can't convert fields to array of strings: %s", err)
}
fields := f.([]string)
filters := paramsMap["filters"].(map[string]any)
// Sometimes filters come as "'field.id'": "expression" so strip extra ''
for k, v := range filters {
if len(k) > 0 && k[0] == '\'' && k[len(k)-1] == '\'' {
delete(filters, k)
filters[k[1:len(k)-1]] = v
}
}
p, err := tools.ConvertAnySliceToTyped(paramsMap["pivots"].([]any), "string")
if err != nil {
return nil, fmt.Errorf("can't convert pivots to array of strings: %s", err)
}
pivots := p.([]string)
s, err := tools.ConvertAnySliceToTyped(paramsMap["sorts"].([]any), "string")
if err != nil {
return nil, fmt.Errorf("can't convert sorts to array of strings: %s", err)
}
sorts := s.([]string)
limit := int64(paramsMap["limit"].(int))
var tz string
if paramsMap["tz"] != nil {
tz = paramsMap["tz"].(string)
} else {
tzname, err := tzlocal.RuntimeTZ()
if err != nil {
logger.ErrorContext(ctx, fmt.Sprintf("Error getting local timezone: %s", err))
tzname = "Etc/UTC"
}
tz = tzname
}
wq := v4.WriteQuery{
Model: paramsMap["model"].(string),
View: paramsMap["explore"].(string),
Fields: &fields,
Pivots: &pivots,
Filters: &filters,
Sorts: &sorts,
QueryTimezone: &tz,
}
req := v4.RequestRunInlineQuery{
ResultFormat: "sql",
Limit: &limit,
Body: wq,
}
resp, err := t.Client.RunInlineQuery(req, t.ApiSettings)
if err != nil {
return nil, fmt.Errorf("error making query_sql request: %s", err)
}
logger.DebugContext(ctx, "resp = ", resp)
return resp, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
return tools.ParseParams(t.Parameters, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return true
}

Some files were not shown because too many files have changed in this diff Show More