mirror of
https://github.com/googleapis/genai-toolbox.git
synced 2026-01-12 08:58:28 -05:00
Compare commits
23 Commits
ci-tests
...
fix-python
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
27caa1f421 | ||
|
|
2b80be1c6a | ||
|
|
ac7967e9eb | ||
|
|
bfa1fff494 | ||
|
|
98e3f6abe4 | ||
|
|
3cff915e22 | ||
|
|
5a6db196c2 | ||
|
|
038585216a | ||
|
|
54799f30b5 | ||
|
|
d7d1b03f3b | ||
|
|
047e60e144 | ||
|
|
898428759c | ||
|
|
e09d182f88 | ||
|
|
38d535de34 | ||
|
|
ef28e39e90 | ||
|
|
a2006ad577 | ||
|
|
fae133930f | ||
|
|
4620666b2e | ||
|
|
937608db3e | ||
|
|
69138dc801 | ||
|
|
0ed998bfed | ||
|
|
57b49358b9 | ||
|
|
a8e98dc99d |
13
DEVELOPER.md
13
DEVELOPER.md
@@ -156,6 +156,8 @@ go test -race -v ./...
|
||||
* **Internal Contributors:** Testing workflows should trigger automatically.
|
||||
* **External Contributors:** Request Toolbox maintainers to trigger the testing
|
||||
workflows on your PR.
|
||||
* Maintainers can comment `/gcbrun` to execute the integration tests.
|
||||
* Maintainers can add the label `tests:run` to execute the unit tests.
|
||||
|
||||
#### Test Resources
|
||||
|
||||
@@ -183,6 +185,9 @@ variables for each source.
|
||||
* Couchbase - setup in the test project via the Marketplace
|
||||
* DGraph - using the public dgraph interface <https://play.dgraph.io> for
|
||||
testing
|
||||
* Looker
|
||||
* The Cloud Build service account is a user for conversational analytics
|
||||
* The Looker instance runs under google.com:looker-sandbox.
|
||||
* Memorystore Redis - setup in the test project using a Memorystore for Redis
|
||||
standalone instance
|
||||
* Memorystore Redis Cluster, Memorystore Valkey standalone, and Memorystore
|
||||
@@ -330,10 +335,16 @@ for instructions on developing Toolbox SDKs.
|
||||
|
||||
### Team
|
||||
|
||||
Team, `@googleapis/senseai-eco`, has been set as
|
||||
Team `@googleapis/senseai-eco` has been set as
|
||||
[CODEOWNERS](.github/CODEOWNERS). The GitHub TeamSync tool is used to create
|
||||
this team from MDB Group, `senseai-eco`.
|
||||
|
||||
Team `@googleapis/toolbox-contributors` has write access to this repo. They
|
||||
can create branches and approve test runs. But they do not have the ability
|
||||
to approve PRs for main. TeamSync is used to create this team from the MDB
|
||||
Group `toolbox-contributors`. Googlers who are developing for MCP-Toolbox
|
||||
but aren't part of the core team should join this group.
|
||||
|
||||
### Releasing
|
||||
|
||||
Toolbox has two types of releases: versioned and continuous. It uses Google
|
||||
|
||||
46
README.md
46
README.md
@@ -34,6 +34,7 @@ documentation](https://googleapis.github.io/genai-toolbox/).
|
||||
- [Installing the server](#installing-the-server)
|
||||
- [Running the server](#running-the-server)
|
||||
- [Integrating your application](#integrating-your-application)
|
||||
- [Using Toolbox with Gemini CLI Extensions](#using-toolbox-with-gemini-cli-extensions)
|
||||
- [Configuration](#configuration)
|
||||
- [Sources](#sources)
|
||||
- [Tools](#tools)
|
||||
@@ -764,6 +765,51 @@ For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
</blockquote>
|
||||
</details>
|
||||
|
||||
### Using Toolbox with Gemini CLI Extensions
|
||||
|
||||
[Gemini CLI extensions][gemini-cli-extensions] provide tools to interact
|
||||
directly with your data sources from command line. Below is a list of Gemini CLI
|
||||
extensions that are built on top of **Toolbox**. They allow you to interact with
|
||||
your data sources through pre-defined or custom tools with natural language.
|
||||
Click into the link to see detailed instructions on their usage.
|
||||
|
||||
To use **custom** tools with Gemini CLI:
|
||||
|
||||
- [MCP Toolbox](https://github.com/gemini-cli-extensions/mcp-toolbox)
|
||||
|
||||
To use [prebuilt tools][prebuilt] with Gemini CLI:
|
||||
|
||||
- [AlloyDB for PostgreSQL](https://github.com/gemini-cli-extensions/alloydb)
|
||||
- [AlloyDB for PostgreSQL
|
||||
Observability](https://github.com/gemini-cli-extensions/alloydb-observability)
|
||||
- [BigQuery Data
|
||||
Analytics](https://github.com/gemini-cli-extensions/bigquery-data-analytics)
|
||||
- [BigQuery Conversational
|
||||
Analytics](https://github.com/gemini-cli-extensions/bigquery-conversational-analytics)
|
||||
- [Cloud SQL for
|
||||
MySQL](https://github.com/gemini-cli-extensions/cloud-sql-mysql)
|
||||
- [Cloud SQL for MySQL
|
||||
Observability](https://github.com/gemini-cli-extensions/cloud-sql-mysql-observability)
|
||||
- [Cloud SQL for
|
||||
PostgreSQL](https://github.com/gemini-cli-extensions/cloud-sql-postgresql)
|
||||
- [Cloud SQL for PostgreSQL
|
||||
Observability](https://github.com/gemini-cli-extensions/cloud-sql-postgresql-observability)
|
||||
- [Cloud SQL for SQL
|
||||
Server](https://github.com/gemini-cli-extensions/cloud-sql-sqlserver)
|
||||
- [Cloud SQL for SQL Server
|
||||
Observability](https://github.com/gemini-cli-extensions/cloud-sql-sqlserver-observability)
|
||||
- [Looker](https://github.com/gemini-cli-extensions/looker)
|
||||
- [Dataplex](https://github.com/gemini-cli-extensions/dataplex)
|
||||
- [MySQL](https://github.com/gemini-cli-extensions/mysql)
|
||||
- [PostgreSQL](https://github.com/gemini-cli-extensions/postgres)
|
||||
- [Spanner](https://github.com/gemini-cli-extensions/spanner)
|
||||
- [Firestore](https://github.com/gemini-cli-extensions/firestore-native)
|
||||
- [SQL Server](https://github.com/gemini-cli-extensions/sql-server)
|
||||
|
||||
[prebuilt]: https://googleapis.github.io/genai-toolbox/reference/prebuilt-tools/
|
||||
[gemini-cli-extensions]:
|
||||
https://github.com/google-gemini/gemini-cli/blob/main/docs/extensions/index.md
|
||||
|
||||
## Configuration
|
||||
|
||||
The primary way to configure Toolbox is through the `tools.yaml` file. If you
|
||||
|
||||
@@ -102,6 +102,11 @@ import (
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerconversationalanalytics"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerdeleteprojectfile"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerdevmode"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectiondatabases"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnections"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectionschemas"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectiontables"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectiontablecolumns"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdashboards"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdimensions"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetexplores"
|
||||
|
||||
@@ -1507,7 +1507,7 @@ func TestPrebuiltTools(t *testing.T) {
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"looker_tools": tools.ToolsetConfig{
|
||||
Name: "looker_tools",
|
||||
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look", "make_look", "get_dashboards", "make_dashboard", "add_dashboard_element", "health_pulse", "health_analyze", "health_vacuum", "dev_mode", "get_projects", "get_project_files", "get_project_file", "create_project_file", "update_project_file", "delete_project_file"},
|
||||
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look", "make_look", "get_dashboards", "make_dashboard", "add_dashboard_element", "health_pulse", "health_analyze", "health_vacuum", "dev_mode", "get_projects", "get_project_files", "get_project_file", "create_project_file", "update_project_file", "delete_project_file", "get_connections", "get_connection_schemas", "get_connection_databases", "get_connection_tables", "get_connection_table_columns"},
|
||||
},
|
||||
},
|
||||
},
|
||||
|
||||
@@ -4,7 +4,7 @@ go 1.24.6
|
||||
|
||||
require (
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.0
|
||||
google.golang.org/genai v1.31.0
|
||||
google.golang.org/genai v1.33.0
|
||||
)
|
||||
|
||||
require (
|
||||
|
||||
@@ -102,8 +102,8 @@ golang.org/x/time v0.12.0 h1:ScB/8o8olJvc+CQPWrK3fPZNfh7qgwCrY0zJmoEQLSE=
|
||||
golang.org/x/time v0.12.0/go.mod h1:CDIdPxbZBQxdj6cxyCIdrNogrJKMJ7pr37NYpMcMDSg=
|
||||
google.golang.org/api v0.248.0 h1:hUotakSkcwGdYUqzCRc5yGYsg4wXxpkKlW5ryVqvC1Y=
|
||||
google.golang.org/api v0.248.0/go.mod h1:yAFUAF56Li7IuIQbTFoLwXTCI6XCFKueOlS7S9e4F9k=
|
||||
google.golang.org/genai v1.31.0 h1:R7xDt/Dosz11vcXbZ4IgisGnzUGGau2PZOIOAnXsYjw=
|
||||
google.golang.org/genai v1.31.0/go.mod h1:7pAilaICJlQBonjKKJNhftDFv3SREhZcTe9F6nRcjbg=
|
||||
google.golang.org/genai v1.33.0 h1:DExzJZbSbxSRmwX2gCsZ+V9vb6rjdmsOAy47ASBgKvg=
|
||||
google.golang.org/genai v1.33.0/go.mod h1:7pAilaICJlQBonjKKJNhftDFv3SREhZcTe9F6nRcjbg=
|
||||
google.golang.org/genproto v0.0.0-20250603155806-513f23925822 h1:rHWScKit0gvAPuOnu87KpaYtjK5zBMLcULh7gxkCXu4=
|
||||
google.golang.org/genproto v0.0.0-20250603155806-513f23925822/go.mod h1:HubltRL7rMh0LfnQPkMH4NPDFEWp0jw3vixw7jEM53s=
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c h1:AtEkQdl5b6zsybXcbz00j1LwNodDuH6hVifIaNqk7NQ=
|
||||
|
||||
@@ -4,7 +4,7 @@ go 1.24.6
|
||||
|
||||
require (
|
||||
github.com/firebase/genkit/go v1.1.0
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.1-0.20251021051241-eb73e0c6c414
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.1-0.20251031124047-f1f6a9faa2a1
|
||||
)
|
||||
|
||||
require (
|
||||
@@ -39,15 +39,15 @@ require (
|
||||
go.opentelemetry.io/otel/metric v1.37.0 // indirect
|
||||
go.opentelemetry.io/otel/sdk v1.37.0 // indirect
|
||||
go.opentelemetry.io/otel/trace v1.37.0 // indirect
|
||||
golang.org/x/crypto v0.42.0 // indirect
|
||||
golang.org/x/net v0.44.0 // indirect
|
||||
golang.org/x/crypto v0.43.0 // indirect
|
||||
golang.org/x/net v0.46.0 // indirect
|
||||
golang.org/x/oauth2 v0.32.0 // indirect
|
||||
golang.org/x/sys v0.36.0 // indirect
|
||||
golang.org/x/text v0.29.0 // indirect
|
||||
google.golang.org/api v0.252.0 // indirect
|
||||
golang.org/x/sys v0.37.0 // indirect
|
||||
golang.org/x/text v0.30.0 // indirect
|
||||
google.golang.org/api v0.254.0 // indirect
|
||||
google.golang.org/genai v1.30.0 // indirect
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251002232023-7c0ddcbb5797 // indirect
|
||||
google.golang.org/grpc v1.75.1 // indirect
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251022142026-3a174f9686a8 // indirect
|
||||
google.golang.org/grpc v1.76.0 // indirect
|
||||
google.golang.org/protobuf v1.36.10 // indirect
|
||||
gopkg.in/yaml.v3 v3.0.1 // indirect
|
||||
)
|
||||
|
||||
@@ -14,8 +14,8 @@ cloud.google.com/go/monitoring v1.24.2 h1:5OTsoJ1dXYIiMiuL+sYscLc9BumrL3CarVLL7d
|
||||
cloud.google.com/go/monitoring v1.24.2/go.mod h1:x7yzPWcgDRnPEv3sI+jJGBkwl5qINf+6qY4eq0I9B4U=
|
||||
cloud.google.com/go/secretmanager v1.16.0 h1:19QT7ZsLJ8FSP1k+4esQvuCD7npMJml6hYzilxVyT+k=
|
||||
cloud.google.com/go/secretmanager v1.16.0/go.mod h1://C/e4I8D26SDTz1f3TQcddhcmiC3rMEl0S1Cakvs3Q=
|
||||
cloud.google.com/go/storage v1.57.0 h1:4g7NB7Ta7KetVbOMpCqy89C+Vg5VE8scqlSHUPm7Rds=
|
||||
cloud.google.com/go/storage v1.57.0/go.mod h1:329cwlpzALLgJuu8beyJ/uvQznDHpa2U5lGjWednkzg=
|
||||
cloud.google.com/go/storage v1.57.1 h1:gzao6odNJ7dR3XXYvAgPK+Iw4fVPPznEPPyNjbaVkq8=
|
||||
cloud.google.com/go/storage v1.57.1/go.mod h1:329cwlpzALLgJuu8beyJ/uvQznDHpa2U5lGjWednkzg=
|
||||
github.com/GoogleCloudPlatform/opentelemetry-operations-go/detectors/gcp v1.29.0 h1:UQUsRi8WTzhZntp5313l+CHIAT95ojUI2lpP/ExlZa4=
|
||||
github.com/GoogleCloudPlatform/opentelemetry-operations-go/detectors/gcp v1.29.0/go.mod h1:Cz6ft6Dkn3Et6l2v2a9/RpN7epQ1GtDlO6lj8bEcOvw=
|
||||
github.com/GoogleCloudPlatform/opentelemetry-operations-go/exporter/metric v0.53.0 h1:owcC2UnmsZycprQ5RfRgjydWhuoxg71LUfyiQdijZuM=
|
||||
@@ -42,8 +42,8 @@ github.com/felixge/httpsnoop v1.0.4 h1:NFTV2Zj1bL4mc9sqWACXbQFVBBg2W3GPvqp8/ESS2
|
||||
github.com/felixge/httpsnoop v1.0.4/go.mod h1:m8KPJKqk1gH5J9DgRY2ASl2lWCfGKXixSwevea8zH2U=
|
||||
github.com/firebase/genkit/go v1.1.0 h1:SQqzQt19gEubvUUCFV98TARFAzD30zT3QhseF3oTKqo=
|
||||
github.com/firebase/genkit/go v1.1.0/go.mod h1:ru1cIuxG1s3HeUjhnadVveDJ1yhinj+j+uUh0f0pyxE=
|
||||
github.com/go-jose/go-jose/v4 v4.1.1 h1:JYhSgy4mXXzAdF3nUx3ygx347LRXJRrpgyU3adRmkAI=
|
||||
github.com/go-jose/go-jose/v4 v4.1.1/go.mod h1:BdsZGqgdO3b6tTc6LSE56wcDbMMLuPsw5d4ZD5f94kA=
|
||||
github.com/go-jose/go-jose/v4 v4.1.2 h1:TK/7NqRQZfgAh+Td8AlsrvtPoUyiHh0LqVvokh+1vHI=
|
||||
github.com/go-jose/go-jose/v4 v4.1.2/go.mod h1:22cg9HWM1pOlnRiY+9cQYJ9XHmya1bYW8OeDM6Ku6Oo=
|
||||
github.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=
|
||||
github.com/go-logr/logr v1.4.3 h1:CjnDlHq8ikf6E492q6eKboGOC0T8CDaOvkHCIg8idEI=
|
||||
github.com/go-logr/logr v1.4.3/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
|
||||
@@ -65,8 +65,8 @@ github.com/googleapis/enterprise-certificate-proxy v0.3.6 h1:GW/XbdyBFQ8Qe+YAmFU
|
||||
github.com/googleapis/enterprise-certificate-proxy v0.3.6/go.mod h1:MkHOF77EYAE7qfSuSS9PU6g4Nt4e11cnsDUowfwewLA=
|
||||
github.com/googleapis/gax-go/v2 v2.15.0 h1:SyjDc1mGgZU5LncH8gimWo9lW1DtIfPibOG81vgd/bo=
|
||||
github.com/googleapis/gax-go/v2 v2.15.0/go.mod h1:zVVkkxAQHa1RQpg9z2AUCMnKhi0Qld9rcmyfL1OZhoc=
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.1-0.20251021051241-eb73e0c6c414 h1:7eZIPHungb5tmLtlsuEdIfdi8ltZPIwHFmnZn6hhcwI=
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.1-0.20251021051241-eb73e0c6c414/go.mod h1:kdrSVCmbJc0JcLw5MnEKR8WQdT8pTT3duj4B0lktghA=
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.1-0.20251031124047-f1f6a9faa2a1 h1:2tNgqscblFtzD0VB1VKqV4vVMveISvepf53GsVkOaaQ=
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.1-0.20251031124047-f1f6a9faa2a1/go.mod h1:ywzclXOfzsAb2Z3TJqNOi9dZej+VrF9QfuvFOOHjpX8=
|
||||
github.com/gorilla/websocket v1.5.3 h1:saDtZ6Pbx/0u+bgYQ3q96pZgCzfhKXGPqt7kZ72aNNg=
|
||||
github.com/gorilla/websocket v1.5.3/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE=
|
||||
github.com/invopop/jsonschema v0.13.0 h1:KvpoAJWEjR3uD9Kbm2HWJmqsEaHt8lBUpd0qHcIi21E=
|
||||
@@ -125,34 +125,34 @@ go.opentelemetry.io/otel/trace v1.37.0 h1:HLdcFNbRQBE2imdSEgm/kwqmQj1Or1l/7bW6mx
|
||||
go.opentelemetry.io/otel/trace v1.37.0/go.mod h1:TlgrlQ+PtQO5XFerSPUYG0JSgGyryXewPGyayAWSBS0=
|
||||
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
|
||||
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
|
||||
golang.org/x/crypto v0.42.0 h1:chiH31gIWm57EkTXpwnqf8qeuMUi0yekh6mT2AvFlqI=
|
||||
golang.org/x/crypto v0.42.0/go.mod h1:4+rDnOTJhQCx2q7/j6rAN5XDw8kPjeaXEUR2eL94ix8=
|
||||
golang.org/x/net v0.44.0 h1:evd8IRDyfNBMBTTY5XRF1vaZlD+EmWx6x8PkhR04H/I=
|
||||
golang.org/x/net v0.44.0/go.mod h1:ECOoLqd5U3Lhyeyo/QDCEVQ4sNgYsqvCZ722XogGieY=
|
||||
golang.org/x/crypto v0.43.0 h1:dduJYIi3A3KOfdGOHX8AVZ/jGiyPa3IbBozJ5kNuE04=
|
||||
golang.org/x/crypto v0.43.0/go.mod h1:BFbav4mRNlXJL4wNeejLpWxB7wMbc79PdRGhWKncxR0=
|
||||
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
|
||||
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
|
||||
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
|
||||
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
|
||||
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
|
||||
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||
golang.org/x/sys v0.36.0 h1:KVRy2GtZBrk1cBYA7MKu5bEZFxQk4NIDV6RLVcC8o0k=
|
||||
golang.org/x/sys v0.36.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
||||
golang.org/x/text v0.29.0 h1:1neNs90w9YzJ9BocxfsQNHKuAT4pkghyXc4nhZ6sJvk=
|
||||
golang.org/x/text v0.29.0/go.mod h1:7MhJOA9CD2qZyOKYazxdYMF85OwPdEr9jTtBpO7ydH4=
|
||||
golang.org/x/time v0.13.0 h1:eUlYslOIt32DgYD6utsuUeHs4d7AsEYLuIAdg7FlYgI=
|
||||
golang.org/x/time v0.13.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
|
||||
golang.org/x/sys v0.37.0 h1:fdNQudmxPjkdUTPnLn5mdQv7Zwvbvpaxqs831goi9kQ=
|
||||
golang.org/x/sys v0.37.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
||||
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
|
||||
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
|
||||
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
|
||||
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
|
||||
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=
|
||||
gonum.org/v1/gonum v0.16.0/go.mod h1:fef3am4MQ93R2HHpKnLk4/Tbh/s0+wqD5nfa6Pnwy4E=
|
||||
google.golang.org/api v0.252.0 h1:xfKJeAJaMwb8OC9fesr369rjciQ704AjU/psjkKURSI=
|
||||
google.golang.org/api v0.252.0/go.mod h1:dnHOv81x5RAmumZ7BWLShB/u7JZNeyalImxHmtTHxqw=
|
||||
google.golang.org/api v0.254.0 h1:jl3XrGj7lRjnlUvZAbAdhINTLbsg5dbjmR90+pTQvt4=
|
||||
google.golang.org/api v0.254.0/go.mod h1:5BkSURm3D9kAqjGvBNgf0EcbX6Rnrf6UArKkwBzAyqQ=
|
||||
google.golang.org/genai v1.30.0 h1:7021aneIvl24nEBLbtQFEWleHsMbjzpcQvkT4WcJ1dc=
|
||||
google.golang.org/genai v1.30.0/go.mod h1:7pAilaICJlQBonjKKJNhftDFv3SREhZcTe9F6nRcjbg=
|
||||
google.golang.org/genproto v0.0.0-20250603155806-513f23925822 h1:rHWScKit0gvAPuOnu87KpaYtjK5zBMLcULh7gxkCXu4=
|
||||
google.golang.org/genproto v0.0.0-20250603155806-513f23925822/go.mod h1:HubltRL7rMh0LfnQPkMH4NPDFEWp0jw3vixw7jEM53s=
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c h1:AtEkQdl5b6zsybXcbz00j1LwNodDuH6hVifIaNqk7NQ=
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c/go.mod h1:ea2MjsO70ssTfCjiwHgI0ZFqcw45Ksuk2ckf9G468GA=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251002232023-7c0ddcbb5797 h1:CirRxTOwnRWVLKzDNrs0CXAaVozJoR4G9xvdRecrdpk=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251002232023-7c0ddcbb5797/go.mod h1:HSkG/KdJWusxU1F6CNrwNDjBMgisKxGnc5dAZfT0mjQ=
|
||||
google.golang.org/grpc v1.75.1 h1:/ODCNEuf9VghjgO3rqLcfg8fiOP0nSluljWFlDxELLI=
|
||||
google.golang.org/grpc v1.75.1/go.mod h1:JtPAzKiq4v1xcAB2hydNlWI2RnF85XXcV0mhKXr2ecQ=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251022142026-3a174f9686a8 h1:M1rk8KBnUsBDg1oPGHNCxG4vc1f49epmTO7xscSajMk=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251022142026-3a174f9686a8/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
|
||||
google.golang.org/grpc v1.76.0 h1:UnVkv1+uMLYXoIz6o7chp59WfQUYA2ex/BXQ9rHZu7A=
|
||||
google.golang.org/grpc v1.76.0/go.mod h1:Ju12QI8M6iQJtbcsV+awF5a4hfJMLi4X0JLo94ULZ6c=
|
||||
google.golang.org/protobuf v1.36.10 h1:AYd7cD/uASjIL6Q9LiTjz8JLcrh/88q5UObnmY3aOOE=
|
||||
google.golang.org/protobuf v1.36.10/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
|
||||
@@ -4,9 +4,10 @@ go 1.24.6
|
||||
|
||||
require (
|
||||
github.com/googleapis/mcp-toolbox-sdk-go v0.3.0
|
||||
github.com/openai/openai-go v1.12.0
|
||||
github.com/openai/openai-go/v3 v3.3.0
|
||||
)
|
||||
|
||||
|
||||
require (
|
||||
cloud.google.com/go/auth v0.16.5 // indirect
|
||||
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
google-adk==1.15.0
|
||||
google-adk==1.15.1
|
||||
toolbox-core==0.5.2
|
||||
pytest==8.4.2
|
||||
@@ -1,3 +1,3 @@
|
||||
google-genai==1.46.0
|
||||
google-genai==1.47.0
|
||||
toolbox-core==0.5.2
|
||||
pytest==8.4.2
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
llama-index==0.14.6
|
||||
llama-index-llms-google-genai==0.6.2
|
||||
llama-index-llms-google-genai==0.7.1
|
||||
toolbox-llamaindex==0.5.2
|
||||
pytest==8.4.2
|
||||
|
||||
@@ -287,8 +287,15 @@ Your AI tool is now connected to Looker using MCP. Try asking your AI
|
||||
assistant to list models, explores, dimensions, and measures. Run a
|
||||
query, retrieve the SQL for a query, and run a saved Look.
|
||||
|
||||
The full tool list is available in the [Prebuilt Tools Reference](../../reference/prebuilt-tools/#looker).
|
||||
|
||||
The following tools are available to the LLM:
|
||||
|
||||
### Looker Model and Query Tools
|
||||
|
||||
These tools are used to get information about a Looker model
|
||||
and execute queries against that model.
|
||||
|
||||
1. **get_models**: list the LookML models in Looker
|
||||
1. **get_explores**: list the explores in a given model
|
||||
1. **get_dimensions**: list the dimensions in a given explore
|
||||
@@ -298,6 +305,12 @@ The following tools are available to the LLM:
|
||||
1. **query**: Run a query and return the data
|
||||
1. **query_sql**: Return the SQL generated by Looker for a query
|
||||
1. **query_url**: Return a link to the query in Looker for further exploration
|
||||
|
||||
### Looker Content Tools
|
||||
|
||||
These tools get saved content (Looks and Dashboards) from a Looker
|
||||
instance and create new saved content.
|
||||
|
||||
1. **get_looks**: Return the saved Looks that match a title or description
|
||||
1. **run_look**: Run a saved Look and return the data
|
||||
1. **make_look**: Create a saved Look in Looker and return the URL
|
||||
@@ -305,6 +318,33 @@ The following tools are available to the LLM:
|
||||
1. **make_dashboard**: Create a saved dashboard in Looker and return the URL
|
||||
1. **add_dashboard_element**: Add a tile to a dashboard
|
||||
|
||||
### Looker Instance Health Tools
|
||||
|
||||
These tools offer the same health check algorithms that the popular
|
||||
CLI [Henry](https://github.com/looker-open-source/henry) offers.
|
||||
|
||||
1. **health_pulse**: Check the health of a Looker intance
|
||||
1. **health_analyze**: Analyze the usage of a Looker object
|
||||
1. **health_vacuum**: Find LookML elements that might be unused
|
||||
|
||||
### LookML Authoring Tools
|
||||
|
||||
These tools allow enable the caller to write and modify LookML files
|
||||
as well as get the database schema needed to write LookML effectively.
|
||||
|
||||
1. **dev_mode**: Activate dev mode.
|
||||
1. **get_projects**: Get the list of LookML projects
|
||||
1. **get_project_files**: Get the list of files in a project
|
||||
1. **get_project_file**: Get the contents of a file in a project
|
||||
1. **create_project_file**: Create a file in a project
|
||||
1. **update_project_file**: Update the contents of a file in a project
|
||||
1. **delete_project_file**: Delete a file in a project
|
||||
1. **get_connections**: Get the list of connections
|
||||
1. **get_connection_schemas**: Get the list of schemas for a connection
|
||||
1. **get_connection_databases**: Get the list of databases for a connection
|
||||
1. **get_connection_tables**: Get the list of tables for a connection
|
||||
1. **get_connection_table_columns**: Get the list of columns for a table in a connection
|
||||
|
||||
{{< notice note >}}
|
||||
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
|
||||
will adapt to the tools available, so this shouldn't affect most users.
|
||||
|
||||
@@ -264,7 +264,6 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
|
||||
* `CLOUD_SQL_MSSQL_REGION`: The region of your Cloud SQL instance.
|
||||
* `CLOUD_SQL_MSSQL_INSTANCE`: The ID of your Cloud SQL instance.
|
||||
* `CLOUD_SQL_MSSQL_DATABASE`: The name of the database to connect to.
|
||||
* `CLOUD_SQL_MSSQL_IP_ADDRESS`: The IP address of the Cloud SQL instance.
|
||||
* `CLOUD_SQL_MSSQL_USER`: The database username.
|
||||
* `CLOUD_SQL_MSSQL_PASSWORD`: The password for the database user.
|
||||
* `CLOUD_SQL_MSSQL_IP_TYPE`: (Optional) The IP type i.e. "Public" or
|
||||
|
||||
@@ -123,6 +123,7 @@ sources:
|
||||
# allowedDatasets: # Optional: Restricts tool access to a specific list of datasets.
|
||||
# - "my_dataset_1"
|
||||
# - "other_project.my_dataset_2"
|
||||
# impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate
|
||||
```
|
||||
|
||||
Initialize a BigQuery source that uses the client's access token:
|
||||
@@ -138,15 +139,17 @@ sources:
|
||||
# allowedDatasets: # Optional: Restricts tool access to a specific list of datasets.
|
||||
# - "my_dataset_1"
|
||||
# - "other_project.my_dataset_2"
|
||||
# impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery". |
|
||||
| project | string | true | Id of the Google Cloud project to use for billing and as the default project for BigQuery resources. |
|
||||
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. Defaults to the table's location or 'US' if the location cannot be determined. [Learn More](https://cloud.google.com/bigquery/docs/locations) |
|
||||
| writeMode | string | false | Controls the write behavior for tools. `allowed` (default): All queries are permitted. `blocked`: Only `SELECT` statements are allowed for the `bigquery-execute-sql` tool. `protected`: Enables session-based execution where all tools associated with this source instance share the same [BigQuery session](https://cloud.google.com/bigquery/docs/sessions-intro). This allows for stateful operations using temporary tables (e.g., `CREATE TEMP TABLE`). For `bigquery-execute-sql`, `SELECT` statements can be used on all tables, but write operations are restricted to the session's temporary dataset. For tools like `bigquery-sql`, `bigquery-forecast`, and `bigquery-analyze-contribution`, the `writeMode` restrictions do not apply, but they will operate within the shared session. **Note:** The `protected` mode cannot be used with `useClientOAuth: true`. It is also not recommended for multi-user server environments, as all users would share the same session. A session is terminated automatically after 24 hours of inactivity or after 7 days, whichever comes first. A new session is created on the next request, and any temporary data from the previous session will be lost. |
|
||||
| allowedDatasets | []string | false | An optional list of dataset IDs that tools using this source are allowed to access. If provided, any tool operation attempting to access a dataset not in this list will be rejected. To enforce this, two types of operations are also disallowed: 1) Dataset-level operations (e.g., `CREATE SCHEMA`), and 2) operations where table access cannot be statically analyzed (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`). If a single dataset is provided, it will be treated as the default for prebuilt tools. |
|
||||
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|---------------------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery". |
|
||||
| project | string | true | Id of the Google Cloud project to use for billing and as the default project for BigQuery resources. |
|
||||
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. Defaults to the table's location or 'US' if the location cannot be determined. [Learn More](https://cloud.google.com/bigquery/docs/locations) |
|
||||
| writeMode | string | false | Controls the write behavior for tools. `allowed` (default): All queries are permitted. `blocked`: Only `SELECT` statements are allowed for the `bigquery-execute-sql` tool. `protected`: Enables session-based execution where all tools associated with this source instance share the same [BigQuery session](https://cloud.google.com/bigquery/docs/sessions-intro). This allows for stateful operations using temporary tables (e.g., `CREATE TEMP TABLE`). For `bigquery-execute-sql`, `SELECT` statements can be used on all tables, but write operations are restricted to the session's temporary dataset. For tools like `bigquery-sql`, `bigquery-forecast`, and `bigquery-analyze-contribution`, the `writeMode` restrictions do not apply, but they will operate within the shared session. **Note:** The `protected` mode cannot be used with `useClientOAuth: true`. It is also not recommended for multi-user server environments, as all users would share the same session. A session is terminated automatically after 24 hours of inactivity or after 7 days, whichever comes first. A new session is created on the next request, and any temporary data from the previous session will be lost. |
|
||||
| allowedDatasets | []string | false | An optional list of dataset IDs that tools using this source are allowed to access. If provided, any tool operation attempting to access a dataset not in this list will be rejected. To enforce this, two types of operations are also disallowed: 1) Dataset-level operations (e.g., `CREATE SCHEMA`), and 2) operations where table access cannot be statically analyzed (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`). If a single dataset is provided, it will be treated as the default for prebuilt tools. |
|
||||
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. |
|
||||
| impersonateServiceAccount | string | false | Service account email to impersonate when making BigQuery and Dataplex API calls. The authenticated principal must have the `roles/iam.serviceAccountTokenCreator` role on the target service account. [Learn More](https://cloud.google.com/iam/docs/service-account-impersonation) |
|
||||
|
||||
@@ -94,7 +94,6 @@ sources:
|
||||
region: my-region
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
ipAddress: localhost
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
# ipType: private
|
||||
@@ -114,7 +113,6 @@ instead of hardcoding your secrets into the configuration file.
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Cloud SQL database to connect to (e.g. "my_db"). |
|
||||
| ipAddress | string | true | IP address of the Cloud SQL instance to connect to. |
|
||||
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-pg-user"). |
|
||||
| password | string | true | Password of the SQL Server user (e.g. "my-password"). |
|
||||
| ipType | string | false | IP Type of the Cloud SQL instance, must be either `public`, `private`, or `psc`. Default: `public`. |
|
||||
|
||||
@@ -31,6 +31,9 @@ the Google Cloud console][spanner-quickstart].
|
||||
- [`spanner-execute-sql`](../tools/spanner/spanner-execute-sql.md)
|
||||
Run structured and parameterized queries on Spanner.
|
||||
|
||||
- [`spanner-list-tables`](../tools/spanner/spanner-list-tables.md)
|
||||
Retrieve schema information about tables in a Spanner database.
|
||||
|
||||
### Pre-built Configurations
|
||||
|
||||
- [Spanner using MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/spanner_mcp/)
|
||||
|
||||
@@ -77,17 +77,18 @@ the parameter.
|
||||
description: Airline unique 2 letter identifier
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|---------------|:--------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
| allowedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| escape | string | false | Only available for type `string`. Indicate the escaping delimiters used for the parameter. This field is intended to be used with templateParameters. Must be one of "single-quotes", "double-quotes", "backticks", "square-brackets". |
|
||||
| minValue | int or float | false | Only available for type `integer` and `float`. Indicate the minimum value allowed. |
|
||||
| maxValue | int or float | false | Only available for type `integer` and `float`. Indicate the maximum value allowed. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|----------------|:--------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
| allowedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| excludedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| escape | string | false | Only available for type `string`. Indicate the escaping delimiters used for the parameter. This field is intended to be used with templateParameters. Must be one of "single-quotes", "double-quotes", "backticks", "square-brackets". |
|
||||
| minValue | int or float | false | Only available for type `integer` and `float`. Indicate the minimum value allowed. |
|
||||
| maxValue | int or float | false | Only available for type `integer` and `float`. Indicate the maximum value allowed. |
|
||||
|
||||
### Array Parameters
|
||||
|
||||
@@ -108,15 +109,16 @@ in the list using the items field:
|
||||
SELECT * FROM airlines WHERE preferred_airlines = ANY($1);
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|---------------|:----------------:|:------------:|----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
| allowedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| items | parameter object | true | Specify a Parameter object for the type of the values in the array. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|----------------|:----------------:|:------------:|----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
| allowedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| excludedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| items | parameter object | true | Specify a Parameter object for the type of the values in the array. |
|
||||
|
||||
{{< notice note >}}
|
||||
Items in array should not have a `default` or `required` value. If provided, it
|
||||
@@ -247,15 +249,16 @@ tools:
|
||||
escape: double-quotes # with this, the statement will resolve to `SELECT "id", "name" FROM flights`
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|---------------|:----------------:|:---------------:|-------------------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the template parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean", "array" |
|
||||
| description | string | true | Natural language description of the template parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
| allowedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| items | parameter object | true (if array) | Specify a Parameter object for the type of the values in the array (string only). |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|----------------|:----------------:|:---------------:|-------------------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the template parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean", "array" |
|
||||
| description | string | true | Natural language description of the template parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
| allowedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| excludedValues | []string | false | Input value will be checked against this field. Regex is also supported. |
|
||||
| items | parameter object | true (if array) | Specify a Parameter object for the type of the values in the array (string only). |
|
||||
|
||||
## Authorized Invocations
|
||||
|
||||
|
||||
@@ -46,6 +46,13 @@ The behavior of this tool is influenced by the `writeMode` setting on its `bigqu
|
||||
tools using the same source. This allows the `input_data` parameter to be a query that references temporary resources (e.g.,
|
||||
`TEMP` tables) created within that session.
|
||||
|
||||
The tool's behavior is also influenced by the `allowedDatasets` restriction on the `bigquery` source:
|
||||
|
||||
- **Without `allowedDatasets` restriction:** The tool can use any table or query for the `input_data` parameter.
|
||||
- **With `allowedDatasets` restriction:** The tool verifies that the `input_data` parameter only accesses tables within the allowed datasets.
|
||||
- If `input_data` is a table ID, the tool checks if the table's dataset is in the allowed list.
|
||||
- If `input_data` is a query, the tool performs a dry run to analyze the query and rejects it if it accesses any table outside the allowed list.
|
||||
|
||||
|
||||
## Example
|
||||
|
||||
|
||||
@@ -15,10 +15,19 @@ It's compatible with the following sources:
|
||||
|
||||
- [bigquery](../../sources/bigquery.md)
|
||||
|
||||
`bigquery-get-dataset-info` takes a `dataset` parameter to specify the dataset
|
||||
on the given source. It also optionally accepts a `project` parameter to
|
||||
define the Google Cloud project ID. If the `project` parameter is not provided,
|
||||
the tool defaults to using the project defined in the source configuration.
|
||||
`bigquery-get-dataset-info` accepts the following parameters:
|
||||
- **`dataset`** (required): Specifies the dataset for which to retrieve metadata.
|
||||
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
|
||||
the tool defaults to the project from the source configuration.
|
||||
|
||||
The tool's behavior regarding these parameters is influenced by the
|
||||
`allowedDatasets` restriction on the `bigquery` source:
|
||||
- **Without `allowedDatasets` restriction:** The tool can retrieve metadata for
|
||||
any dataset specified by the `dataset` and `project` parameters.
|
||||
- **With `allowedDatasets` restriction:** Before retrieving metadata, the tool
|
||||
verifies that the requested dataset is in the allowed list. If it is not, the
|
||||
request is denied. If only one dataset is specified in the `allowedDatasets`
|
||||
list, it will be used as the default value for the `dataset` parameter.
|
||||
|
||||
## Example
|
||||
|
||||
|
||||
@@ -55,8 +55,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-add-dashboard-element" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-add-dashboard-element" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -38,7 +38,7 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "lookerca-conversational-analytics". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
|
||||
@@ -38,8 +38,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-create-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-create-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -37,8 +37,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-delete-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-delete-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -35,8 +35,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-dev-mode". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-dev-mode". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -0,0 +1,41 @@
|
||||
---
|
||||
title: "looker-get-connection-databases"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "looker-get-connection-databases" tool returns all the databases in a connection.
|
||||
aliases:
|
||||
- /resources/tools/looker-get-connection-databases
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `looker-get-connection-databases` tool returns all the databases in a connection.
|
||||
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [looker](../../sources/looker.md)
|
||||
|
||||
`looker-get-connection-databases` accepts a `conn` parameter.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
get_connection_databases:
|
||||
kind: looker-get-connection-databases
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_databases Tool
|
||||
|
||||
This tool will list the databases available from a connection if the connection
|
||||
supports multiple databases.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-connection-databases". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -0,0 +1,41 @@
|
||||
---
|
||||
title: "looker-get-connection-schemas"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "looker-get-connection-schemas" tool returns all the schemas in a connection.
|
||||
aliases:
|
||||
- /resources/tools/looker-get-connection-schemas
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `looker-get-connection-schemas` tool returns all the schemas in a connection.
|
||||
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [looker](../../sources/looker.md)
|
||||
|
||||
`looker-get-connection-schemas` accepts a `conn` parameter and an optional `db` parameter.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
get_connection_schemas:
|
||||
kind: looker-get-connection-schemas
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_schemas Tool
|
||||
|
||||
This tool will list the schemas available from a connection, filtered by
|
||||
an optional database name.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-connection-schemas". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -0,0 +1,43 @@
|
||||
---
|
||||
title: "looker-get-connection-table-columns"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "looker-get-connection-table-columns" tool returns all the columns for each table specified.
|
||||
aliases:
|
||||
- /resources/tools/looker-get-connection-table-columns
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `looker-get-connection-table-columns` tool returns all the columnes for each table specified.
|
||||
|
||||
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [looker](../../sources/looker.md)
|
||||
|
||||
`looker-get-connection-table-columns` accepts a `conn` parameter, a `schema` parameter, a `tables` parameter with a comma separated list of tables, and an optional `db` parameter.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
get_connection_table_columns:
|
||||
kind: looker-get-connection-table-columns
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_table_columns Tool
|
||||
|
||||
This tool will list the columns available from a connection, for all the tables
|
||||
given in a comma separated list of table names, filtered by the
|
||||
schema name and optional database name.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-connection-table-columns". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -0,0 +1,41 @@
|
||||
---
|
||||
title: "looker-get-connection-tables"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "looker-get-connection-tables" tool returns all the tables in a connection.
|
||||
aliases:
|
||||
- /resources/tools/looker-get-connection-tables
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `looker-get-connection-tables` tool returns all the tables in a connection.
|
||||
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [looker](../../sources/looker.md)
|
||||
|
||||
`looker-get-connection-tables` accepts a `conn` parameter, a `schema` parameter, and an optional `db` parameter.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
get_connection_tables:
|
||||
kind: looker-get-connection-tables
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_tables Tool
|
||||
|
||||
This tool will list the tables available from a connection, filtered by the
|
||||
schema name and optional database name.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-connection-tables". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
42
docs/en/resources/tools/looker/looker-get-connections.md
Normal file
42
docs/en/resources/tools/looker/looker-get-connections.md
Normal file
@@ -0,0 +1,42 @@
|
||||
---
|
||||
title: "looker-get-connections"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "looker-get-connections" tool returns all the connections in the source.
|
||||
aliases:
|
||||
- /resources/tools/looker-get-connections
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `looker-get-connections` tool returns all the connections in the source.
|
||||
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [looker](../../sources/looker.md)
|
||||
|
||||
`looker-get-connections` accepts no parameters.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
get_connections:
|
||||
kind: looker-get-connections
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connections Tool
|
||||
|
||||
This tool will list all the connections available in the Looker system, as
|
||||
well as the dialect name, the default schema, the database if applicable,
|
||||
and whether the connection supports multiple databases.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-connections". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -52,8 +52,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-dashboards" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-dashboards" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -60,8 +60,8 @@ The response is a json array with the following elements:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-dimensions". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-dimensions". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -53,8 +53,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-looks" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-looks" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -33,8 +33,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-models". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-models". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -34,8 +34,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -34,8 +34,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-project-files". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-project-files". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -34,8 +34,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-projects". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-get-projects". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -18,46 +18,40 @@ The `looker-health-analyze` tool performs various analysis tasks on a Looker ins
|
||||
|
||||
## Parameters
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
| :--- | :--- | :--- | :--- |
|
||||
| kind | string | true | Must be "looker-health-analyze" |
|
||||
| source | string | true | Looker source name |
|
||||
| action | string | true | The analysis to perform: `projects`, `models`, or `explores`. |
|
||||
| project | string | false | The name of the Looker project to analyze. |
|
||||
| model | string | false | The name of the Looker model to analyze. Required for `explores` actions. |
|
||||
| explore | string | false | The name of the Looker explore to analyze. Required for the `explores` action. |
|
||||
| timeframe | int | false | The timeframe in days to analyze. Defaults to 90. |
|
||||
| min_queries | int | false | The minimum number of queries for a model or explore to be considered used. Defaults to 1. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|:------------|:---------|:-------------|:-------------------------------------------------------------------------------------------|
|
||||
| action | string | true | The analysis to perform: `projects`, `models`, or `explores`. |
|
||||
| project | string | false | The name of the Looker project to analyze. |
|
||||
| model | string | false | The name of the Looker model to analyze. Required for `explores` actions. |
|
||||
| explore | string | false | The name of the Looker explore to analyze. Required for the `explores` action. |
|
||||
| timeframe | int | false | The timeframe in days to analyze. Defaults to 90. |
|
||||
| min_queries | int | false | The minimum number of queries for a model or explore to be considered used. Defaults to 1. |
|
||||
|
||||
## Example
|
||||
|
||||
Analyze all models in `thelook` project.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
analyze-tool:
|
||||
health_analyze:
|
||||
kind: looker-health-analyze
|
||||
source: looker-source
|
||||
description: |
|
||||
Analyzes Looker projects, models, and explores.
|
||||
Specify the `action` parameter to select the type of analysis.
|
||||
parameters:
|
||||
action: models
|
||||
project: "thelook"
|
||||
health-analyze Tool
|
||||
|
||||
Analyze all the explores in the `ecomm` model of `thelook` project. Specifically look at usage within the past 20 days. Usage minimum should be at least 10 queries.
|
||||
This tool calculates the usage of projects, models and explores.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
analyze-tool:
|
||||
kind: looker-health-analyze
|
||||
source: looker-source
|
||||
description: |
|
||||
Analyzes Looker projects, models, and explores.
|
||||
Specify the `action` parameter to select the type of analysis.
|
||||
parameters:
|
||||
action: explores
|
||||
project: "thelook"
|
||||
model: "ecomm"
|
||||
timeframe: 20
|
||||
min_queries: 10
|
||||
It accepts 6 parameters:
|
||||
1. `action`: can be "projects", "models", or "explores"
|
||||
2. `project`: the project to analyze (optional)
|
||||
3. `model`: the model to analyze (optional)
|
||||
4. `explore`: the explore to analyze (optional)
|
||||
5. `timeframe`: the lookback period in days, default is 90
|
||||
6. `min_queries`: the minimum number of queries to consider a resource as active, default is 1
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-health-analyze" |
|
||||
| source | string | true | Looker source name |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -17,33 +17,15 @@ The `looker-health-pulse` tool performs health checks on a Looker instance. The
|
||||
- `check_dashboard_errors`: Lists dashboards with erroring queries in the last 7 days.
|
||||
- `check_explore_performance`: Lists the slowest explores in the last 7 days and reports average query runtime.
|
||||
- `check_schedule_failures`: Lists schedules that have failed in the last 7 days.
|
||||
- `check_legacy_features`: Lists enabled legacy features. (*To note, this function is not available in Looker Core. You will get an error running this command with a Core instance configured.*)
|
||||
- `check_legacy_features`: Lists enabled legacy features. (*To note, this function is not
|
||||
available in Looker Core.*)
|
||||
|
||||
## Parameters
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|---------------|:--------:|:------------:|---------------------------------------------|
|
||||
| kind | string | true | Must be "looker-health-pulse" |
|
||||
| source | string | true | Looker source name |
|
||||
| action | string | true | The health check to perform |
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
pulse:
|
||||
kind: looker-health-pulse
|
||||
source: looker-source
|
||||
description: |
|
||||
Pulse Tool
|
||||
|
||||
Performs health checks on Looker instance.
|
||||
Specify the `action` parameter to select the check.
|
||||
parameters:
|
||||
action: check_dashboard_performance
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **action** | **description** |
|
||||
|---------------------------|--------------------------------------------------------------------------------|
|
||||
@@ -52,4 +34,36 @@ tools:
|
||||
| check_dashboard_errors | Lists dashboards with erroring queries in the last 7 days |
|
||||
| check_explore_performance | Lists slowest explores and average query runtime |
|
||||
| check_schedule_failures | Lists failed schedules in the last 7 days |
|
||||
| check_legacy_features | Lists enabled legacy features |
|
||||
| check_legacy_features | Lists enabled legacy features |
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
health_pulse:
|
||||
kind: looker-health-pulse
|
||||
source: looker-source
|
||||
description: |
|
||||
health-pulse Tool
|
||||
|
||||
This tool takes the pulse of a Looker instance by taking
|
||||
one of the following actions:
|
||||
1. `check_db_connections`,
|
||||
2. `check_dashboard_performance`,
|
||||
3. `check_dashboard_errors`,
|
||||
4. `check_explore_performance`,
|
||||
5. `check_schedule_failures`, or
|
||||
6. `check_legacy_features`
|
||||
|
||||
The `check_legacy_features` action is only available in Looker Core. If
|
||||
it is called on a Looker Core instance, you will get a notice. That notice
|
||||
should not be reported as an error.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-health-pulse" |
|
||||
| source | string | true | Looker source name |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -17,16 +17,14 @@ The `looker-health-vacuum` tool helps you identify unused LookML objects such as
|
||||
|
||||
## Parameters
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
| :--- | :--- | :--- | :--- |
|
||||
| kind | string | true | Must be "looker-health-vacuum" |
|
||||
| source | string | true | Looker source name |
|
||||
| action | string | true | The vacuum to perform: `models`, or `explores`. |
|
||||
| project | string | false | The name of the Looker project to vacuum. |
|
||||
| model | string | false | The name of the Looker model to vacuum. |
|
||||
| explore | string | false | The name of the Looker explore to vacuum. |
|
||||
| timeframe | int | false | The timeframe in days to analyze for usage. Defaults to 90. |
|
||||
| min_queries | int | false | The minimum number of queries for an object to be considered used. Defaults to 1. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|:------------|:---------|:-------------|:----------------------------------------------------------------------------------|
|
||||
| action | string | true | The vacuum to perform: `models`, or `explores`. |
|
||||
| project | string | false | The name of the Looker project to vacuum. |
|
||||
| model | string | false | The name of the Looker model to vacuum. |
|
||||
| explore | string | false | The name of the Looker explore to vacuum. |
|
||||
| timeframe | int | false | The timeframe in days to analyze for usage. Defaults to 90. |
|
||||
| min_queries | int | false | The minimum number of queries for an object to be considered used. Defaults to 1. |
|
||||
|
||||
## Example
|
||||
|
||||
@@ -34,30 +32,29 @@ Identify unnused fields (*in this case, less than 1 query in the last 20 days*)
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
vacuum-tool:
|
||||
health_vacuum:
|
||||
kind: looker-health-vacuum
|
||||
source: looker-source
|
||||
description: |
|
||||
Vacuums the Looker instance by identifying unused explores, fields, and joins.
|
||||
parameters:
|
||||
action: explores
|
||||
project: "thelook_core"
|
||||
model: "thelook"
|
||||
explore: "order_items"
|
||||
timeframe: 20
|
||||
min_queries: 1
|
||||
health-vacuum Tool
|
||||
|
||||
This tool suggests models or explores that can removed
|
||||
because they are unused.
|
||||
|
||||
It accepts 6 parameters:
|
||||
1. `action`: can be "models" or "explores"
|
||||
2. `project`: the project to vacuum (optional)
|
||||
3. `model`: the model to vacuum (optional)
|
||||
4. `explore`: the explore to vacuum (optional)
|
||||
5. `timeframe`: the lookback period in days, default is 90
|
||||
6. `min_queries`: the minimum number of queries to consider a resource as active, default is 1
|
||||
|
||||
The result is a list of objects that are candidates for deletion.
|
||||
```
|
||||
|
||||
Identify unnused explores across all models in `thelook_core` project.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
vacuum-tool:
|
||||
kind: looker-health-vacuum
|
||||
source: looker-source
|
||||
description: |
|
||||
Vacuums the Looker instance by identifying unused explores, fields, and joins.
|
||||
parameters:
|
||||
action: models
|
||||
project: "thelook_core"
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-health-vacuum" |
|
||||
| source | string | true | Looker source name |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -46,8 +46,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-make-dashboard" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-make-dashboard" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -58,8 +58,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-make-look" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-make-look" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -77,8 +77,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-query-sql" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-query-sql" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -485,8 +485,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-query-url" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-query-url" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -79,8 +79,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-query" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-query" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -36,8 +36,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-run-look" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-run-look" |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
|
||||
@@ -38,8 +38,8 @@ tools:
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-update-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||
| kind | string | true | Must be "looker-update-project-file". |
|
||||
| source | string | true | Name of the source Looker instance. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
20
go.mod
20
go.mod
@@ -6,11 +6,11 @@ toolchain go1.25.2
|
||||
|
||||
require (
|
||||
cloud.google.com/go/alloydbconn v1.15.5
|
||||
cloud.google.com/go/bigquery v1.71.0
|
||||
cloud.google.com/go/bigquery v1.72.0
|
||||
cloud.google.com/go/bigtable v1.40.1
|
||||
cloud.google.com/go/cloudsqlconn v1.18.1
|
||||
cloud.google.com/go/dataplex v1.27.1
|
||||
cloud.google.com/go/dataproc/v2 v2.14.1
|
||||
cloud.google.com/go/dataproc/v2 v2.15.0
|
||||
cloud.google.com/go/firestore v1.20.0
|
||||
cloud.google.com/go/geminidataanalytics v0.2.1
|
||||
cloud.google.com/go/spanner v1.86.1
|
||||
@@ -55,9 +55,9 @@ require (
|
||||
go.opentelemetry.io/otel/trace v1.38.0
|
||||
golang.org/x/oauth2 v0.32.0
|
||||
google.golang.org/api v0.251.0
|
||||
google.golang.org/genproto v0.0.0-20251007200510-49b9836ed3ff
|
||||
google.golang.org/genproto v0.0.0-20251022142026-3a174f9686a8
|
||||
google.golang.org/protobuf v1.36.10
|
||||
modernc.org/sqlite v1.39.1
|
||||
modernc.org/sqlite v1.40.0
|
||||
)
|
||||
|
||||
require (
|
||||
@@ -79,10 +79,10 @@ require (
|
||||
cloud.google.com/go/auth v0.16.5 // indirect
|
||||
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
|
||||
cloud.google.com/go/compute/metadata v0.9.0 // indirect
|
||||
cloud.google.com/go/iam v1.5.2 // indirect
|
||||
cloud.google.com/go/longrunning v0.6.7 // indirect
|
||||
cloud.google.com/go/monitoring v1.24.2 // indirect
|
||||
cloud.google.com/go/trace v1.11.6 // indirect
|
||||
cloud.google.com/go/iam v1.5.3 // indirect
|
||||
cloud.google.com/go/longrunning v0.7.0 // indirect
|
||||
cloud.google.com/go/monitoring v1.24.3 // indirect
|
||||
cloud.google.com/go/trace v1.11.7 // indirect
|
||||
filippo.io/edwards25519 v1.1.0 // indirect
|
||||
github.com/GoogleCloudPlatform/grpc-gcp-go/grpcgcp v1.5.3 // indirect
|
||||
github.com/GoogleCloudPlatform/opentelemetry-operations-go/detectors/gcp v1.29.0 // indirect
|
||||
@@ -178,8 +178,8 @@ require (
|
||||
golang.org/x/time v0.13.0 // indirect
|
||||
golang.org/x/tools v0.36.0 // indirect
|
||||
golang.org/x/xerrors v0.0.0-20240903120638-7835f813f4da // indirect
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20251002232023-7c0ddcbb5797 // indirect
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251002232023-7c0ddcbb5797 // indirect
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20251014184007-4626949a642f // indirect
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251014184007-4626949a642f // indirect
|
||||
google.golang.org/grpc v1.75.1 // indirect
|
||||
gopkg.in/inf.v0 v0.9.1 // indirect
|
||||
gopkg.in/ini.v1 v1.67.0 // indirect
|
||||
|
||||
40
go.sum
40
go.sum
@@ -137,8 +137,8 @@ cloud.google.com/go/bigquery v1.47.0/go.mod h1:sA9XOgy0A8vQK9+MWhEQTY6Tix87M/Zur
|
||||
cloud.google.com/go/bigquery v1.48.0/go.mod h1:QAwSz+ipNgfL5jxiaK7weyOhzdoAy1zFm0Nf1fysJac=
|
||||
cloud.google.com/go/bigquery v1.49.0/go.mod h1:Sv8hMmTFFYBlt/ftw2uN6dFdQPzBlREY9yBh7Oy7/4Q=
|
||||
cloud.google.com/go/bigquery v1.50.0/go.mod h1:YrleYEh2pSEbgTBZYMJ5SuSr0ML3ypjRB1zgf7pvQLU=
|
||||
cloud.google.com/go/bigquery v1.71.0 h1:NvSZvXU1Hyb+YiRVKQPuQXGeZaw/0NP6M/WOrBqSx3g=
|
||||
cloud.google.com/go/bigquery v1.71.0/go.mod h1:GUbRtmeCckOE85endLherHD9RsujY+gS7i++c1CqssQ=
|
||||
cloud.google.com/go/bigquery v1.72.0 h1:D/yLju+3Ens2IXx7ou1DJ62juBm+/coBInn4VVOg5Cw=
|
||||
cloud.google.com/go/bigquery v1.72.0/go.mod h1:GUbRtmeCckOE85endLherHD9RsujY+gS7i++c1CqssQ=
|
||||
cloud.google.com/go/bigtable v1.40.1 h1:k8HfpUOvn7sQwc6oNKqjvD/yjkwynf4qBuyKwh5cU08=
|
||||
cloud.google.com/go/bigtable v1.40.1/go.mod h1:LtPzCcrAFaGRZ82Hs8xMueUeYW9Jw12AmNdUTMfDnh4=
|
||||
cloud.google.com/go/billing v1.4.0/go.mod h1:g9IdKBEFlItS8bTtlrZdVLWSSdSyFUZKXNS02zKMOZY=
|
||||
@@ -241,8 +241,8 @@ cloud.google.com/go/dataplex v1.27.1/go.mod h1:VB+xlYJiJ5kreonXsa2cHPj0A3CfPh/mg
|
||||
cloud.google.com/go/dataproc v1.7.0/go.mod h1:CKAlMjII9H90RXaMpSxQ8EU6dQx6iAYNPcYPOkSbi8s=
|
||||
cloud.google.com/go/dataproc v1.8.0/go.mod h1:5OW+zNAH0pMpw14JVrPONsxMQYMBqJuzORhIBfBn9uI=
|
||||
cloud.google.com/go/dataproc v1.12.0/go.mod h1:zrF3aX0uV3ikkMz6z4uBbIKyhRITnxvr4i3IjKsKrw4=
|
||||
cloud.google.com/go/dataproc/v2 v2.14.1 h1:Kxq0iomU0H4MlVP4HYeYPNJnV+YxNctf/hFrprmGy5Y=
|
||||
cloud.google.com/go/dataproc/v2 v2.14.1/go.mod h1:tSdkodShfzrrUNPDVEL6MdH9/mIEvp/Z9s9PBdbsZg8=
|
||||
cloud.google.com/go/dataproc/v2 v2.15.0 h1:I/Yux/d8uaxf3W+d59kolGTOc52+VZaL6RzJw7oDOeg=
|
||||
cloud.google.com/go/dataproc/v2 v2.15.0/go.mod h1:tSdkodShfzrrUNPDVEL6MdH9/mIEvp/Z9s9PBdbsZg8=
|
||||
cloud.google.com/go/dataqna v0.5.0/go.mod h1:90Hyk596ft3zUQ8NkFfvICSIfHFh1Bc7C4cK3vbhkeo=
|
||||
cloud.google.com/go/dataqna v0.6.0/go.mod h1:1lqNpM7rqNLVgWBJyk5NF6Uen2PHym0jtVJonplVsDA=
|
||||
cloud.google.com/go/dataqna v0.7.0/go.mod h1:Lx9OcIIeqCrw1a6KdO3/5KMP1wAmTc0slZWwP12Qq3c=
|
||||
@@ -339,8 +339,8 @@ cloud.google.com/go/iam v0.8.0/go.mod h1:lga0/y3iH6CX7sYqypWJ33hf7kkfXJag67naqGE
|
||||
cloud.google.com/go/iam v0.11.0/go.mod h1:9PiLDanza5D+oWFZiH1uG+RnRCfEGKoyl6yo4cgWZGY=
|
||||
cloud.google.com/go/iam v0.12.0/go.mod h1:knyHGviacl11zrtZUoDuYpDgLjvr28sLQaG0YB2GYAY=
|
||||
cloud.google.com/go/iam v0.13.0/go.mod h1:ljOg+rcNfzZ5d6f1nAUJ8ZIxOaZUVoS14bKCtaLZ/D0=
|
||||
cloud.google.com/go/iam v1.5.2 h1:qgFRAGEmd8z6dJ/qyEchAuL9jpswyODjA2lS+w234g8=
|
||||
cloud.google.com/go/iam v1.5.2/go.mod h1:SE1vg0N81zQqLzQEwxL2WI6yhetBdbNQuTvIKCSkUHE=
|
||||
cloud.google.com/go/iam v1.5.3 h1:+vMINPiDF2ognBJ97ABAYYwRgsaqxPbQDlMnbHMjolc=
|
||||
cloud.google.com/go/iam v1.5.3/go.mod h1:MR3v9oLkZCTlaqljW6Eb2d3HGDGK5/bDv93jhfISFvU=
|
||||
cloud.google.com/go/iap v1.4.0/go.mod h1:RGFwRJdihTINIe4wZ2iCP0zF/qu18ZwyKxrhMhygBEc=
|
||||
cloud.google.com/go/iap v1.5.0/go.mod h1:UH/CGgKd4KyohZL5Pt0jSKE4m3FR51qg6FKQ/z/Ix9A=
|
||||
cloud.google.com/go/iap v1.6.0/go.mod h1:NSuvI9C/j7UdjGjIde7t7HBz+QTwBcapPE07+sSRcLk=
|
||||
@@ -375,8 +375,8 @@ cloud.google.com/go/logging v1.13.0/go.mod h1:36CoKh6KA/M0PbhPKMq6/qety2DCAErbhX
|
||||
cloud.google.com/go/longrunning v0.1.1/go.mod h1:UUFxuDWkv22EuY93jjmDMFT5GPQKeFVJBIF6QlTqdsE=
|
||||
cloud.google.com/go/longrunning v0.3.0/go.mod h1:qth9Y41RRSUE69rDcOn6DdK3HfQfsUI0YSmW3iIlLJc=
|
||||
cloud.google.com/go/longrunning v0.4.1/go.mod h1:4iWDqhBZ70CvZ6BfETbvam3T8FMvLK+eFj0E6AaRQTo=
|
||||
cloud.google.com/go/longrunning v0.6.7 h1:IGtfDWHhQCgCjwQjV9iiLnUta9LBCo8R9QmAFsS/PrE=
|
||||
cloud.google.com/go/longrunning v0.6.7/go.mod h1:EAFV3IZAKmM56TyiE6VAP3VoTzhZzySwI/YI1s/nRsY=
|
||||
cloud.google.com/go/longrunning v0.7.0 h1:FV0+SYF1RIj59gyoWDRi45GiYUMM3K1qO51qoboQT1E=
|
||||
cloud.google.com/go/longrunning v0.7.0/go.mod h1:ySn2yXmjbK9Ba0zsQqunhDkYi0+9rlXIwnoAf+h+TPY=
|
||||
cloud.google.com/go/managedidentities v1.3.0/go.mod h1:UzlW3cBOiPrzucO5qWkNkh0w33KFtBJU281hacNvsdE=
|
||||
cloud.google.com/go/managedidentities v1.4.0/go.mod h1:NWSBYbEMgqmbZsLIyKvxrYbtqOsxY1ZrGM+9RgDqInM=
|
||||
cloud.google.com/go/managedidentities v1.5.0/go.mod h1:+dWcZ0JlUmpuxpIDfyP5pP5y0bLdRwOS4Lp7gMni/LA=
|
||||
@@ -400,8 +400,8 @@ cloud.google.com/go/monitoring v1.7.0/go.mod h1:HpYse6kkGo//7p6sT0wsIC6IBDET0RhI
|
||||
cloud.google.com/go/monitoring v1.8.0/go.mod h1:E7PtoMJ1kQXWxPjB6mv2fhC5/15jInuulFdYYtlcvT4=
|
||||
cloud.google.com/go/monitoring v1.12.0/go.mod h1:yx8Jj2fZNEkL/GYZyTLS4ZtZEZN8WtDEiEqG4kLK50w=
|
||||
cloud.google.com/go/monitoring v1.13.0/go.mod h1:k2yMBAB1H9JT/QETjNkgdCGD9bPF712XiLTVr+cBrpw=
|
||||
cloud.google.com/go/monitoring v1.24.2 h1:5OTsoJ1dXYIiMiuL+sYscLc9BumrL3CarVLL7dd7lHM=
|
||||
cloud.google.com/go/monitoring v1.24.2/go.mod h1:x7yzPWcgDRnPEv3sI+jJGBkwl5qINf+6qY4eq0I9B4U=
|
||||
cloud.google.com/go/monitoring v1.24.3 h1:dde+gMNc0UhPZD1Azu6at2e79bfdztVDS5lvhOdsgaE=
|
||||
cloud.google.com/go/monitoring v1.24.3/go.mod h1:nYP6W0tm3N9H/bOw8am7t62YTzZY+zUeQ+Bi6+2eonI=
|
||||
cloud.google.com/go/networkconnectivity v1.4.0/go.mod h1:nOl7YL8odKyAOtzNX73/M5/mGZgqqMeryi6UPZTk/rA=
|
||||
cloud.google.com/go/networkconnectivity v1.5.0/go.mod h1:3GzqJx7uhtlM3kln0+x5wyFvuVH1pIBJjhCpjzSt75o=
|
||||
cloud.google.com/go/networkconnectivity v1.6.0/go.mod h1:OJOoEXW+0LAxHh89nXd64uGG+FbQoeH8DtxCHVOMlaM=
|
||||
@@ -588,8 +588,8 @@ cloud.google.com/go/trace v1.3.0/go.mod h1:FFUE83d9Ca57C+K8rDl/Ih8LwOzWIV1krKgxg
|
||||
cloud.google.com/go/trace v1.4.0/go.mod h1:UG0v8UBqzusp+z63o7FK74SdFE+AXpCLdFb1rshXG+Y=
|
||||
cloud.google.com/go/trace v1.8.0/go.mod h1:zH7vcsbAhklH8hWFig58HvxcxyQbaIqMarMg9hn5ECA=
|
||||
cloud.google.com/go/trace v1.9.0/go.mod h1:lOQqpE5IaWY0Ixg7/r2SjixMuc6lfTFeO4QGM4dQWOk=
|
||||
cloud.google.com/go/trace v1.11.6 h1:2O2zjPzqPYAHrn3OKl029qlqG6W8ZdYaOWRyr8NgMT4=
|
||||
cloud.google.com/go/trace v1.11.6/go.mod h1:GA855OeDEBiBMzcckLPE2kDunIpC72N+Pq8WFieFjnI=
|
||||
cloud.google.com/go/trace v1.11.7 h1:kDNDX8JkaAG3R2nq1lIdkb7FCSi1rCmsEtKVsty7p+U=
|
||||
cloud.google.com/go/trace v1.11.7/go.mod h1:TNn9d5V3fQVf6s4SCveVMIBS2LJUqo73GACmq/Tky0s=
|
||||
cloud.google.com/go/translate v1.3.0/go.mod h1:gzMUwRjvOqj5i69y/LYLd8RrNQk+hOmIXTi9+nb3Djs=
|
||||
cloud.google.com/go/translate v1.4.0/go.mod h1:06Dn/ppvLD6WvA5Rhdp029IX2Mi3Mn7fpMRLPvXT5Wg=
|
||||
cloud.google.com/go/translate v1.5.0/go.mod h1:29YDSYveqqpA1CQFD7NQuP49xymq17RXNaUDdc0mNu0=
|
||||
@@ -1982,12 +1982,12 @@ google.golang.org/genproto v0.0.0-20230323212658-478b75c54725/go.mod h1:UUQDJDOl
|
||||
google.golang.org/genproto v0.0.0-20230330154414-c0448cd141ea/go.mod h1:UUQDJDOlWu4KYeJZffbWgBkS1YFobzKbLVfK69pe0Ak=
|
||||
google.golang.org/genproto v0.0.0-20230331144136-dcfb400f0633/go.mod h1:UUQDJDOlWu4KYeJZffbWgBkS1YFobzKbLVfK69pe0Ak=
|
||||
google.golang.org/genproto v0.0.0-20230410155749-daa745c078e1/go.mod h1:nKE/iIaLqn2bQwXBg8f1g2Ylh6r5MN5CmZvuzZCgsCU=
|
||||
google.golang.org/genproto v0.0.0-20251007200510-49b9836ed3ff h1:3jGSSqkLOAYU1gI52uHoj51zxEsGMEYatnBFU0m6pB8=
|
||||
google.golang.org/genproto v0.0.0-20251007200510-49b9836ed3ff/go.mod h1:45Y7O/+fGjlhL8+FRpuLqM9YKvn+AU5dolRkE3DOaX8=
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20251002232023-7c0ddcbb5797 h1:D/zZ8knc/wLq9imidPFpHsGuRUYTCWWCwemZ2dxACGs=
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20251002232023-7c0ddcbb5797/go.mod h1:NnuHhy+bxcg30o7FnVAZbXsPHUDQ9qKWAQKCD7VxFtk=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251002232023-7c0ddcbb5797 h1:CirRxTOwnRWVLKzDNrs0CXAaVozJoR4G9xvdRecrdpk=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251002232023-7c0ddcbb5797/go.mod h1:HSkG/KdJWusxU1F6CNrwNDjBMgisKxGnc5dAZfT0mjQ=
|
||||
google.golang.org/genproto v0.0.0-20251022142026-3a174f9686a8 h1:a12a2/BiVRxRWIqBbfqoSK6tgq8cyUgMnEI81QlPge0=
|
||||
google.golang.org/genproto v0.0.0-20251022142026-3a174f9686a8/go.mod h1:1Ic78BnpzY8OaTCmzxJDP4qC9INZPbGZl+54RKjtyeI=
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20251014184007-4626949a642f h1:OiFuztEyBivVKDvguQJYWq1yDcfAHIID/FVrPR4oiI0=
|
||||
google.golang.org/genproto/googleapis/api v0.0.0-20251014184007-4626949a642f/go.mod h1:kprOiu9Tr0JYyD6DORrc4Hfyk3RFXqkQ3ctHEum3ZbM=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251014184007-4626949a642f h1:1FTH6cpXFsENbPR5Bu8NQddPSaUUE6NA2XdZdDSAJK4=
|
||||
google.golang.org/genproto/googleapis/rpc v0.0.0-20251014184007-4626949a642f/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
|
||||
google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
|
||||
google.golang.org/grpc v1.20.1/go.mod h1:10oTOabMzJvdu6/UiuZezV6QK5dSlG84ov/aaiqXj38=
|
||||
google.golang.org/grpc v1.21.1/go.mod h1:oYelfM1adQP15Ek0mdvEgi9Df8B9CZIaU1084ijfRaM=
|
||||
@@ -2124,8 +2124,8 @@ modernc.org/opt v0.1.4/go.mod h1:03fq9lsNfvkYSfxrfUhZCWPk1lm4cq4N+Bh//bEtgns=
|
||||
modernc.org/sortutil v1.2.1 h1:+xyoGf15mM3NMlPDnFqrteY07klSFxLElE2PVuWIJ7w=
|
||||
modernc.org/sortutil v1.2.1/go.mod h1:7ZI3a3REbai7gzCLcotuw9AC4VZVpYMjDzETGsSMqJE=
|
||||
modernc.org/sqlite v1.18.1/go.mod h1:6ho+Gow7oX5V+OiOQ6Tr4xeqbx13UZ6t+Fw9IRUG4d4=
|
||||
modernc.org/sqlite v1.39.1 h1:H+/wGFzuSCIEVCvXYVHX5RQglwhMOvtHSv+VtidL2r4=
|
||||
modernc.org/sqlite v1.39.1/go.mod h1:9fjQZ0mB1LLP0GYrp39oOJXx/I2sxEnZtzCmEQIKvGE=
|
||||
modernc.org/sqlite v1.40.0 h1:bNWEDlYhNPAUdUdBzjAvn8icAs/2gaKlj4vM+tQ6KdQ=
|
||||
modernc.org/sqlite v1.40.0/go.mod h1:9fjQZ0mB1LLP0GYrp39oOJXx/I2sxEnZtzCmEQIKvGE=
|
||||
modernc.org/strutil v1.1.1/go.mod h1:DE+MQQ/hjKBZS2zNInV5hhcipt5rLPWkmpbGeW5mmdw=
|
||||
modernc.org/strutil v1.1.3/go.mod h1:MEHNA7PdEnEwLvspRMtWTNnp2nnyvMfkimT1NKNAGbw=
|
||||
modernc.org/strutil v1.2.1 h1:UneZBkQA+DX2Rp35KcM69cSsNES9ly8mQWD71HKlOA0=
|
||||
|
||||
@@ -73,4 +73,4 @@ tools:
|
||||
|
||||
toolsets:
|
||||
cloud_sql_mssql_cloud_monitoring_tools:
|
||||
- get_system_metrics
|
||||
- get_system_metrics
|
||||
|
||||
@@ -19,7 +19,6 @@ sources:
|
||||
region: ${CLOUD_SQL_MSSQL_REGION}
|
||||
instance: ${CLOUD_SQL_MSSQL_INSTANCE}
|
||||
database: ${CLOUD_SQL_MSSQL_DATABASE}
|
||||
ipAddress: ${CLOUD_SQL_MSSQL_IP_ADDRESS}
|
||||
user: ${CLOUD_SQL_MSSQL_USER}
|
||||
password: ${CLOUD_SQL_MSSQL_PASSWORD}
|
||||
ipType: ${CLOUD_SQL_MSSQL_IP_TYPE:public}
|
||||
|
||||
@@ -711,6 +711,10 @@ tools:
|
||||
4. `check_explore_performance`,
|
||||
5. `check_schedule_failures`, or
|
||||
6. `check_legacy_features`
|
||||
|
||||
The `check_legacy_features` action is only available in Looker Core. If
|
||||
it is called on a Looker Core instance, you will get a notice. That notice
|
||||
should not be reported as an error.
|
||||
|
||||
health_analyze:
|
||||
kind: looker-health-analyze
|
||||
@@ -819,6 +823,54 @@ tools:
|
||||
This tool must be called after the dev_mode tool has changed the session to
|
||||
dev mode.
|
||||
|
||||
get_connections:
|
||||
kind: looker-get-connections
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connections Tool
|
||||
|
||||
This tool will list all the connections available in the Looker system, as
|
||||
well as the dialect name, the default schema, the database if applicable,
|
||||
and whether the connection supports multiple databases.
|
||||
|
||||
get_connection_schemas:
|
||||
kind: looker-get-connection-schemas
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_schemas Tool
|
||||
|
||||
This tool will list the schemas available from a connection, filtered by
|
||||
an optional database name.
|
||||
|
||||
get_connection_databases:
|
||||
kind: looker-get-connection-databases
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_databases Tool
|
||||
|
||||
This tool will list the databases available from a connection if the connection
|
||||
supports multiple databases.
|
||||
|
||||
get_connection_tables:
|
||||
kind: looker-get-connection-tables
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_tables Tool
|
||||
|
||||
This tool will list the tables available from a connection, filtered by the
|
||||
schema name and optional database name.
|
||||
|
||||
get_connection_table_columns:
|
||||
kind: looker-get-connection-table-columns
|
||||
source: looker-source
|
||||
description: |
|
||||
get_connection_table_columns Tool
|
||||
|
||||
This tool will list the columns available from a connection, for all the tables
|
||||
given in a comma separated list of table names, filtered by the
|
||||
schema name and optional database name.
|
||||
|
||||
|
||||
toolsets:
|
||||
looker_tools:
|
||||
- get_models
|
||||
@@ -846,3 +898,8 @@ toolsets:
|
||||
- create_project_file
|
||||
- update_project_file
|
||||
- delete_project_file
|
||||
- get_connections
|
||||
- get_connection_schemas
|
||||
- get_connection_databases
|
||||
- get_connection_tables
|
||||
- get_connection_table_columns
|
||||
|
||||
@@ -32,6 +32,7 @@ import (
|
||||
"golang.org/x/oauth2/google"
|
||||
bigqueryrestapi "google.golang.org/api/bigquery/v2"
|
||||
"google.golang.org/api/googleapi"
|
||||
"google.golang.org/api/impersonate"
|
||||
"google.golang.org/api/option"
|
||||
)
|
||||
|
||||
@@ -78,6 +79,7 @@ type Config struct {
|
||||
WriteMode string `yaml:"writeMode"`
|
||||
AllowedDatasets []string `yaml:"allowedDatasets"`
|
||||
UseClientOAuth bool `yaml:"useClientOAuth"`
|
||||
ImpersonateServiceAccount string `yaml:"impersonateServiceAccount"`
|
||||
}
|
||||
|
||||
func (r Config) SourceConfigKind() string {
|
||||
@@ -94,6 +96,10 @@ func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.So
|
||||
return nil, fmt.Errorf("writeMode 'protected' cannot be used with useClientOAuth 'true'")
|
||||
}
|
||||
|
||||
if r.UseClientOAuth && r.ImpersonateServiceAccount != "" {
|
||||
return nil, fmt.Errorf("useClientOAuth cannot be used with impersonateServiceAccount")
|
||||
}
|
||||
|
||||
var client *bigqueryapi.Client
|
||||
var restService *bigqueryrestapi.Service
|
||||
var tokenSource oauth2.TokenSource
|
||||
@@ -107,7 +113,7 @@ func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.So
|
||||
}
|
||||
} else {
|
||||
// Initializes a BigQuery Google SQL source
|
||||
client, restService, tokenSource, err = initBigQueryConnection(ctx, tracer, r.Name, r.Project, r.Location)
|
||||
client, restService, tokenSource, err = initBigQueryConnection(ctx, tracer, r.Name, r.Project, r.Location, r.ImpersonateServiceAccount)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error creating client from ADC: %w", err)
|
||||
}
|
||||
@@ -147,18 +153,19 @@ func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.So
|
||||
}
|
||||
|
||||
s := &Source{
|
||||
Name: r.Name,
|
||||
Kind: SourceKind,
|
||||
Project: r.Project,
|
||||
Location: r.Location,
|
||||
Client: client,
|
||||
RestService: restService,
|
||||
TokenSource: tokenSource,
|
||||
MaxQueryResultRows: 50,
|
||||
ClientCreator: clientCreator,
|
||||
WriteMode: r.WriteMode,
|
||||
AllowedDatasets: allowedDatasets,
|
||||
UseClientOAuth: r.UseClientOAuth,
|
||||
Name: r.Name,
|
||||
Kind: SourceKind,
|
||||
Project: r.Project,
|
||||
Location: r.Location,
|
||||
Client: client,
|
||||
RestService: restService,
|
||||
TokenSource: tokenSource,
|
||||
MaxQueryResultRows: 50,
|
||||
WriteMode: r.WriteMode,
|
||||
AllowedDatasets: allowedDatasets,
|
||||
UseClientOAuth: r.UseClientOAuth,
|
||||
ClientCreator: clientCreator,
|
||||
ImpersonateServiceAccount: r.ImpersonateServiceAccount,
|
||||
}
|
||||
s.SessionProvider = s.newBigQuerySessionProvider()
|
||||
|
||||
@@ -167,7 +174,6 @@ func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.So
|
||||
}
|
||||
s.makeDataplexCatalogClient = s.lazyInitDataplexClient(ctx, tracer)
|
||||
return s, nil
|
||||
|
||||
}
|
||||
|
||||
var _ sources.Source = &Source{}
|
||||
@@ -185,6 +191,7 @@ type Source struct {
|
||||
ClientCreator BigqueryClientCreator
|
||||
AllowedDatasets map[string]struct{}
|
||||
UseClientOAuth bool
|
||||
ImpersonateServiceAccount string
|
||||
WriteMode string
|
||||
sessionMutex sync.Mutex
|
||||
makeDataplexCatalogClient func() (*dataplexapi.CatalogClient, DataplexClientCreator, error)
|
||||
@@ -327,6 +334,17 @@ func (s *Source) BigQueryTokenSource() oauth2.TokenSource {
|
||||
}
|
||||
|
||||
func (s *Source) BigQueryTokenSourceWithScope(ctx context.Context, scope string) (oauth2.TokenSource, error) {
|
||||
if s.ImpersonateServiceAccount != "" {
|
||||
// Create impersonated credentials token source with the requested scope
|
||||
ts, err := impersonate.CredentialsTokenSource(ctx, impersonate.CredentialsConfig{
|
||||
TargetPrincipal: s.ImpersonateServiceAccount,
|
||||
Scopes: []string{scope},
|
||||
})
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to create impersonated credentials for %q with scope %q: %w", s.ImpersonateServiceAccount, scope, err)
|
||||
}
|
||||
return ts, nil
|
||||
}
|
||||
return google.DefaultTokenSource(ctx, scope)
|
||||
}
|
||||
|
||||
@@ -373,7 +391,7 @@ func (s *Source) lazyInitDataplexClient(ctx context.Context, tracer trace.Tracer
|
||||
|
||||
return func() (*dataplexapi.CatalogClient, DataplexClientCreator, error) {
|
||||
once.Do(func() {
|
||||
c, cc, e := initDataplexConnection(ctx, tracer, s.Name, s.Project, s.UseClientOAuth)
|
||||
c, cc, e := initDataplexConnection(ctx, tracer, s.Name, s.Project, s.UseClientOAuth, s.ImpersonateServiceAccount)
|
||||
if e != nil {
|
||||
err = fmt.Errorf("failed to initialize dataplex client: %w", e)
|
||||
return
|
||||
@@ -391,34 +409,61 @@ func initBigQueryConnection(
|
||||
name string,
|
||||
project string,
|
||||
location string,
|
||||
impersonateServiceAccount string,
|
||||
) (*bigqueryapi.Client, *bigqueryrestapi.Service, oauth2.TokenSource, error) {
|
||||
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
|
||||
defer span.End()
|
||||
|
||||
cred, err := google.FindDefaultCredentials(ctx, "https://www.googleapis.com/auth/cloud-platform")
|
||||
if err != nil {
|
||||
return nil, nil, nil, fmt.Errorf("failed to find default Google Cloud credentials with scope %q: %w", bigqueryapi.Scope, err)
|
||||
}
|
||||
|
||||
userAgent, err := util.UserAgentFromContext(ctx)
|
||||
if err != nil {
|
||||
return nil, nil, nil, err
|
||||
}
|
||||
|
||||
var tokenSource oauth2.TokenSource
|
||||
var opts []option.ClientOption
|
||||
|
||||
if impersonateServiceAccount != "" {
|
||||
// Create impersonated credentials token source with cloud-platform scope
|
||||
// This broader scope is needed for tools like conversational analytics
|
||||
cloudPlatformTokenSource, err := impersonate.CredentialsTokenSource(ctx, impersonate.CredentialsConfig{
|
||||
TargetPrincipal: impersonateServiceAccount,
|
||||
Scopes: []string{"https://www.googleapis.com/auth/cloud-platform"},
|
||||
})
|
||||
if err != nil {
|
||||
return nil, nil, nil, fmt.Errorf("failed to create impersonated credentials for %q: %w", impersonateServiceAccount, err)
|
||||
}
|
||||
tokenSource = cloudPlatformTokenSource
|
||||
opts = []option.ClientOption{
|
||||
option.WithUserAgent(userAgent),
|
||||
option.WithTokenSource(cloudPlatformTokenSource),
|
||||
}
|
||||
} else {
|
||||
// Use default credentials
|
||||
cred, err := google.FindDefaultCredentials(ctx, bigqueryapi.Scope)
|
||||
if err != nil {
|
||||
return nil, nil, nil, fmt.Errorf("failed to find default Google Cloud credentials with scope %q: %w", bigqueryapi.Scope, err)
|
||||
}
|
||||
tokenSource = cred.TokenSource
|
||||
opts = []option.ClientOption{
|
||||
option.WithUserAgent(userAgent),
|
||||
option.WithCredentials(cred),
|
||||
}
|
||||
}
|
||||
|
||||
// Initialize the high-level BigQuery client
|
||||
client, err := bigqueryapi.NewClient(ctx, project, option.WithUserAgent(userAgent), option.WithCredentials(cred))
|
||||
client, err := bigqueryapi.NewClient(ctx, project, opts...)
|
||||
if err != nil {
|
||||
return nil, nil, nil, fmt.Errorf("failed to create BigQuery client for project %q: %w", project, err)
|
||||
}
|
||||
client.Location = location
|
||||
|
||||
// Initialize the low-level BigQuery REST service using the same credentials
|
||||
restService, err := bigqueryrestapi.NewService(ctx, option.WithUserAgent(userAgent), option.WithCredentials(cred))
|
||||
restService, err := bigqueryrestapi.NewService(ctx, opts...)
|
||||
if err != nil {
|
||||
return nil, nil, nil, fmt.Errorf("failed to create BigQuery v2 service: %w", err)
|
||||
}
|
||||
|
||||
return client, restService, cred.TokenSource, nil
|
||||
return client, restService, tokenSource, nil
|
||||
}
|
||||
|
||||
// initBigQueryConnectionWithOAuthToken initialize a BigQuery client with an
|
||||
@@ -486,6 +531,7 @@ func initDataplexConnection(
|
||||
name string,
|
||||
project string,
|
||||
useClientOAuth bool,
|
||||
impersonateServiceAccount string,
|
||||
) (*dataplexapi.CatalogClient, DataplexClientCreator, error) {
|
||||
var client *dataplexapi.CatalogClient
|
||||
var clientCreator DataplexClientCreator
|
||||
@@ -494,11 +540,6 @@ func initDataplexConnection(
|
||||
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
|
||||
defer span.End()
|
||||
|
||||
cred, err := google.FindDefaultCredentials(ctx)
|
||||
if err != nil {
|
||||
return nil, nil, fmt.Errorf("failed to find default Google Cloud credentials: %w", err)
|
||||
}
|
||||
|
||||
userAgent, err := util.UserAgentFromContext(ctx)
|
||||
if err != nil {
|
||||
return nil, nil, err
|
||||
@@ -507,7 +548,34 @@ func initDataplexConnection(
|
||||
if useClientOAuth {
|
||||
clientCreator = newDataplexClientCreator(ctx, project, userAgent)
|
||||
} else {
|
||||
client, err = dataplexapi.NewCatalogClient(ctx, option.WithUserAgent(userAgent), option.WithCredentials(cred))
|
||||
var opts []option.ClientOption
|
||||
|
||||
if impersonateServiceAccount != "" {
|
||||
// Create impersonated credentials token source
|
||||
ts, err := impersonate.CredentialsTokenSource(ctx, impersonate.CredentialsConfig{
|
||||
TargetPrincipal: impersonateServiceAccount,
|
||||
Scopes: []string{"https://www.googleapis.com/auth/cloud-platform"},
|
||||
})
|
||||
if err != nil {
|
||||
return nil, nil, fmt.Errorf("failed to create impersonated credentials for %q: %w", impersonateServiceAccount, err)
|
||||
}
|
||||
opts = []option.ClientOption{
|
||||
option.WithUserAgent(userAgent),
|
||||
option.WithTokenSource(ts),
|
||||
}
|
||||
} else {
|
||||
// Use default credentials
|
||||
cred, err := google.FindDefaultCredentials(ctx)
|
||||
if err != nil {
|
||||
return nil, nil, fmt.Errorf("failed to find default Google Cloud credentials: %w", err)
|
||||
}
|
||||
opts = []option.ClientOption{
|
||||
option.WithUserAgent(userAgent),
|
||||
option.WithCredentials(cred),
|
||||
}
|
||||
}
|
||||
|
||||
client, err = dataplexapi.NewCatalogClient(ctx, opts...)
|
||||
if err != nil {
|
||||
return nil, nil, fmt.Errorf("failed to create Dataplex client for project %q: %w", project, err)
|
||||
}
|
||||
|
||||
@@ -110,6 +110,26 @@ func TestParseFromYamlBigQuery(t *testing.T) {
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
desc: "with service account impersonation example",
|
||||
in: `
|
||||
sources:
|
||||
my-instance:
|
||||
kind: bigquery
|
||||
project: my-project
|
||||
location: us
|
||||
impersonateServiceAccount: service-account@my-project.iam.gserviceaccount.com
|
||||
`,
|
||||
want: server.SourceConfigs{
|
||||
"my-instance": bigquery.Config{
|
||||
Name: "my-instance",
|
||||
Kind: bigquery.SourceKind,
|
||||
Project: "my-project",
|
||||
Location: "us",
|
||||
ImpersonateServiceAccount: "service-account@my-project.iam.gserviceaccount.com",
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
|
||||
@@ -54,7 +54,7 @@ type Config struct {
|
||||
Project string `yaml:"project" validate:"required"`
|
||||
Region string `yaml:"region" validate:"required"`
|
||||
Instance string `yaml:"instance" validate:"required"`
|
||||
IPAddress string `yaml:"ipAddress" validate:"required"`
|
||||
IPAddress string `yaml:"ipAddress"` // Deprecated: kept for backwards compatibility
|
||||
IPType sources.IPType `yaml:"ipType" validate:"required"`
|
||||
User string `yaml:"user" validate:"required"`
|
||||
Password string `yaml:"password" validate:"required"`
|
||||
@@ -68,7 +68,7 @@ func (r Config) SourceConfigKind() string {
|
||||
|
||||
func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.Source, error) {
|
||||
// Initializes a Cloud SQL MSSQL source
|
||||
db, err := initCloudSQLMssqlConnection(ctx, tracer, r.Name, r.Project, r.Region, r.Instance, r.IPAddress, r.IPType.String(), r.User, r.Password, r.Database)
|
||||
db, err := initCloudSQLMssqlConnection(ctx, tracer, r.Name, r.Project, r.Region, r.Instance, r.IPType.String(), r.User, r.Password, r.Database)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("unable to create db connection: %w", err)
|
||||
}
|
||||
@@ -106,7 +106,7 @@ func (s *Source) MSSQLDB() *sql.DB {
|
||||
return s.Db
|
||||
}
|
||||
|
||||
func initCloudSQLMssqlConnection(ctx context.Context, tracer trace.Tracer, name, project, region, instance, ipAddress, ipType, user, pass, dbname string) (*sql.DB, error) {
|
||||
func initCloudSQLMssqlConnection(ctx context.Context, tracer trace.Tracer, name, project, region, instance, ipType, user, pass, dbname string) (*sql.DB, error) {
|
||||
//nolint:all // Reassigned ctx
|
||||
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
|
||||
defer span.End()
|
||||
@@ -125,7 +125,6 @@ func initCloudSQLMssqlConnection(ctx context.Context, tracer trace.Tracer, name,
|
||||
url := &url.URL{
|
||||
Scheme: "sqlserver",
|
||||
User: url.UserPassword(user, pass),
|
||||
Host: ipAddress,
|
||||
RawQuery: query.Encode(),
|
||||
}
|
||||
|
||||
|
||||
@@ -40,22 +40,20 @@ func TestParseFromYamlCloudSQLMssql(t *testing.T) {
|
||||
region: my-region
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
ipAddress: localhost
|
||||
user: my_user
|
||||
password: my_pass
|
||||
`,
|
||||
want: server.SourceConfigs{
|
||||
"my-instance": cloudsqlmssql.Config{
|
||||
Name: "my-instance",
|
||||
Kind: cloudsqlmssql.SourceKind,
|
||||
Project: "my-project",
|
||||
Region: "my-region",
|
||||
Instance: "my-instance",
|
||||
IPAddress: "localhost",
|
||||
IPType: "public",
|
||||
Database: "my_db",
|
||||
User: "my_user",
|
||||
Password: "my_pass",
|
||||
Name: "my-instance",
|
||||
Kind: cloudsqlmssql.SourceKind,
|
||||
Project: "my-project",
|
||||
Region: "my-region",
|
||||
Instance: "my-instance",
|
||||
IPType: "public",
|
||||
Database: "my_db",
|
||||
User: "my_user",
|
||||
Password: "my_pass",
|
||||
},
|
||||
},
|
||||
},
|
||||
@@ -69,11 +67,38 @@ func TestParseFromYamlCloudSQLMssql(t *testing.T) {
|
||||
region: my-region
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
ipAddress: localhost
|
||||
user: my_user
|
||||
password: my_pass
|
||||
ipType: psc
|
||||
`,
|
||||
want: server.SourceConfigs{
|
||||
"my-instance": cloudsqlmssql.Config{
|
||||
Name: "my-instance",
|
||||
Kind: cloudsqlmssql.SourceKind,
|
||||
Project: "my-project",
|
||||
Region: "my-region",
|
||||
Instance: "my-instance",
|
||||
IPType: "psc",
|
||||
Database: "my_db",
|
||||
User: "my_user",
|
||||
Password: "my_pass",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
desc: "with deprecated ipAddress",
|
||||
in: `
|
||||
sources:
|
||||
my-instance:
|
||||
kind: cloud-sql-mssql
|
||||
project: my-project
|
||||
region: my-region
|
||||
instance: my-instance
|
||||
ipAddress: random
|
||||
database: my_db
|
||||
user: my_user
|
||||
password: my_pass
|
||||
`,
|
||||
want: server.SourceConfigs{
|
||||
"my-instance": cloudsqlmssql.Config{
|
||||
Name: "my-instance",
|
||||
@@ -81,8 +106,8 @@ func TestParseFromYamlCloudSQLMssql(t *testing.T) {
|
||||
Project: "my-project",
|
||||
Region: "my-region",
|
||||
Instance: "my-instance",
|
||||
IPAddress: "localhost",
|
||||
IPType: "psc",
|
||||
IPAddress: "random",
|
||||
IPType: "public",
|
||||
Database: "my_db",
|
||||
User: "my_user",
|
||||
Password: "my_pass",
|
||||
@@ -125,7 +150,6 @@ func TestFailParseFromYaml(t *testing.T) {
|
||||
instance: my-instance
|
||||
ipType: fail
|
||||
database: my_db
|
||||
ipAddress: localhost
|
||||
user: my_user
|
||||
password: my_pass
|
||||
`,
|
||||
@@ -141,12 +165,11 @@ func TestFailParseFromYaml(t *testing.T) {
|
||||
region: my-region
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
ipAddress: localhost
|
||||
user: my_user
|
||||
password: my_pass
|
||||
foo: bar
|
||||
`,
|
||||
err: "unable to parse source \"my-instance\" as \"cloud-sql-mssql\": [2:1] unknown field \"foo\"\n 1 | database: my_db\n> 2 | foo: bar\n ^\n 3 | instance: my-instance\n 4 | ipAddress: localhost\n 5 | kind: cloud-sql-mssql\n 6 | ",
|
||||
err: "unable to parse source \"my-instance\" as \"cloud-sql-mssql\": [2:1] unknown field \"foo\"\n 1 | database: my_db\n> 2 | foo: bar\n ^\n 3 | instance: my-instance\n 4 | kind: cloud-sql-mssql\n 5 | password: my_pass\n 6 | ",
|
||||
},
|
||||
{
|
||||
desc: "missing required field",
|
||||
@@ -157,7 +180,6 @@ func TestFailParseFromYaml(t *testing.T) {
|
||||
region: my-region
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
ipAddress: localhost
|
||||
user: my_user
|
||||
password: my_pass
|
||||
`,
|
||||
|
||||
@@ -25,6 +25,7 @@ import (
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
bigqueryds "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
bqutil "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerycommon"
|
||||
bigqueryrestapi "google.golang.org/api/bigquery/v2"
|
||||
"google.golang.org/api/iterator"
|
||||
)
|
||||
@@ -50,6 +51,8 @@ type compatibleSource interface {
|
||||
BigQueryRestService() *bigqueryrestapi.Service
|
||||
BigQueryClientCreator() bigqueryds.BigqueryClientCreator
|
||||
UseClientAuthorization() bool
|
||||
IsDatasetAllowed(projectID, datasetID string) bool
|
||||
BigQueryAllowedDatasets() []string
|
||||
BigQuerySession() bigqueryds.BigQuerySessionProvider
|
||||
}
|
||||
|
||||
@@ -86,8 +89,17 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
|
||||
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
|
||||
}
|
||||
|
||||
inputDataParameter := tools.NewStringParameter("input_data",
|
||||
"The data that contain the test and control data to analyze. Can be a fully qualified BigQuery table ID or a SQL query.")
|
||||
allowedDatasets := s.BigQueryAllowedDatasets()
|
||||
inputDataDescription := "The data that contain the test and control data to analyze. Can be a fully qualified BigQuery table ID or a SQL query."
|
||||
if len(allowedDatasets) > 0 {
|
||||
datasetIDs := []string{}
|
||||
for _, ds := range allowedDatasets {
|
||||
datasetIDs = append(datasetIDs, fmt.Sprintf("`%s`", ds))
|
||||
}
|
||||
inputDataDescription += fmt.Sprintf(" The query or table must only access datasets from the following list: %s.", strings.Join(datasetIDs, ", "))
|
||||
}
|
||||
|
||||
inputDataParameter := tools.NewStringParameter("input_data", inputDataDescription)
|
||||
contributionMetricParameter := tools.NewStringParameter("contribution_metric",
|
||||
`The name of the column that contains the metric to analyze.
|
||||
Provides the expression to use to calculate the metric you are analyzing.
|
||||
@@ -123,17 +135,19 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
|
||||
|
||||
// finish tool setup
|
||||
t := Tool{
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientAuthorization(),
|
||||
ClientCreator: s.BigQueryClientCreator(),
|
||||
Client: s.BigQueryClient(),
|
||||
RestService: s.BigQueryRestService(),
|
||||
SessionProvider: s.BigQuerySession(),
|
||||
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
|
||||
mcpManifest: mcpManifest,
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientAuthorization(),
|
||||
ClientCreator: s.BigQueryClientCreator(),
|
||||
Client: s.BigQueryClient(),
|
||||
RestService: s.BigQueryRestService(),
|
||||
IsDatasetAllowed: s.IsDatasetAllowed,
|
||||
AllowedDatasets: allowedDatasets,
|
||||
SessionProvider: s.BigQuerySession(),
|
||||
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
|
||||
mcpManifest: mcpManifest,
|
||||
}
|
||||
return t, nil
|
||||
}
|
||||
@@ -148,12 +162,14 @@ type Tool struct {
|
||||
UseClientOAuth bool `yaml:"useClientOAuth"`
|
||||
Parameters tools.Parameters `yaml:"parameters"`
|
||||
|
||||
Client *bigqueryapi.Client
|
||||
RestService *bigqueryrestapi.Service
|
||||
ClientCreator bigqueryds.BigqueryClientCreator
|
||||
SessionProvider bigqueryds.BigQuerySessionProvider
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
Client *bigqueryapi.Client
|
||||
RestService *bigqueryrestapi.Service
|
||||
ClientCreator bigqueryds.BigqueryClientCreator
|
||||
IsDatasetAllowed func(projectID, datasetID string) bool
|
||||
AllowedDatasets []string
|
||||
SessionProvider bigqueryds.BigQuerySessionProvider
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
}
|
||||
|
||||
// Invoke runs the contribution analysis.
|
||||
@@ -164,6 +180,22 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
return nil, fmt.Errorf("unable to cast input_data parameter %s", paramsMap["input_data"])
|
||||
}
|
||||
|
||||
bqClient := t.Client
|
||||
restService := t.RestService
|
||||
var err error
|
||||
|
||||
// Initialize new client if using user OAuth token
|
||||
if t.UseClientOAuth {
|
||||
tokenStr, err := accessToken.ParseBearerToken()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||
}
|
||||
bqClient, restService, err = t.ClientCreator(tokenStr, true)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error creating client from OAuth access token: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
modelID := fmt.Sprintf("contribution_analysis_model_%s", strings.ReplaceAll(uuid.New().String(), "-", ""))
|
||||
|
||||
var options []string
|
||||
@@ -196,8 +228,54 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
var inputDataSource string
|
||||
trimmedUpperInputData := strings.TrimSpace(strings.ToUpper(inputData))
|
||||
if strings.HasPrefix(trimmedUpperInputData, "SELECT") || strings.HasPrefix(trimmedUpperInputData, "WITH") {
|
||||
if len(t.AllowedDatasets) > 0 {
|
||||
var connProps []*bigqueryapi.ConnectionProperty
|
||||
session, err := t.SessionProvider(ctx)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to get BigQuery session: %w", err)
|
||||
}
|
||||
if session != nil {
|
||||
connProps = []*bigqueryapi.ConnectionProperty{
|
||||
{Key: "session_id", Value: session.ID},
|
||||
}
|
||||
}
|
||||
dryRunJob, err := bqutil.DryRunQuery(ctx, restService, t.Client.Project(), t.Client.Location, inputData, nil, connProps)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("query validation failed: %w", err)
|
||||
}
|
||||
statementType := dryRunJob.Statistics.Query.StatementType
|
||||
if statementType != "SELECT" {
|
||||
return nil, fmt.Errorf("the 'input_data' parameter only supports a table ID or a SELECT query. The provided query has statement type '%s'", statementType)
|
||||
}
|
||||
|
||||
queryStats := dryRunJob.Statistics.Query
|
||||
if queryStats != nil {
|
||||
for _, tableRef := range queryStats.ReferencedTables {
|
||||
if !t.IsDatasetAllowed(tableRef.ProjectId, tableRef.DatasetId) {
|
||||
return nil, fmt.Errorf("query in input_data accesses dataset '%s.%s', which is not in the allowed list", tableRef.ProjectId, tableRef.DatasetId)
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return nil, fmt.Errorf("could not analyze query in input_data to validate against allowed datasets")
|
||||
}
|
||||
}
|
||||
inputDataSource = fmt.Sprintf("(%s)", inputData)
|
||||
} else {
|
||||
if len(t.AllowedDatasets) > 0 {
|
||||
parts := strings.Split(inputData, ".")
|
||||
var projectID, datasetID string
|
||||
switch len(parts) {
|
||||
case 3: // project.dataset.table
|
||||
projectID, datasetID = parts[0], parts[1]
|
||||
case 2: // dataset.table
|
||||
projectID, datasetID = t.Client.Project(), parts[0]
|
||||
default:
|
||||
return nil, fmt.Errorf("invalid table ID format for 'input_data': %q. Expected 'dataset.table' or 'project.dataset.table'", inputData)
|
||||
}
|
||||
if !t.IsDatasetAllowed(projectID, datasetID) {
|
||||
return nil, fmt.Errorf("access to dataset '%s.%s' (from table '%s') is not allowed", projectID, datasetID, inputData)
|
||||
}
|
||||
}
|
||||
inputDataSource = fmt.Sprintf("SELECT * FROM `%s`", inputData)
|
||||
}
|
||||
|
||||
@@ -209,21 +287,6 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
inputDataSource,
|
||||
)
|
||||
|
||||
bqClient := t.Client
|
||||
var err error
|
||||
|
||||
// Initialize new client if using user OAuth token
|
||||
if t.UseClientOAuth {
|
||||
tokenStr, err := accessToken.ParseBearerToken()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||
}
|
||||
bqClient, _, err = t.ClientCreator(tokenStr, false)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error creating client from OAuth access token: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
createModelQuery := bqClient.Query(createModelSQL)
|
||||
|
||||
// Get session from provider if in protected mode.
|
||||
|
||||
@@ -309,7 +309,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
// This block handles SELECT statements, which return a row set.
|
||||
// We iterate through the results, convert each row into a map of
|
||||
|
||||
@@ -294,7 +294,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
// This block handles SELECT statements, which return a row set.
|
||||
// We iterate through the results, convert each row into a map of
|
||||
|
||||
@@ -23,6 +23,7 @@ import (
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
bigqueryds "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
bqutil "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerycommon"
|
||||
)
|
||||
|
||||
const kind string = "bigquery-get-dataset-info"
|
||||
@@ -48,6 +49,8 @@ type compatibleSource interface {
|
||||
BigQueryClient() *bigqueryapi.Client
|
||||
BigQueryClientCreator() bigqueryds.BigqueryClientCreator
|
||||
UseClientAuthorization() bool
|
||||
IsDatasetAllowed(projectID, datasetID string) bool
|
||||
BigQueryAllowedDatasets() []string
|
||||
}
|
||||
|
||||
// validate compatible sources are still compatible
|
||||
@@ -83,23 +86,33 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
|
||||
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
|
||||
}
|
||||
|
||||
projectParameter := tools.NewStringParameterWithDefault(projectKey, s.BigQueryProject(), "The Google Cloud project ID containing the dataset.")
|
||||
datasetParameter := tools.NewStringParameter(datasetKey, "The dataset to get metadata information.")
|
||||
defaultProjectID := s.BigQueryProject()
|
||||
projectDescription := "The Google Cloud project ID containing the dataset."
|
||||
datasetDescription := "The dataset to get metadata information. Can be in `project.dataset` format."
|
||||
var datasetParameter tools.Parameter
|
||||
var projectParameter tools.Parameter
|
||||
|
||||
projectParameter, datasetParameter = bqutil.InitializeDatasetParameters(
|
||||
s.BigQueryAllowedDatasets(),
|
||||
defaultProjectID,
|
||||
projectKey, datasetKey,
|
||||
projectDescription, datasetDescription)
|
||||
parameters := tools.Parameters{projectParameter, datasetParameter}
|
||||
|
||||
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
|
||||
|
||||
// finish tool setup
|
||||
t := Tool{
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientAuthorization(),
|
||||
ClientCreator: s.BigQueryClientCreator(),
|
||||
Client: s.BigQueryClient(),
|
||||
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
|
||||
mcpManifest: mcpManifest,
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientAuthorization(),
|
||||
ClientCreator: s.BigQueryClientCreator(),
|
||||
Client: s.BigQueryClient(),
|
||||
IsDatasetAllowed: s.IsDatasetAllowed,
|
||||
manifest: tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
|
||||
mcpManifest: mcpManifest,
|
||||
}
|
||||
return t, nil
|
||||
}
|
||||
@@ -114,11 +127,12 @@ type Tool struct {
|
||||
UseClientOAuth bool `yaml:"useClientOAuth"`
|
||||
Parameters tools.Parameters `yaml:"parameters"`
|
||||
|
||||
Client *bigqueryapi.Client
|
||||
ClientCreator bigqueryds.BigqueryClientCreator
|
||||
Statement string
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
Client *bigqueryapi.Client
|
||||
ClientCreator bigqueryds.BigqueryClientCreator
|
||||
Statement string
|
||||
IsDatasetAllowed func(projectID, datasetID string) bool
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||
@@ -147,11 +161,16 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
return nil, fmt.Errorf("error creating client from OAuth access token: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
if !t.IsDatasetAllowed(projectId, datasetId) {
|
||||
return nil, fmt.Errorf("access denied to dataset '%s' because it is not in the configured list of allowed datasets for project '%s'", datasetId, projectId)
|
||||
}
|
||||
|
||||
dsHandle := bqClient.DatasetInProject(projectId, datasetId)
|
||||
|
||||
metadata, err := dsHandle.Metadata(ctx)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to get metadata for dataset %s (in project %s): %w", datasetId, bqClient.Project(), err)
|
||||
return nil, fmt.Errorf("failed to get metadata for dataset %s (in project %s): %w", datasetId, projectId, err)
|
||||
}
|
||||
|
||||
return metadata, nil
|
||||
|
||||
@@ -116,7 +116,7 @@ func (t *Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
rows, err := t.Db.QueryContext(ctx, sql)
|
||||
if err != nil {
|
||||
|
||||
@@ -0,0 +1,150 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
package lookergetconnectiondatabases
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
||||
|
||||
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
|
||||
)
|
||||
|
||||
const kind string = "looker-get-connection-databases"
|
||||
|
||||
func init() {
|
||||
if !tools.Register(kind, newConfig) {
|
||||
panic(fmt.Sprintf("tool kind %q already registered", kind))
|
||||
}
|
||||
}
|
||||
|
||||
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
|
||||
actual := Config{Name: name}
|
||||
if err := decoder.DecodeContext(ctx, &actual); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return actual, nil
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
Name string `yaml:"name" validate:"required"`
|
||||
Kind string `yaml:"kind" validate:"required"`
|
||||
Source string `yaml:"source" validate:"required"`
|
||||
Description string `yaml:"description" validate:"required"`
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.ToolConfig = Config{}
|
||||
|
||||
func (cfg Config) ToolConfigKind() string {
|
||||
return kind
|
||||
}
|
||||
|
||||
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
|
||||
// verify source exists
|
||||
rawS, ok := srcs[cfg.Source]
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
|
||||
}
|
||||
|
||||
// verify the source is compatible
|
||||
s, ok := rawS.(*lookersrc.Source)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
|
||||
}
|
||||
|
||||
connParameter := tools.NewStringParameter("conn", "The connection containing the databases.")
|
||||
parameters := tools.Parameters{connParameter}
|
||||
|
||||
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
|
||||
|
||||
// finish tool setup
|
||||
return Tool{
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientOAuth,
|
||||
Client: s.Client,
|
||||
ApiSettings: s.ApiSettings,
|
||||
manifest: tools.Manifest{
|
||||
Description: cfg.Description,
|
||||
Parameters: parameters.Manifest(),
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
},
|
||||
mcpManifest: mcpManifest,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.Tool = Tool{}
|
||||
|
||||
type Tool struct {
|
||||
Name string `yaml:"name"`
|
||||
Kind string `yaml:"kind"`
|
||||
UseClientOAuth bool
|
||||
Client *v4.LookerSDK
|
||||
ApiSettings *rtl.ApiSettings
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
Parameters tools.Parameters `yaml:"parameters"`
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||
mapParams := params.AsMap()
|
||||
conn, ok := mapParams["conn"].(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("'conn' must be a string, got %T", mapParams["conn"])
|
||||
}
|
||||
|
||||
sdk, err := lookercommon.GetLookerSDK(t.UseClientOAuth, t.ApiSettings, t.Client, accessToken)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||
}
|
||||
resp, err := sdk.ConnectionDatabases(conn, t.ApiSettings)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error making get_connection_databases request: %s", err)
|
||||
}
|
||||
//logger.DebugContext(ctx, "Got response of %v\n", resp)
|
||||
|
||||
return resp, nil
|
||||
}
|
||||
|
||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
|
||||
return tools.ParseParams(t.Parameters, data, claims)
|
||||
}
|
||||
|
||||
func (t Tool) Manifest() tools.Manifest {
|
||||
return t.manifest
|
||||
}
|
||||
|
||||
func (t Tool) McpManifest() tools.McpManifest {
|
||||
return t.mcpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Authorized(verifiedAuthServices []string) bool {
|
||||
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
|
||||
}
|
||||
|
||||
func (t Tool) RequiresClientAuthorization() bool {
|
||||
return t.UseClientOAuth
|
||||
}
|
||||
@@ -0,0 +1,116 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package lookergetconnectiondatabases_test
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectiondatabases"
|
||||
)
|
||||
|
||||
func TestParseFromYamlLookerGetConnectionDatabases(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
want server.ToolConfigs
|
||||
}{
|
||||
{
|
||||
desc: "basic example",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-databases
|
||||
source: my-instance
|
||||
description: some description
|
||||
`,
|
||||
want: server.ToolConfigs{
|
||||
"example_tool": lkr.Config{
|
||||
Name: "example_tool",
|
||||
Kind: "looker-get-connection-databases",
|
||||
Source: "my-instance",
|
||||
Description: "some description",
|
||||
AuthRequired: []string{},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to unmarshal: %s", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
|
||||
t.Fatalf("incorrect parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestFailParseFromYamlLookerGetConnectionDatabases(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
err string
|
||||
}{
|
||||
{
|
||||
desc: "Invalid method",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-databases
|
||||
source: my-instance
|
||||
method: GOT
|
||||
description: some description
|
||||
`,
|
||||
err: "unable to parse tool \"example_tool\" as kind \"looker-get-connection-databases\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-connection-databases\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err == nil {
|
||||
t.Fatalf("expect parsing to fail")
|
||||
}
|
||||
errStr := err.Error()
|
||||
if !strings.Contains(errStr, tc.err) {
|
||||
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
@@ -0,0 +1,169 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
package lookergetconnections
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
||||
"github.com/googleapis/genai-toolbox/internal/util"
|
||||
|
||||
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
|
||||
)
|
||||
|
||||
const kind string = "looker-get-connections"
|
||||
|
||||
func init() {
|
||||
if !tools.Register(kind, newConfig) {
|
||||
panic(fmt.Sprintf("tool kind %q already registered", kind))
|
||||
}
|
||||
}
|
||||
|
||||
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
|
||||
actual := Config{Name: name}
|
||||
if err := decoder.DecodeContext(ctx, &actual); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return actual, nil
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
Name string `yaml:"name" validate:"required"`
|
||||
Kind string `yaml:"kind" validate:"required"`
|
||||
Source string `yaml:"source" validate:"required"`
|
||||
Description string `yaml:"description" validate:"required"`
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.ToolConfig = Config{}
|
||||
|
||||
func (cfg Config) ToolConfigKind() string {
|
||||
return kind
|
||||
}
|
||||
|
||||
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
|
||||
// verify source exists
|
||||
rawS, ok := srcs[cfg.Source]
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
|
||||
}
|
||||
|
||||
// verify the source is compatible
|
||||
s, ok := rawS.(*lookersrc.Source)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
|
||||
}
|
||||
|
||||
parameters := tools.Parameters{}
|
||||
|
||||
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
|
||||
|
||||
// finish tool setup
|
||||
return Tool{
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientOAuth,
|
||||
Client: s.Client,
|
||||
ApiSettings: s.ApiSettings,
|
||||
manifest: tools.Manifest{
|
||||
Description: cfg.Description,
|
||||
Parameters: parameters.Manifest(),
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
},
|
||||
mcpManifest: mcpManifest,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.Tool = Tool{}
|
||||
|
||||
type Tool struct {
|
||||
Name string `yaml:"name"`
|
||||
Kind string `yaml:"kind"`
|
||||
UseClientOAuth bool
|
||||
Client *v4.LookerSDK
|
||||
ApiSettings *rtl.ApiSettings
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
Parameters tools.Parameters `yaml:"parameters"`
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||
}
|
||||
|
||||
sdk, err := lookercommon.GetLookerSDK(t.UseClientOAuth, t.ApiSettings, t.Client, accessToken)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||
}
|
||||
resp, err := sdk.AllConnections("name, dialect(name), database, schema",t.ApiSettings)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error making get_connections request: %s", err)
|
||||
}
|
||||
|
||||
var data []any
|
||||
for _, v := range resp {
|
||||
vMap := make(map[string]any)
|
||||
vMap["name"] = *v.Name
|
||||
vMap["dialect_name"] = *v.Dialect.Name
|
||||
if v.Database != nil {
|
||||
vMap["database"] = *v.Database
|
||||
}
|
||||
if v.Schema != nil {
|
||||
vMap["schema"] = *v.Schema
|
||||
}
|
||||
conn, err := sdk.ConnectionFeatures(*v.Name, "multiple_databases", t.ApiSettings)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error making get_connection_features request: %s", err)
|
||||
}
|
||||
vMap["supports_multiple_databases"] = *conn.MultipleDatabases
|
||||
logger.DebugContext(ctx, "Converted to %v\n", vMap)
|
||||
data = append(data, vMap)
|
||||
}
|
||||
logger.DebugContext(ctx, "data = ", data)
|
||||
|
||||
return data, nil
|
||||
}
|
||||
|
||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
|
||||
return tools.ParamValues{}, nil
|
||||
}
|
||||
|
||||
func (t Tool) Manifest() tools.Manifest {
|
||||
return t.manifest
|
||||
}
|
||||
|
||||
func (t Tool) McpManifest() tools.McpManifest {
|
||||
return t.mcpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Authorized(verifiedAuthServices []string) bool {
|
||||
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
|
||||
}
|
||||
|
||||
func (t Tool) RequiresClientAuthorization() bool {
|
||||
return t.UseClientOAuth
|
||||
}
|
||||
@@ -0,0 +1,116 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package lookergetconnections_test
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnections"
|
||||
)
|
||||
|
||||
func TestParseFromYamlLookerGetConnections(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
want server.ToolConfigs
|
||||
}{
|
||||
{
|
||||
desc: "basic example",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connections
|
||||
source: my-instance
|
||||
description: some description
|
||||
`,
|
||||
want: server.ToolConfigs{
|
||||
"example_tool": lkr.Config{
|
||||
Name: "example_tool",
|
||||
Kind: "looker-get-connections",
|
||||
Source: "my-instance",
|
||||
Description: "some description",
|
||||
AuthRequired: []string{},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to unmarshal: %s", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
|
||||
t.Fatalf("incorrect parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestFailParseFromYamlLookerGetConnections(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
err string
|
||||
}{
|
||||
{
|
||||
desc: "Invalid method",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connections
|
||||
source: my-instance
|
||||
method: GOT
|
||||
description: some description
|
||||
`,
|
||||
err: "unable to parse tool \"example_tool\" as kind \"looker-get-connections\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-connections\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err == nil {
|
||||
t.Fatalf("expect parsing to fail")
|
||||
}
|
||||
errStr := err.Error()
|
||||
if !strings.Contains(errStr, tc.err) {
|
||||
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
@@ -0,0 +1,156 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
package lookergetconnectionschemas
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
||||
|
||||
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
|
||||
)
|
||||
|
||||
const kind string = "looker-get-connection-schemas"
|
||||
|
||||
func init() {
|
||||
if !tools.Register(kind, newConfig) {
|
||||
panic(fmt.Sprintf("tool kind %q already registered", kind))
|
||||
}
|
||||
}
|
||||
|
||||
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
|
||||
actual := Config{Name: name}
|
||||
if err := decoder.DecodeContext(ctx, &actual); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return actual, nil
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
Name string `yaml:"name" validate:"required"`
|
||||
Kind string `yaml:"kind" validate:"required"`
|
||||
Source string `yaml:"source" validate:"required"`
|
||||
Description string `yaml:"description" validate:"required"`
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.ToolConfig = Config{}
|
||||
|
||||
func (cfg Config) ToolConfigKind() string {
|
||||
return kind
|
||||
}
|
||||
|
||||
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
|
||||
// verify source exists
|
||||
rawS, ok := srcs[cfg.Source]
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
|
||||
}
|
||||
|
||||
// verify the source is compatible
|
||||
s, ok := rawS.(*lookersrc.Source)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
|
||||
}
|
||||
|
||||
connParameter := tools.NewStringParameter("conn", "The connection containing the schemas.")
|
||||
dbParameter := tools.NewStringParameterWithRequired("db", "The optional database to search", false)
|
||||
parameters := tools.Parameters{connParameter, dbParameter}
|
||||
|
||||
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
|
||||
|
||||
// finish tool setup
|
||||
return Tool{
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientOAuth,
|
||||
Client: s.Client,
|
||||
ApiSettings: s.ApiSettings,
|
||||
manifest: tools.Manifest{
|
||||
Description: cfg.Description,
|
||||
Parameters: parameters.Manifest(),
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
},
|
||||
mcpManifest: mcpManifest,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.Tool = Tool{}
|
||||
|
||||
type Tool struct {
|
||||
Name string `yaml:"name"`
|
||||
Kind string `yaml:"kind"`
|
||||
UseClientOAuth bool
|
||||
Client *v4.LookerSDK
|
||||
ApiSettings *rtl.ApiSettings
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
Parameters tools.Parameters `yaml:"parameters"`
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||
mapParams := params.AsMap()
|
||||
conn, ok := mapParams["conn"].(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("'conn' must be a string, got %T", mapParams["conn"])
|
||||
}
|
||||
db, _ := mapParams["db"].(string)
|
||||
|
||||
sdk, err := lookercommon.GetLookerSDK(t.UseClientOAuth, t.ApiSettings, t.Client, accessToken)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||
}
|
||||
req := v4.RequestConnectionSchemas{
|
||||
ConnectionName: conn,
|
||||
}
|
||||
if db != "" {
|
||||
req.Database = &db
|
||||
}
|
||||
resp, err := sdk.ConnectionSchemas(req, t.ApiSettings)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error making get_connection_schemas request: %s", err)
|
||||
}
|
||||
return resp, nil
|
||||
}
|
||||
|
||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
|
||||
return tools.ParseParams(t.Parameters, data, claims)
|
||||
}
|
||||
|
||||
func (t Tool) Manifest() tools.Manifest {
|
||||
return t.manifest
|
||||
}
|
||||
|
||||
func (t Tool) McpManifest() tools.McpManifest {
|
||||
return t.mcpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Authorized(verifiedAuthServices []string) bool {
|
||||
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
|
||||
}
|
||||
|
||||
func (t Tool) RequiresClientAuthorization() bool {
|
||||
return t.UseClientOAuth
|
||||
}
|
||||
@@ -0,0 +1,116 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package lookergetconnectionschemas_test
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectionschemas"
|
||||
)
|
||||
|
||||
func TestParseFromYamlLookerGetConnectionSchemas(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
want server.ToolConfigs
|
||||
}{
|
||||
{
|
||||
desc: "basic example",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-schemas
|
||||
source: my-instance
|
||||
description: some description
|
||||
`,
|
||||
want: server.ToolConfigs{
|
||||
"example_tool": lkr.Config{
|
||||
Name: "example_tool",
|
||||
Kind: "looker-get-connection-schemas",
|
||||
Source: "my-instance",
|
||||
Description: "some description",
|
||||
AuthRequired: []string{},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to unmarshal: %s", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
|
||||
t.Fatalf("incorrect parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestFailParseFromYamlLookerGetConnectionSchemas(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
err string
|
||||
}{
|
||||
{
|
||||
desc: "Invalid method",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-schemas
|
||||
source: my-instance
|
||||
method: GOT
|
||||
description: some description
|
||||
`,
|
||||
err: "unable to parse tool \"example_tool\" as kind \"looker-get-connection-schemas\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-connection-schemas\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err == nil {
|
||||
t.Fatalf("expect parsing to fail")
|
||||
}
|
||||
errStr := err.Error()
|
||||
if !strings.Contains(errStr, tc.err) {
|
||||
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
@@ -0,0 +1,193 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
package lookergetconnectiontablecolumns
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
||||
"github.com/googleapis/genai-toolbox/internal/util"
|
||||
|
||||
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
|
||||
)
|
||||
|
||||
const kind string = "looker-get-connection-table-columns"
|
||||
|
||||
func init() {
|
||||
if !tools.Register(kind, newConfig) {
|
||||
panic(fmt.Sprintf("tool kind %q already registered", kind))
|
||||
}
|
||||
}
|
||||
|
||||
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
|
||||
actual := Config{Name: name}
|
||||
if err := decoder.DecodeContext(ctx, &actual); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return actual, nil
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
Name string `yaml:"name" validate:"required"`
|
||||
Kind string `yaml:"kind" validate:"required"`
|
||||
Source string `yaml:"source" validate:"required"`
|
||||
Description string `yaml:"description" validate:"required"`
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.ToolConfig = Config{}
|
||||
|
||||
func (cfg Config) ToolConfigKind() string {
|
||||
return kind
|
||||
}
|
||||
|
||||
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
|
||||
// verify source exists
|
||||
rawS, ok := srcs[cfg.Source]
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
|
||||
}
|
||||
|
||||
// verify the source is compatible
|
||||
s, ok := rawS.(*lookersrc.Source)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
|
||||
}
|
||||
|
||||
connParameter := tools.NewStringParameter("conn", "The connection containing the tables.")
|
||||
dbParameter := tools.NewStringParameterWithRequired("db", "The optional database to search", false)
|
||||
schemaParameter := tools.NewStringParameter("schema", "The schema containing the tables.")
|
||||
tablesParameter := tools.NewStringParameter("tables", "A comma separated list of tables containing the columns.")
|
||||
parameters := tools.Parameters{connParameter, dbParameter, schemaParameter, tablesParameter}
|
||||
|
||||
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
|
||||
|
||||
// finish tool setup
|
||||
return Tool{
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientOAuth,
|
||||
Client: s.Client,
|
||||
ApiSettings: s.ApiSettings,
|
||||
manifest: tools.Manifest{
|
||||
Description: cfg.Description,
|
||||
Parameters: parameters.Manifest(),
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
},
|
||||
mcpManifest: mcpManifest,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.Tool = Tool{}
|
||||
|
||||
type Tool struct {
|
||||
Name string `yaml:"name"`
|
||||
Kind string `yaml:"kind"`
|
||||
UseClientOAuth bool
|
||||
Client *v4.LookerSDK
|
||||
ApiSettings *rtl.ApiSettings
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
Parameters tools.Parameters `yaml:"parameters"`
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||
}
|
||||
mapParams := params.AsMap()
|
||||
conn, ok := mapParams["conn"].(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("'conn' must be a string, got %T", mapParams["conn"])
|
||||
}
|
||||
db, _ := mapParams["db"].(string)
|
||||
schema, ok := mapParams["schema"].(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("'schema' must be a string, got %T", mapParams["schema"])
|
||||
}
|
||||
tables, ok := mapParams["tables"].(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("'tables' must be a string, got %T", mapParams["tables"])
|
||||
}
|
||||
|
||||
sdk, err := lookercommon.GetLookerSDK(t.UseClientOAuth, t.ApiSettings, t.Client, accessToken)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||
}
|
||||
req := v4.RequestConnectionColumns{
|
||||
ConnectionName: conn,
|
||||
SchemaName: &schema,
|
||||
TableNames: &tables,
|
||||
}
|
||||
if db != "" {
|
||||
req.Database = &db
|
||||
}
|
||||
resp, err := sdk.ConnectionColumns(req, t.ApiSettings)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error making get_connection_table_columns request: %s", err)
|
||||
}
|
||||
var data []any
|
||||
for _, t := range resp {
|
||||
vMap := make(map[string]any)
|
||||
vMap["table_name"] = *t.Name
|
||||
vMap["sql_escaped_table_name"] = *t.SqlEscapedName
|
||||
vMap["schema_name"] = *t.SchemaName
|
||||
var columnData []any
|
||||
for _, c := range *t.Columns {
|
||||
vMap2 := make(map[string]any)
|
||||
vMap2["column_name"] = *c.Name
|
||||
vMap2["sql_escaped_column_name"] = *c.SqlEscapedName
|
||||
vMap2["data_type_database"] = *c.DataTypeDatabase
|
||||
vMap2["data_type_looker"] = *c.DataTypeLooker
|
||||
columnData = append(columnData, vMap2)
|
||||
}
|
||||
vMap["columns"] = columnData
|
||||
data = append(data, vMap)
|
||||
}
|
||||
logger.DebugContext(ctx, "data = ", data)
|
||||
|
||||
return data, nil
|
||||
}
|
||||
|
||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
|
||||
return tools.ParseParams(t.Parameters, data, claims)
|
||||
}
|
||||
|
||||
func (t Tool) Manifest() tools.Manifest {
|
||||
return t.manifest
|
||||
}
|
||||
|
||||
func (t Tool) McpManifest() tools.McpManifest {
|
||||
return t.mcpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Authorized(verifiedAuthServices []string) bool {
|
||||
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
|
||||
}
|
||||
|
||||
func (t Tool) RequiresClientAuthorization() bool {
|
||||
return t.UseClientOAuth
|
||||
}
|
||||
@@ -0,0 +1,116 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package lookergetconnectiontablecolumns_test
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectiontablecolumns"
|
||||
)
|
||||
|
||||
func TestParseFromYamlLookerGetConnectionTableColumns(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
want server.ToolConfigs
|
||||
}{
|
||||
{
|
||||
desc: "basic example",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-table-columns
|
||||
source: my-instance
|
||||
description: some description
|
||||
`,
|
||||
want: server.ToolConfigs{
|
||||
"example_tool": lkr.Config{
|
||||
Name: "example_tool",
|
||||
Kind: "looker-get-connection-table-columns",
|
||||
Source: "my-instance",
|
||||
Description: "some description",
|
||||
AuthRequired: []string{},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to unmarshal: %s", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
|
||||
t.Fatalf("incorrect parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestFailParseFromYamlLookerGetConnectionTableColumns(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
err string
|
||||
}{
|
||||
{
|
||||
desc: "Invalid method",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-table-columns
|
||||
source: my-instance
|
||||
method: GOT
|
||||
description: some description
|
||||
`,
|
||||
err: "unable to parse tool \"example_tool\" as kind \"looker-get-connection-table-columns\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-connection-table-columns\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err == nil {
|
||||
t.Fatalf("expect parsing to fail")
|
||||
}
|
||||
errStr := err.Error()
|
||||
if !strings.Contains(errStr, tc.err) {
|
||||
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
@@ -0,0 +1,184 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
package lookergetconnectiontables
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
lookersrc "github.com/googleapis/genai-toolbox/internal/sources/looker"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
||||
"github.com/googleapis/genai-toolbox/internal/util"
|
||||
|
||||
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
|
||||
)
|
||||
|
||||
const kind string = "looker-get-connection-tables"
|
||||
|
||||
func init() {
|
||||
if !tools.Register(kind, newConfig) {
|
||||
panic(fmt.Sprintf("tool kind %q already registered", kind))
|
||||
}
|
||||
}
|
||||
|
||||
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
|
||||
actual := Config{Name: name}
|
||||
if err := decoder.DecodeContext(ctx, &actual); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return actual, nil
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
Name string `yaml:"name" validate:"required"`
|
||||
Kind string `yaml:"kind" validate:"required"`
|
||||
Source string `yaml:"source" validate:"required"`
|
||||
Description string `yaml:"description" validate:"required"`
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.ToolConfig = Config{}
|
||||
|
||||
func (cfg Config) ToolConfigKind() string {
|
||||
return kind
|
||||
}
|
||||
|
||||
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
|
||||
// verify source exists
|
||||
rawS, ok := srcs[cfg.Source]
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
|
||||
}
|
||||
|
||||
// verify the source is compatible
|
||||
s, ok := rawS.(*lookersrc.Source)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("invalid source for %q tool: source kind must be `looker`", kind)
|
||||
}
|
||||
|
||||
connParameter := tools.NewStringParameter("conn", "The connection containing the tables.")
|
||||
dbParameter := tools.NewStringParameterWithRequired("db", "The optional database to search", false)
|
||||
schemaParameter := tools.NewStringParameter("schema", "The schema containing the tables.")
|
||||
parameters := tools.Parameters{connParameter, dbParameter, schemaParameter}
|
||||
|
||||
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)
|
||||
|
||||
// finish tool setup
|
||||
return Tool{
|
||||
Name: cfg.Name,
|
||||
Kind: kind,
|
||||
Parameters: parameters,
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
UseClientOAuth: s.UseClientOAuth,
|
||||
Client: s.Client,
|
||||
ApiSettings: s.ApiSettings,
|
||||
manifest: tools.Manifest{
|
||||
Description: cfg.Description,
|
||||
Parameters: parameters.Manifest(),
|
||||
AuthRequired: cfg.AuthRequired,
|
||||
},
|
||||
mcpManifest: mcpManifest,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// validate interface
|
||||
var _ tools.Tool = Tool{}
|
||||
|
||||
type Tool struct {
|
||||
Name string `yaml:"name"`
|
||||
Kind string `yaml:"kind"`
|
||||
UseClientOAuth bool
|
||||
Client *v4.LookerSDK
|
||||
ApiSettings *rtl.ApiSettings
|
||||
AuthRequired []string `yaml:"authRequired"`
|
||||
Parameters tools.Parameters `yaml:"parameters"`
|
||||
manifest tools.Manifest
|
||||
mcpManifest tools.McpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||
}
|
||||
mapParams := params.AsMap()
|
||||
conn, ok := mapParams["conn"].(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("'conn' must be a string, got %T", mapParams["conn"])
|
||||
}
|
||||
db, _ := mapParams["db"].(string)
|
||||
schema, ok := mapParams["schema"].(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("'schema' must be a string, got %T", mapParams["schema"])
|
||||
}
|
||||
|
||||
sdk, err := lookercommon.GetLookerSDK(t.UseClientOAuth, t.ApiSettings, t.Client, accessToken)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||
}
|
||||
req := v4.RequestConnectionTables{
|
||||
ConnectionName: conn,
|
||||
SchemaName: &schema,
|
||||
}
|
||||
if db != "" {
|
||||
req.Database = &db
|
||||
}
|
||||
resp, err := sdk.ConnectionTables(req, t.ApiSettings)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error making get_connection_tables request: %s", err)
|
||||
}
|
||||
var data []any
|
||||
for _, s := range resp {
|
||||
vMap := make(map[string]any)
|
||||
vMap["schema_name"] = *s.Name
|
||||
vMap["is_default"] = *s.IsDefault
|
||||
var tableData []any
|
||||
for _, t := range *s.Tables {
|
||||
vMap2 := make(map[string]any)
|
||||
vMap2["table_name"] = *t.Name
|
||||
vMap2["sql_escaped_name"] = *t.SqlEscapedName
|
||||
tableData = append(tableData, vMap2)
|
||||
}
|
||||
vMap["tables"] = tableData
|
||||
data = append(data, vMap)
|
||||
}
|
||||
logger.DebugContext(ctx, "data = ", data)
|
||||
|
||||
return data, nil
|
||||
}
|
||||
|
||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
|
||||
return tools.ParseParams(t.Parameters, data, claims)
|
||||
}
|
||||
|
||||
func (t Tool) Manifest() tools.Manifest {
|
||||
return t.manifest
|
||||
}
|
||||
|
||||
func (t Tool) McpManifest() tools.McpManifest {
|
||||
return t.mcpManifest
|
||||
}
|
||||
|
||||
func (t Tool) Authorized(verifiedAuthServices []string) bool {
|
||||
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
|
||||
}
|
||||
|
||||
func (t Tool) RequiresClientAuthorization() bool {
|
||||
return t.UseClientOAuth
|
||||
}
|
||||
@@ -0,0 +1,116 @@
|
||||
// Copyright 2025 Google LLC
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package lookergetconnectiontables_test
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||
lkr "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetconnectiontables"
|
||||
)
|
||||
|
||||
func TestParseFromYamlLookerGetConnectionTables(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
want server.ToolConfigs
|
||||
}{
|
||||
{
|
||||
desc: "basic example",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-tables
|
||||
source: my-instance
|
||||
description: some description
|
||||
`,
|
||||
want: server.ToolConfigs{
|
||||
"example_tool": lkr.Config{
|
||||
Name: "example_tool",
|
||||
Kind: "looker-get-connection-tables",
|
||||
Source: "my-instance",
|
||||
Description: "some description",
|
||||
AuthRequired: []string{},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to unmarshal: %s", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
|
||||
t.Fatalf("incorrect parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestFailParseFromYamlLookerGetConnectionTables(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
desc string
|
||||
in string
|
||||
err string
|
||||
}{
|
||||
{
|
||||
desc: "Invalid method",
|
||||
in: `
|
||||
tools:
|
||||
example_tool:
|
||||
kind: looker-get-connection-tables
|
||||
source: my-instance
|
||||
method: GOT
|
||||
description: some description
|
||||
`,
|
||||
err: "unable to parse tool \"example_tool\" as kind \"looker-get-connection-tables\": [4:1] unknown field \"method\"\n 1 | authRequired: []\n 2 | description: some description\n 3 | kind: looker-get-connection-tables\n> 4 | method: GOT\n ^\n 5 | source: my-instance",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := struct {
|
||||
Tools server.ToolConfigs `yaml:"tools"`
|
||||
}{}
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
|
||||
if err == nil {
|
||||
t.Fatalf("expect parsing to fail")
|
||||
}
|
||||
errStr := err.Error()
|
||||
if !strings.Contains(errStr, tc.err) {
|
||||
t.Fatalf("unexpected error string: got %q, want substring %q", errStr, tc.err)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
@@ -142,6 +142,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
vMap["label"] = *v.Label
|
||||
vMap["name"] = *v.Name
|
||||
vMap["project_name"] = *v.ProjectName
|
||||
vMap["connections"] = *v.AllowedDbConnectionNames
|
||||
logger.DebugContext(ctx, "Converted to %v\n", vMap)
|
||||
data = append(data, vMap)
|
||||
}
|
||||
|
||||
@@ -178,12 +178,12 @@ func getOptions(ctx context.Context, sortParameters tools.Parameters, projectPay
|
||||
}
|
||||
|
||||
opts = opts.SetProjection(projection)
|
||||
logger.DebugContext(ctx, "Projection is set to %v", projection)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("Projection is set to %v", projection))
|
||||
}
|
||||
|
||||
if limit > 0 {
|
||||
opts = opts.SetLimit(limit)
|
||||
logger.DebugContext(ctx, "Limit is being set to %d", limit)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("Limit is being set to %d", limit))
|
||||
}
|
||||
return opts, nil
|
||||
}
|
||||
|
||||
@@ -125,7 +125,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
results, err := t.Pool.QueryContext(ctx, sql)
|
||||
if err != nil {
|
||||
|
||||
@@ -126,7 +126,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
results, err := t.Pool.QueryContext(ctx, sql)
|
||||
if err != nil {
|
||||
|
||||
@@ -206,7 +206,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, t.statement)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, t.statement))
|
||||
|
||||
results, err := t.Pool.QueryContext(ctx, t.statement, duration, duration, limit)
|
||||
if err != nil {
|
||||
|
||||
@@ -163,7 +163,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, listTableFragmentationStatement)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, listTableFragmentationStatement))
|
||||
|
||||
results, err := t.Pool.QueryContext(ctx, listTableFragmentationStatement, table_schema, table_schema, table_name, table_name, data_free_threshold_bytes, limit)
|
||||
if err != nil {
|
||||
|
||||
@@ -154,7 +154,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, listTablesMissingUniqueIndexesStatement)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, listTablesMissingUniqueIndexesStatement))
|
||||
|
||||
results, err := t.Pool.QueryContext(ctx, listTablesMissingUniqueIndexesStatement, table_schema, table_schema, limit)
|
||||
if err != nil {
|
||||
|
||||
@@ -113,7 +113,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sqlParam)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sqlParam))
|
||||
|
||||
results, err := t.Pool.QueryContext(ctx, sqlParam)
|
||||
if err != nil {
|
||||
|
||||
@@ -423,13 +423,14 @@ type ParameterMcpManifest struct {
|
||||
|
||||
// CommonParameter are default fields that are emebdding in most Parameter implementations. Embedding this stuct will give the object Name() and Type() functions.
|
||||
type CommonParameter struct {
|
||||
Name string `yaml:"name" validate:"required"`
|
||||
Type string `yaml:"type" validate:"required"`
|
||||
Desc string `yaml:"description" validate:"required"`
|
||||
Required *bool `yaml:"required"`
|
||||
AllowedValues []any `yaml:"allowedValues"`
|
||||
AuthServices []ParamAuthService `yaml:"authServices"`
|
||||
AuthSources []ParamAuthService `yaml:"authSources"` // Deprecated: Kept for compatibility.
|
||||
Name string `yaml:"name" validate:"required"`
|
||||
Type string `yaml:"type" validate:"required"`
|
||||
Desc string `yaml:"description" validate:"required"`
|
||||
Required *bool `yaml:"required"`
|
||||
AllowedValues []any `yaml:"allowedValues"`
|
||||
ExcludedValues []any `yaml:"excludedValues"`
|
||||
AuthServices []ParamAuthService `yaml:"authServices"`
|
||||
AuthSources []ParamAuthService `yaml:"authSources"` // Deprecated: Kept for compatibility.
|
||||
}
|
||||
|
||||
// GetName returns the name specified for the Parameter.
|
||||
@@ -469,6 +470,24 @@ func (p *CommonParameter) IsAllowedValues(v any) bool {
|
||||
return false
|
||||
}
|
||||
|
||||
// GetExcludedValues returns the excluded values for the Parameter.
|
||||
func (p *CommonParameter) GetExcludedValues() []any {
|
||||
return p.ExcludedValues
|
||||
}
|
||||
|
||||
// IsExcludedValues checks if the value is allowed.
|
||||
func (p *CommonParameter) IsExcludedValues(v any) bool {
|
||||
if len(p.ExcludedValues) == 0 {
|
||||
return false
|
||||
}
|
||||
for _, av := range p.ExcludedValues {
|
||||
if MatchStringOrRegex(v, av) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// MatchStringOrRegex checks if the input matches the target
|
||||
func MatchStringOrRegex(input, target any) bool {
|
||||
targetS, ok := target.(string)
|
||||
@@ -480,7 +499,7 @@ func MatchStringOrRegex(input, target any) bool {
|
||||
// if target is not regex, run direct comparison
|
||||
return input == target
|
||||
}
|
||||
inputS := fmt.Sprintf("%s", input)
|
||||
inputS := fmt.Sprintf("%v", input)
|
||||
return re.MatchString(inputS)
|
||||
}
|
||||
|
||||
@@ -594,6 +613,19 @@ func NewStringParameterWithAllowedValues(name string, desc string, allowedValues
|
||||
}
|
||||
}
|
||||
|
||||
// NewStringParameterWithExcludedValues is a convenience function for initializing a StringParameter with a list of excludedValues
|
||||
func NewStringParameterWithExcludedValues(name string, desc string, excludedValues []any) *StringParameter {
|
||||
return &StringParameter{
|
||||
CommonParameter: CommonParameter{
|
||||
Name: name,
|
||||
Type: typeString,
|
||||
Desc: desc,
|
||||
ExcludedValues: excludedValues,
|
||||
AuthServices: nil,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
var _ Parameter = &StringParameter{}
|
||||
|
||||
// StringParameter is a parameter representing the "string" type.
|
||||
@@ -612,6 +644,9 @@ func (p *StringParameter) Parse(v any) (any, error) {
|
||||
if !p.IsAllowedValues(newV) {
|
||||
return nil, fmt.Errorf("%s is not an allowed value", newV)
|
||||
}
|
||||
if p.IsExcludedValues(newV) {
|
||||
return nil, fmt.Errorf("%s is an excluded value", newV)
|
||||
}
|
||||
if p.Escape != nil {
|
||||
return applyEscape(*p.Escape, newV)
|
||||
}
|
||||
@@ -735,6 +770,19 @@ func NewIntParameterWithAllowedValues(name string, desc string, allowedValues []
|
||||
}
|
||||
}
|
||||
|
||||
// NewIntParameterWithExcludedValues is a convenience function for initializing a IntParameter with a list of excludedValues
|
||||
func NewIntParameterWithExcludedValues(name string, desc string, excludedValues []any) *IntParameter {
|
||||
return &IntParameter{
|
||||
CommonParameter: CommonParameter{
|
||||
Name: name,
|
||||
Type: typeString,
|
||||
Desc: desc,
|
||||
ExcludedValues: excludedValues,
|
||||
AuthServices: nil,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
var _ Parameter = &IntParameter{}
|
||||
|
||||
// IntParameter is a parameter representing the "int" type.
|
||||
@@ -766,6 +814,9 @@ func (p *IntParameter) Parse(v any) (any, error) {
|
||||
if !p.IsAllowedValues(out) {
|
||||
return nil, fmt.Errorf("%d is not an allowed value", out)
|
||||
}
|
||||
if p.IsExcludedValues(out) {
|
||||
return nil, fmt.Errorf("%d is an excluded value", out)
|
||||
}
|
||||
if p.MinValue != nil && out < *p.MinValue {
|
||||
return nil, fmt.Errorf("%d is under the minimum value", out)
|
||||
}
|
||||
@@ -877,6 +928,19 @@ func NewFloatParameterWithAllowedValues(name string, desc string, allowedValues
|
||||
}
|
||||
}
|
||||
|
||||
// NewFloatParameterWithExcludedValues is a convenience function for initializing a FloatParameter with a list of excluded values.
|
||||
func NewFloatParameterWithExcludedValues(name string, desc string, excludedValues []any) *FloatParameter {
|
||||
return &FloatParameter{
|
||||
CommonParameter: CommonParameter{
|
||||
Name: name,
|
||||
Type: typeFloat,
|
||||
Desc: desc,
|
||||
ExcludedValues: excludedValues,
|
||||
AuthServices: nil,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
var _ Parameter = &FloatParameter{}
|
||||
|
||||
// FloatParameter is a parameter representing the "float" type.
|
||||
@@ -906,6 +970,9 @@ func (p *FloatParameter) Parse(v any) (any, error) {
|
||||
if !p.IsAllowedValues(out) {
|
||||
return nil, fmt.Errorf("%g is not an allowed value", out)
|
||||
}
|
||||
if p.IsExcludedValues(out) {
|
||||
return nil, fmt.Errorf("%g is an excluded value", out)
|
||||
}
|
||||
if p.MinValue != nil && out < *p.MinValue {
|
||||
return nil, fmt.Errorf("%g is under the minimum value", out)
|
||||
}
|
||||
@@ -1013,6 +1080,19 @@ func NewBooleanParameterWithAllowedValues(name string, desc string, allowedValue
|
||||
}
|
||||
}
|
||||
|
||||
// NewBooleanParameterWithExcludedValues is a convenience function for initializing a BooleanParameter with a list of excluded values.
|
||||
func NewBooleanParameterWithExcludedValues(name string, desc string, excludedValues []any) *BooleanParameter {
|
||||
return &BooleanParameter{
|
||||
CommonParameter: CommonParameter{
|
||||
Name: name,
|
||||
Type: typeBool,
|
||||
Desc: desc,
|
||||
ExcludedValues: excludedValues,
|
||||
AuthServices: nil,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
var _ Parameter = &BooleanParameter{}
|
||||
|
||||
// BooleanParameter is a parameter representing the "boolean" type.
|
||||
@@ -1029,6 +1109,9 @@ func (p *BooleanParameter) Parse(v any) (any, error) {
|
||||
if !p.IsAllowedValues(newV) {
|
||||
return nil, fmt.Errorf("%t is not an allowed value", newV)
|
||||
}
|
||||
if p.IsExcludedValues(newV) {
|
||||
return nil, fmt.Errorf("%t is an excluded value", newV)
|
||||
}
|
||||
return newV, nil
|
||||
}
|
||||
|
||||
@@ -1125,6 +1208,20 @@ func NewArrayParameterWithAllowedValues(name string, desc string, allowedValues
|
||||
}
|
||||
}
|
||||
|
||||
// NewArrayParameterWithExcludedValues is a convenience function for initializing a ArrayParameter with a list of excluded values.
|
||||
func NewArrayParameterWithExcludedValues(name string, desc string, excludedValues []any, items Parameter) *ArrayParameter {
|
||||
return &ArrayParameter{
|
||||
CommonParameter: CommonParameter{
|
||||
Name: name,
|
||||
Type: typeArray,
|
||||
Desc: desc,
|
||||
ExcludedValues: excludedValues,
|
||||
AuthServices: nil,
|
||||
},
|
||||
Items: items,
|
||||
}
|
||||
}
|
||||
|
||||
var _ Parameter = &ArrayParameter{}
|
||||
|
||||
// ArrayParameter is a parameter representing the "array" type.
|
||||
@@ -1170,6 +1267,19 @@ func (p *ArrayParameter) IsAllowedValues(v []any) bool {
|
||||
return false
|
||||
}
|
||||
|
||||
func (p *ArrayParameter) IsExcludedValues(v []any) bool {
|
||||
a := p.GetExcludedValues()
|
||||
if len(a) == 0 {
|
||||
return false
|
||||
}
|
||||
for _, av := range a {
|
||||
if reflect.DeepEqual(v, av) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (p *ArrayParameter) Parse(v any) (any, error) {
|
||||
arrVal, ok := v.([]any)
|
||||
if !ok {
|
||||
@@ -1178,6 +1288,9 @@ func (p *ArrayParameter) Parse(v any) (any, error) {
|
||||
if !p.IsAllowedValues(arrVal) {
|
||||
return nil, fmt.Errorf("%s is not an allowed value", arrVal)
|
||||
}
|
||||
if p.IsExcludedValues(arrVal) {
|
||||
return nil, fmt.Errorf("%s is an excluded value", arrVal)
|
||||
}
|
||||
rtn := make([]any, 0, len(arrVal))
|
||||
for idx, val := range arrVal {
|
||||
val, err := p.Items.Parse(val)
|
||||
@@ -1310,6 +1423,19 @@ func NewMapParameterWithAllowedValues(name string, desc string, allowedValues []
|
||||
}
|
||||
}
|
||||
|
||||
// NewMapParameterWithExcludedValues is a convenience function for initializing a MapParameter with a list of excluded values.
|
||||
func NewMapParameterWithExcludedValues(name string, desc string, excludedValues []any, valueType string) *MapParameter {
|
||||
return &MapParameter{
|
||||
CommonParameter: CommonParameter{
|
||||
Name: name,
|
||||
Type: "map",
|
||||
Desc: desc,
|
||||
ExcludedValues: excludedValues,
|
||||
},
|
||||
ValueType: valueType,
|
||||
}
|
||||
}
|
||||
|
||||
// UnmarshalYAML handles parsing the MapParameter from YAML input.
|
||||
func (p *MapParameter) UnmarshalYAML(ctx context.Context, unmarshal func(interface{}) error) error {
|
||||
var rawItem struct {
|
||||
@@ -1364,6 +1490,19 @@ func (p *MapParameter) IsAllowedValues(v map[string]any) bool {
|
||||
return false
|
||||
}
|
||||
|
||||
func (p *MapParameter) IsExcludedValues(v map[string]any) bool {
|
||||
a := p.GetExcludedValues()
|
||||
if len(a) == 0 {
|
||||
return false
|
||||
}
|
||||
for _, av := range a {
|
||||
if reflect.DeepEqual(v, av) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// Parse validates and parses an incoming value for the map parameter.
|
||||
func (p *MapParameter) Parse(v any) (any, error) {
|
||||
m, ok := v.(map[string]any)
|
||||
@@ -1373,6 +1512,9 @@ func (p *MapParameter) Parse(v any) (any, error) {
|
||||
if !p.IsAllowedValues(m) {
|
||||
return nil, fmt.Errorf("%s is not an allowed value", m)
|
||||
}
|
||||
if p.IsExcludedValues(m) {
|
||||
return nil, fmt.Errorf("%s is an excluded value", m)
|
||||
}
|
||||
// for generic maps, convert json.Numbers to their corresponding types
|
||||
if p.ValueType == "" {
|
||||
convertedData, err := util.ConvertNumbers(m)
|
||||
|
||||
@@ -750,6 +750,43 @@ func TestParametersParse(t *testing.T) {
|
||||
"my_string": "bar",
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "string not allowed regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewStringParameterWithAllowedValues("my_string", "this param is a string", []any{"^f.*"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_string": "bar",
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "string excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewStringParameterWithExcludedValues("my_string", "this param is a string", []any{"foo"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_string": "foo",
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "string excluded regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewStringParameterWithExcludedValues("my_string", "this param is a string", []any{"^f.*"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_string": "foo",
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "string not excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewStringParameterWithExcludedValues("my_string", "this param is a string", []any{"foo"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_string": "bar",
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_string", Value: "bar"}},
|
||||
},
|
||||
{
|
||||
name: "string with escape backticks",
|
||||
params: tools.Parameters{
|
||||
@@ -829,6 +866,16 @@ func TestParametersParse(t *testing.T) {
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_int", Value: 1}},
|
||||
},
|
||||
{
|
||||
name: "int allowed regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewIntParameterWithAllowedValues("my_int", "this param is an int", []any{"^\\d{2}$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_int": 10,
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_int", Value: 10}},
|
||||
},
|
||||
{
|
||||
name: "int not allowed",
|
||||
params: tools.Parameters{
|
||||
@@ -838,6 +885,53 @@ func TestParametersParse(t *testing.T) {
|
||||
"my_int": 2,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "int not allowed regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewIntParameterWithAllowedValues("my_int", "this param is an int", []any{"^\\d{2}$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_int": 100,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "int excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewIntParameterWithExcludedValues("my_int", "this param is an int", []any{1}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_int": 1,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "int excluded regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewIntParameterWithExcludedValues("my_int", "this param is an int", []any{"^\\d{2}$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_int": 10,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "int not excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewIntParameterWithExcludedValues("my_int", "this param is an int", []any{1}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_int": 2,
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_int", Value: 2}},
|
||||
},
|
||||
{
|
||||
name: "int not excluded regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewIntParameterWithExcludedValues("my_int", "this param is an int", []any{"^\\d{2}$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_int": 2,
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_int", Value: 2}},
|
||||
},
|
||||
{
|
||||
name: "int minValue",
|
||||
params: tools.Parameters{
|
||||
@@ -905,6 +999,16 @@ func TestParametersParse(t *testing.T) {
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_float", Value: 1.1}},
|
||||
},
|
||||
{
|
||||
name: "float allowed regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewFloatParameterWithAllowedValues("my_float", "this param is a float", []any{"^0\\.\\d+$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_float": 0.99,
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_float", Value: 0.99}},
|
||||
},
|
||||
{
|
||||
name: "float not allowed",
|
||||
params: tools.Parameters{
|
||||
@@ -914,6 +1018,54 @@ func TestParametersParse(t *testing.T) {
|
||||
"my_float": 1.2,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "float not allowed regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewFloatParameterWithAllowedValues("my_float", "this param is a float", []any{"^0\\.\\d+$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_float": 1.99,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "float excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewFloatParameterWithExcludedValues("my_float", "this param is a float", []any{1.1}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_float": 1.1,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "float excluded regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewFloatParameterWithExcludedValues("my_float", "this param is a float", []any{"^0\\.\\d+$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_float": 0.99,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "float not excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewFloatParameterWithExcludedValues("my_float", "this param is a float", []any{1.1}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_float": 1.2,
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_float", Value: 1.2}},
|
||||
},
|
||||
{
|
||||
name: "float not excluded regex",
|
||||
params: tools.Parameters{
|
||||
tools.NewFloatParameterWithExcludedValues("my_float", "this param is a float", []any{"^0\\.\\d+$"}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_float": 1.99,
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_float", Value: 1.99}},
|
||||
},
|
||||
|
||||
{
|
||||
name: "float minValue",
|
||||
params: tools.Parameters{
|
||||
@@ -990,6 +1142,25 @@ func TestParametersParse(t *testing.T) {
|
||||
"my_bool": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "bool excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewBooleanParameterWithExcludedValues("my_bool", "this param is a bool", []any{true}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_bool": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "bool not excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewBooleanParameterWithExcludedValues("my_bool", "this param is a bool", []any{false}),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_bool": true,
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_bool", Value: true}},
|
||||
},
|
||||
{
|
||||
name: "string default",
|
||||
params: tools.Parameters{
|
||||
@@ -1136,6 +1307,25 @@ func TestParametersParse(t *testing.T) {
|
||||
"my_map": map[string]any{"key1": "val2"},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "map excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewMapParameterWithExcludedValues("my_map", "a map", []any{map[string]any{"key1": "val1"}}, "string"),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_map": map[string]any{"key1": "val1"},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "map not excluded",
|
||||
params: tools.Parameters{
|
||||
tools.NewMapParameterWithExcludedValues("my_map", "a map", []any{map[string]any{"key1": "val1"}}, "string"),
|
||||
},
|
||||
in: map[string]any{
|
||||
"my_map": map[string]any{"key1": "val2"},
|
||||
},
|
||||
want: tools.ParamValues{tools.ParamValue{Name: "my_map", Value: map[string]any{"key1": "val2"}}},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
|
||||
@@ -126,12 +126,13 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
results, err := t.Pool.Query(ctx, sql)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("unable to execute query: %w", err)
|
||||
}
|
||||
defer results.Close()
|
||||
|
||||
fields := results.FieldDescriptions()
|
||||
|
||||
@@ -148,6 +149,10 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
out = append(out, vMap)
|
||||
}
|
||||
|
||||
if err := results.Err(); err != nil {
|
||||
return err.Error(), fmt.Errorf("unable to execute query: %w", err)
|
||||
}
|
||||
|
||||
return out, nil
|
||||
}
|
||||
|
||||
|
||||
@@ -153,7 +153,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
var results []any
|
||||
var opErr error
|
||||
|
||||
@@ -125,7 +125,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
results, err := t.DB.QueryContext(ctx, sql)
|
||||
if err != nil {
|
||||
|
||||
@@ -123,7 +123,7 @@ func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("error getting logger: %s", err)
|
||||
}
|
||||
logger.DebugContext(ctx, "executing `%s` tool query: %s", kind, sql)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("executing `%s` tool query: %s", kind, sql))
|
||||
|
||||
results, err := t.Pool.QueryContext(ctx, sql)
|
||||
if err != nil {
|
||||
|
||||
@@ -205,7 +205,7 @@ func TestBigQueryToolEndpoints(t *testing.T) {
|
||||
}
|
||||
|
||||
func TestBigQueryToolWithDatasetRestriction(t *testing.T) {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 3*time.Minute)
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 4*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
client, err := initBigQueryConnection(BigqueryProject)
|
||||
@@ -225,6 +225,9 @@ func TestBigQueryToolWithDatasetRestriction(t *testing.T) {
|
||||
allowedForecastTableName2 := "allowed_forecast_table_2"
|
||||
disallowedForecastTableName := "disallowed_forecast_table"
|
||||
|
||||
allowedAnalyzeContributionTableName1 := "allowed_analyze_contribution_table_1"
|
||||
allowedAnalyzeContributionTableName2 := "allowed_analyze_contribution_table_2"
|
||||
disallowedAnalyzeContributionTableName := "disallowed_analyze_contribution_table"
|
||||
// Setup allowed table
|
||||
allowedTableNameParam1 := fmt.Sprintf("`%s.%s.%s`", BigqueryProject, allowedDatasetName1, allowedTableName1)
|
||||
createAllowedTableStmt1 := fmt.Sprintf("CREATE TABLE %s (id INT64)", allowedTableNameParam1)
|
||||
@@ -259,6 +262,23 @@ func TestBigQueryToolWithDatasetRestriction(t *testing.T) {
|
||||
teardownDisallowedForecast := setupBigQueryTable(t, ctx, client, createDisallowedForecastStmt, insertDisallowedForecastStmt, disallowedDatasetName, disallowedForecastTableFullName, disallowedForecastParams)
|
||||
defer teardownDisallowedForecast(t)
|
||||
|
||||
// Setup allowed analyze contribution table
|
||||
allowedAnalyzeContributionTableFullName1 := fmt.Sprintf("`%s.%s.%s`", BigqueryProject, allowedDatasetName1, allowedAnalyzeContributionTableName1)
|
||||
createAnalyzeContributionStmt1, insertAnalyzeContributionStmt1, analyzeContributionParams1 := getBigQueryAnalyzeContributionToolInfo(allowedAnalyzeContributionTableFullName1)
|
||||
teardownAllowedAnalyzeContribution1 := setupBigQueryTable(t, ctx, client, createAnalyzeContributionStmt1, insertAnalyzeContributionStmt1, allowedDatasetName1, allowedAnalyzeContributionTableFullName1, analyzeContributionParams1)
|
||||
defer teardownAllowedAnalyzeContribution1(t)
|
||||
|
||||
allowedAnalyzeContributionTableFullName2 := fmt.Sprintf("`%s.%s.%s`", BigqueryProject, allowedDatasetName2, allowedAnalyzeContributionTableName2)
|
||||
createAnalyzeContributionStmt2, insertAnalyzeContributionStmt2, analyzeContributionParams2 := getBigQueryAnalyzeContributionToolInfo(allowedAnalyzeContributionTableFullName2)
|
||||
teardownAllowedAnalyzeContribution2 := setupBigQueryTable(t, ctx, client, createAnalyzeContributionStmt2, insertAnalyzeContributionStmt2, allowedDatasetName2, allowedAnalyzeContributionTableFullName2, analyzeContributionParams2)
|
||||
defer teardownAllowedAnalyzeContribution2(t)
|
||||
|
||||
// Setup disallowed analyze contribution table
|
||||
disallowedAnalyzeContributionTableFullName := fmt.Sprintf("`%s.%s.%s`", BigqueryProject, disallowedDatasetName, disallowedAnalyzeContributionTableName)
|
||||
createDisallowedAnalyzeContributionStmt, insertDisallowedAnalyzeContributionStmt, disallowedAnalyzeContributionParams := getBigQueryAnalyzeContributionToolInfo(disallowedAnalyzeContributionTableFullName)
|
||||
teardownDisallowedAnalyzeContribution := setupBigQueryTable(t, ctx, client, createDisallowedAnalyzeContributionStmt, insertDisallowedAnalyzeContributionStmt, disallowedDatasetName, disallowedAnalyzeContributionTableFullName, disallowedAnalyzeContributionParams)
|
||||
defer teardownDisallowedAnalyzeContribution(t)
|
||||
|
||||
// Configure source with dataset restriction.
|
||||
sourceConfig := getBigQueryVars(t)
|
||||
sourceConfig["allowedDatasets"] = []string{allowedDatasetName1, allowedDatasetName2}
|
||||
@@ -275,6 +295,11 @@ func TestBigQueryToolWithDatasetRestriction(t *testing.T) {
|
||||
"source": "my-instance",
|
||||
"description": "Tool to list table within a dataset",
|
||||
},
|
||||
"get-dataset-info-restricted": map[string]any{
|
||||
"kind": "bigquery-get-dataset-info",
|
||||
"source": "my-instance",
|
||||
"description": "Tool to get dataset info",
|
||||
},
|
||||
"get-table-info-restricted": map[string]any{
|
||||
"kind": "bigquery-get-table-info",
|
||||
"source": "my-instance",
|
||||
@@ -295,6 +320,11 @@ func TestBigQueryToolWithDatasetRestriction(t *testing.T) {
|
||||
"source": "my-instance",
|
||||
"description": "Tool to forecast",
|
||||
},
|
||||
"analyze-contribution-restricted": map[string]any{
|
||||
"kind": "bigquery-analyze-contribution",
|
||||
"source": "my-instance",
|
||||
"description": "Tool to analyze contribution",
|
||||
},
|
||||
}
|
||||
|
||||
// Create config file
|
||||
@@ -322,8 +352,10 @@ func TestBigQueryToolWithDatasetRestriction(t *testing.T) {
|
||||
|
||||
// Run tests
|
||||
runListDatasetIdsWithRestriction(t, allowedDatasetName1, allowedDatasetName2)
|
||||
runListTableIdsWithRestriction(t, allowedDatasetName1, disallowedDatasetName, allowedTableName1, allowedForecastTableName1)
|
||||
runListTableIdsWithRestriction(t, allowedDatasetName2, disallowedDatasetName, allowedTableName2, allowedForecastTableName2)
|
||||
runListTableIdsWithRestriction(t, allowedDatasetName1, disallowedDatasetName, allowedTableName1, allowedForecastTableName1, allowedAnalyzeContributionTableName1)
|
||||
runListTableIdsWithRestriction(t, allowedDatasetName2, disallowedDatasetName, allowedTableName2, allowedForecastTableName2, allowedAnalyzeContributionTableName2)
|
||||
runGetDatasetInfoWithRestriction(t, allowedDatasetName1, disallowedDatasetName)
|
||||
runGetDatasetInfoWithRestriction(t, allowedDatasetName2, disallowedDatasetName)
|
||||
runGetTableInfoWithRestriction(t, allowedDatasetName1, disallowedDatasetName, allowedTableName1, disallowedTableName)
|
||||
runGetTableInfoWithRestriction(t, allowedDatasetName2, disallowedDatasetName, allowedTableName2, disallowedTableName)
|
||||
runExecuteSqlWithRestriction(t, allowedTableNameParam1, disallowedTableNameParam)
|
||||
@@ -332,6 +364,8 @@ func TestBigQueryToolWithDatasetRestriction(t *testing.T) {
|
||||
runConversationalAnalyticsWithRestriction(t, allowedDatasetName2, disallowedDatasetName, allowedTableName2, disallowedTableName)
|
||||
runForecastWithRestriction(t, allowedForecastTableFullName1, disallowedForecastTableFullName)
|
||||
runForecastWithRestriction(t, allowedForecastTableFullName2, disallowedForecastTableFullName)
|
||||
runAnalyzeContributionWithRestriction(t, allowedAnalyzeContributionTableFullName1, disallowedAnalyzeContributionTableFullName)
|
||||
runAnalyzeContributionWithRestriction(t, allowedAnalyzeContributionTableFullName2, disallowedAnalyzeContributionTableFullName)
|
||||
}
|
||||
|
||||
func TestBigQueryWriteModeAllowed(t *testing.T) {
|
||||
@@ -2474,7 +2508,7 @@ func runListDatasetIdsWithRestriction(t *testing.T, allowedDatasetName1, allowed
|
||||
testCases := []struct {
|
||||
name string
|
||||
wantStatusCode int
|
||||
wantElements []string
|
||||
wantElements []string
|
||||
}{
|
||||
{
|
||||
name: "invoke list-dataset-ids with restriction",
|
||||
@@ -2499,7 +2533,7 @@ func runListDatasetIdsWithRestriction(t *testing.T, allowedDatasetName1, allowed
|
||||
if err := json.Unmarshal(bodyBytes, &respBody); err != nil {
|
||||
t.Fatalf("error parsing response body: %v", err)
|
||||
}
|
||||
|
||||
|
||||
gotJSON, ok := respBody["result"].(string)
|
||||
if !ok {
|
||||
t.Fatalf("unable to find 'result' as a string in response body: %s", string(bodyBytes))
|
||||
@@ -2603,6 +2637,55 @@ func runListTableIdsWithRestriction(t *testing.T, allowedDatasetName, disallowed
|
||||
}
|
||||
}
|
||||
|
||||
func runGetDatasetInfoWithRestriction(t *testing.T, allowedDatasetName, disallowedDatasetName string) {
|
||||
testCases := []struct {
|
||||
name string
|
||||
dataset string
|
||||
wantStatusCode int
|
||||
wantInError string
|
||||
}{
|
||||
{
|
||||
name: "invoke on allowed dataset",
|
||||
dataset: allowedDatasetName,
|
||||
wantStatusCode: http.StatusOK,
|
||||
},
|
||||
{
|
||||
name: "invoke on disallowed dataset",
|
||||
dataset: disallowedDatasetName,
|
||||
wantStatusCode: http.StatusBadRequest,
|
||||
wantInError: fmt.Sprintf("access denied to dataset '%s'", disallowedDatasetName),
|
||||
},
|
||||
}
|
||||
|
||||
for _, tc := range testCases {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
body := bytes.NewBuffer([]byte(fmt.Sprintf(`{"dataset":"%s"}`, tc.dataset)))
|
||||
req, err := http.NewRequest(http.MethodPost, "http://127.0.0.1:5000/api/tool/get-dataset-info-restricted/invoke", body)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to create request: %s", err)
|
||||
}
|
||||
req.Header.Add("Content-type", "application/json")
|
||||
resp, err := http.DefaultClient.Do(req)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to send request: %s", err)
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
|
||||
if resp.StatusCode != tc.wantStatusCode {
|
||||
bodyBytes, _ := io.ReadAll(resp.Body)
|
||||
t.Fatalf("unexpected status code: got %d, want %d. Body: %s", resp.StatusCode, tc.wantStatusCode, string(bodyBytes))
|
||||
}
|
||||
|
||||
if tc.wantInError != "" {
|
||||
bodyBytes, _ := io.ReadAll(resp.Body)
|
||||
if !strings.Contains(string(bodyBytes), tc.wantInError) {
|
||||
t.Errorf("unexpected error message: got %q, want to contain %q", string(bodyBytes), tc.wantInError)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func runGetTableInfoWithRestriction(t *testing.T, allowedDatasetName, disallowedDatasetName, allowedTableName, disallowedTableName string) {
|
||||
testCases := []struct {
|
||||
name string
|
||||
@@ -3069,3 +3152,86 @@ func runForecastWithRestriction(t *testing.T, allowedTableFullName, disallowedTa
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func runAnalyzeContributionWithRestriction(t *testing.T, allowedTableFullName, disallowedTableFullName string) {
|
||||
allowedTableUnquoted := strings.ReplaceAll(allowedTableFullName, "`", "")
|
||||
disallowedTableUnquoted := strings.ReplaceAll(disallowedTableFullName, "`", "")
|
||||
disallowedDatasetFQN := strings.Join(strings.Split(disallowedTableUnquoted, ".")[0:2], ".")
|
||||
|
||||
testCases := []struct {
|
||||
name string
|
||||
inputData string
|
||||
wantStatusCode int
|
||||
wantInResult string
|
||||
wantInError string
|
||||
}{
|
||||
{
|
||||
name: "invoke with allowed table name",
|
||||
inputData: allowedTableUnquoted,
|
||||
wantStatusCode: http.StatusOK,
|
||||
wantInResult: `"relative_difference"`,
|
||||
},
|
||||
{
|
||||
name: "invoke with disallowed table name",
|
||||
inputData: disallowedTableUnquoted,
|
||||
wantStatusCode: http.StatusBadRequest,
|
||||
wantInError: fmt.Sprintf("access to dataset '%s' (from table '%s') is not allowed", disallowedDatasetFQN, disallowedTableUnquoted),
|
||||
},
|
||||
{
|
||||
name: "invoke with query on allowed table",
|
||||
inputData: fmt.Sprintf("SELECT * FROM %s", allowedTableFullName),
|
||||
wantStatusCode: http.StatusOK,
|
||||
wantInResult: `"relative_difference"`,
|
||||
},
|
||||
{
|
||||
name: "invoke with query on disallowed table",
|
||||
inputData: fmt.Sprintf("SELECT * FROM %s", disallowedTableFullName),
|
||||
wantStatusCode: http.StatusBadRequest,
|
||||
wantInError: fmt.Sprintf("query in input_data accesses dataset '%s', which is not in the allowed list", disallowedDatasetFQN),
|
||||
},
|
||||
}
|
||||
|
||||
for _, tc := range testCases {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
requestBodyMap := map[string]any{
|
||||
"input_data": tc.inputData,
|
||||
"contribution_metric": "SUM(metric)",
|
||||
"is_test_col": "is_test",
|
||||
"dimension_id_cols": []string{"dim1", "dim2"},
|
||||
}
|
||||
bodyBytes, err := json.Marshal(requestBodyMap)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to marshal request body: %v", err)
|
||||
}
|
||||
body := bytes.NewBuffer(bodyBytes)
|
||||
|
||||
resp, bodyBytes := tests.RunRequest(t, http.MethodPost, "http://127.0.0.1:5000/api/tool/analyze-contribution-restricted/invoke", body, nil)
|
||||
|
||||
if resp.StatusCode != tc.wantStatusCode {
|
||||
t.Fatalf("unexpected status code: got %d, want %d. Body: %s", resp.StatusCode, tc.wantStatusCode, string(bodyBytes))
|
||||
}
|
||||
|
||||
var respBody map[string]interface{}
|
||||
if err := json.Unmarshal(bodyBytes, &respBody); err != nil {
|
||||
t.Fatalf("error parsing response body: %v", err)
|
||||
}
|
||||
|
||||
if tc.wantInResult != "" {
|
||||
got, ok := respBody["result"].(string)
|
||||
if !ok {
|
||||
t.Fatalf("unable to find result in response body")
|
||||
}
|
||||
|
||||
if !strings.Contains(got, tc.wantInResult) {
|
||||
t.Errorf("unexpected result: got %q, want to contain %q", string(bodyBytes), tc.wantInResult)
|
||||
}
|
||||
}
|
||||
|
||||
if tc.wantInError != "" {
|
||||
if !strings.Contains(string(bodyBytes), tc.wantInError) {
|
||||
t.Errorf("unexpected error message: got %q, want to contain %q", string(bodyBytes), tc.wantInError)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -40,7 +40,6 @@ var (
|
||||
CloudSQLMSSQLRegion = os.Getenv("CLOUD_SQL_MSSQL_REGION")
|
||||
CloudSQLMSSQLInstance = os.Getenv("CLOUD_SQL_MSSQL_INSTANCE")
|
||||
CloudSQLMSSQLDatabase = os.Getenv("CLOUD_SQL_MSSQL_DATABASE")
|
||||
CloudSQLMSSQLIp = os.Getenv("CLOUD_SQL_MSSQL_IP")
|
||||
CloudSQLMSSQLUser = os.Getenv("CLOUD_SQL_MSSQL_USER")
|
||||
CloudSQLMSSQLPass = os.Getenv("CLOUD_SQL_MSSQL_PASS")
|
||||
)
|
||||
@@ -53,8 +52,6 @@ func getCloudSQLMSSQLVars(t *testing.T) map[string]any {
|
||||
t.Fatal("'CLOUD_SQL_MSSQL_REGION' not set")
|
||||
case CloudSQLMSSQLInstance:
|
||||
t.Fatal("'CLOUD_SQL_MSSQL_INSTANCE' not set")
|
||||
case CloudSQLMSSQLIp:
|
||||
t.Fatal("'CLOUD_SQL_MSSQL_IP' not set")
|
||||
case CloudSQLMSSQLDatabase:
|
||||
t.Fatal("'CLOUD_SQL_MSSQL_DATABASE' not set")
|
||||
case CloudSQLMSSQLUser:
|
||||
@@ -64,25 +61,23 @@ func getCloudSQLMSSQLVars(t *testing.T) map[string]any {
|
||||
}
|
||||
|
||||
return map[string]any{
|
||||
"kind": CloudSQLMSSQLSourceKind,
|
||||
"project": CloudSQLMSSQLProject,
|
||||
"instance": CloudSQLMSSQLInstance,
|
||||
"ipAddress": CloudSQLMSSQLIp,
|
||||
"region": CloudSQLMSSQLRegion,
|
||||
"database": CloudSQLMSSQLDatabase,
|
||||
"user": CloudSQLMSSQLUser,
|
||||
"password": CloudSQLMSSQLPass,
|
||||
"kind": CloudSQLMSSQLSourceKind,
|
||||
"project": CloudSQLMSSQLProject,
|
||||
"instance": CloudSQLMSSQLInstance,
|
||||
"region": CloudSQLMSSQLRegion,
|
||||
"database": CloudSQLMSSQLDatabase,
|
||||
"user": CloudSQLMSSQLUser,
|
||||
"password": CloudSQLMSSQLPass,
|
||||
}
|
||||
}
|
||||
|
||||
// Copied over from cloud_sql_mssql.go
|
||||
func initCloudSQLMSSQLConnection(project, region, instance, ipAddress, ipType, user, pass, dbname string) (*sql.DB, error) {
|
||||
func initCloudSQLMSSQLConnection(project, region, instance, ipType, user, pass, dbname string) (*sql.DB, error) {
|
||||
// Create dsn
|
||||
query := fmt.Sprintf("database=%s&cloudsql=%s:%s:%s", dbname, project, region, instance)
|
||||
url := &url.URL{
|
||||
Scheme: "sqlserver",
|
||||
User: url.UserPassword(user, pass),
|
||||
Host: ipAddress,
|
||||
RawQuery: query,
|
||||
}
|
||||
|
||||
@@ -118,7 +113,7 @@ func TestCloudSQLMSSQLToolEndpoints(t *testing.T) {
|
||||
|
||||
var args []string
|
||||
|
||||
db, err := initCloudSQLMSSQLConnection(CloudSQLMSSQLProject, CloudSQLMSSQLRegion, CloudSQLMSSQLInstance, CloudSQLMSSQLIp, "public", CloudSQLMSSQLUser, CloudSQLMSSQLPass, CloudSQLMSSQLDatabase)
|
||||
db, err := initCloudSQLMSSQLConnection(CloudSQLMSSQLProject, CloudSQLMSSQLRegion, CloudSQLMSSQLInstance, "public", CloudSQLMSSQLUser, CloudSQLMSSQLPass, CloudSQLMSSQLDatabase)
|
||||
if err != nil {
|
||||
t.Fatalf("unable to create Cloud SQL connection pool: %s", err)
|
||||
}
|
||||
|
||||
@@ -199,6 +199,31 @@ func TestLooker(t *testing.T) {
|
||||
"source": "my-instance",
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
},
|
||||
"get_connections": map[string]any{
|
||||
"kind": "looker-get-connections",
|
||||
"source": "my-instance",
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
},
|
||||
"get_connection_schemas": map[string]any{
|
||||
"kind": "looker-get-connection-schemas",
|
||||
"source": "my-instance",
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
},
|
||||
"get_connection_databases": map[string]any{
|
||||
"kind": "looker-get-connection-databases",
|
||||
"source": "my-instance",
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
},
|
||||
"get_connection_tables": map[string]any{
|
||||
"kind": "looker-get-connection-tables",
|
||||
"source": "my-instance",
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
},
|
||||
"get_connection_table_columns": map[string]any{
|
||||
"kind": "looker-get-connection-table-columns",
|
||||
"source": "my-instance",
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -992,8 +1017,127 @@ func TestLooker(t *testing.T) {
|
||||
},
|
||||
},
|
||||
)
|
||||
tests.RunToolGetTestByName(t, "get_connections",
|
||||
map[string]any{
|
||||
"get_connections": map[string]any{
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
"authRequired": []any{},
|
||||
"parameters": []any{},
|
||||
},
|
||||
},
|
||||
)
|
||||
tests.RunToolGetTestByName(t, "get_connection_schemas",
|
||||
map[string]any{
|
||||
"get_connection_schemas": map[string]any{
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
"authRequired": []any{},
|
||||
"parameters": []any{
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The connection containing the schemas.",
|
||||
"name": "conn",
|
||||
"required": true,
|
||||
"type": "string",
|
||||
},
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The optional database to search",
|
||||
"name": "db",
|
||||
"required": false,
|
||||
"type": "string",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
tests.RunToolGetTestByName(t, "get_connection_databases",
|
||||
map[string]any{
|
||||
"get_connection_databases": map[string]any{
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
"authRequired": []any{},
|
||||
"parameters": []any{
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The connection containing the databases.",
|
||||
"name": "conn",
|
||||
"required": true,
|
||||
"type": "string",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
tests.RunToolGetTestByName(t, "get_connection_tables",
|
||||
map[string]any{
|
||||
"get_connection_tables": map[string]any{
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
"authRequired": []any{},
|
||||
"parameters": []any{
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The connection containing the tables.",
|
||||
"name": "conn",
|
||||
"required": true,
|
||||
"type": "string",
|
||||
},
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The optional database to search",
|
||||
"name": "db",
|
||||
"required": false,
|
||||
"type": "string",
|
||||
},
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The schema containing the tables.",
|
||||
"name": "schema",
|
||||
"required": true,
|
||||
"type": "string",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
tests.RunToolGetTestByName(t, "get_connection_table_columns",
|
||||
map[string]any{
|
||||
"get_connection_table_columns": map[string]any{
|
||||
"description": "Simple tool to test end to end functionality.",
|
||||
"authRequired": []any{},
|
||||
"parameters": []any{
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The connection containing the tables.",
|
||||
"name": "conn",
|
||||
"required": true,
|
||||
"type": "string",
|
||||
},
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The optional database to search",
|
||||
"name": "db",
|
||||
"required": false,
|
||||
"type": "string",
|
||||
},
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "The schema containing the tables.",
|
||||
"name": "schema",
|
||||
"required": true,
|
||||
"type": "string",
|
||||
},
|
||||
map[string]any{
|
||||
"authSources": []any{},
|
||||
"description": "A comma separated list of tables containing the columns.",
|
||||
"name": "tables",
|
||||
"required": true,
|
||||
"type": "string",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
wantResult := "{\"label\":\"System Activity\",\"name\":\"system__activity\",\"project_name\":\"system__activity\"}"
|
||||
wantResult := "{\"connections\":[],\"label\":\"System Activity\",\"name\":\"system__activity\",\"project_name\":\"system__activity\"}"
|
||||
tests.RunToolInvokeSimpleTest(t, "get_models", wantResult)
|
||||
|
||||
wantResult = "{\"description\":\"Data about Look and dashboard usage, including frequency of views, favoriting, scheduling, embedding, and access via the API. Also includes details about individual Looks and dashboards.\",\"group_label\":\"System Activity\",\"label\":\"Content Usage\",\"name\":\"content_usage\"}"
|
||||
@@ -1070,6 +1214,21 @@ func TestLooker(t *testing.T) {
|
||||
|
||||
wantResult = "production"
|
||||
tests.RunToolInvokeParametersTest(t, "dev_mode", []byte(`{"devMode": false}`), wantResult)
|
||||
|
||||
wantResult = "thelook"
|
||||
tests.RunToolInvokeSimpleTest(t, "get_connections", wantResult)
|
||||
|
||||
wantResult = "{\"name\":\"demo_db\",\"is_default\":true}"
|
||||
tests.RunToolInvokeParametersTest(t, "get_connection_schemas", []byte(`{"conn": "thelook"}`), wantResult)
|
||||
|
||||
wantResult = "[]"
|
||||
tests.RunToolInvokeParametersTest(t, "get_connection_databases", []byte(`{"conn": "thelook"}`), wantResult)
|
||||
|
||||
wantResult = "Employees"
|
||||
tests.RunToolInvokeParametersTest(t, "get_connection_tables", []byte(`{"conn": "thelook", "schema": "demo_db"}`), wantResult)
|
||||
|
||||
wantResult = "{\"column_name\":\"EmpID\",\"data_type_database\":\"int\",\"data_type_looker\":\"number\",\"sql_escaped_column_name\":\"EmpID\"}"
|
||||
tests.RunToolInvokeParametersTest(t, "get_connection_table_columns", []byte(`{"conn": "thelook", "schema": "demo_db", "tables": "Employees"}`), wantResult)
|
||||
}
|
||||
|
||||
func runConversationalAnalytics(t *testing.T, modelName, exploreName string) {
|
||||
|
||||
@@ -746,6 +746,20 @@ func RunExecuteSqlToolInvokeTest(t *testing.T, createTableStatement, select1Want
|
||||
requestBody: bytes.NewBuffer([]byte(fmt.Sprintf(`{"sql": %s}`, configs.select1Statement))),
|
||||
isErr: true,
|
||||
},
|
||||
{
|
||||
name: "invoke my-exec-sql-tool with invalid SELECT SQL",
|
||||
api: "http://127.0.0.1:5000/api/tool/my-exec-sql-tool/invoke",
|
||||
requestHeader: map[string]string{},
|
||||
requestBody: bytes.NewBuffer([]byte(`{"sql":"SELECT * FROM non_existent_table"}`)),
|
||||
isErr: true,
|
||||
},
|
||||
{
|
||||
name: "invoke my-exec-sql-tool with invalid ALTER SQL",
|
||||
api: "http://127.0.0.1:5000/api/tool/my-exec-sql-tool/invoke",
|
||||
requestHeader: map[string]string{},
|
||||
requestBody: bytes.NewBuffer([]byte(`{"sql":"ALTER TALE t ALTER COLUMN id DROP NOT NULL"}`)),
|
||||
isErr: true,
|
||||
},
|
||||
}
|
||||
for _, tc := range invokeTcs {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
|
||||
Reference in New Issue
Block a user