mirror of
https://github.com/googleapis/genai-toolbox.git
synced 2026-01-07 22:54:06 -05:00
docs: update long lines and tables (#1952)
Update long lines and tables formatting in markdown doc files.
This commit is contained in:
@@ -93,7 +93,8 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
|
|||||||
### Adding a New Tool
|
### Adding a New Tool
|
||||||
|
|
||||||
> [!NOTE]
|
> [!NOTE]
|
||||||
> Please follow the tool naming convention detailed [here](./DEVELOPER.md#tool-naming-conventions).
|
> Please follow the tool naming convention detailed
|
||||||
|
> [here](./DEVELOPER.md#tool-naming-conventions).
|
||||||
|
|
||||||
We recommend looking at an [example tool
|
We recommend looking at an [example tool
|
||||||
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
|
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
|
||||||
@@ -129,10 +130,10 @@ tools.
|
|||||||
|
|
||||||
* **Add a test file** under a new directory `tests/newdb`.
|
* **Add a test file** under a new directory `tests/newdb`.
|
||||||
* **Add pre-defined integration test suites** in the
|
* **Add pre-defined integration test suites** in the
|
||||||
`/tests/newdb/newdb_integration_test.go` that are **required** to be run as long as your
|
`/tests/newdb/newdb_integration_test.go` that are **required** to be run as
|
||||||
code contains related features. Please check each test suites for the config
|
long as your code contains related features. Please check each test suites for
|
||||||
defaults, if your source require test suites config updates, please refer to
|
the config defaults, if your source require test suites config updates, please
|
||||||
[config option](./tests/option.go):
|
refer to [config option](./tests/option.go):
|
||||||
|
|
||||||
1. [RunToolGetTest][tool-get]: tests for the `GET` endpoint that returns the
|
1. [RunToolGetTest][tool-get]: tests for the `GET` endpoint that returns the
|
||||||
tool's manifest.
|
tool's manifest.
|
||||||
|
|||||||
26
DEVELOPER.md
26
DEVELOPER.md
@@ -255,18 +255,25 @@ Follow these steps to preview documentation changes locally using a Hugo server:
|
|||||||
There are 3 GHA workflows we use to achieve document versioning:
|
There are 3 GHA workflows we use to achieve document versioning:
|
||||||
|
|
||||||
1. **Deploy In-development docs:**
|
1. **Deploy In-development docs:**
|
||||||
This workflow is run on every commit merged into the main branch. It deploys the built site to the `/dev/` subdirectory for the in-development documentation.
|
This workflow is run on every commit merged into the main branch. It deploys
|
||||||
|
the built site to the `/dev/` subdirectory for the in-development
|
||||||
|
documentation.
|
||||||
|
|
||||||
1. **Deploy Versioned Docs:**
|
1. **Deploy Versioned Docs:**
|
||||||
When a new GitHub Release is published, it performs two deployments based on the new release tag.
|
When a new GitHub Release is published, it performs two deployments based on
|
||||||
One to the new version subdirectory and one to the root directory of the versioned-gh-pages branch.
|
the new release tag. One to the new version subdirectory and one to the root
|
||||||
|
directory of the versioned-gh-pages branch.
|
||||||
|
|
||||||
**Note:** Before the release PR from release-please is merged, add the newest version into the hugo.toml file.
|
**Note:** Before the release PR from release-please is merged, add the
|
||||||
|
newest version into the hugo.toml file.
|
||||||
|
|
||||||
1. **Deploy Previous Version Docs:**
|
1. **Deploy Previous Version Docs:**
|
||||||
This is a manual workflow, started from the GitHub Actions UI.
|
This is a manual workflow, started from the GitHub Actions UI.
|
||||||
To rebuild and redeploy documentation for an already released version that were released before this new system was in place. This workflow can be started on the UI by providing the git version tag which you want to create the documentation for.
|
To rebuild and redeploy documentation for an already released version that
|
||||||
The specific versioned subdirectory and the root docs are updated on the versioned-gh-pages branch.
|
were released before this new system was in place. This workflow can be
|
||||||
|
started on the UI by providing the git version tag which you want to create
|
||||||
|
the documentation for. The specific versioned subdirectory and the root docs
|
||||||
|
are updated on the versioned-gh-pages branch.
|
||||||
|
|
||||||
#### Contributors
|
#### Contributors
|
||||||
|
|
||||||
@@ -337,7 +344,9 @@ for instructions on developing Toolbox SDKs.
|
|||||||
|
|
||||||
Team `@googleapis/senseai-eco` has been set as
|
Team `@googleapis/senseai-eco` has been set as
|
||||||
[CODEOWNERS](.github/CODEOWNERS). The GitHub TeamSync tool is used to create
|
[CODEOWNERS](.github/CODEOWNERS). The GitHub TeamSync tool is used to create
|
||||||
this team from MDB Group, `senseai-eco`. Additionally, database-specific GitHub teams (e.g., `@googleapis/toolbox-alloydb`) have been created from MDB groups to manage code ownership and review for individual database products.
|
this team from MDB Group, `senseai-eco`. Additionally, database-specific GitHub
|
||||||
|
teams (e.g., `@googleapis/toolbox-alloydb`) have been created from MDB groups to
|
||||||
|
manage code ownership and review for individual database products.
|
||||||
|
|
||||||
Team `@googleapis/toolbox-contributors` has write access to this repo. They
|
Team `@googleapis/toolbox-contributors` has write access to this repo. They
|
||||||
can create branches and approve test runs. But they do not have the ability
|
can create branches and approve test runs. But they do not have the ability
|
||||||
@@ -441,7 +450,8 @@ Trigger pull request tests for external contributors by:
|
|||||||
|
|
||||||
## Repo Setup & Automation
|
## Repo Setup & Automation
|
||||||
|
|
||||||
* .github/blunderbuss.yml - Auto-assign issues and PRs from GitHub teams. Use a product label to assign to a product-specific team member.
|
* .github/blunderbuss.yml - Auto-assign issues and PRs from GitHub teams. Use a
|
||||||
|
product label to assign to a product-specific team member.
|
||||||
* .github/renovate.json5 - Tooling for dependency updates. Dependabot is built
|
* .github/renovate.json5 - Tooling for dependency updates. Dependabot is built
|
||||||
into the GitHub repo for GitHub security warnings
|
into the GitHub repo for GitHub security warnings
|
||||||
* go/github-issue-mirror - GitHub issues are automatically mirrored into buganizer
|
* go/github-issue-mirror - GitHub issues are automatically mirrored into buganizer
|
||||||
|
|||||||
@@ -1,10 +1,14 @@
|
|||||||
This document helps you find and install the right Gemini CLI extension to interact with your databases.
|
This document helps you find and install the right Gemini CLI extension to
|
||||||
|
interact with your databases.
|
||||||
|
|
||||||
## How to Install an Extension
|
## How to Install an Extension
|
||||||
|
|
||||||
To install any of the extensions listed below, use the `gemini extensions install` command followed by the extension's GitHub repository URL.
|
To install any of the extensions listed below, use the `gemini extensions
|
||||||
|
install` command followed by the extension's GitHub repository URL.
|
||||||
|
|
||||||
For complete instructions on finding, installing, and managing extensions, please see the [official Gemini CLI extensions documentation](https://github.com/google-gemini/gemini-cli/blob/main/docs/extensions/index.md).
|
For complete instructions on finding, installing, and managing extensions,
|
||||||
|
please see the [official Gemini CLI extensions
|
||||||
|
documentation](https://github.com/google-gemini/gemini-cli/blob/main/docs/extensions/index.md).
|
||||||
|
|
||||||
**Example Installation Command:**
|
**Example Installation Command:**
|
||||||
|
|
||||||
@@ -13,46 +17,63 @@ gemini extensions install https://github.com/gemini-cli-extensions/EXTENSION_NAM
|
|||||||
```
|
```
|
||||||
|
|
||||||
Make sure the user knows:
|
Make sure the user knows:
|
||||||
|
|
||||||
* These commands are not supported from within the CLI
|
* These commands are not supported from within the CLI
|
||||||
* These commands will only be reflected in active CLI sessions on restart
|
* These commands will only be reflected in active CLI sessions on restart
|
||||||
* Extensions require Application Default Credentials in your environment. See [Set up ADC for a local development environment](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment) to learn how you can provide either your user credentials or service account credentials to ADC in a local development environment.
|
* Extensions require Application Default Credentials in your environment. See
|
||||||
* Most extensions require you to set environment variables to connect to a database. If there is a link provided for the configuration, fetch the web page and return the configuration.
|
[Set up ADC for a local development
|
||||||
|
environment](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
|
||||||
|
to learn how you can provide either your user credentials or service account
|
||||||
|
credentials to ADC in a local development environment.
|
||||||
|
* Most extensions require you to set environment variables to connect to a
|
||||||
|
database. If there is a link provided for the configuration, fetch the web
|
||||||
|
page and return the configuration.
|
||||||
|
|
||||||
-----
|
-----
|
||||||
|
|
||||||
## Find Your Database Extension
|
## Find Your Database Extension
|
||||||
|
|
||||||
Find your database or service in the list below to get the correct installation command.
|
Find your database or service in the list below to get the correct installation
|
||||||
|
command.
|
||||||
|
|
||||||
**Note on Observability:** Extensions with `-observability` in their name are designed to help you understand the health and performance of your database instances, often by analyzing metrics and logs.
|
**Note on Observability:** Extensions with `-observability` in their name are
|
||||||
|
designed to help you understand the health and performance of your database
|
||||||
|
instances, often by analyzing metrics and logs.
|
||||||
|
|
||||||
### Google Cloud Managed Databases
|
### Google Cloud Managed Databases
|
||||||
|
|
||||||
#### BigQuery
|
#### BigQuery
|
||||||
|
|
||||||
* For data analytics and querying:
|
* For data analytics and querying:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/bigquery-data-analytics
|
gemini extensions install https://github.com/gemini-cli-extensions/bigquery-data-analytics
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/bigquery-data-analytics/tree/main?tab=readme-ov-file#configuration
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/bigquery-data-analytics/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
|
* For conversational analytics (using natural language):
|
||||||
|
|
||||||
* For conversational analytics (using natural language):
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/bigquery-conversational-analytics
|
gemini extensions install https://github.com/gemini-cli-extensions/bigquery-conversational-analytics
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/bigquery-conversational-analytics/tree/main?tab=readme-ov-file#configuration
|
Configuration: https://github.com/gemini-cli-extensions/bigquery-conversational-analytics/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
#### Cloud SQL for MySQL
|
#### Cloud SQL for MySQL
|
||||||
|
|
||||||
* Main Extension:
|
* Main Extension:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-mysql
|
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-mysql
|
||||||
```
|
```
|
||||||
Configuration: https://github.com/gemini-cli-extensions/cloud-sql-mysql/tree/main?tab=readme-ov-file#configuration
|
|
||||||
|
|
||||||
* Observability:
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/cloud-sql-mysql/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
|
* Observability:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-mysql-observability
|
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-mysql-observability
|
||||||
```
|
```
|
||||||
@@ -61,129 +82,166 @@ Find your database or service in the list below to get the correct installation
|
|||||||
|
|
||||||
#### Cloud SQL for PostgreSQL
|
#### Cloud SQL for PostgreSQL
|
||||||
|
|
||||||
* Main Extension:
|
* Main Extension:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-postgresql
|
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-postgresql
|
||||||
```
|
```
|
||||||
Configuration: https://github.com/gemini-cli-extensions/cloud-sql-postgresql/tree/main?tab=readme-ov-file#configuration
|
|
||||||
|
|
||||||
* Observability:
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/cloud-sql-postgresql/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
|
* Observability:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-postgresql-observability
|
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-postgresql-observability
|
||||||
```
|
```
|
||||||
|
|
||||||
If you are looking for other PostgreSQL options, consider the `postgres` extension for self-hosted instances, or the `alloydb` extension for AlloyDB for PostgreSQL.
|
If you are looking for other PostgreSQL options, consider the `postgres`
|
||||||
|
extension for self-hosted instances, or the `alloydb` extension for AlloyDB
|
||||||
|
for PostgreSQL.
|
||||||
|
|
||||||
#### Cloud SQL for SQL Server
|
#### Cloud SQL for SQL Server
|
||||||
|
|
||||||
* Main Extension:
|
* Main Extension:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-sqlserver
|
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-sqlserver
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/cloud-sql-sqlserver/tree/main?tab=readme-ov-file#configuration
|
|
||||||
|
|
||||||
* Observability:
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/cloud-sql-sqlserver/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
|
* Observability:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-sqlserver-observability
|
gemini extensions install https://github.com/gemini-cli-extensions/cloud-sql-sqlserver-observability
|
||||||
```
|
```
|
||||||
|
|
||||||
If you are looking for self-hosted SQL Server, consider the `sql-server` extension.
|
If you are looking for self-hosted SQL Server, consider the `sql-server`
|
||||||
|
extension.
|
||||||
|
|
||||||
#### AlloyDB for PostgreSQL
|
#### AlloyDB for PostgreSQL
|
||||||
|
|
||||||
* Main Extension:
|
* Main Extension:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/alloydb
|
gemini extensions install https://github.com/gemini-cli-extensions/alloydb
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/alloydb/tree/main?tab=readme-ov-file#configuration
|
|
||||||
|
|
||||||
* Observability:
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/alloydb/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
|
* Observability:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/alloydb-observability
|
gemini extensions install https://github.com/gemini-cli-extensions/alloydb-observability
|
||||||
```
|
```
|
||||||
|
|
||||||
If you are looking for other PostgreSQL options, consider the `postgres` extension for self-hosted instances, or the `cloud-sql-postgresql` extension for Cloud SQL for PostgreSQL.
|
If you are looking for other PostgreSQL options, consider the `postgres`
|
||||||
|
extension for self-hosted instances, or the `cloud-sql-postgresql` extension
|
||||||
|
for Cloud SQL for PostgreSQL.
|
||||||
|
|
||||||
#### Spanner
|
#### Spanner
|
||||||
|
|
||||||
* For querying Spanner databases:
|
* For querying Spanner databases:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/spanner
|
gemini extensions install https://github.com/gemini-cli-extensions/spanner
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/spanner/tree/main?tab=readme-ov-file#configuration
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/spanner/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
#### Firestore
|
#### Firestore
|
||||||
|
|
||||||
* For querying Firestore in Native Mode:
|
* For querying Firestore in Native Mode:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/firestore-native
|
gemini extensions install https://github.com/gemini-cli-extensions/firestore-native
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/firestore-native/tree/main?tab=readme-ov-file#configuration
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/firestore-native/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
### Other Google Cloud Data Services
|
### Other Google Cloud Data Services
|
||||||
|
|
||||||
#### Dataplex
|
#### Dataplex
|
||||||
|
|
||||||
* For interacting with Dataplex data lakes and assets:
|
* For interacting with Dataplex data lakes and assets:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/dataplex
|
gemini extensions install https://github.com/gemini-cli-extensions/dataplex
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/dataplex/tree/main?tab=readme-ov-file#configuration
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/dataplex/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
#### Looker
|
#### Looker
|
||||||
|
|
||||||
* For querying Looker instances:
|
* For querying Looker instances:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/looker
|
gemini extensions install https://github.com/gemini-cli-extensions/looker
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/looker/tree/main?tab=readme-ov-file#configuration
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/looker/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
### Other Database Engines
|
### Other Database Engines
|
||||||
|
|
||||||
These extensions are for connecting to database instances not managed by Cloud SQL (e.g., self-hosted on-prem, on a VM, or in another cloud).
|
These extensions are for connecting to database instances not managed by Cloud
|
||||||
|
SQL (e.g., self-hosted on-prem, on a VM, or in another cloud).
|
||||||
|
|
||||||
|
* MySQL:
|
||||||
|
|
||||||
* MySQL:
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/mysql
|
gemini extensions install https://github.com/gemini-cli-extensions/mysql
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/mysql/tree/main?tab=readme-ov-file#configuration
|
|
||||||
|
|
||||||
If you are looking for Google Cloud managed MySQL, consider the `cloud-sql-mysql` extension.
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/mysql/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
|
If you are looking for Google Cloud managed MySQL, consider the
|
||||||
|
`cloud-sql-mysql` extension.
|
||||||
|
|
||||||
|
* PostgreSQL:
|
||||||
|
|
||||||
* PostgreSQL:
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/postgres
|
gemini extensions install https://github.com/gemini-cli-extensions/postgres
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/postgres/tree/main?tab=readme-ov-file#configuration
|
|
||||||
|
|
||||||
If you are looking for Google Cloud managed PostgreSQL, consider the `cloud-sql-postgresql` or `alloydb` extensions.
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/postgres/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
|
If you are looking for Google Cloud managed PostgreSQL, consider the
|
||||||
|
`cloud-sql-postgresql` or `alloydb` extensions.
|
||||||
|
|
||||||
|
* SQL Server:
|
||||||
|
|
||||||
* SQL Server:
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/sql-server
|
gemini extensions install https://github.com/gemini-cli-extensions/sql-server
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/sql-server/tree/main?tab=readme-ov-file#configuration
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/sql-server/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|
||||||
If you are looking for Google Cloud managed SQL Server, consider the `cloud-sql-sqlserver` extension.
|
If you are looking for Google Cloud managed SQL Server, consider the
|
||||||
|
`cloud-sql-sqlserver` extension.
|
||||||
|
|
||||||
### Custom Tools
|
### Custom Tools
|
||||||
|
|
||||||
#### MCP Toolbox
|
#### MCP Toolbox
|
||||||
|
|
||||||
* For connecting to MCP Toolbox servers:
|
* For connecting to MCP Toolbox servers:
|
||||||
|
|
||||||
|
This extension can be used with any Google Cloud database to build custom
|
||||||
|
tools. For more information, see the [MCP Toolbox
|
||||||
|
documentation](https://googleapis.github.io/genai-toolbox/getting-started/introduction/).
|
||||||
|
|
||||||
This extension can be used with any Google Cloud database to build custom tools. For more information, see the [MCP Toolbox documentation](https://googleapis.github.io/genai-toolbox/getting-started/introduction/).
|
|
||||||
```bash
|
```bash
|
||||||
gemini extensions install https://github.com/gemini-cli-extensions/mcp-toolbox
|
gemini extensions install https://github.com/gemini-cli-extensions/mcp-toolbox
|
||||||
```
|
```
|
||||||
|
|
||||||
Configuration: https://github.com/gemini-cli-extensions/mcp-toolbox/tree/main?tab=readme-ov-file#configuration
|
Configuration:
|
||||||
|
https://github.com/gemini-cli-extensions/mcp-toolbox/tree/main?tab=readme-ov-file#configuration
|
||||||
|
|||||||
@@ -804,8 +804,6 @@ For more detailed instructions on using the Toolbox Core SDK, see the
|
|||||||
For more detailed instructions on using the Toolbox Go SDK, see the
|
For more detailed instructions on using the Toolbox Go SDK, see the
|
||||||
[project's README][toolbox-core-go-readme].
|
[project's README][toolbox-core-go-readme].
|
||||||
|
|
||||||
[toolbox-go]: https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go/core
|
|
||||||
[toolbox-core-go-readme]: https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/README.md
|
|
||||||
|
|
||||||
</details>
|
</details>
|
||||||
</details>
|
</details>
|
||||||
|
|||||||
@@ -183,11 +183,11 @@ Protocol (OTLP). If you would like to use a collector, please refer to this
|
|||||||
|
|
||||||
The following flags are used to determine Toolbox's telemetry configuration:
|
The following flags are used to determine Toolbox's telemetry configuration:
|
||||||
|
|
||||||
| **flag** | **type** | **description** |
|
| **flag** | **type** | **description** |
|
||||||
|----------------------------|----------|----------------------------------------------------------------------------------------------------------------|
|
|----------------------------|----------|------------------------------------------------------------------------------------------------------------------|
|
||||||
| `--telemetry-gcp` | bool | Enable exporting directly to Google Cloud Monitoring. Default is `false`. |
|
| `--telemetry-gcp` | bool | Enable exporting directly to Google Cloud Monitoring. Default is `false`. |
|
||||||
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "<http://127.0.0.1:4318>"). |
|
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "<http://127.0.0.1:4318>"). |
|
||||||
| `--telemetry-service-name` | string | Sets the value of the `service.name` resource attribute. Default is `toolbox`. |
|
| `--telemetry-service-name` | string | Sets the value of the `service.name` resource attribute. Default is `toolbox`. |
|
||||||
|
|
||||||
In addition to the flags noted above, you can also make additional configuration
|
In addition to the flags noted above, you can also make additional configuration
|
||||||
for OpenTelemetry via the [General SDK Configuration][sdk-configuration] through
|
for OpenTelemetry via the [General SDK Configuration][sdk-configuration] through
|
||||||
|
|||||||
@@ -99,7 +99,8 @@ my_second_toolset = client.load_toolset("my_second_toolset")
|
|||||||
|
|
||||||
### Prompts
|
### Prompts
|
||||||
|
|
||||||
The `prompts` section of your `tools.yaml` defines the templates containing structured messages and instructions for interacting with language models.
|
The `prompts` section of your `tools.yaml` defines the templates containing
|
||||||
|
structured messages and instructions for interacting with language models.
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
prompts:
|
prompts:
|
||||||
|
|||||||
@@ -84,38 +84,46 @@ following instructions for your OS and CPU architecture.
|
|||||||
{{< tabpane text=true >}}
|
{{< tabpane text=true >}}
|
||||||
{{% tab header="Linux (AMD64)" lang="en" %}}
|
{{% tab header="Linux (AMD64)" lang="en" %}}
|
||||||
To install Toolbox as a binary on Linux (AMD64):
|
To install Toolbox as a binary on Linux (AMD64):
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
# see releases page for other versions
|
# see releases page for other versions
|
||||||
export VERSION=0.20.0
|
export VERSION=0.20.0
|
||||||
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
|
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
|
||||||
chmod +x toolbox
|
chmod +x toolbox
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="macOS (Apple Silicon)" lang="en" %}}
|
{{% tab header="macOS (Apple Silicon)" lang="en" %}}
|
||||||
To install Toolbox as a binary on macOS (Apple Silicon):
|
To install Toolbox as a binary on macOS (Apple Silicon):
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
# see releases page for other versions
|
# see releases page for other versions
|
||||||
export VERSION=0.20.0
|
export VERSION=0.20.0
|
||||||
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
|
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
|
||||||
chmod +x toolbox
|
chmod +x toolbox
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="macOS (Intel)" lang="en" %}}
|
{{% tab header="macOS (Intel)" lang="en" %}}
|
||||||
To install Toolbox as a binary on macOS (Intel):
|
To install Toolbox as a binary on macOS (Intel):
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
# see releases page for other versions
|
# see releases page for other versions
|
||||||
export VERSION=0.20.0
|
export VERSION=0.20.0
|
||||||
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
|
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
|
||||||
chmod +x toolbox
|
chmod +x toolbox
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Windows (AMD64)" lang="en" %}}
|
{{% tab header="Windows (AMD64)" lang="en" %}}
|
||||||
To install Toolbox as a binary on Windows (AMD64):
|
To install Toolbox as a binary on Windows (AMD64):
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
# see releases page for other versions
|
# see releases page for other versions
|
||||||
$VERSION = "0.20.0"
|
$VERSION = "0.20.0"
|
||||||
Invoke-WebRequest -Uri "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" -OutFile "toolbox.exe"
|
Invoke-WebRequest -Uri "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" -OutFile "toolbox.exe"
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
|
|||||||
@@ -25,12 +25,15 @@ This guide assumes you have already done the following:
|
|||||||
[install-postgres]: https://www.postgresql.org/download/
|
[install-postgres]: https://www.postgresql.org/download/
|
||||||
|
|
||||||
### Cloud Setup (Optional)
|
### Cloud Setup (Optional)
|
||||||
|
|
||||||
{{< regionInclude "quickstart/shared/cloud_setup.md" "cloud_setup" >}}
|
{{< regionInclude "quickstart/shared/cloud_setup.md" "cloud_setup" >}}
|
||||||
|
|
||||||
## Step 1: Set up your database
|
## Step 1: Set up your database
|
||||||
|
|
||||||
{{< regionInclude "quickstart/shared/database_setup.md" "database_setup" >}}
|
{{< regionInclude "quickstart/shared/database_setup.md" "database_setup" >}}
|
||||||
|
|
||||||
## Step 2: Install and configure Toolbox
|
## Step 2: Install and configure Toolbox
|
||||||
|
|
||||||
{{< regionInclude "quickstart/shared/configure_toolbox.md" "configure_toolbox" >}}
|
{{< regionInclude "quickstart/shared/configure_toolbox.md" "configure_toolbox" >}}
|
||||||
|
|
||||||
## Step 3: Connect your agent to Toolbox
|
## Step 3: Connect your agent to Toolbox
|
||||||
|
|||||||
@@ -59,7 +59,7 @@ npm install genkit @genkit-ai/googleai
|
|||||||
npm install llamaindex @llamaindex/google @llamaindex/workflow
|
npm install llamaindex @llamaindex/google @llamaindex/workflow
|
||||||
{{< /tab >}}
|
{{< /tab >}}
|
||||||
{{< tab header="GoogleGenAI" lang="bash" >}}
|
{{< tab header="GoogleGenAI" lang="bash" >}}
|
||||||
npm install @google/genai
|
npm install @google/genai
|
||||||
{{< /tab >}}
|
{{< /tab >}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
|
|||||||
@@ -254,6 +254,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
|
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
@@ -278,6 +279,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
@@ -287,6 +289,7 @@ Your AI tool is now connected to Cloud SQL for SQL Server using MCP.
|
|||||||
|
|
||||||
The `cloud-sql-mssql-admin` server provides tools for managing your Cloud SQL
|
The `cloud-sql-mssql-admin` server provides tools for managing your Cloud SQL
|
||||||
instances and interacting with your database:
|
instances and interacting with your database:
|
||||||
|
|
||||||
* **create_instance**: Creates a new Cloud SQL for SQL Server instance.
|
* **create_instance**: Creates a new Cloud SQL for SQL Server instance.
|
||||||
* **get_instance**: Gets information about a Cloud SQL instance.
|
* **get_instance**: Gets information about a Cloud SQL instance.
|
||||||
* **list_instances**: Lists Cloud SQL instances in a project.
|
* **list_instances**: Lists Cloud SQL instances in a project.
|
||||||
|
|||||||
@@ -254,6 +254,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
|
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
@@ -278,6 +279,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
@@ -287,6 +289,7 @@ Your AI tool is now connected to Cloud SQL for MySQL using MCP.
|
|||||||
|
|
||||||
The `cloud-sql-mysql-admin` server provides tools for managing your Cloud SQL
|
The `cloud-sql-mysql-admin` server provides tools for managing your Cloud SQL
|
||||||
instances and interacting with your database:
|
instances and interacting with your database:
|
||||||
|
|
||||||
* **create_instance**: Creates a new Cloud SQL for MySQL instance.
|
* **create_instance**: Creates a new Cloud SQL for MySQL instance.
|
||||||
* **get_instance**: Gets information about a Cloud SQL instance.
|
* **get_instance**: Gets information about a Cloud SQL instance.
|
||||||
* **list_instances**: Lists Cloud SQL instances in a project.
|
* **list_instances**: Lists Cloud SQL instances in a project.
|
||||||
|
|||||||
@@ -254,6 +254,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
|
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
@@ -278,6 +279,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
@@ -287,6 +289,7 @@ Your AI tool is now connected to Cloud SQL for PostgreSQL using MCP.
|
|||||||
|
|
||||||
The `cloud-sql-postgres-admin` server provides tools for managing your Cloud SQL
|
The `cloud-sql-postgres-admin` server provides tools for managing your Cloud SQL
|
||||||
instances and interacting with your database:
|
instances and interacting with your database:
|
||||||
|
|
||||||
* **create_instance**: Creates a new Cloud SQL for PostgreSQL instance.
|
* **create_instance**: Creates a new Cloud SQL for PostgreSQL instance.
|
||||||
* **get_instance**: Gets information about a Cloud SQL instance.
|
* **get_instance**: Gets information about a Cloud SQL instance.
|
||||||
* **list_instances**: Lists Cloud SQL instances in a project.
|
* **list_instances**: Lists Cloud SQL instances in a project.
|
||||||
|
|||||||
@@ -46,6 +46,7 @@ to expose your developer assistant tools to a Looker instance:
|
|||||||
v0.10.0+:
|
v0.10.0+:
|
||||||
|
|
||||||
<!-- {x-release-please-start-version} -->
|
<!-- {x-release-please-start-version} -->
|
||||||
|
|
||||||
{{< tabpane persist=header >}}
|
{{< tabpane persist=header >}}
|
||||||
{{< tab header="linux/amd64" lang="bash" >}}
|
{{< tab header="linux/amd64" lang="bash" >}}
|
||||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/linux/amd64/toolbox
|
curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/linux/amd64/toolbox
|
||||||
@@ -82,7 +83,8 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
{{< tabpane text=true >}}
|
{{< tabpane text=true >}}
|
||||||
{{% tab header="Gemini-CLI" lang="en" %}}
|
{{% tab header="Gemini-CLI" lang="en" %}}
|
||||||
|
|
||||||
1. Install [Gemini-CLI](https://github.com/google-gemini/gemini-cli#install-globally-with-npm).
|
1. Install
|
||||||
|
[Gemini-CLI](https://github.com/google-gemini/gemini-cli#install-globally-with-npm).
|
||||||
1. Create a directory `.gemini` in your home directory if it doesn't exist.
|
1. Create a directory `.gemini` in your home directory if it doesn't exist.
|
||||||
1. Create the file `.gemini/settings.json` if it doesn't exist.
|
1. Create the file `.gemini/settings.json` if it doesn't exist.
|
||||||
1. Add the following configuration, or add the mcpServers stanza if you already
|
1. Add the following configuration, or add the mcpServers stanza if you already
|
||||||
@@ -287,7 +289,8 @@ Your AI tool is now connected to Looker using MCP. Try asking your AI
|
|||||||
assistant to list models, explores, dimensions, and measures. Run a
|
assistant to list models, explores, dimensions, and measures. Run a
|
||||||
query, retrieve the SQL for a query, and run a saved Look.
|
query, retrieve the SQL for a query, and run a saved Look.
|
||||||
|
|
||||||
The full tool list is available in the [Prebuilt Tools Reference](../../reference/prebuilt-tools.md/#looker).
|
The full tool list is available in the [Prebuilt Tools
|
||||||
|
Reference](../../reference/prebuilt-tools/#looker).
|
||||||
|
|
||||||
The following tools are available to the LLM:
|
The following tools are available to the LLM:
|
||||||
|
|
||||||
@@ -314,8 +317,10 @@ instance and create new saved content.
|
|||||||
1. **get_looks**: Return the saved Looks that match a title or description
|
1. **get_looks**: Return the saved Looks that match a title or description
|
||||||
1. **run_look**: Run a saved Look and return the data
|
1. **run_look**: Run a saved Look and return the data
|
||||||
1. **make_look**: Create a saved Look in Looker and return the URL
|
1. **make_look**: Create a saved Look in Looker and return the URL
|
||||||
1. **get_dashboards**: Return the saved dashboards that match a title or description
|
1. **get_dashboards**: Return the saved dashboards that match a title or
|
||||||
1. **run_dashbaord**: Run the queries associated with a dashboard and return the data
|
description
|
||||||
|
1. **run_dashbaord**: Run the queries associated with a dashboard and return the
|
||||||
|
data
|
||||||
1. **make_dashboard**: Create a saved dashboard in Looker and return the URL
|
1. **make_dashboard**: Create a saved dashboard in Looker and return the URL
|
||||||
1. **add_dashboard_element**: Add a tile to a dashboard
|
1. **add_dashboard_element**: Add a tile to a dashboard
|
||||||
|
|
||||||
@@ -344,7 +349,8 @@ as well as get the database schema needed to write LookML effectively.
|
|||||||
1. **get_connection_schemas**: Get the list of schemas for a connection
|
1. **get_connection_schemas**: Get the list of schemas for a connection
|
||||||
1. **get_connection_databases**: Get the list of databases for a connection
|
1. **get_connection_databases**: Get the list of databases for a connection
|
||||||
1. **get_connection_tables**: Get the list of tables for a connection
|
1. **get_connection_tables**: Get the list of tables for a connection
|
||||||
1. **get_connection_table_columns**: Get the list of columns for a table in a connection
|
1. **get_connection_table_columns**: Get the list of columns for a table in a
|
||||||
|
connection
|
||||||
|
|
||||||
{{< notice note >}}
|
{{< notice note >}}
|
||||||
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
|
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
|
||||||
|
|||||||
@@ -217,6 +217,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Windsurf" lang="en" %}}
|
{{% tab header="Windsurf" lang="en" %}}
|
||||||
|
|
||||||
@@ -243,6 +244,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini CLI" lang="en" %}}
|
{{% tab header="Gemini CLI" lang="en" %}}
|
||||||
|
|
||||||
@@ -270,6 +272,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
|
|
||||||
@@ -299,6 +302,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
|
|||||||
@@ -215,6 +215,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Windsurf" lang="en" %}}
|
{{% tab header="Windsurf" lang="en" %}}
|
||||||
|
|
||||||
@@ -241,6 +242,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini CLI" lang="en" %}}
|
{{% tab header="Gemini CLI" lang="en" %}}
|
||||||
|
|
||||||
@@ -268,6 +270,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
|
|
||||||
@@ -297,6 +300,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
|
|||||||
@@ -79,10 +79,10 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
{{% tab header="Claude code" lang="en" %}}
|
{{% tab header="Claude code" lang="en" %}}
|
||||||
|
|
||||||
1. Install [Claude
|
1. Install [Claude
|
||||||
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
|
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
|
||||||
1. Create a `.mcp.json` file in your project root if it doesn't exist.
|
1. Create a `.mcp.json` file in your project root if it doesn't exist.
|
||||||
1. Add the following configuration, replace the environment variables with your
|
1. Add the following configuration, replace the environment variables with your
|
||||||
values, and save:
|
values, and save:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
@@ -108,7 +108,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
|
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
|
||||||
1. Under the Developer tab, tap Edit Config to open the configuration file.
|
1. Under the Developer tab, tap Edit Config to open the configuration file.
|
||||||
1. Add the following configuration, replace the environment variables with your
|
1. Add the following configuration, replace the environment variables with your
|
||||||
values, and save:
|
values, and save:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
@@ -129,15 +129,15 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
|
|
||||||
1. Restart Claude desktop.
|
1. Restart Claude desktop.
|
||||||
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
|
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
|
||||||
new MCP server available.
|
new MCP server available.
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Cline" lang="en" %}}
|
{{% tab header="Cline" lang="en" %}}
|
||||||
|
|
||||||
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and
|
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and
|
||||||
tap the **MCP Servers** icon.
|
tap the **MCP Servers** icon.
|
||||||
1. Tap Configure MCP Servers to open the configuration file.
|
1. Tap Configure MCP Servers to open the configuration file.
|
||||||
1. Add the following configuration, replace the environment variables with your
|
1. Add the following configuration, replace the environment variables with your
|
||||||
values, and save:
|
values, and save:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
@@ -156,13 +156,15 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
1. You should see a green active status after the server is successfully connected.
|
1. You should see a green active status after the server is successfully
|
||||||
|
connected.
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Cursor" lang="en" %}}
|
{{% tab header="Cursor" lang="en" %}}
|
||||||
|
|
||||||
1. Create a `.cursor` directory in your project root if it doesn't exist.
|
1. Create a `.cursor` directory in your project root if it doesn't exist.
|
||||||
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
|
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
|
||||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
1. Add the following configuration, replace the environment variables with your
|
||||||
|
values, and save:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
@@ -211,6 +213,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Windsurf" lang="en" %}}
|
{{% tab header="Windsurf" lang="en" %}}
|
||||||
|
|
||||||
@@ -236,6 +239,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini CLI" lang="en" %}}
|
{{% tab header="Gemini CLI" lang="en" %}}
|
||||||
|
|
||||||
@@ -262,6 +266,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
|
|
||||||
@@ -290,6 +295,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
|
|||||||
@@ -287,6 +287,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
|
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
@@ -313,6 +314,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
|
|||||||
@@ -195,6 +195,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Windsurf" lang="en" %}}
|
{{% tab header="Windsurf" lang="en" %}}
|
||||||
|
|
||||||
@@ -217,6 +218,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini CLI" lang="en" %}}
|
{{% tab header="Gemini CLI" lang="en" %}}
|
||||||
|
|
||||||
@@ -240,6 +242,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||||
|
|
||||||
@@ -265,6 +268,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.20.0/windows/amd64/toolb
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
{{% /tab %}}
|
{{% /tab %}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
|
|||||||
@@ -7,12 +7,20 @@ description: "Connect to Toolbox via Gemini CLI Extensions."
|
|||||||
|
|
||||||
## Gemini CLI Extensions
|
## Gemini CLI Extensions
|
||||||
|
|
||||||
[Gemini CLI][gemini-cli] is an open-source AI agent designed to assist with development workflows by assisting with coding, debugging, data exploration, and content creation. Its mission is to provide an agentic interface for interacting with database and analytics services and popular open-source databases.
|
[Gemini CLI][gemini-cli] is an open-source AI agent designed to assist with
|
||||||
|
development workflows by assisting with coding, debugging, data exploration, and
|
||||||
|
content creation. Its mission is to provide an agentic interface for interacting
|
||||||
|
with database and analytics services and popular open-source databases.
|
||||||
|
|
||||||
### How extensions work
|
### How extensions work
|
||||||
Gemini CLI is highly extensible, allowing for the addition of new tools and capabilities through extensions. You can load the extensions from a GitHub URL, a local directory, or a configurable registry. They provide new tools, slash commands, and prompts to assist with your workflow.
|
|
||||||
|
|
||||||
Use the Gemini CLI Extensions to load prebuilt or custom tools to interact with your databases.
|
Gemini CLI is highly extensible, allowing for the addition of new tools and
|
||||||
|
capabilities through extensions. You can load the extensions from a GitHub URL,
|
||||||
|
a local directory, or a configurable registry. They provide new tools, slash
|
||||||
|
commands, and prompts to assist with your workflow.
|
||||||
|
|
||||||
|
Use the Gemini CLI Extensions to load prebuilt or custom tools to interact with
|
||||||
|
your databases.
|
||||||
|
|
||||||
[gemini-cli]: https://google-gemini.github.io/gemini-cli/
|
[gemini-cli]: https://google-gemini.github.io/gemini-cli/
|
||||||
|
|
||||||
@@ -35,4 +43,4 @@ Below are a list of Gemini CLI Extensions powered by MCP Toolbox:
|
|||||||
* [mysql](https://github.com/gemini-cli-extensions/mysql)
|
* [mysql](https://github.com/gemini-cli-extensions/mysql)
|
||||||
* [postgres](https://github.com/gemini-cli-extensions/postgres)
|
* [postgres](https://github.com/gemini-cli-extensions/postgres)
|
||||||
* [spanner](https://github.com/gemini-cli-extensions/spanner)
|
* [spanner](https://github.com/gemini-cli-extensions/spanner)
|
||||||
* [sql-server](https://github.com/gemini-cli-extensions/sql-server)
|
* [sql-server](https://github.com/gemini-cli-extensions/sql-server)
|
||||||
|
|||||||
@@ -169,10 +169,10 @@ testing and debugging Toolbox server.
|
|||||||
|
|
||||||
### Tested Clients
|
### Tested Clients
|
||||||
|
|
||||||
| Client | SSE Works | MCP Config Docs |
|
| Client | SSE Works | MCP Config Docs |
|
||||||
|--------|--------|--------|
|
|--------------------|------------|---------------------------------------------------------------------------------|
|
||||||
| Claude Desktop | ✅ | <https://modelcontextprotocol.io/quickstart/user#1-download-claude-for-desktop> |
|
| Claude Desktop | ✅ | <https://modelcontextprotocol.io/quickstart/user#1-download-claude-for-desktop> |
|
||||||
| MCP Inspector | ✅ | <https://github.com/modelcontextprotocol/inspector> |
|
| MCP Inspector | ✅ | <https://github.com/modelcontextprotocol/inspector> |
|
||||||
| Cursor | ✅ | <https://docs.cursor.com/context/model-context-protocol> |
|
| Cursor | ✅ | <https://docs.cursor.com/context/model-context-protocol> |
|
||||||
| Windsurf | ✅ | <https://docs.windsurf.com/windsurf/mcp> |
|
| Windsurf | ✅ | <https://docs.windsurf.com/windsurf/mcp> |
|
||||||
| VS Code (Insiders) | ✅ | <https://code.visualstudio.com/docs/copilot/chat/mcp-servers> |
|
| VS Code (Insiders) | ✅ | <https://code.visualstudio.com/docs/copilot/chat/mcp-servers> |
|
||||||
|
|||||||
@@ -164,7 +164,8 @@ You can connect to Toolbox Cloud Run instances directly through the SDK.
|
|||||||
{{< tab header="Python" lang="python" >}}
|
{{< tab header="Python" lang="python" >}}
|
||||||
from toolbox_core import ToolboxClient, auth_methods
|
from toolbox_core import ToolboxClient, auth_methods
|
||||||
|
|
||||||
# Replace with the Cloud Run service URL generated in the previous step.
|
# Replace with the Cloud Run service URL generated in the previous step
|
||||||
|
|
||||||
URL = "https://cloud-run-url.app"
|
URL = "https://cloud-run-url.app"
|
||||||
|
|
||||||
auth_token_provider = auth_methods.aget_google_id_token(URL) # can also use sync method
|
auth_token_provider = auth_methods.aget_google_id_token(URL) # can also use sync method
|
||||||
@@ -204,7 +205,6 @@ func main() {
|
|||||||
{{< /tab >}}
|
{{< /tab >}}
|
||||||
{{< /tabpane >}}
|
{{< /tabpane >}}
|
||||||
|
|
||||||
|
|
||||||
Now, you can use this client to connect to the deployed Cloud Run instance!
|
Now, you can use this client to connect to the deployed Cloud Run instance!
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
@@ -215,21 +215,21 @@ for your service in the Google Cloud Console's Cloud Run section. They often
|
|||||||
contain the specific error message needed to diagnose the problem.
|
contain the specific error message needed to diagnose the problem.
|
||||||
{{< /notice >}}
|
{{< /notice >}}
|
||||||
|
|
||||||
* **Deployment Fails with "Container failed to start":** This is almost always
|
- **Deployment Fails with "Container failed to start":** This is almost always
|
||||||
caused by a port mismatch. Ensure your container's `--port` argument is set to
|
caused by a port mismatch. Ensure your container's `--port` argument is set to
|
||||||
`8080` to match the `$PORT` environment variable provided by Cloud Run.
|
`8080` to match the `$PORT` environment variable provided by Cloud Run.
|
||||||
|
|
||||||
* **Client Receives Permission Denied Error (401 or 403):** If your client
|
- **Client Receives Permission Denied Error (401 or 403):** If your client
|
||||||
application (e.g., your local SDK) gets a `401 Unauthorized` or `403
|
application (e.g., your local SDK) gets a `401 Unauthorized` or `403
|
||||||
Forbidden` error when trying to call your Cloud Run service, it means the
|
Forbidden` error when trying to call your Cloud Run service, it means the
|
||||||
client is not properly authenticated as an invoker.
|
client is not properly authenticated as an invoker.
|
||||||
* Ensure the user or service account calling the service has the **Cloud Run
|
- Ensure the user or service account calling the service has the **Cloud Run
|
||||||
Invoker** (`roles/run.invoker`) IAM role.
|
Invoker** (`roles/run.invoker`) IAM role.
|
||||||
* If running locally, make sure your Application Default Credentials are set
|
- If running locally, make sure your Application Default Credentials are set
|
||||||
up correctly by running `gcloud auth application-default login`.
|
up correctly by running `gcloud auth application-default login`.
|
||||||
|
|
||||||
* **Service Fails to Access Secrets (in logs):** If your application starts but
|
- **Service Fails to Access Secrets (in logs):** If your application starts but
|
||||||
the logs show errors like "permission denied" when trying to access Secret
|
the logs show errors like "permission denied" when trying to access Secret
|
||||||
Manager, it means the Toolbox service account is missing permissions.
|
Manager, it means the Toolbox service account is missing permissions.
|
||||||
* Ensure the `toolbox-identity` service account has the **Secret Manager
|
- Ensure the `toolbox-identity` service account has the **Secret Manager
|
||||||
Secret Accessor** (`roles/secretmanager.secretAccessor`) IAM role.
|
Secret Accessor** (`roles/secretmanager.secretAccessor`) IAM role.
|
||||||
|
|||||||
@@ -69,7 +69,7 @@ response field (e.g. empty string).
|
|||||||
|
|
||||||
To edit headers, press the "Edit Headers" button to display the header modal.
|
To edit headers, press the "Edit Headers" button to display the header modal.
|
||||||
Within this modal, users can make direct edits by typing into the header's text
|
Within this modal, users can make direct edits by typing into the header's text
|
||||||
area.
|
area.
|
||||||
|
|
||||||
Toolbox UI validates that the headers are in correct JSON format. Other
|
Toolbox UI validates that the headers are in correct JSON format. Other
|
||||||
header-related errors (e.g., incorrect header names or values required by the
|
header-related errors (e.g., incorrect header names or values required by the
|
||||||
|
|||||||
@@ -32,10 +32,12 @@ description: >
|
|||||||
### Transport Configuration
|
### Transport Configuration
|
||||||
|
|
||||||
**Server Settings:**
|
**Server Settings:**
|
||||||
|
|
||||||
- `--address`, `-a`: Server listening address (default: "127.0.0.1")
|
- `--address`, `-a`: Server listening address (default: "127.0.0.1")
|
||||||
- `--port`, `-p`: Server listening port (default: 5000)
|
- `--port`, `-p`: Server listening port (default: 5000)
|
||||||
|
|
||||||
**STDIO:**
|
**STDIO:**
|
||||||
|
|
||||||
- `--stdio`: Run in MCP STDIO mode instead of HTTP server
|
- `--stdio`: Run in MCP STDIO mode instead of HTTP server
|
||||||
|
|
||||||
#### Usage Examples
|
#### Usage Examples
|
||||||
@@ -50,15 +52,19 @@ description: >
|
|||||||
The CLI supports multiple mutually exclusive ways to specify tool configurations:
|
The CLI supports multiple mutually exclusive ways to specify tool configurations:
|
||||||
|
|
||||||
**Single File:** (default)
|
**Single File:** (default)
|
||||||
|
|
||||||
- `--tools-file`: Path to a single YAML configuration file (default: `tools.yaml`)
|
- `--tools-file`: Path to a single YAML configuration file (default: `tools.yaml`)
|
||||||
|
|
||||||
**Multiple Files:**
|
**Multiple Files:**
|
||||||
|
|
||||||
- `--tools-files`: Comma-separated list of YAML files to merge
|
- `--tools-files`: Comma-separated list of YAML files to merge
|
||||||
|
|
||||||
**Directory:**
|
**Directory:**
|
||||||
|
|
||||||
- `--tools-folder`: Directory containing YAML files to load and merge
|
- `--tools-folder`: Directory containing YAML files to load and merge
|
||||||
|
|
||||||
**Prebuilt Configurations:**
|
**Prebuilt Configurations:**
|
||||||
|
|
||||||
- `--prebuilt`: Use predefined configurations for specific database types (e.g.,
|
- `--prebuilt`: Use predefined configurations for specific database types (e.g.,
|
||||||
'bigquery', 'postgres', 'spanner'). See [Prebuilt Tools
|
'bigquery', 'postgres', 'spanner'). See [Prebuilt Tools
|
||||||
Reference](prebuilt-tools.md) for allowed values.
|
Reference](prebuilt-tools.md) for allowed values.
|
||||||
@@ -79,4 +85,4 @@ Toolbox enables dynamic reloading by default. To disable, use the
|
|||||||
|
|
||||||
To launch Toolbox's interactive UI, use the `--ui` flag. This allows you to test
|
To launch Toolbox's interactive UI, use the `--ui` flag. This allows you to test
|
||||||
tools and toolsets with features such as authorized parameters. To learn more,
|
tools and toolsets with features such as authorized parameters. To learn more,
|
||||||
visit [Toolbox UI](../how-to/toolbox-ui/index.md).
|
visit [Toolbox UI](../how-to/toolbox-ui/index.md).
|
||||||
|
|||||||
@@ -9,7 +9,11 @@ description: >
|
|||||||
A `prompt` represents a reusable prompt template that can be retrieved and used
|
A `prompt` represents a reusable prompt template that can be retrieved and used
|
||||||
by MCP clients.
|
by MCP clients.
|
||||||
|
|
||||||
A Prompt is essentially a template for a message or a series of messages that can be sent to a Large Language Model (LLM). The Toolbox server implements the `prompts/list` and `prompts/get` methods from the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/docs/getting-started/intro) specification, allowing clients to discover and retrieve these prompts.
|
A Prompt is essentially a template for a message or a series of messages that
|
||||||
|
can be sent to a Large Language Model (LLM). The Toolbox server implements the
|
||||||
|
`prompts/list` and `prompts/get` methods from the [Model Context Protocol
|
||||||
|
(MCP)](https://modelcontextprotocol.io/docs/getting-started/intro)
|
||||||
|
specification, allowing clients to discover and retrieve these prompts.
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
prompts:
|
prompts:
|
||||||
@@ -24,19 +28,19 @@ prompts:
|
|||||||
|
|
||||||
## Prompt Schema
|
## Prompt Schema
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
| --- | --- | --- | --- |
|
|-------------|--------------------------------|--------------|--------------------------------------------------------------------------|
|
||||||
| description | string | No | A brief explanation of what the prompt does. |
|
| description | string | No | A brief explanation of what the prompt does. |
|
||||||
| kind | string | No | The kind of prompt. Defaults to `"custom"`. |
|
| kind | string | No | The kind of prompt. Defaults to `"custom"`. |
|
||||||
| messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. |
|
| messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. |
|
||||||
| arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content.|
|
| arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content. |
|
||||||
|
|
||||||
## Message Schema
|
## Message Schema
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
| --- | --- | --- | --- |
|
|-----------|----------|--------------|--------------------------------------------------------------------------------------------------------|
|
||||||
| role | string | No | The role of the sender. Can be `"user"` or `"assistant"`. Defaults to `"user"`. |
|
| role | string | No | The role of the sender. Can be `"user"` or `"assistant"`. Defaults to `"user"`. |
|
||||||
| content | string | Yes | The text of the message. You can include placeholders for arguments using `{{.argument_name}}` syntax. |
|
| content | string | Yes | The text of the message. You can include placeholders for arguments using `{{.argument_name}}` syntax. |
|
||||||
|
|
||||||
## Argument Schema
|
## Argument Schema
|
||||||
|
|
||||||
@@ -45,11 +49,17 @@ type. If the `type` field is not specified, it will default to `string`.
|
|||||||
|
|
||||||
## Usage with Gemini CLI
|
## Usage with Gemini CLI
|
||||||
|
|
||||||
Prompts defined in your `tools.yaml` can be seamlessly integrated with the Gemini CLI to create [custom slash commands](https://github.com/google-gemini/gemini-cli/blob/main/docs/tools/mcp-server.md#mcp-prompts-as-slash-commands). The workflow is as follows:
|
Prompts defined in your `tools.yaml` can be seamlessly integrated with the
|
||||||
|
Gemini CLI to create [custom slash
|
||||||
|
commands](https://github.com/google-gemini/gemini-cli/blob/main/docs/tools/mcp-server.md#mcp-prompts-as-slash-commands).
|
||||||
|
The workflow is as follows:
|
||||||
|
|
||||||
1. **Discovery:** When the Gemini CLI connects to your Toolbox server, it automatically calls `prompts/list` to discover all available prompts.
|
1. **Discovery:** When the Gemini CLI connects to your Toolbox server, it
|
||||||
|
automatically calls `prompts/list` to discover all available prompts.
|
||||||
|
|
||||||
2. **Conversion:** Each discovered prompt is converted into a corresponding slash command. For example, a prompt named `code_review` becomes the `/code_review` command in the CLI.
|
2. **Conversion:** Each discovered prompt is converted into a corresponding
|
||||||
|
slash command. For example, a prompt named `code_review` becomes the
|
||||||
|
`/code_review` command in the CLI.
|
||||||
|
|
||||||
3. **Execution:** You can execute the command as follows:
|
3. **Execution:** You can execute the command as follows:
|
||||||
|
|
||||||
@@ -65,6 +75,7 @@ Prompts defined in your `tools.yaml` can be seamlessly integrated with the Gemin
|
|||||||
Please review the following code for quality, correctness, and potential improvements: \ndef hello():\n print('world')
|
Please review the following code for quality, correctness, and potential improvements: \ndef hello():\n print('world')
|
||||||
```
|
```
|
||||||
|
|
||||||
5. **Response:** This completed prompt is then sent to the Gemini model, and the model's response is displayed back to you in the CLI.
|
5. **Response:** This completed prompt is then sent to the Gemini model, and the
|
||||||
|
model's response is displayed back to you in the CLI.
|
||||||
|
|
||||||
## Kinds of prompts
|
## Kinds of prompts
|
||||||
|
|||||||
@@ -52,12 +52,12 @@ prompts:
|
|||||||
|
|
||||||
### Prompt Schema
|
### Prompt Schema
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
| --- | --- | --- | --- |
|
|-------------|--------------------------------|--------------|--------------------------------------------------------------------------|
|
||||||
| kind | string | No | The kind of prompt. Must be `"custom"`. |
|
| kind | string | No | The kind of prompt. Must be `"custom"`. |
|
||||||
| description | string | No | A brief explanation of what the prompt does. |
|
| description | string | No | A brief explanation of what the prompt does. |
|
||||||
| messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. |
|
| messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. |
|
||||||
| arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content.|
|
| arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content. |
|
||||||
|
|
||||||
### Message Schema
|
### Message Schema
|
||||||
|
|
||||||
@@ -66,4 +66,3 @@ Refer to the default prompt [Message Schema](../_index.md#message-schema).
|
|||||||
### Argument Schema
|
### Argument Schema
|
||||||
|
|
||||||
Refer to the default prompt [Argument Schema](../_index.md#argument-schema).
|
Refer to the default prompt [Argument Schema](../_index.md#argument-schema).
|
||||||
|
|
||||||
|
|||||||
@@ -52,7 +52,7 @@ cluster][alloydb-free-trial].
|
|||||||
List schemas in an AlloyDB for PostgreSQL database.
|
List schemas in an AlloyDB for PostgreSQL database.
|
||||||
|
|
||||||
- [`postgres-database-overview`](../tools/postgres/postgres-database-overview.md)
|
- [`postgres-database-overview`](../tools/postgres/postgres-database-overview.md)
|
||||||
Fetches the current state of the PostgreSQL server.
|
Fetches the current state of the PostgreSQL server.
|
||||||
|
|
||||||
- [`postgres-list-triggers`](../tools/postgres/postgres-list-triggers.md)
|
- [`postgres-list-triggers`](../tools/postgres/postgres-list-triggers.md)
|
||||||
List triggers in an AlloyDB for PostgreSQL database.
|
List triggers in an AlloyDB for PostgreSQL database.
|
||||||
|
|||||||
@@ -8,7 +8,10 @@ description: >
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
[Apache Cassandra][cassandra-docs] is a NoSQL distributed database. By design, NoSQL databases are lightweight, open-source, non-relational, and largely distributed. Counted among their strengths are horizontal scalability, distributed architectures, and a flexible approach to schema definition.
|
[Apache Cassandra][cassandra-docs] is a NoSQL distributed database. By design,
|
||||||
|
NoSQL databases are lightweight, open-source, non-relational, and largely
|
||||||
|
distributed. Counted among their strengths are horizontal scalability,
|
||||||
|
distributed architectures, and a flexible approach to schema definition.
|
||||||
|
|
||||||
[cassandra-docs]: https://cassandra.apache.org/
|
[cassandra-docs]: https://cassandra.apache.org/
|
||||||
|
|
||||||
@@ -17,7 +20,6 @@ description: >
|
|||||||
- [`cassandra-cql`](../tools/cassandra/cassandra-cql.md)
|
- [`cassandra-cql`](../tools/cassandra/cassandra-cql.md)
|
||||||
Run parameterized CQL queries in Cassandra.
|
Run parameterized CQL queries in Cassandra.
|
||||||
|
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
@@ -43,15 +45,15 @@ instead of hardcoding your secrets into the configuration file.
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|------------------------|:---------:|:------------:|-------------------------------------------------------------------------------------------------------|
|
|------------------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "cassandra". |
|
| kind | string | true | Must be "cassandra". |
|
||||||
| hosts | string[] | true | List of IP addresses to connect to (e.g., ["192.168.1.1:9042", "192.168.1.2:9042","192.168.1.3:9042"]). The default port is 9042 if not specified. |
|
| hosts | string[] | true | List of IP addresses to connect to (e.g., ["192.168.1.1:9042", "192.168.1.2:9042","192.168.1.3:9042"]). The default port is 9042 if not specified. |
|
||||||
| keyspace | string | true | Name of the Cassandra keyspace to connect to (e.g., "my_keyspace"). |
|
| keyspace | string | true | Name of the Cassandra keyspace to connect to (e.g., "my_keyspace"). |
|
||||||
| protoVersion | integer | false | Protocol version for the Cassandra connection (e.g., 4). |
|
| protoVersion | integer | false | Protocol version for the Cassandra connection (e.g., 4). |
|
||||||
| username | string | false | Name of the Cassandra user to connect as (e.g., "my-cassandra-user"). |
|
| username | string | false | Name of the Cassandra user to connect as (e.g., "my-cassandra-user"). |
|
||||||
| password | string | false | Password of the Cassandra user (e.g., "my-password"). |
|
| password | string | false | Password of the Cassandra user (e.g., "my-password"). |
|
||||||
| caPath | string | false | Path to the CA certificate for SSL/TLS (e.g., "/path/to/ca.crt"). |
|
| caPath | string | false | Path to the CA certificate for SSL/TLS (e.g., "/path/to/ca.crt"). |
|
||||||
| certPath | string | false | Path to the client certificate for SSL/TLS (e.g., "/path/to/client.crt"). |
|
| certPath | string | false | Path to the client certificate for SSL/TLS (e.g., "/path/to/client.crt"). |
|
||||||
| keyPath | string | false | Path to the client key for SSL/TLS (e.g., "/path/to/client.key"). |
|
| keyPath | string | false | Path to the client key for SSL/TLS (e.g., "/path/to/client.key"). |
|
||||||
| enableHostVerification | boolean | false | Enable host verification for SSL/TLS (e.g., true). By default, host verification is disabled. |
|
| enableHostVerification | boolean | false | Enable host verification for SSL/TLS (e.g., true). By default, host verification is disabled. |
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ description: >
|
|||||||
- [`clickhouse-sql`](../tools/clickhouse/clickhouse-sql.md)
|
- [`clickhouse-sql`](../tools/clickhouse/clickhouse-sql.md)
|
||||||
Execute SQL queries as prepared statements in ClickHouse.
|
Execute SQL queries as prepared statements in ClickHouse.
|
||||||
|
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
### Database User
|
### Database User
|
||||||
|
|||||||
@@ -23,9 +23,10 @@ A dataset is a container in your Google Cloud project that holds modality-specif
|
|||||||
healthcare data. Datasets contain other data stores, such as FHIR stores and DICOM
|
healthcare data. Datasets contain other data stores, such as FHIR stores and DICOM
|
||||||
stores, which in turn hold their own types of healthcare data.
|
stores, which in turn hold their own types of healthcare data.
|
||||||
|
|
||||||
A single dataset can contain one or many data stores, and those stores can all service
|
A single dataset can contain one or many data stores, and those stores can all
|
||||||
the same modality or different modalities as application needs dictate. Using multiple
|
service the same modality or different modalities as application needs dictate.
|
||||||
stores in the same dataset might be appropriate in various situations.
|
Using multiple stores in the same dataset might be appropriate in various
|
||||||
|
situations.
|
||||||
|
|
||||||
If you are new to the Cloud Healthcare API, you can try to
|
If you are new to the Cloud Healthcare API, you can try to
|
||||||
[create and view datasets and stores using curl][healthcare-quickstart-curl].
|
[create and view datasets and stores using curl][healthcare-quickstart-curl].
|
||||||
@@ -85,8 +86,9 @@ If you are new to the Cloud Healthcare API, you can try to
|
|||||||
|
|
||||||
### IAM Permissions
|
### IAM Permissions
|
||||||
|
|
||||||
The Cloud Healthcare API uses [Identity and Access Management (IAM)][iam-overview] to control
|
The Cloud Healthcare API uses [Identity and Access Management
|
||||||
user and group access to Cloud Healthcare resources like projects, datasets, and stores.
|
(IAM)][iam-overview] to control user and group access to Cloud Healthcare
|
||||||
|
resources like projects, datasets, and stores.
|
||||||
|
|
||||||
### Authentication via Application Default Credentials (ADC)
|
### Authentication via Application Default Credentials (ADC)
|
||||||
|
|
||||||
@@ -96,9 +98,9 @@ By **default**, Toolbox will use your [Application Default Credentials
|
|||||||
|
|
||||||
When using this method, you need to ensure the IAM identity associated with your
|
When using this method, you need to ensure the IAM identity associated with your
|
||||||
ADC (such as a service account) has the correct permissions for the queries you
|
ADC (such as a service account) has the correct permissions for the queries you
|
||||||
intend to run. Common roles include `roles/healthcare.fhirResourceReader` (which includes
|
intend to run. Common roles include `roles/healthcare.fhirResourceReader` (which
|
||||||
permissions to read and search for FHIR resources) or `roles/healthcare.dicomViewer` (for
|
includes permissions to read and search for FHIR resources) or
|
||||||
retrieving DICOM images).
|
`roles/healthcare.dicomViewer` (for retrieving DICOM images).
|
||||||
Follow this [guide][set-adc] to set up your ADC.
|
Follow this [guide][set-adc] to set up your ADC.
|
||||||
|
|
||||||
### Authentication via User's OAuth Access Token
|
### Authentication via User's OAuth Access Token
|
||||||
@@ -106,8 +108,8 @@ Follow this [guide][set-adc] to set up your ADC.
|
|||||||
If the `useClientOAuth` parameter is set to `true`, Toolbox will instead use the
|
If the `useClientOAuth` parameter is set to `true`, Toolbox will instead use the
|
||||||
OAuth access token for authentication. This token is parsed from the
|
OAuth access token for authentication. This token is parsed from the
|
||||||
`Authorization` header passed in with the tool invocation request. This method
|
`Authorization` header passed in with the tool invocation request. This method
|
||||||
allows Toolbox to make queries to the [Cloud Healthcare API][healthcare-docs] on behalf of the
|
allows Toolbox to make queries to the [Cloud Healthcare API][healthcare-docs] on
|
||||||
client or the end-user.
|
behalf of the client or the end-user.
|
||||||
|
|
||||||
When using this on-behalf-of authentication, you must ensure that the
|
When using this on-behalf-of authentication, you must ensure that the
|
||||||
identity used has been granted the correct IAM permissions.
|
identity used has been granted the correct IAM permissions.
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ Cloud Monitoring API](https://cloud.google.com/monitoring/api). This allows
|
|||||||
tools to access cloud monitoring metrics explorer and run promql queries.
|
tools to access cloud monitoring metrics explorer and run promql queries.
|
||||||
|
|
||||||
Authentication can be handled in two ways:
|
Authentication can be handled in two ways:
|
||||||
|
|
||||||
1. **Application Default Credentials (ADC):** By default, the source uses ADC
|
1. **Application Default Credentials (ADC):** By default, the source uses ADC
|
||||||
to authenticate with the API.
|
to authenticate with the API.
|
||||||
2. **Client-side OAuth:** If `useClientOAuth` is set to `true`, the source will
|
2. **Client-side OAuth:** If `useClientOAuth` is set to `true`, the source will
|
||||||
|
|||||||
@@ -48,7 +48,7 @@ to a database by following these instructions][csql-pg-quickstart].
|
|||||||
List schemas in a PostgreSQL database.
|
List schemas in a PostgreSQL database.
|
||||||
|
|
||||||
- [`postgres-database-overview`](../tools/postgres/postgres-database-overview.md)
|
- [`postgres-database-overview`](../tools/postgres/postgres-database-overview.md)
|
||||||
Fetches the current state of the PostgreSQL server.
|
Fetches the current state of the PostgreSQL server.
|
||||||
|
|
||||||
- [`postgres-list-triggers`](../tools/postgres/postgres-list-triggers.md)
|
- [`postgres-list-triggers`](../tools/postgres/postgres-list-triggers.md)
|
||||||
List triggers in a PostgreSQL database.
|
List triggers in a PostgreSQL database.
|
||||||
@@ -65,7 +65,6 @@ to a database by following these instructions][csql-pg-quickstart].
|
|||||||
MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/cloud_sql_pg_mcp/)
|
MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/cloud_sql_pg_mcp/)
|
||||||
Connect your IDE to Cloud SQL for Postgres using Toolbox.
|
Connect your IDE to Cloud SQL for Postgres using Toolbox.
|
||||||
|
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
### IAM Permissions
|
### IAM Permissions
|
||||||
|
|||||||
@@ -30,24 +30,25 @@ sources:
|
|||||||
```
|
```
|
||||||
|
|
||||||
{{< notice note >}}
|
{{< notice note >}}
|
||||||
For more details about alternate addresses and custom ports refer to [Managing Connections](https://docs.couchbase.com/java-sdk/current/howtos/managing-connections.html).
|
For more details about alternate addresses and custom ports refer to [Managing
|
||||||
|
Connections](https://docs.couchbase.com/java-sdk/current/howtos/managing-connections.html).
|
||||||
{{< /notice >}}
|
{{< /notice >}}
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|----------------------|:--------:|:------------:|---------------------------------------------------------|
|
|----------------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "couchbase". |
|
| kind | string | true | Must be "couchbase". |
|
||||||
| connectionString | string | true | Connection string for the Couchbase cluster. |
|
| connectionString | string | true | Connection string for the Couchbase cluster. |
|
||||||
| bucket | string | true | Name of the bucket to connect to. |
|
| bucket | string | true | Name of the bucket to connect to. |
|
||||||
| scope | string | true | Name of the scope within the bucket. |
|
| scope | string | true | Name of the scope within the bucket. |
|
||||||
| username | string | false | Username for authentication. |
|
| username | string | false | Username for authentication. |
|
||||||
| password | string | false | Password for authentication. |
|
| password | string | false | Password for authentication. |
|
||||||
| clientCert | string | false | Path to client certificate file for TLS authentication. |
|
| clientCert | string | false | Path to client certificate file for TLS authentication. |
|
||||||
| clientCertPassword | string | false | Password for the client certificate. |
|
| clientCertPassword | string | false | Password for the client certificate. |
|
||||||
| clientKey | string | false | Path to client key file for TLS authentication. |
|
| clientKey | string | false | Path to client key file for TLS authentication. |
|
||||||
| clientKeyPassword | string | false | Password for the client key. |
|
| clientKeyPassword | string | false | Password for the client key. |
|
||||||
| caCert | string | false | Path to CA certificate file. |
|
| caCert | string | false | Path to CA certificate file. |
|
||||||
| noSslVerify | boolean | false | If true, skip server certificate verification. **Warning:** This option should only be used in development or testing environments. Disabling SSL verification poses significant security risks in production as it makes your connection vulnerable to man-in-the-middle attacks. |
|
| noSslVerify | boolean | false | If true, skip server certificate verification. **Warning:** This option should only be used in development or testing environments. Disabling SSL verification poses significant security risks in production as it makes your connection vulnerable to man-in-the-middle attacks. |
|
||||||
| profile | string | false | Name of the connection profile to apply. |
|
| profile | string | false | Name of the connection profile to apply. |
|
||||||
| queryScanConsistency | integer | false | Query scan consistency. Controls the consistency guarantee for index scanning. Values: 1 for "not_bounded" (fastest option, but results may not include the most recent operations), 2 for "request_plus" (highest consistency level, includes all operations up until the query started, but incurs a performance penalty). If not specified, defaults to the Couchbase Go SDK default. |
|
| queryScanConsistency | integer | false | Query scan consistency. Controls the consistency guarantee for index scanning. Values: 1 for "not_bounded" (fastest option, but results may not include the most recent operations), 2 for "request_plus" (highest consistency level, includes all operations up until the query started, but incurs a performance penalty). If not specified, defaults to the Couchbase Go SDK default. |
|
||||||
|
|||||||
@@ -321,4 +321,4 @@ Logical operators are case-sensitive. `OR` and `AND` are acceptable whereas `or`
|
|||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------|
|
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "dataplex". |
|
| kind | string | true | Must be "dataplex". |
|
||||||
| project | string | true | ID of the GCP project used for quota and billing purposes (e.g. "my-project-id").|
|
| project | string | true | ID of the GCP project used for quota and billing purposes (e.g. "my-project-id").|
|
||||||
|
|||||||
@@ -10,22 +10,27 @@ description: >
|
|||||||
|
|
||||||
# Elasticsearch Source
|
# Elasticsearch Source
|
||||||
|
|
||||||
[Elasticsearch][elasticsearch-docs] is a distributed, free and open search and analytics engine
|
[Elasticsearch][elasticsearch-docs] is a distributed, free and open search and
|
||||||
for all types of data, including textual, numerical, geospatial, structured,
|
analytics engine for all types of data, including textual, numerical,
|
||||||
and unstructured.
|
geospatial, structured, and unstructured.
|
||||||
|
|
||||||
If you are new to Elasticsearch, you can learn how to
|
If you are new to Elasticsearch, you can learn how to
|
||||||
[set up a cluster and start indexing data][elasticsearch-quickstart].
|
[set up a cluster and start indexing data][elasticsearch-quickstart].
|
||||||
|
|
||||||
Elasticsearch uses [ES|QL][elasticsearch-esql] for querying data. ES|QL
|
Elasticsearch uses [ES|QL][elasticsearch-esql] for querying data. ES|QL
|
||||||
is a powerful query language that allows you to search and aggregate data in
|
is a powerful query language that allows you to search and aggregate data in
|
||||||
Elasticsearch.
|
Elasticsearch.
|
||||||
|
|
||||||
See the [official documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html) for more information.
|
See the [official
|
||||||
|
documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html)
|
||||||
|
for more information.
|
||||||
|
|
||||||
[elasticsearch-docs]: https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html
|
[elasticsearch-docs]:
|
||||||
[elasticsearch-quickstart]: https://www.elastic.co/guide/en/elasticsearch/reference/current/getting-started.html
|
https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html
|
||||||
[elasticsearch-esql]: https://www.elastic.co/guide/en/elasticsearch/reference/current/esql.html
|
[elasticsearch-quickstart]:
|
||||||
|
https://www.elastic.co/guide/en/elasticsearch/reference/current/getting-started.html
|
||||||
|
[elasticsearch-esql]:
|
||||||
|
https://www.elastic.co/guide/en/elasticsearch/reference/current/esql.html
|
||||||
|
|
||||||
## Available Tools
|
## Available Tools
|
||||||
|
|
||||||
@@ -44,9 +49,12 @@ ensure the API key has the correct permissions for the queries you intend to
|
|||||||
run. See [API key management][api-key-management] for more information on
|
run. See [API key management][api-key-management] for more information on
|
||||||
applying permissions to an API key.
|
applying permissions to an API key.
|
||||||
|
|
||||||
[api-key]: https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-create-api-key.html
|
[api-key]:
|
||||||
[set-api-key]: https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-create-api-key.html
|
https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-create-api-key.html
|
||||||
[api-key-management]: https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-api-key.html
|
[set-api-key]:
|
||||||
|
https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-create-api-key.html
|
||||||
|
[api-key-management]:
|
||||||
|
https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-api-key.html
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -61,8 +69,8 @@ sources:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------|
|
|-----------|:--------:|:------------:|--------------------------------------------|
|
||||||
| kind | string | true | Must be "elasticsearch". |
|
| kind | string | true | Must be "elasticsearch". |
|
||||||
| addresses | []string | true | List of Elasticsearch hosts to connect to. |
|
| addresses | []string | true | List of Elasticsearch hosts to connect to. |
|
||||||
| apikey | string | true | The API key to use for authentication. |
|
| apikey | string | true | The API key to use for authentication. |
|
||||||
|
|||||||
@@ -60,4 +60,4 @@ instead of hardcoding your secrets into the configuration file.
|
|||||||
| port | string | true | Port to connect to (e.g. "3050") |
|
| port | string | true | Port to connect to (e.g. "3050") |
|
||||||
| database | string | true | Path to the Firebird database file (e.g. "/var/lib/firebird/data/test.fdb"). |
|
| database | string | true | Path to the Firebird database file (e.g. "/var/lib/firebird/data/test.fdb"). |
|
||||||
| user | string | true | Name of the Firebird user to connect as (e.g. "SYSDBA"). |
|
| user | string | true | Name of the Firebird user to connect as (e.g. "SYSDBA"). |
|
||||||
| password | string | true | Password of the Firebird user (e.g. "masterkey"). |
|
| password | string | true | Password of the Firebird user (e.g. "masterkey"). |
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ permissions):
|
|||||||
- `roles/cloudaicompanion.user`
|
- `roles/cloudaicompanion.user`
|
||||||
- `roles/geminidataanalytics.dataAgentStatelessUser`
|
- `roles/geminidataanalytics.dataAgentStatelessUser`
|
||||||
|
|
||||||
To initialize the application default credential run `gcloud auth login --update-adc`
|
To initialize the application default credential run `gcloud auth login
|
||||||
in your environment before starting MCP Toolbox.
|
--update-adc` in your environment before starting MCP Toolbox.
|
||||||
|
|
||||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||||
|
|
||||||
@@ -81,7 +81,8 @@ The client id and client secret are seemingly random character sequences
|
|||||||
assigned by the looker server. If you are using Looker OAuth you don't need
|
assigned by the looker server. If you are using Looker OAuth you don't need
|
||||||
these settings
|
these settings
|
||||||
|
|
||||||
The `project` and `location` fields are utilized **only** when using the conversational analytics tool.
|
The `project` and `location` fields are utilized **only** when using the
|
||||||
|
conversational analytics tool.
|
||||||
|
|
||||||
{{< notice tip >}}
|
{{< notice tip >}}
|
||||||
Use environment variable replacement with the format ${ENV_NAME}
|
Use environment variable replacement with the format ${ENV_NAME}
|
||||||
|
|||||||
@@ -8,17 +8,32 @@ description: >
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
[MindsDB][mindsdb-docs] is an AI federated database in the world. It allows you to combine information from hundreds of datasources as if they were SQL, supporting joins across datasources and enabling you to query all unstructured data as if it were structured.
|
[MindsDB][mindsdb-docs] is an AI federated database in the world. It allows you
|
||||||
|
to combine information from hundreds of datasources as if they were SQL,
|
||||||
|
supporting joins across datasources and enabling you to query all unstructured
|
||||||
|
data as if it were structured.
|
||||||
|
|
||||||
MindsDB translates MySQL queries into whatever API is needed - whether it's REST APIs, GraphQL, or native database protocols. This means you can write standard SQL queries and MindsDB automatically handles the translation to APIs like Salesforce, Jira, GitHub, email systems, MongoDB, and hundreds of other datasources.
|
MindsDB translates MySQL queries into whatever API is needed - whether it's REST
|
||||||
|
APIs, GraphQL, or native database protocols. This means you can write standard
|
||||||
|
SQL queries and MindsDB automatically handles the translation to APIs like
|
||||||
|
Salesforce, Jira, GitHub, email systems, MongoDB, and hundreds of other
|
||||||
|
datasources.
|
||||||
|
|
||||||
MindsDB also enables you to use ML frameworks to train and use models as virtual tables from the data in those datasources. With MindsDB, the GenAI Toolbox can now expand to hundreds of datasources and leverage all of MindsDB's capabilities on ML and unstructured data.
|
MindsDB also enables you to use ML frameworks to train and use models as virtual
|
||||||
|
tables from the data in those datasources. With MindsDB, the GenAI Toolbox can
|
||||||
|
now expand to hundreds of datasources and leverage all of MindsDB's capabilities
|
||||||
|
on ML and unstructured data.
|
||||||
|
|
||||||
**Key Features:**
|
**Key Features:**
|
||||||
- **Federated Database**: Connect and query hundreds of datasources through a single SQL interface
|
|
||||||
- **Cross-Datasource Joins**: Perform joins across different datasources seamlessly
|
- **Federated Database**: Connect and query hundreds of datasources through a
|
||||||
- **API Translation**: Automatically translates MySQL queries into REST APIs, GraphQL, and native protocols
|
single SQL interface
|
||||||
- **Unstructured Data Support**: Query unstructured data as if it were structured
|
- **Cross-Datasource Joins**: Perform joins across different datasources
|
||||||
|
seamlessly
|
||||||
|
- **API Translation**: Automatically translates MySQL queries into REST APIs,
|
||||||
|
GraphQL, and native protocols
|
||||||
|
- **Unstructured Data Support**: Query unstructured data as if it were
|
||||||
|
structured
|
||||||
- **ML as Virtual Tables**: Train and use ML models as virtual tables
|
- **ML as Virtual Tables**: Train and use ML models as virtual tables
|
||||||
- **MySQL Wire Protocol**: Compatible with standard MySQL clients and tools
|
- **MySQL Wire Protocol**: Compatible with standard MySQL clients and tools
|
||||||
|
|
||||||
@@ -30,6 +45,7 @@ MindsDB also enables you to use ML frameworks to train and use models as virtual
|
|||||||
MindsDB supports hundreds of datasources, including:
|
MindsDB supports hundreds of datasources, including:
|
||||||
|
|
||||||
### **Business Applications**
|
### **Business Applications**
|
||||||
|
|
||||||
- **Salesforce**: Query leads, opportunities, accounts, and custom objects
|
- **Salesforce**: Query leads, opportunities, accounts, and custom objects
|
||||||
- **Jira**: Access issues, projects, workflows, and team data
|
- **Jira**: Access issues, projects, workflows, and team data
|
||||||
- **GitHub**: Query repositories, commits, pull requests, and issues
|
- **GitHub**: Query repositories, commits, pull requests, and issues
|
||||||
@@ -37,22 +53,23 @@ MindsDB supports hundreds of datasources, including:
|
|||||||
- **HubSpot**: Query contacts, companies, deals, and marketing data
|
- **HubSpot**: Query contacts, companies, deals, and marketing data
|
||||||
|
|
||||||
### **Databases & Storage**
|
### **Databases & Storage**
|
||||||
|
|
||||||
- **MongoDB**: Query NoSQL collections as structured tables
|
- **MongoDB**: Query NoSQL collections as structured tables
|
||||||
- **Redis**: Key-value stores and caching layers
|
- **Redis**: Key-value stores and caching layers
|
||||||
- **Elasticsearch**: Search and analytics data
|
- **Elasticsearch**: Search and analytics data
|
||||||
- **S3/Google Cloud Storage**: File storage and data lakes
|
- **S3/Google Cloud Storage**: File storage and data lakes
|
||||||
|
|
||||||
### **Communication & Email**
|
### **Communication & Email**
|
||||||
|
|
||||||
- **Gmail/Outlook**: Query emails, attachments, and metadata
|
- **Gmail/Outlook**: Query emails, attachments, and metadata
|
||||||
- **Slack**: Access workspace data and conversations
|
- **Slack**: Access workspace data and conversations
|
||||||
- **Microsoft Teams**: Team communications and files
|
- **Microsoft Teams**: Team communications and files
|
||||||
- **Discord**: Server data and message history
|
- **Discord**: Server data and message history
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## Example Queries
|
## Example Queries
|
||||||
|
|
||||||
### Cross-Datasource Analytics
|
### Cross-Datasource Analytics
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Join Salesforce opportunities with GitHub activity
|
-- Join Salesforce opportunities with GitHub activity
|
||||||
SELECT
|
SELECT
|
||||||
@@ -67,6 +84,7 @@ GROUP BY s.opportunity_name, s.amount, g.repository_name;
|
|||||||
```
|
```
|
||||||
|
|
||||||
### Email & Communication Analysis
|
### Email & Communication Analysis
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Analyze email patterns with Slack activity
|
-- Analyze email patterns with Slack activity
|
||||||
SELECT
|
SELECT
|
||||||
@@ -81,6 +99,7 @@ GROUP BY e.sender, e.subject, s.channel_name;
|
|||||||
```
|
```
|
||||||
|
|
||||||
### ML Model Predictions
|
### ML Model Predictions
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Use ML model to predict customer churn
|
-- Use ML model to predict customer churn
|
||||||
SELECT
|
SELECT
|
||||||
@@ -96,9 +115,13 @@ WHERE predicted_churn_probability > 0.8;
|
|||||||
|
|
||||||
### Database User
|
### Database User
|
||||||
|
|
||||||
This source uses standard MySQL authentication since MindsDB implements the MySQL wire protocol. You will need to [create a MindsDB user][mindsdb-users] to login to the database with. If MindsDB is configured without authentication, you can omit the password field.
|
This source uses standard MySQL authentication since MindsDB implements the
|
||||||
|
MySQL wire protocol. You will need to [create a MindsDB user][mindsdb-users] to
|
||||||
|
login to the database with. If MindsDB is configured without authentication, you
|
||||||
|
can omit the password field.
|
||||||
|
|
||||||
[mindsdb-users]: https://docs.mindsdb.com/
|
[mindsdb-users]: https://docs.mindsdb.com/
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
@@ -136,26 +159,32 @@ instead of hardcoding your secrets into the configuration file.
|
|||||||
|
|
||||||
With MindsDB integration, you can:
|
With MindsDB integration, you can:
|
||||||
|
|
||||||
- **Query Multiple Datasources**: Connect to databases, APIs, file systems, and more through a single SQL interface
|
- **Query Multiple Datasources**: Connect to databases, APIs, file systems, and
|
||||||
- **Cross-Datasource Analytics**: Perform joins and analytics across different data sources
|
more through a single SQL interface
|
||||||
- **ML Model Integration**: Use trained ML models as virtual tables for predictions and insights
|
- **Cross-Datasource Analytics**: Perform joins and analytics across different
|
||||||
- **Unstructured Data Processing**: Query documents, images, and other unstructured data as structured tables
|
data sources
|
||||||
- **Real-time Predictions**: Get real-time predictions from ML models through SQL queries
|
- **ML Model Integration**: Use trained ML models as virtual tables for
|
||||||
- **API Abstraction**: Write SQL queries that automatically translate to REST APIs, GraphQL, and native protocols
|
predictions and insights
|
||||||
|
- **Unstructured Data Processing**: Query documents, images, and other
|
||||||
|
unstructured data as structured tables
|
||||||
|
- **Real-time Predictions**: Get real-time predictions from ML models through
|
||||||
|
SQL queries
|
||||||
|
- **API Abstraction**: Write SQL queries that automatically translate to REST
|
||||||
|
APIs, GraphQL, and native protocols
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
|
|--------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "mindsdb". |
|
| kind | string | true | Must be "mindsdb". |
|
||||||
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
||||||
| port | string | true | Port to connect to (e.g. "3306"). |
|
| port | string | true | Port to connect to (e.g. "3306"). |
|
||||||
| database | string | true | Name of the MindsDB database to connect to (e.g. "my_db"). |
|
| database | string | true | Name of the MindsDB database to connect to (e.g. "my_db"). |
|
||||||
| user | string | true | Name of the MindsDB user to connect as (e.g. "my-mindsdb-user"). |
|
| user | string | true | Name of the MindsDB user to connect as (e.g. "my-mindsdb-user"). |
|
||||||
| password | string | false | Password of the MindsDB user (e.g. "my-password"). Optional if MindsDB is configured without authentication. |
|
| password | string | false | Password of the MindsDB user (e.g. "my-password"). Optional if MindsDB is configured without authentication. |
|
||||||
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
|
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
|
||||||
|
|
||||||
## Resources
|
## Resources
|
||||||
|
|
||||||
- [MindsDB Documentation][mindsdb-docs] - Official documentation and guides
|
- [MindsDB Documentation][mindsdb-docs] - Official documentation and guides
|
||||||
- [MindsDB GitHub][mindsdb-github] - Source code and community
|
- [MindsDB GitHub][mindsdb-github] - Source code and community
|
||||||
|
|||||||
@@ -33,7 +33,8 @@ amount of data through a structured format.
|
|||||||
This source only uses standard authentication. You will need to [create a
|
This source only uses standard authentication. You will need to [create a
|
||||||
SQL Server user][mssql-users] to login to the database with.
|
SQL Server user][mssql-users] to login to the database with.
|
||||||
|
|
||||||
[mssql-users]: https://learn.microsoft.com/en-us/sql/relational-databases/security/authentication-access/create-a-database-user?view=sql-server-ver16
|
[mssql-users]:
|
||||||
|
https://learn.microsoft.com/en-us/sql/relational-databases/security/authentication-access/create-a-database-user?view=sql-server-ver16
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -56,12 +57,12 @@ instead of hardcoding your secrets into the configuration file.
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "mssql". |
|
| kind | string | true | Must be "mssql". |
|
||||||
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
||||||
| port | string | true | Port to connect to (e.g. "1433"). |
|
| port | string | true | Port to connect to (e.g. "1433"). |
|
||||||
| database | string | true | Name of the SQL Server database to connect to (e.g. "my_db"). |
|
| database | string | true | Name of the SQL Server database to connect to (e.g. "my_db"). |
|
||||||
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-user"). |
|
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-user"). |
|
||||||
| password | string | true | Password of the SQL Server user (e.g. "my-password"). |
|
| password | string | true | Password of the SQL Server user (e.g. "my-password"). |
|
||||||
| encrypt | string | false | Encryption level for data transmitted between the client and server (e.g., "strict"). If not specified, defaults to the [github.com/microsoft/go-mssqldb](https://github.com/microsoft/go-mssqldb?tab=readme-ov-file#common-parameters) package's default encrypt value. |
|
| encrypt | string | false | Encryption level for data transmitted between the client and server (e.g., "strict"). If not specified, defaults to the [github.com/microsoft/go-mssqldb](https://github.com/microsoft/go-mssqldb?tab=readme-ov-file#common-parameters) package's default encrypt value. |
|
||||||
|
|||||||
@@ -82,4 +82,4 @@ suitable for large-scale applications.
|
|||||||
### Strong Consistency
|
### Strong Consistency
|
||||||
|
|
||||||
OceanBase provides strong consistency guarantees, ensuring that all transactions
|
OceanBase provides strong consistency guarantees, ensuring that all transactions
|
||||||
are ACID compliant.
|
are ACID compliant.
|
||||||
|
|||||||
@@ -8,7 +8,10 @@ description: >
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
[Oracle Database][oracle-docs] is a multi-model database management system produced and marketed by Oracle Corporation. It is commonly used for running online transaction processing (OLTP), data warehousing (DW), and mixed (OLTP & DW) database workloads.
|
[Oracle Database][oracle-docs] is a multi-model database management system
|
||||||
|
produced and marketed by Oracle Corporation. It is commonly used for running
|
||||||
|
online transaction processing (OLTP), data warehousing (DW), and mixed (OLTP &
|
||||||
|
DW) database workloads.
|
||||||
|
|
||||||
[oracle-docs]: https://www.oracle.com/database/
|
[oracle-docs]: https://www.oracle.com/database/
|
||||||
|
|
||||||
@@ -24,33 +27,44 @@ description: >
|
|||||||
|
|
||||||
### Database User
|
### Database User
|
||||||
|
|
||||||
This source uses standard authentication. You will need to [create an Oracle user][oracle-users] to log in to the database with the necessary permissions.
|
This source uses standard authentication. You will need to [create an Oracle
|
||||||
|
user][oracle-users] to log in to the database with the necessary permissions.
|
||||||
|
|
||||||
[oracle-users]:
|
[oracle-users]:
|
||||||
https://docs.oracle.com/en/database/oracle/oracle-database/21/sqlrf/CREATE-USER.html
|
https://docs.oracle.com/en/database/oracle/oracle-database/21/sqlrf/CREATE-USER.html
|
||||||
|
|
||||||
## Connection Methods
|
## Connection Methods
|
||||||
|
|
||||||
You can configure the connection to your Oracle database using one of the following three methods. **You should only use one method** in your source configuration.
|
You can configure the connection to your Oracle database using one of the
|
||||||
|
following three methods. **You should only use one method** in your source
|
||||||
|
configuration.
|
||||||
|
|
||||||
### Basic Connection (Host/Port/Service Name)
|
### Basic Connection (Host/Port/Service Name)
|
||||||
|
|
||||||
This is the most straightforward method, where you provide the connection details as separate fields:
|
This is the most straightforward method, where you provide the connection
|
||||||
|
details as separate fields:
|
||||||
|
|
||||||
- `host`: The IP address or hostname of the database server.
|
- `host`: The IP address or hostname of the database server.
|
||||||
- `port`: The port number the Oracle listener is running on (typically 1521).
|
- `port`: The port number the Oracle listener is running on (typically 1521).
|
||||||
- `serviceName`: The service name for the database instance you wish to connect to.
|
- `serviceName`: The service name for the database instance you wish to connect
|
||||||
|
to.
|
||||||
|
|
||||||
### Connection String
|
### Connection String
|
||||||
|
|
||||||
As an alternative, you can provide all the connection details in a single `connectionString`. This is a convenient way to consolidate the connection information. The typical format is `hostname:port/servicename`.
|
As an alternative, you can provide all the connection details in a single
|
||||||
|
`connectionString`. This is a convenient way to consolidate the connection
|
||||||
|
information. The typical format is `hostname:port/servicename`.
|
||||||
|
|
||||||
### TNS Alias
|
### TNS Alias
|
||||||
|
|
||||||
For environments that use a `tnsnames.ora` configuration file, you can connect using a TNS (Transparent Network Substrate) alias.
|
For environments that use a `tnsnames.ora` configuration file, you can connect
|
||||||
|
using a TNS (Transparent Network Substrate) alias.
|
||||||
|
|
||||||
- `tnsAlias`: Specify the alias name defined in your `tnsnames.ora` file.
|
- `tnsAlias`: Specify the alias name defined in your `tnsnames.ora` file.
|
||||||
- `tnsAdmin` (Optional): If your configuration file is not in a standard location, you can use this field to provide the path to the directory containing it. This setting will override the `TNS_ADMIN` environment variable.
|
- `tnsAdmin` (Optional): If your configuration file is not in a standard
|
||||||
|
location, you can use this field to provide the path to the directory
|
||||||
|
containing it. This setting will override the `TNS_ADMIN` environment
|
||||||
|
variable.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
|
|||||||
@@ -42,7 +42,7 @@ reputation for reliability, feature robustness, and performance.
|
|||||||
List schemas in a PostgreSQL database.
|
List schemas in a PostgreSQL database.
|
||||||
|
|
||||||
- [`postgres-database-overview`](../tools/postgres/postgres-database-overview.md)
|
- [`postgres-database-overview`](../tools/postgres/postgres-database-overview.md)
|
||||||
Fetches the current state of the PostgreSQL server.
|
Fetches the current state of the PostgreSQL server.
|
||||||
|
|
||||||
- [`postgres-list-triggers`](../tools/postgres/postgres-list-triggers.md)
|
- [`postgres-list-triggers`](../tools/postgres/postgres-list-triggers.md)
|
||||||
List triggers in a PostgreSQL database.
|
List triggers in a PostgreSQL database.
|
||||||
|
|||||||
@@ -8,18 +8,22 @@ description: >
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
[SingleStore][singlestore-docs] is a distributed SQL database built to power intelligent applications. It is both relational and multi-model, enabling developers to easily build and scale applications and workloads.
|
[SingleStore][singlestore-docs] is a distributed SQL database built to power
|
||||||
|
intelligent applications. It is both relational and multi-model, enabling
|
||||||
|
developers to easily build and scale applications and workloads.
|
||||||
|
|
||||||
SingleStore is built around Universal Storage which combines in-memory rowstore and on-disk columnstore data formats to deliver a single table type that is optimized to handle both transactional and analytical workloads.
|
SingleStore is built around Universal Storage which combines in-memory rowstore
|
||||||
|
and on-disk columnstore data formats to deliver a single table type that is
|
||||||
|
optimized to handle both transactional and analytical workloads.
|
||||||
|
|
||||||
[singlestore-docs]: https://docs.singlestore.com/
|
[singlestore-docs]: https://docs.singlestore.com/
|
||||||
|
|
||||||
## Available Tools
|
## Available Tools
|
||||||
|
|
||||||
- [`singlestore-sql`](../tools/singlestore/singlestore-sql.md)
|
- [`singlestore-sql`](../tools/singlestore/singlestore-sql.md)
|
||||||
Execute pre-defined prepared SQL queries in SingleStore.
|
Execute pre-defined prepared SQL queries in SingleStore.
|
||||||
|
|
||||||
- [`singlestore-execute-sql`](../tools/singlestore/singlestore-execute-sql.md)
|
- [`singlestore-execute-sql`](../tools/singlestore/singlestore-execute-sql.md)
|
||||||
Run parameterized SQL queries in SingleStore.
|
Run parameterized SQL queries in SingleStore.
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
@@ -29,7 +33,8 @@ SingleStore is built around Universal Storage which combines in-memory rowstore
|
|||||||
This source only uses standard authentication. You will need to [create a
|
This source only uses standard authentication. You will need to [create a
|
||||||
database user][singlestore-user] to login to the database with.
|
database user][singlestore-user] to login to the database with.
|
||||||
|
|
||||||
[singlestore-user]: https://docs.singlestore.com/cloud/reference/sql-reference/security-management-commands/create-user/
|
[singlestore-user]:
|
||||||
|
https://docs.singlestore.com/cloud/reference/sql-reference/security-management-commands/create-user/
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -53,7 +58,7 @@ instead of hardcoding your secrets into the configuration file.
|
|||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
|
|--------------|:--------:|:------------:|-------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "singlestore". |
|
| kind | string | true | Must be "singlestore". |
|
||||||
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
||||||
| port | string | true | Port to connect to (e.g. "3306"). |
|
| port | string | true | Port to connect to (e.g. "3306"). |
|
||||||
|
|||||||
@@ -8,14 +8,19 @@ aliases: [/resources/tools/alloydb-create-cluster]
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
The `alloydb-create-cluster` tool creates a new AlloyDB for PostgreSQL cluster in a specified project and location. It is compatible with [alloydb-admin](../../sources/alloydb-admin.md) source.
|
The `alloydb-create-cluster` tool creates a new AlloyDB for PostgreSQL cluster
|
||||||
|
in a specified project and location. It is compatible with
|
||||||
|
[alloydb-admin](../../sources/alloydb-admin.md) source.
|
||||||
This tool provisions a cluster with a **private IP address** within the specified VPC network.
|
This tool provisions a cluster with a **private IP address** within the specified VPC network.
|
||||||
|
|
||||||
**Permissions & APIs Required:**
|
**Permissions & APIs Required:**
|
||||||
Before using, ensure the following on your GCP project:
|
Before using, ensure the following on your GCP project:
|
||||||
|
|
||||||
1. The [AlloyDB API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com) is enabled.
|
1. The [AlloyDB
|
||||||
2. The user or service account executing the tool has one of the following IAM roles:
|
API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com) is
|
||||||
|
enabled.
|
||||||
|
2. The user or service account executing the tool has one of the following IAM
|
||||||
|
roles:
|
||||||
|
|
||||||
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
|
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
|
||||||
- `roles/owner` (the Owner basic IAM role)
|
- `roles/owner` (the Owner basic IAM role)
|
||||||
@@ -24,7 +29,7 @@ This tool provisions a cluster with a **private IP address** within the specifie
|
|||||||
The tool takes the following input parameters:
|
The tool takes the following input parameters:
|
||||||
|
|
||||||
| Parameter | Type | Description | Required |
|
| Parameter | Type | Description | Required |
|
||||||
| :--------- | :----- | :------------------------------------------------------------------------------------------------------------------------ | :------- |
|
|:-----------|:-------|:--------------------------------------------------------------------------------------------------------------------------|:---------|
|
||||||
| `project` | string | The GCP project ID where the cluster will be created. | Yes |
|
| `project` | string | The GCP project ID where the cluster will be created. | Yes |
|
||||||
| `cluster` | string | A unique identifier for the new AlloyDB cluster. | Yes |
|
| `cluster` | string | A unique identifier for the new AlloyDB cluster. | Yes |
|
||||||
| `password` | string | A secure password for the initial user. | Yes |
|
| `password` | string | A secure password for the initial user. | Yes |
|
||||||
@@ -44,8 +49,8 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** | |
|
| **field** | **type** | **required** | **description** |
|
||||||
| ----------- | :------: | :----------: | ---------------------------------------------------- | - |
|
|-------------|:--------:|:------------:|------------------------------------------------------|
|
||||||
| kind | string | true | Must be alloydb-create-cluster. | |
|
| kind | string | true | Must be alloydb-create-cluster. |
|
||||||
| source | string | true | The name of an `alloydb-admin` source. | |
|
| source | string | true | The name of an `alloydb-admin` source. |
|
||||||
| description | string | false | Description of the tool that is passed to the agent. | |
|
| description | string | false | Description of the tool that is passed to the agent. |
|
||||||
|
|||||||
@@ -22,7 +22,6 @@ This tool provisions a new instance with a **public IP address**.
|
|||||||
2. The user or service account executing the tool has one of the following IAM
|
2. The user or service account executing the tool has one of the following IAM
|
||||||
roles:
|
roles:
|
||||||
|
|
||||||
|
|
||||||
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
|
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
|
||||||
- `roles/owner` (the Owner basic IAM role)
|
- `roles/owner` (the Owner basic IAM role)
|
||||||
- `roles/editor` (the Editor basic IAM role)
|
- `roles/editor` (the Editor basic IAM role)
|
||||||
|
|||||||
@@ -8,10 +8,12 @@ aliases: [/resources/tools/alloydb-get-instance]
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
The `alloydb-get-instance` tool retrieves detailed information for a single, specified AlloyDB instance. It is compatible with [alloydb-admin](../../sources/alloydb-admin.md) source.
|
The `alloydb-get-instance` tool retrieves detailed information for a single,
|
||||||
|
specified AlloyDB instance. It is compatible with
|
||||||
|
[alloydb-admin](../../sources/alloydb-admin.md) source.
|
||||||
|
|
||||||
| Parameter | Type | Description | Required |
|
| Parameter | Type | Description | Required |
|
||||||
| :--------- | :----- | :-------------------------------------------------- | :------- |
|
|:-----------|:-------|:----------------------------------------------------|:---------|
|
||||||
| `project` | string | The GCP project ID to get instance for. | Yes |
|
| `project` | string | The GCP project ID to get instance for. | Yes |
|
||||||
| `location` | string | The location of the instance (e.g., 'us-central1'). | Yes |
|
| `location` | string | The location of the instance (e.g., 'us-central1'). | Yes |
|
||||||
| `cluster` | string | The ID of the cluster. | Yes |
|
| `cluster` | string | The ID of the cluster. | Yes |
|
||||||
@@ -30,7 +32,7 @@ tools:
|
|||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
| ----------- | :------: | :----------: | ---------------------------------------------------- |
|
|-------------|:--------:|:------------:|------------------------------------------------------|
|
||||||
| kind | string | true | Must be alloydb-get-instance. |
|
| kind | string | true | Must be alloydb-get-instance. |
|
||||||
| source | string | true | The name of an `alloydb-admin` source. |
|
| source | string | true | The name of an `alloydb-admin` source. |
|
||||||
| description | string | false | Description of the tool that is passed to the agent. |
|
| description | string | false | Description of the tool that is passed to the agent. |
|
||||||
|
|||||||
@@ -34,7 +34,8 @@ database layer.
|
|||||||
|
|
||||||
{{< notice tip >}} AlloyDB AI natural language is currently in gated public
|
{{< notice tip >}} AlloyDB AI natural language is currently in gated public
|
||||||
preview. For more information on availability and limitations, please see
|
preview. For more information on availability and limitations, please see
|
||||||
[AlloyDB AI natural language overview](https://cloud.google.com/alloydb/docs/ai/natural-language-overview)
|
[AlloyDB AI natural language
|
||||||
|
overview](https://cloud.google.com/alloydb/docs/ai/natural-language-overview)
|
||||||
{{< /notice >}}
|
{{< /notice >}}
|
||||||
|
|
||||||
To enable AlloyDB AI natural language for your AlloyDB cluster, please follow
|
To enable AlloyDB AI natural language for your AlloyDB cluster, please follow
|
||||||
@@ -46,15 +47,17 @@ context for your application.
|
|||||||
As of AlloyDB AI NL v1.0.3+, the signature of `execute_nl_query` has been
|
As of AlloyDB AI NL v1.0.3+, the signature of `execute_nl_query` has been
|
||||||
updated. Run `SELECT extversion FROM pg_extension WHERE extname =
|
updated. Run `SELECT extversion FROM pg_extension WHERE extname =
|
||||||
'alloydb_ai_nl';` to check which version your instance is using.
|
'alloydb_ai_nl';` to check which version your instance is using.
|
||||||
AlloyDB AI NL v1.0.3+ is required for Toolbox v0.19.0+. Starting with Toolbox v0.19.0, users
|
AlloyDB AI NL v1.0.3+ is required for Toolbox v0.19.0+. Starting with Toolbox
|
||||||
who previously used the create_configuration operation for the natural language
|
v0.19.0, users who previously used the create_configuration operation for the
|
||||||
configuration must update it. To do so, please drop the existing configuration
|
natural language configuration must update it. To do so, please drop the
|
||||||
and redefine it using the instructions
|
existing configuration and redefine it using the instructions
|
||||||
[here](https://docs.cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries#create-config).
|
[here](https://docs.cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries#create-config).
|
||||||
{{< /notice >}}
|
{{< /notice >}}
|
||||||
|
|
||||||
[alloydb-ai-nl-overview]: https://cloud.google.com/alloydb/docs/ai/natural-language-overview
|
[alloydb-ai-nl-overview]:
|
||||||
[alloydb-ai-gen-nl]: https://cloud.google.com/alloydb/docs/ai/generate-sql-queries-natural-language
|
https://cloud.google.com/alloydb/docs/ai/natural-language-overview
|
||||||
|
[alloydb-ai-gen-nl]:
|
||||||
|
https://cloud.google.com/alloydb/docs/ai/generate-sql-queries-natural-language
|
||||||
|
|
||||||
## Configuration
|
## Configuration
|
||||||
|
|
||||||
@@ -84,12 +87,17 @@ Parameters](../#array-parameters) or Bound Parameters to provide secure
|
|||||||
access to queries generated using natural language, as these parameters are not
|
access to queries generated using natural language, as these parameters are not
|
||||||
visible to the LLM.
|
visible to the LLM.
|
||||||
|
|
||||||
[alloydb-psv]: https://cloud.google.com/alloydb/docs/parameterized-secure-views-overview
|
[alloydb-psv]:
|
||||||
|
https://cloud.google.com/alloydb/docs/parameterized-secure-views-overview
|
||||||
|
|
||||||
|
{{< notice tip >}} Make sure to enable the `parameterized_views` extension
|
||||||
|
before running this tool. You can do so by running this command in the AlloyDB
|
||||||
|
studio:
|
||||||
|
|
||||||
{{< notice tip >}} Make sure to enable the `parameterized_views` extension before running this tool. You can do so by running this command in the AlloyDB studio:
|
|
||||||
```sql
|
```sql
|
||||||
CREATE EXTENSION IF NOT EXISTS parameterized_views;
|
CREATE EXTENSION IF NOT EXISTS parameterized_views;
|
||||||
```
|
```
|
||||||
|
|
||||||
{{< /notice >}}
|
{{< /notice >}}
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
@@ -112,12 +120,13 @@ tools:
|
|||||||
- name: my_google_service
|
- name: my_google_service
|
||||||
field: email
|
field: email
|
||||||
```
|
```
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------|
|
|--------------------|:---------------------------------------:|:------------:|--------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "alloydb-ai-nl". |
|
| kind | string | true | Must be "alloydb-ai-nl". |
|
||||||
| source | string | true | Name of the AlloyDB source the natural language query should execute on. |
|
| source | string | true | Name of the AlloyDB source the natural language query should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| nlConfig | string | true | The name of the `nl_config` in AlloyDB |
|
| nlConfig | string | true | The name of the `nl_config` in AlloyDB |
|
||||||
| nlConfigParameters | [parameters](../#specifying-parameters) | true | List of PSV parameters defined in the `nl_config` |
|
| nlConfigParameters | [parameters](../#specifying-parameters) | true | List of PSV parameters defined in the `nl_config` |
|
||||||
|
|||||||
@@ -39,20 +39,27 @@ It's compatible with the following sources:
|
|||||||
insights. Can be `'NO_PRUNING'` or `'PRUNE_REDUNDANT_INSIGHTS'`. Defaults to
|
insights. Can be `'NO_PRUNING'` or `'PRUNE_REDUNDANT_INSIGHTS'`. Defaults to
|
||||||
`'PRUNE_REDUNDANT_INSIGHTS'`.
|
`'PRUNE_REDUNDANT_INSIGHTS'`.
|
||||||
|
|
||||||
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
|
The behavior of this tool is influenced by the `writeMode` setting on its
|
||||||
|
`bigquery` source:
|
||||||
|
|
||||||
- **`allowed` (default) and `blocked`:** These modes do not impose any special restrictions on the `bigquery-analyze-contribution` tool.
|
- **`allowed` (default) and `blocked`:** These modes do not impose any special
|
||||||
- **`protected`:** This mode enables session-based execution. The tool will operate within the same BigQuery session as other
|
restrictions on the `bigquery-analyze-contribution` tool.
|
||||||
tools using the same source. This allows the `input_data` parameter to be a query that references temporary resources (e.g.,
|
- **`protected`:** This mode enables session-based execution. The tool will
|
||||||
`TEMP` tables) created within that session.
|
operate within the same BigQuery session as other tools using the same source.
|
||||||
|
This allows the `input_data` parameter to be a query that references temporary
|
||||||
|
resources (e.g., `TEMP` tables) created within that session.
|
||||||
|
|
||||||
The tool's behavior is also influenced by the `allowedDatasets` restriction on the `bigquery` source:
|
The tool's behavior is also influenced by the `allowedDatasets` restriction on
|
||||||
|
the `bigquery` source:
|
||||||
- **Without `allowedDatasets` restriction:** The tool can use any table or query for the `input_data` parameter.
|
|
||||||
- **With `allowedDatasets` restriction:** The tool verifies that the `input_data` parameter only accesses tables within the allowed datasets.
|
|
||||||
- If `input_data` is a table ID, the tool checks if the table's dataset is in the allowed list.
|
|
||||||
- If `input_data` is a query, the tool performs a dry run to analyze the query and rejects it if it accesses any table outside the allowed list.
|
|
||||||
|
|
||||||
|
- **Without `allowedDatasets` restriction:** The tool can use any table or query
|
||||||
|
for the `input_data` parameter.
|
||||||
|
- **With `allowedDatasets` restriction:** The tool verifies that the
|
||||||
|
`input_data` parameter only accesses tables within the allowed datasets.
|
||||||
|
- If `input_data` is a table ID, the tool checks if the table's dataset is in
|
||||||
|
the allowed list.
|
||||||
|
- If `input_data` is a query, the tool performs a dry run to analyze the query
|
||||||
|
and rejects it if it accesses any table outside the allowed list.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -65,6 +72,7 @@ tools:
|
|||||||
```
|
```
|
||||||
|
|
||||||
## Sample Prompt
|
## Sample Prompt
|
||||||
|
|
||||||
You can prepare a sample table following
|
You can prepare a sample table following
|
||||||
https://cloud.google.com/bigquery/docs/get-contribution-analysis-insights.
|
https://cloud.google.com/bigquery/docs/get-contribution-analysis-insights.
|
||||||
And use the following sample prompts to call this tool:
|
And use the following sample prompts to call this tool:
|
||||||
@@ -78,8 +86,8 @@ And use the following sample prompts to call this tool:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-------------|:--------:|:------------:|------------------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "bigquery-analyze-contribution". |
|
| kind | string | true | Must be "bigquery-analyze-contribution". |
|
||||||
| source | string | true | Name of the source the tool should execute on. |
|
| source | string | true | Name of the source the tool should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -37,15 +37,18 @@ It's compatible with the following sources:
|
|||||||
conversation history and system instructions for context.
|
conversation history and system instructions for context.
|
||||||
- **`table_references`:** A JSON string of a list of BigQuery tables to use as
|
- **`table_references`:** A JSON string of a list of BigQuery tables to use as
|
||||||
context. Each object in the list must contain `projectId`, `datasetId`, and
|
context. Each object in the list must contain `projectId`, `datasetId`, and
|
||||||
`tableId`. Example: `'[{"projectId": "my-gcp-project", "datasetId": "my_dataset", "tableId": "my_table"}]'`
|
`tableId`. Example: `'[{"projectId": "my-gcp-project", "datasetId":
|
||||||
|
"my_dataset", "tableId": "my_table"}]'`
|
||||||
|
|
||||||
The tool's behavior regarding these parameters is influenced by the `allowedDatasets`
|
The tool's behavior regarding these parameters is influenced by the
|
||||||
restriction on the `bigquery` source:
|
`allowedDatasets` restriction on the `bigquery` source:
|
||||||
- **Without `allowedDatasets` restriction:** The tool can use tables from any
|
|
||||||
|
- **Without `allowedDatasets` restriction:** The tool can use tables from any
|
||||||
dataset specified in the `table_references` parameter.
|
dataset specified in the `table_references` parameter.
|
||||||
- **With `allowedDatasets` restriction:** Before processing the request, the tool
|
- **With `allowedDatasets` restriction:** Before processing the request, the
|
||||||
verifies that every table in `table_references` belongs to a dataset in the allowed
|
tool verifies that every table in `table_references` belongs to a dataset in
|
||||||
list. If any table is from a dataset that is not in the list, the request is denied.
|
the allowed list. If any table is from a dataset that is not in the list, the
|
||||||
|
request is denied.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
|
|||||||
@@ -16,23 +16,31 @@ It's compatible with the following sources:
|
|||||||
- [bigquery](../../sources/bigquery.md)
|
- [bigquery](../../sources/bigquery.md)
|
||||||
|
|
||||||
`bigquery-execute-sql` accepts the following parameters:
|
`bigquery-execute-sql` accepts the following parameters:
|
||||||
- **`sql`** (required): The GoogleSQL statement to execute.
|
|
||||||
- **`dry_run`** (optional): If set to `true`, the query is validated but not run,
|
|
||||||
returning information about the execution instead. Defaults to `false`.
|
|
||||||
|
|
||||||
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
|
- **`sql`** (required): The GoogleSQL statement to execute.
|
||||||
|
- **`dry_run`** (optional): If set to `true`, the query is validated but not
|
||||||
|
run, returning information about the execution instead. Defaults to `false`.
|
||||||
|
|
||||||
|
The behavior of this tool is influenced by the `writeMode` setting on its
|
||||||
|
`bigquery` source:
|
||||||
|
|
||||||
- **`allowed` (default):** All SQL statements are permitted.
|
- **`allowed` (default):** All SQL statements are permitted.
|
||||||
- **`blocked`:** Only `SELECT` statements are allowed. Any other type of statement (e.g., `INSERT`, `UPDATE`, `CREATE`) will be rejected.
|
- **`blocked`:** Only `SELECT` statements are allowed. Any other type of
|
||||||
- **`protected`:** This mode enables session-based execution. `SELECT` statements can be used on all tables, while write operations are allowed only for the session's temporary dataset (e.g., `CREATE TEMP TABLE ...`). This prevents modifications to permanent datasets while allowing stateful, multi-step operations within a secure session.
|
statement (e.g., `INSERT`, `UPDATE`, `CREATE`) will be rejected.
|
||||||
|
- **`protected`:** This mode enables session-based execution. `SELECT`
|
||||||
|
statements can be used on all tables, while write operations are allowed only
|
||||||
|
for the session's temporary dataset (e.g., `CREATE TEMP TABLE ...`). This
|
||||||
|
prevents modifications to permanent datasets while allowing stateful,
|
||||||
|
multi-step operations within a secure session.
|
||||||
|
|
||||||
The tool's behavior is influenced by the `allowedDatasets` restriction on the
|
The tool's behavior is influenced by the `allowedDatasets` restriction on the
|
||||||
`bigquery` source. Similar to `writeMode`, this setting provides an additional layer of security by controlling which datasets can be accessed:
|
`bigquery` source. Similar to `writeMode`, this setting provides an additional
|
||||||
|
layer of security by controlling which datasets can be accessed:
|
||||||
|
|
||||||
- **Without `allowedDatasets` restriction:** The tool can execute any valid GoogleSQL
|
- **Without `allowedDatasets` restriction:** The tool can execute any valid
|
||||||
query.
|
GoogleSQL query.
|
||||||
- **With `allowedDatasets` restriction:** Before execution, the tool performs a dry run
|
- **With `allowedDatasets` restriction:** Before execution, the tool performs a
|
||||||
to analyze the query.
|
dry run to analyze the query.
|
||||||
It will reject the query if it attempts to access any table outside the
|
It will reject the query if it attempts to access any table outside the
|
||||||
allowed `datasets` list. To enforce this restriction, the following operations
|
allowed `datasets` list. To enforce this restriction, the following operations
|
||||||
are also disallowed:
|
are also disallowed:
|
||||||
@@ -40,7 +48,8 @@ The tool's behavior is influenced by the `allowedDatasets` restriction on the
|
|||||||
- **Unanalyzable operations** where the accessed tables cannot be determined
|
- **Unanalyzable operations** where the accessed tables cannot be determined
|
||||||
statically (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`, `CALL`).
|
statically (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`, `CALL`).
|
||||||
|
|
||||||
> **Note:** This tool is intended for developer assistant workflows with human-in-the-loop and shouldn't be used for production agents.
|
> **Note:** This tool is intended for developer assistant workflows with
|
||||||
|
> human-in-the-loop and shouldn't be used for production agents.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -54,8 +63,8 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "bigquery-execute-sql". |
|
| kind | string | true | Must be "bigquery-execute-sql". |
|
||||||
| source | string | true | Name of the source the SQL should execute on. |
|
| source | string | true | Name of the source the SQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -33,19 +33,27 @@ query based on the provided parameters:
|
|||||||
- **horizon** (integer, optional): The number of future time steps you want to
|
- **horizon** (integer, optional): The number of future time steps you want to
|
||||||
predict. It defaults to 10 if not specified.
|
predict. It defaults to 10 if not specified.
|
||||||
|
|
||||||
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
|
The behavior of this tool is influenced by the `writeMode` setting on its
|
||||||
|
`bigquery` source:
|
||||||
|
|
||||||
- **`allowed` (default) and `blocked`:** These modes do not impose any special restrictions on the `bigquery-forecast` tool.
|
- **`allowed` (default) and `blocked`:** These modes do not impose any special
|
||||||
- **`protected`:** This mode enables session-based execution. The tool will operate within the same BigQuery session as other
|
restrictions on the `bigquery-forecast` tool.
|
||||||
tools using the same source. This allows the `history_data` parameter to be a query that references temporary resources (e.g.,
|
- **`protected`:** This mode enables session-based execution. The tool will
|
||||||
`TEMP` tables) created within that session.
|
operate within the same BigQuery session as other tools using the same source.
|
||||||
|
This allows the `history_data` parameter to be a query that references
|
||||||
|
temporary resources (e.g., `TEMP` tables) created within that session.
|
||||||
|
|
||||||
The tool's behavior is also influenced by the `allowedDatasets` restriction on the `bigquery` source:
|
The tool's behavior is also influenced by the `allowedDatasets` restriction on
|
||||||
|
the `bigquery` source:
|
||||||
|
|
||||||
- **Without `allowedDatasets` restriction:** The tool can use any table or query for the `history_data` parameter.
|
- **Without `allowedDatasets` restriction:** The tool can use any table or query
|
||||||
- **With `allowedDatasets` restriction:** The tool verifies that the `history_data` parameter only accesses tables within the allowed datasets.
|
for the `history_data` parameter.
|
||||||
- If `history_data` is a table ID, the tool checks if the table's dataset is in the allowed list.
|
- **With `allowedDatasets` restriction:** The tool verifies that the
|
||||||
- If `history_data` is a query, the tool performs a dry run to analyze the query and rejects it if it accesses any table outside the allowed list.
|
`history_data` parameter only accesses tables within the allowed datasets.
|
||||||
|
- If `history_data` is a table ID, the tool checks if the table's dataset is
|
||||||
|
in the allowed list.
|
||||||
|
- If `history_data` is a query, the tool performs a dry run to analyze the
|
||||||
|
query and rejects it if it accesses any table outside the allowed list.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -58,11 +66,13 @@ tools:
|
|||||||
```
|
```
|
||||||
|
|
||||||
## Sample Prompt
|
## Sample Prompt
|
||||||
|
|
||||||
You can use the following sample prompts to call this tool:
|
You can use the following sample prompts to call this tool:
|
||||||
|
|
||||||
- Can you forecast the history time series data in bigquery table `bqml_tutorial.google_analytic`? Use project_id `myproject`.
|
- Can you forecast the history time series data in bigquery table
|
||||||
- What are the future `total_visits` in bigquery table `bqml_tutorial.google_analytic`?
|
`bqml_tutorial.google_analytic`? Use project_id `myproject`.
|
||||||
|
- What are the future `total_visits` in bigquery table
|
||||||
|
`bqml_tutorial.google_analytic`?
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
|
|||||||
@@ -16,12 +16,14 @@ It's compatible with the following sources:
|
|||||||
- [bigquery](../../sources/bigquery.md)
|
- [bigquery](../../sources/bigquery.md)
|
||||||
|
|
||||||
`bigquery-get-dataset-info` accepts the following parameters:
|
`bigquery-get-dataset-info` accepts the following parameters:
|
||||||
|
|
||||||
- **`dataset`** (required): Specifies the dataset for which to retrieve metadata.
|
- **`dataset`** (required): Specifies the dataset for which to retrieve metadata.
|
||||||
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
|
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
|
||||||
the tool defaults to the project from the source configuration.
|
the tool defaults to the project from the source configuration.
|
||||||
|
|
||||||
The tool's behavior regarding these parameters is influenced by the
|
The tool's behavior regarding these parameters is influenced by the
|
||||||
`allowedDatasets` restriction on the `bigquery` source:
|
`allowedDatasets` restriction on the `bigquery` source:
|
||||||
|
|
||||||
- **Without `allowedDatasets` restriction:** The tool can retrieve metadata for
|
- **Without `allowedDatasets` restriction:** The tool can retrieve metadata for
|
||||||
any dataset specified by the `dataset` and `project` parameters.
|
any dataset specified by the `dataset` and `project` parameters.
|
||||||
- **With `allowedDatasets` restriction:** Before retrieving metadata, the tool
|
- **With `allowedDatasets` restriction:** Before retrieving metadata, the tool
|
||||||
|
|||||||
@@ -16,6 +16,7 @@ It's compatible with the following sources:
|
|||||||
- [bigquery](../../sources/bigquery.md)
|
- [bigquery](../../sources/bigquery.md)
|
||||||
|
|
||||||
`bigquery-get-table-info` accepts the following parameters:
|
`bigquery-get-table-info` accepts the following parameters:
|
||||||
|
|
||||||
- **`table`** (required): The name of the table for which to retrieve metadata.
|
- **`table`** (required): The name of the table for which to retrieve metadata.
|
||||||
- **`dataset`** (required): The dataset containing the specified table.
|
- **`dataset`** (required): The dataset containing the specified table.
|
||||||
- **`project`** (optional): The Google Cloud project ID. If not provided, the
|
- **`project`** (optional): The Google Cloud project ID. If not provided, the
|
||||||
@@ -23,6 +24,7 @@ It's compatible with the following sources:
|
|||||||
|
|
||||||
The tool's behavior regarding these parameters is influenced by the
|
The tool's behavior regarding these parameters is influenced by the
|
||||||
`allowedDatasets` restriction on the `bigquery` source:
|
`allowedDatasets` restriction on the `bigquery` source:
|
||||||
|
|
||||||
- **Without `allowedDatasets` restriction:** The tool can retrieve metadata for
|
- **Without `allowedDatasets` restriction:** The tool can retrieve metadata for
|
||||||
any table specified by the `table`, `dataset`, and `project` parameters.
|
any table specified by the `table`, `dataset`, and `project` parameters.
|
||||||
- **With `allowedDatasets` restriction:** Before retrieving metadata, the tool
|
- **With `allowedDatasets` restriction:** Before retrieving metadata, the tool
|
||||||
|
|||||||
@@ -16,11 +16,13 @@ It's compatible with the following sources:
|
|||||||
- [bigquery](../../sources/bigquery.md)
|
- [bigquery](../../sources/bigquery.md)
|
||||||
|
|
||||||
`bigquery-list-dataset-ids` accepts the following parameter:
|
`bigquery-list-dataset-ids` accepts the following parameter:
|
||||||
|
|
||||||
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
|
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
|
||||||
the tool defaults to the project from the source configuration.
|
the tool defaults to the project from the source configuration.
|
||||||
|
|
||||||
The tool's behavior regarding this parameter is influenced by the
|
The tool's behavior regarding this parameter is influenced by the
|
||||||
`allowedDatasets` restriction on the `bigquery` source:
|
`allowedDatasets` restriction on the `bigquery` source:
|
||||||
|
|
||||||
- **Without `allowedDatasets` restriction:** The tool can list datasets from any
|
- **Without `allowedDatasets` restriction:** The tool can list datasets from any
|
||||||
project specified by the `project` parameter.
|
project specified by the `project` parameter.
|
||||||
- **With `allowedDatasets` restriction:** The tool directly returns the
|
- **With `allowedDatasets` restriction:** The tool directly returns the
|
||||||
|
|||||||
@@ -15,10 +15,16 @@ the following sources:
|
|||||||
|
|
||||||
- [bigquery](../../sources/bigquery.md)
|
- [bigquery](../../sources/bigquery.md)
|
||||||
|
|
||||||
The behavior of this tool is influenced by the `writeMode` setting on its `bigquery` source:
|
The behavior of this tool is influenced by the `writeMode` setting on its
|
||||||
|
`bigquery` source:
|
||||||
|
|
||||||
- **`allowed` (default) and `blocked`:** These modes do not impose any restrictions on the `bigquery-sql` tool. The pre-defined SQL statement will be executed as-is.
|
- **`allowed` (default) and `blocked`:** These modes do not impose any
|
||||||
- **`protected`:** This mode enables session-based execution. The tool will operate within the same BigQuery session as other tools using the same source, allowing it to interact with temporary resources like `TEMP` tables created within that session.
|
restrictions on the `bigquery-sql` tool. The pre-defined SQL statement will be
|
||||||
|
executed as-is.
|
||||||
|
- **`protected`:** This mode enables session-based execution. The tool will
|
||||||
|
operate within the same BigQuery session as other tools using the same source,
|
||||||
|
allowing it to interact with temporary resources like `TEMP` tables created
|
||||||
|
within that session.
|
||||||
|
|
||||||
### GoogleSQL
|
### GoogleSQL
|
||||||
|
|
||||||
@@ -28,7 +34,8 @@ parameters can be inserted into the query. BigQuery supports both named paramete
|
|||||||
(e.g., `@name`) and positional parameters (`?`), but they cannot be mixed in the
|
(e.g., `@name`) and positional parameters (`?`), but they cannot be mixed in the
|
||||||
same query.
|
same query.
|
||||||
|
|
||||||
[bigquery-googlesql]: https://cloud.google.com/bigquery/docs/reference/standard-sql/
|
[bigquery-googlesql]:
|
||||||
|
https://cloud.google.com/bigquery/docs/reference/standard-sql/
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -100,11 +107,11 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
|
|--------------------|:---------------------------------------------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "bigquery-sql". |
|
| kind | string | true | Must be "bigquery-sql". |
|
||||||
| source | string | true | Name of the source the GoogleSQL should execute on. |
|
| source | string | true | Name of the source the GoogleSQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| statement | string | true | The GoogleSQL statement to execute. |
|
| statement | string | true | The GoogleSQL statement to execute. |
|
||||||
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
|
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
|
||||||
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
||||||
|
|||||||
@@ -100,13 +100,13 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
|
|--------------------|:--------------------------------------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "bigtable-sql". |
|
| kind | string | true | Must be "bigtable-sql". |
|
||||||
| source | string | true | Name of the source the SQL should execute on. |
|
| source | string | true | Name of the source the SQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| statement | string | true | SQL statement to execute on. |
|
| statement | string | true | SQL statement to execute on. |
|
||||||
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
|
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
|
||||||
| templateParameters | [templateParameters](..#template-parameters) | false | List of [templateParameters](..#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
| templateParameters | [templateParameters](..#template-parameters) | false | List of [templateParameters](..#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
||||||
|
|
||||||
## Tips
|
## Tips
|
||||||
@@ -119,6 +119,8 @@ tools:
|
|||||||
workaround would be to leverage Bigtable [Logical
|
workaround would be to leverage Bigtable [Logical
|
||||||
Views][bigtable-logical-view] to rename the columns.
|
Views][bigtable-logical-view] to rename the columns.
|
||||||
|
|
||||||
[bigtable-studio]: https://cloud.google.com/bigtable/docs/manage-data-using-console
|
[bigtable-studio]:
|
||||||
[bigtable-logical-view]: https://cloud.google.com/bigtable/docs/create-manage-logical-views
|
https://cloud.google.com/bigtable/docs/manage-data-using-console
|
||||||
|
[bigtable-logical-view]:
|
||||||
|
https://cloud.google.com/bigtable/docs/create-manage-logical-views
|
||||||
[bigtable-querybuilder]: https://cloud.google.com/bigtable/docs/query-builder
|
[bigtable-querybuilder]: https://cloud.google.com/bigtable/docs/query-builder
|
||||||
|
|||||||
@@ -16,17 +16,19 @@ database. It's compatible with any of the following sources:
|
|||||||
|
|
||||||
- [cassandra](../../sources/cassandra.md)
|
- [cassandra](../../sources/cassandra.md)
|
||||||
|
|
||||||
The specified CQL statement is executed as a [prepared statement][cassandra-prepare],
|
The specified CQL statement is executed as a [prepared
|
||||||
and expects parameters in the CQL query to be in the form of placeholders `?`.
|
statement][cassandra-prepare], and expects parameters in the CQL query to be in
|
||||||
|
the form of placeholders `?`.
|
||||||
|
|
||||||
[cassandra-prepare]: https://docs.datastax.com/en/datastax-drivers/developing/prepared-statements.html
|
[cassandra-prepare]:
|
||||||
|
https://docs.datastax.com/en/datastax-drivers/developing/prepared-statements.html
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
> **Note:** This tool uses parameterized queries to prevent CQL injections.
|
> **Note:** This tool uses parameterized queries to prevent CQL injections.
|
||||||
> Query parameters can be used as substitutes for arbitrary expressions.
|
> Query parameters can be used as substitutes for arbitrary expressions.
|
||||||
> Parameters cannot be used as substitutes for keyspaces, table names, column names,
|
> Parameters cannot be used as substitutes for keyspaces, table names, column
|
||||||
> or other parts of the query.
|
> names, or other parts of the query.
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
tools:
|
tools:
|
||||||
@@ -85,12 +87,12 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
|
|--------------------|:---------------------------------------------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "cassandra-cql". |
|
| kind | string | true | Must be "cassandra-cql". |
|
||||||
| source | string | true | Name of the source the CQL should execute on. |
|
| source | string | true | Name of the source the CQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| statement | string | true | CQL statement to execute. |
|
| statement | string | true | CQL statement to execute. |
|
||||||
| authRequired | []string | false | List of authentication requirements for the source. |
|
| authRequired | []string | false | List of authentication requirements for the source. |
|
||||||
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the CQL statement. |
|
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the CQL statement. |
|
||||||
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the CQL statement before executing prepared statement. |
|
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the CQL statement before executing prepared statement. |
|
||||||
|
|||||||
@@ -44,4 +44,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|-------------------------------------------------------|
|
|-------------|:--------:|:------------:|-------------------------------------------------------|
|
||||||
| kind | string | true | Must be "clickhouse-execute-sql". |
|
| kind | string | true | Must be "clickhouse-execute-sql". |
|
||||||
| source | string | true | Name of the ClickHouse source to execute SQL against. |
|
| source | string | true | Name of the ClickHouse source to execute SQL against. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -10,11 +10,12 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `clickhouse-list-tables` tool lists all available tables in a specified
|
A `clickhouse-list-tables` tool lists all available tables in a specified
|
||||||
ClickHouse database. It's compatible with the [clickhouse](../../sources/clickhouse.md) source.
|
ClickHouse database. It's compatible with the
|
||||||
|
[clickhouse](../../sources/clickhouse.md) source.
|
||||||
|
|
||||||
This tool executes the `SHOW TABLES FROM <database>` command and returns a list
|
This tool executes the `SHOW TABLES FROM <database>` command and returns a list
|
||||||
of all tables in the specified database that are accessible to the configured
|
of all tables in the specified database that are accessible to the configured
|
||||||
user, making it useful for schema exploration and table discovery tasks.
|
user, making it useful for schema exploration and table discovery tasks.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
@@ -29,17 +30,19 @@ tools:
|
|||||||
|
|
||||||
## Parameters
|
## Parameters
|
||||||
|
|
||||||
| **parameter** | **type** | **required** | **description** |
|
| **parameter** | **type** | **required** | **description** |
|
||||||
|---------------|:--------:|:------------:|---------------------------------------------|
|
|---------------|:--------:|:------------:|-----------------------------------|
|
||||||
| database | string | true | The database to list tables from. |
|
| database | string | true | The database to list tables from. |
|
||||||
|
|
||||||
## Return Value
|
## Return Value
|
||||||
|
|
||||||
The tool returns an array of objects, where each object contains:
|
The tool returns an array of objects, where each object contains:
|
||||||
|
|
||||||
- `name`: The name of the table
|
- `name`: The name of the table
|
||||||
- `database`: The database the table belongs to
|
- `database`: The database the table belongs to
|
||||||
|
|
||||||
Example response:
|
Example response:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
[
|
[
|
||||||
{"name": "users", "database": "analytics"},
|
{"name": "users", "database": "analytics"},
|
||||||
@@ -51,10 +54,10 @@ Example response:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------------|:------------------:|:------------:|-----------------------------------------------------------|
|
|--------------|:------------------:|:------------:|---------------------------------------------------------|
|
||||||
| kind | string | true | Must be "clickhouse-list-tables". |
|
| kind | string | true | Must be "clickhouse-list-tables". |
|
||||||
| source | string | true | Name of the ClickHouse source to list tables from. |
|
| source | string | true | Name of the ClickHouse source to list tables from. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| authRequired | array of string | false | Authentication services required to use this tool. |
|
| authRequired | array of string | false | Authentication services required to use this tool. |
|
||||||
| parameters | array of Parameter | false | Parameters for the tool (see Parameters section above). |
|
| parameters | array of Parameter | false | Parameters for the tool (see Parameters section above). |
|
||||||
|
|||||||
@@ -14,8 +14,8 @@ A `clickhouse-sql` tool executes SQL queries as prepared statements against a
|
|||||||
ClickHouse database. It's compatible with the
|
ClickHouse database. It's compatible with the
|
||||||
[clickhouse](../../sources/clickhouse.md) source.
|
[clickhouse](../../sources/clickhouse.md) source.
|
||||||
|
|
||||||
This tool supports both template parameters (for SQL statement customization)
|
This tool supports both template parameters (for SQL statement customization)
|
||||||
and regular parameters (for prepared statement values), providing flexible
|
and regular parameters (for prepared statement values), providing flexible
|
||||||
query execution capabilities.
|
query execution capabilities.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|||||||
@@ -10,13 +10,15 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-fhir-fetch-page` tool fetches a page of FHIR resources from a given URL. It's
|
A `cloud-healthcare-fhir-fetch-page` tool fetches a page of FHIR resources from
|
||||||
compatible with the following sources:
|
a given URL. It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`cloud-healthcare-fhir-fetch-page` can be used for pagination when a previous tool call (like
|
`cloud-healthcare-fhir-fetch-page` can be used for pagination when a previous
|
||||||
`cloud-healthcare-fhir-patient-search` or `cloud-healthcare-fhir-patient-everything`) returns a 'next' link in the response bundle.
|
tool call (like `cloud-healthcare-fhir-patient-search` or
|
||||||
|
`cloud-healthcare-fhir-patient-everything`) returns a 'next' link in the
|
||||||
|
response bundle.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
|
|||||||
@@ -10,14 +10,14 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-fhir-patient-everything` tool retrieves resources related to a given patient
|
A `cloud-healthcare-fhir-patient-everything` tool retrieves resources related to
|
||||||
from a FHIR store. It's compatible with the following sources:
|
a given patient from a FHIR store. It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`cloud-healthcare-fhir-patient-everything` returns all the information available for a given
|
`cloud-healthcare-fhir-patient-everything` returns all the information available
|
||||||
patient ID. It can be configured to only return certain resource types, or only
|
for a given patient ID. It can be configured to only return certain resource
|
||||||
resources that have been updated after a given time.
|
types, or only resources that have been updated after a given time.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -46,4 +46,5 @@ tools:
|
|||||||
| sinceFilter | string | false | If provided, only resources updated after this time are returned. The time uses the format YYYY-MM-DDThh:mm:ss.sss+zz:zz. The time must be specified to the second and include a time zone. For example, 2015-02-07T13:28:17.239+02:00 or 2017-01-01T00:00:00Z. |
|
| sinceFilter | string | false | If provided, only resources updated after this time are returned. The time uses the format YYYY-MM-DDThh:mm:ss.sss+zz:zz. The time must be specified to the second and include a time zone. For example, 2015-02-07T13:28:17.239+02:00 or 2017-01-01T00:00:00Z. |
|
||||||
| storeID | string | true* | The FHIR store ID to search in. |
|
| storeID | string | true* | The FHIR store ID to search in. |
|
||||||
|
|
||||||
*If the `allowedFHIRStores` in the source has length 1, then the `storeID` parameter is not needed.
|
*If the `allowedFHIRStores` in the source has length 1, then the `storeID`
|
||||||
|
parameter is not needed.
|
||||||
|
|||||||
@@ -10,12 +10,13 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-fhir-patient-search` tool searches for patients in a FHIR store based on a
|
A `cloud-healthcare-fhir-patient-search` tool searches for patients in a FHIR
|
||||||
set of criteria. It's compatible with the following sources:
|
store based on a set of criteria. It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`cloud-healthcare-fhir-patient-search` returns a list of patients that match the given criteria.
|
`cloud-healthcare-fhir-patient-search` returns a list of patients that match the
|
||||||
|
given criteria.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -60,4 +61,5 @@ tools:
|
|||||||
| summary | boolean | false | Requests the server to return a subset of the resource. True by default. |
|
| summary | boolean | false | Requests the server to return a subset of the resource. True by default. |
|
||||||
| storeID | string | true* | The FHIR store ID to search in. |
|
| storeID | string | true* | The FHIR store ID to search in. |
|
||||||
|
|
||||||
*If the `allowedFHIRStores` in the source has length 1, then the `storeID` parameter is not needed.
|
*If the `allowedFHIRStores` in the source has length 1, then the `storeID`
|
||||||
|
parameter is not needed.
|
||||||
|
|||||||
@@ -10,8 +10,8 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-get-dicom-store-metrics` tool retrieves metrics for a DICOM store. It's
|
A `cloud-healthcare-get-dicom-store-metrics` tool retrieves metrics for a DICOM
|
||||||
compatible with the following sources:
|
store. It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
@@ -41,4 +41,5 @@ tools:
|
|||||||
|-----------|:--------:|:------------:|----------------------------------------|
|
|-----------|:--------:|:------------:|----------------------------------------|
|
||||||
| storeID | string | true* | The DICOM store ID to get metrics for. |
|
| storeID | string | true* | The DICOM store ID to get metrics for. |
|
||||||
|
|
||||||
*If the `allowedDICOMStores` in the source has length 1, then the `storeID` parameter is not needed.
|
*If the `allowedDICOMStores` in the source has length 1, then the `storeID`
|
||||||
|
parameter is not needed.
|
||||||
|
|||||||
@@ -11,12 +11,14 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-get-fhir-resource` tool retrieves a specific FHIR resource from a FHIR store.
|
A `cloud-healthcare-get-fhir-resource` tool retrieves a specific FHIR resource
|
||||||
|
from a FHIR store.
|
||||||
It's compatible with the following sources:
|
It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`cloud-healthcare-get-fhir-resource` returns a single FHIR resource, identified by its type and ID.
|
`cloud-healthcare-get-fhir-resource` returns a single FHIR resource, identified
|
||||||
|
by its type and ID.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -44,4 +46,5 @@ tools:
|
|||||||
| resourceID | string | true | The ID of the FHIR resource to retrieve. |
|
| resourceID | string | true | The ID of the FHIR resource to retrieve. |
|
||||||
| storeID | string | true* | The FHIR store ID to retrieve the resource from. |
|
| storeID | string | true* | The FHIR store ID to retrieve the resource from. |
|
||||||
|
|
||||||
*If the `allowedFHIRStores` in the source has length 1, then the `storeID` parameter is not needed.
|
*If the `allowedFHIRStores` in the source has length 1, then the `storeID`
|
||||||
|
parameter is not needed.
|
||||||
|
|||||||
@@ -10,13 +10,14 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-list-dicom-stores` lists the available DICOM stores in the healthcare dataset.
|
A `cloud-healthcare-list-dicom-stores` lists the available DICOM stores in the
|
||||||
|
healthcare dataset.
|
||||||
It's compatible with the following sources:
|
It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`cloud-healthcare-list-dicom-stores` returns the details of the available DICOM stores in the
|
`cloud-healthcare-list-dicom-stores` returns the details of the available DICOM
|
||||||
dataset of the healthcare source. It takes no extra parameters.
|
stores in the dataset of the healthcare source. It takes no extra parameters.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
|
|||||||
@@ -10,13 +10,14 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-list-fhir-stores` lists the available FHIR stores in the healthcare dataset.
|
A `cloud-healthcare-list-fhir-stores` lists the available FHIR stores in the
|
||||||
|
healthcare dataset.
|
||||||
It's compatible with the following sources:
|
It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`cloud-healthcare-list-fhir-stores` returns the details of the available FHIR stores in the
|
`cloud-healthcare-list-fhir-stores` returns the details of the available FHIR
|
||||||
dataset of the healthcare source. It takes no extra parameters.
|
stores in the dataset of the healthcare source. It takes no extra parameters.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
|
|||||||
@@ -10,12 +10,14 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-retrieve-rendered-dicom-instance` tool retrieves a rendered DICOM instance from a DICOM store.
|
A `cloud-healthcare-retrieve-rendered-dicom-instance` tool retrieves a rendered
|
||||||
|
DICOM instance from a DICOM store.
|
||||||
It's compatible with the following sources:
|
It's compatible with the following sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`cloud-healthcare-retrieve-rendered-dicom-instance` returns a base64 encoded string of the image in JPEG format.
|
`cloud-healthcare-retrieve-rendered-dicom-instance` returns a base64 encoded
|
||||||
|
string of the image in JPEG format.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -45,4 +47,5 @@ tools:
|
|||||||
| FrameNumber | integer | false | The frame number to retrieve (1-based). Only applicable to multi-frame instances. Defaults to 1. |
|
| FrameNumber | integer | false | The frame number to retrieve (1-based). Only applicable to multi-frame instances. Defaults to 1. |
|
||||||
| storeID | string | true* | The DICOM store ID to retrieve from. |
|
| storeID | string | true* | The DICOM store ID to retrieve from. |
|
||||||
|
|
||||||
*If the `allowedDICOMStores` in the source has length 1, then the `storeID` parameter is not needed.
|
*If the `allowedDICOMStores` in the source has length 1, then the `storeID`
|
||||||
|
parameter is not needed.
|
||||||
|
|||||||
@@ -10,12 +10,14 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `cloud-healthcare-search-dicom-instances` tool searches for DICOM instances in a DICOM store based on a
|
A `cloud-healthcare-search-dicom-instances` tool searches for DICOM instances in
|
||||||
set of criteria. It's compatible with the following sources:
|
a DICOM store based on a set of criteria. It's compatible with the following
|
||||||
|
sources:
|
||||||
|
|
||||||
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
- [cloud-healthcare](../../sources/cloud-healthcare.md)
|
||||||
|
|
||||||
`search-dicom-instances` returns a list of DICOM instances that match the given criteria.
|
`search-dicom-instances` returns a list of DICOM instances that match the given
|
||||||
|
criteria.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -52,4 +54,5 @@ tools:
|
|||||||
| includefield | []string | false | List of attributeIDs to include in the output, such as DICOM tag IDs or keywords. Set to `["all"]` to return all available tags. |
|
| includefield | []string | false | List of attributeIDs to include in the output, such as DICOM tag IDs or keywords. Set to `["all"]` to return all available tags. |
|
||||||
| storeID | string | true* | The DICOM store ID to search in. |
|
| storeID | string | true* | The DICOM store ID to search in. |
|
||||||
|
|
||||||
*If the `allowedDICOMStores` in the source has length 1, then the `storeID` parameter is not needed.
|
*If the `allowedDICOMStores` in the source has length 1, then the `storeID`
|
||||||
|
parameter is not needed.
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ project:
|
|||||||
- **Ad-hoc analysis:** Quickly investigate performance issues by executing
|
- **Ad-hoc analysis:** Quickly investigate performance issues by executing
|
||||||
direct promql queries for a database instance.
|
direct promql queries for a database instance.
|
||||||
- **Prebuilt Configs:** Use the already added prebuilt tools mentioned in
|
- **Prebuilt Configs:** Use the already added prebuilt tools mentioned in
|
||||||
prebuilt-tools.md to query the databases system/query level metrics.
|
prebuilt-tools.md to query the databases system/query level metrics.
|
||||||
|
|
||||||
Here are some common use cases for the `cloud-monitoring-query-prometheus` tool:
|
Here are some common use cases for the `cloud-monitoring-query-prometheus` tool:
|
||||||
|
|
||||||
@@ -54,7 +54,6 @@ Here are some common use cases for the `cloud-monitoring-query-prometheus` tool:
|
|||||||
Here are some examples of how to use the `cloud-monitoring-query-prometheus`
|
Here are some examples of how to use the `cloud-monitoring-query-prometheus`
|
||||||
tool.
|
tool.
|
||||||
|
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
tools:
|
tools:
|
||||||
get_wait_time_metrics:
|
get_wait_time_metrics:
|
||||||
@@ -68,6 +67,7 @@ tools:
|
|||||||
```
|
```
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-------------|:--------:|:------------:|------------------------------------------------------|
|
|-------------|:--------:|:------------:|------------------------------------------------------|
|
||||||
| kind | string | true | Must be cloud-monitoring-query-prometheus. |
|
| kind | string | true | Must be cloud-monitoring-query-prometheus. |
|
||||||
|
|||||||
@@ -11,7 +11,6 @@ long-running Cloud SQL operation to complete. It does this by polling the Cloud
|
|||||||
SQL Admin API operation status endpoint until the operation is finished, using
|
SQL Admin API operation status endpoint until the operation is finished, using
|
||||||
exponential backoff.
|
exponential backoff.
|
||||||
|
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
|
|||||||
@@ -89,12 +89,12 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
|
|--------------------|:--------------------------------------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "couchbase-sql". |
|
| kind | string | true | Must be "couchbase-sql". |
|
||||||
| source | string | true | Name of the source the SQL query should execute on. |
|
| source | string | true | Name of the source the SQL query should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| statement | string | true | SQL statement to execute |
|
| statement | string | true | SQL statement to execute |
|
||||||
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the SQL statement. |
|
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the SQL statement. |
|
||||||
| templateParameters | [templateParameters](..#template-parameters) | false | List of [templateParameters](..#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
| templateParameters | [templateParameters](..#template-parameters) | false | List of [templateParameters](..#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
||||||
| authRequired | array[string] | false | List of auth services that are required to use this tool. |
|
| authRequired | array[string] | false | List of auth services that are required to use this tool. |
|
||||||
|
|||||||
@@ -10,21 +10,26 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `dataform-compile-local` tool runs the `dataform compile` command on a local Dataform project.
|
A `dataform-compile-local` tool runs the `dataform compile` command on a local
|
||||||
|
Dataform project.
|
||||||
|
|
||||||
It is a standalone tool and **is not** compatible with any sources.
|
It is a standalone tool and **is not** compatible with any sources.
|
||||||
|
|
||||||
At invocation time, the tool executes `dataform compile --json` in the specified project directory and returns the resulting JSON object from the CLI.
|
At invocation time, the tool executes `dataform compile --json` in the specified
|
||||||
|
project directory and returns the resulting JSON object from the CLI.
|
||||||
|
|
||||||
`dataform-compile-local` takes the following parameter:
|
`dataform-compile-local` takes the following parameter:
|
||||||
|
|
||||||
- `project_dir` (string): The absolute or relative path to the local Dataform project directory. The server process must have read access to this path.
|
- `project_dir` (string): The absolute or relative path to the local Dataform
|
||||||
|
project directory. The server process must have read access to this path.
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
### Dataform CLI
|
### Dataform CLI
|
||||||
|
|
||||||
This tool executes the `dataform` command-line interface (CLI) via a system call. You must have the **`dataform` CLI** installed and available in the server's system `PATH`.
|
This tool executes the `dataform` command-line interface (CLI) via a system
|
||||||
|
call. You must have the **`dataform` CLI** installed and available in the
|
||||||
|
server's system `PATH`.
|
||||||
|
|
||||||
You can typically install the CLI via `npm`:
|
You can typically install the CLI via `npm`:
|
||||||
|
|
||||||
@@ -32,7 +37,9 @@ You can typically install the CLI via `npm`:
|
|||||||
npm install -g @dataform/cli
|
npm install -g @dataform/cli
|
||||||
```
|
```
|
||||||
|
|
||||||
See the [official Dataform documentation](https://www.google.com/search?q=https://cloud.google.com/dataform/docs/install-dataform-cli) for more details.
|
See the [official Dataform
|
||||||
|
documentation](https://www.google.com/search?q=https://cloud.google.com/dataform/docs/install-dataform-cli)
|
||||||
|
for more details.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -45,7 +52,7 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
| :---- | :---- | :---- | :---- |
|
|:------------|:---------|:-------------|:---------------------------------------------------|
|
||||||
| kind | string | true | Must be "dataform-compile-local". |
|
| kind | string | true | Must be "dataform-compile-local". |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -13,7 +13,9 @@ Execute ES|QL queries.
|
|||||||
This tool allows you to execute ES|QL queries against your Elasticsearch
|
This tool allows you to execute ES|QL queries against your Elasticsearch
|
||||||
cluster. You can use this to perform complex searches and aggregations.
|
cluster. You can use this to perform complex searches and aggregations.
|
||||||
|
|
||||||
See the [official documentation](https://www.elastic.co/docs/reference/query-languages/esql/esql-getting-started) for more information.
|
See the [official
|
||||||
|
documentation](https://www.elastic.co/docs/reference/query-languages/esql/esql-getting-started)
|
||||||
|
for more information.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -36,10 +38,9 @@ tools:
|
|||||||
|
|
||||||
## Parameters
|
## Parameters
|
||||||
|
|
||||||
| **name** | **type** | **required** | **description** |
|
| **name** | **type** | **required** | **description** |
|
||||||
|------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------|
|
|------------|:---------------------------------------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| query | string | false | The ES\|QL query to run. Can also be passed by parameters. |
|
| query | string | false | The ES\|QL query to run. Can also be passed by parameters. |
|
||||||
| format | string | false | The format of the query. Default is json. Valid values are csv, json, tsv, txt, yaml, cbor, smile, or arrow. |
|
| format | string | false | The format of the query. Default is json. Valid values are csv, json, tsv, txt, yaml, cbor, smile, or arrow. |
|
||||||
| timeout | integer | false | The timeout for the query in seconds. Default is 60 (1 minute). |
|
| timeout | integer | false | The timeout for the query in seconds. Default is 60 (1 minute). |
|
||||||
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the ES\|QL query.<br/>Only supports “string”, “integer”, “float”, “boolean”. |
|
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the ES\|QL query.<br/>Only supports “string”, “integer”, “float”, “boolean”. |
|
||||||
|
|
||||||
|
|||||||
@@ -18,8 +18,8 @@ database. It's compatible with the following source:
|
|||||||
|
|
||||||
The specified SQL statement is executed as a [prepared statement][fb-prepare],
|
The specified SQL statement is executed as a [prepared statement][fb-prepare],
|
||||||
and supports both positional parameters (`?`) and named parameters (`:param_name`).
|
and supports both positional parameters (`?`) and named parameters (`:param_name`).
|
||||||
Parameters will be inserted according to their position or name. If template
|
Parameters will be inserted according to their position or name. If template
|
||||||
parameters are included, they will be resolved before the execution of the
|
parameters are included, they will be resolved before the execution of the
|
||||||
prepared statement.
|
prepared statement.
|
||||||
|
|
||||||
[fb-prepare]: https://firebirdsql.org/refdocs/langrefupd25-psql-execstat.html
|
[fb-prepare]: https://firebirdsql.org/refdocs/langrefupd25-psql-execstat.html
|
||||||
@@ -125,11 +125,11 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
|
|--------------------|:---------------------------------------------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "firebird-sql". |
|
| kind | string | true | Must be "firebird-sql". |
|
||||||
| source | string | true | Name of the source the SQL should execute on. |
|
| source | string | true | Name of the source the SQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| statement | string | true | SQL statement to execute on. |
|
| statement | string | true | SQL statement to execute on. |
|
||||||
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
|
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
|
||||||
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
||||||
|
|||||||
@@ -38,6 +38,7 @@ The tool requires Firestore's native JSON format for document data. Each field
|
|||||||
must be wrapped with its type indicator:
|
must be wrapped with its type indicator:
|
||||||
|
|
||||||
### Basic Types
|
### Basic Types
|
||||||
|
|
||||||
- **String**: `{"stringValue": "your string"}`
|
- **String**: `{"stringValue": "your string"}`
|
||||||
- **Integer**: `{"integerValue": "123"}` or `{"integerValue": 123}`
|
- **Integer**: `{"integerValue": "123"}` or `{"integerValue": 123}`
|
||||||
- **Double**: `{"doubleValue": 123.45}`
|
- **Double**: `{"doubleValue": 123.45}`
|
||||||
@@ -47,6 +48,7 @@ must be wrapped with its type indicator:
|
|||||||
- **Timestamp**: `{"timestampValue": "2025-01-07T10:00:00Z"}` (RFC3339 format)
|
- **Timestamp**: `{"timestampValue": "2025-01-07T10:00:00Z"}` (RFC3339 format)
|
||||||
|
|
||||||
### Complex Types
|
### Complex Types
|
||||||
|
|
||||||
- **GeoPoint**: `{"geoPointValue": {"latitude": 34.052235, "longitude": -118.243683}}`
|
- **GeoPoint**: `{"geoPointValue": {"latitude": 34.052235, "longitude": -118.243683}}`
|
||||||
- **Array**: `{"arrayValue": {"values": [{"stringValue": "item1"}, {"integerValue": "2"}]}}`
|
- **Array**: `{"arrayValue": {"values": [{"stringValue": "item1"}, {"integerValue": "2"}]}}`
|
||||||
- **Map**: `{"mapValue": {"fields": {"key1": {"stringValue": "value1"}, "key2": {"booleanValue": true}}}}`
|
- **Map**: `{"mapValue": {"fields": {"key1": {"stringValue": "value1"}, "key2": {"booleanValue": true}}}}`
|
||||||
@@ -65,6 +67,7 @@ tools:
|
|||||||
```
|
```
|
||||||
|
|
||||||
Usage:
|
Usage:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
"collectionPath": "companies",
|
"collectionPath": "companies",
|
||||||
|
|||||||
@@ -74,11 +74,11 @@ deleted. To delete a field, include it in the `updateMask` but omit it from
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-------------|:--------------:|:------------:|----------------------------------------------------------|
|
|-------------|:--------:|:------------:|------------------------------------------------------|
|
||||||
| kind | string | true | Must be "firestore-update-document". |
|
| kind | string | true | Must be "firestore-update-document". |
|
||||||
| source | string | true | Name of the Firestore source to update documents in. |
|
| source | string | true | Name of the Firestore source to update documents in. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|
||||||
## Examples
|
## Examples
|
||||||
|
|
||||||
@@ -91,7 +91,9 @@ tools:
|
|||||||
source: my-firestore
|
source: my-firestore
|
||||||
description: Update a user document
|
description: Update a user document
|
||||||
```
|
```
|
||||||
|
|
||||||
Usage:
|
Usage:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
"documentPath": "users/user123",
|
"documentPath": "users/user123",
|
||||||
@@ -140,7 +142,8 @@ Usage:
|
|||||||
|
|
||||||
### Update with Field Deletion
|
### Update with Field Deletion
|
||||||
|
|
||||||
To delete fields, include them in the `updateMask` but omit them from `documentData`:
|
To delete fields, include them in the `updateMask` but omit them from
|
||||||
|
`documentData`:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
@@ -158,7 +161,8 @@ To delete fields, include them in the `updateMask` but omit them from `documentD
|
|||||||
In this example:
|
In this example:
|
||||||
|
|
||||||
- `name` will be updated to "John Smith"
|
- `name` will be updated to "John Smith"
|
||||||
- `temporaryField` and `obsoleteData` will be deleted from the document (they are in the mask but not in the data)
|
- `temporaryField` and `obsoleteData` will be deleted from the document (they
|
||||||
|
are in the mask but not in the data)
|
||||||
|
|
||||||
### Complex Update with Nested Data
|
### Complex Update with Nested Data
|
||||||
|
|
||||||
@@ -317,28 +321,43 @@ Common errors include:
|
|||||||
|
|
||||||
## Best Practices
|
## Best Practices
|
||||||
|
|
||||||
1. **Use update masks for precision**: When you only need to update specific fields, use the `updateMask` parameter to avoid unintended changes
|
1. **Use update masks for precision**: When you only need to update specific
|
||||||
2. **Always use typed values**: Every field must be wrapped with its appropriate type indicator (e.g., `{"stringValue": "text"}`)
|
fields, use the `updateMask` parameter to avoid unintended changes
|
||||||
3. **Integer values can be strings**: The tool accepts integer values as strings (e.g., `{"integerValue": "1500"}`)
|
2. **Always use typed values**: Every field must be wrapped with its appropriate
|
||||||
4. **Use returnData sparingly**: Only set to true when you need to verify the exact data after the update
|
type indicator (e.g., `{"stringValue": "text"}`)
|
||||||
5. **Validate data before sending**: Ensure your data matches Firestore's native JSON format
|
3. **Integer values can be strings**: The tool accepts integer values as strings
|
||||||
|
(e.g., `{"integerValue": "1500"}`)
|
||||||
|
4. **Use returnData sparingly**: Only set to true when you need to verify the
|
||||||
|
exact data after the update
|
||||||
|
5. **Validate data before sending**: Ensure your data matches Firestore's native
|
||||||
|
JSON format
|
||||||
6. **Handle timestamps properly**: Use RFC3339 format for timestamp strings
|
6. **Handle timestamps properly**: Use RFC3339 format for timestamp strings
|
||||||
7. **Base64 encode binary data**: Binary data must be base64 encoded in the `bytesValue` field
|
7. **Base64 encode binary data**: Binary data must be base64 encoded in the
|
||||||
8. **Consider security rules**: Ensure your Firestore security rules allow document updates
|
`bytesValue` field
|
||||||
9. **Delete fields using update mask**: To delete fields, include them in the `updateMask` but omit them from `documentData`
|
8. **Consider security rules**: Ensure your Firestore security rules allow
|
||||||
10. **Test with non-production data first**: Always test your updates on non-critical documents first
|
document updates
|
||||||
|
9. **Delete fields using update mask**: To delete fields, include them in the
|
||||||
|
`updateMask` but omit them from `documentData`
|
||||||
|
10. **Test with non-production data first**: Always test your updates on
|
||||||
|
non-critical documents first
|
||||||
|
|
||||||
## Differences from Add Documents
|
## Differences from Add Documents
|
||||||
|
|
||||||
- **Purpose**: Updates existing documents vs. creating new ones
|
- **Purpose**: Updates existing documents vs. creating new ones
|
||||||
- **Document must exist**: For standard updates (though not using updateMask will create if missing with given document id)
|
- **Document must exist**: For standard updates (though not using updateMask
|
||||||
|
will create if missing with given document id)
|
||||||
- **Update mask support**: Allows selective field updates
|
- **Update mask support**: Allows selective field updates
|
||||||
- **Field deletion**: Supports removing specific fields by including them in the mask but not in the data
|
- **Field deletion**: Supports removing specific fields by including them in the
|
||||||
|
mask but not in the data
|
||||||
- **Returns updateTime**: Instead of createTime
|
- **Returns updateTime**: Instead of createTime
|
||||||
|
|
||||||
## Related Tools
|
## Related Tools
|
||||||
|
|
||||||
- [`firestore-add-documents`](firestore-add-documents.md) - Add new documents to Firestore
|
- [`firestore-add-documents`](firestore-add-documents.md) - Add new documents to
|
||||||
- [`firestore-get-documents`](firestore-get-documents.md) - Retrieve documents by their paths
|
Firestore
|
||||||
- [`firestore-query-collection`](firestore-query-collection.md) - Query documents in a collection
|
- [`firestore-get-documents`](firestore-get-documents.md) - Retrieve documents
|
||||||
- [`firestore-delete-documents`](firestore-delete-documents.md) - Delete documents from Firestore
|
by their paths
|
||||||
|
- [`firestore-query-collection`](firestore-query-collection.md) - Query
|
||||||
|
documents in a collection
|
||||||
|
- [`firestore-delete-documents`](firestore-delete-documents.md) - Delete
|
||||||
|
documents from Firestore
|
||||||
|
|||||||
@@ -247,17 +247,17 @@ my-http-tool:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|--------------|:------------------------------------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
|--------------|:---------------------------------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| kind | string | true | Must be "http". |
|
| kind | string | true | Must be "http". |
|
||||||
| source | string | true | Name of the source the HTTP request should be sent to. |
|
| source | string | true | Name of the source the HTTP request should be sent to. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
| path | string | true | The path of the HTTP request. You can include static query parameters in the path string. |
|
| path | string | true | The path of the HTTP request. You can include static query parameters in the path string. |
|
||||||
| method | string | true | The HTTP method to use (e.g., GET, POST, PUT, DELETE). |
|
| method | string | true | The HTTP method to use (e.g., GET, POST, PUT, DELETE). |
|
||||||
| headers | map[string]string | false | A map of headers to include in the HTTP request (overrides source headers). |
|
| headers | map[string]string | false | A map of headers to include in the HTTP request (overrides source headers). |
|
||||||
| requestBody | string | false | The request body payload. Use [go template][go-template-doc] with the parameter name as the placeholder (e.g., `{{.id}}` will be replaced with the value of the parameter that has name `id` in the `bodyParams` section). |
|
| requestBody | string | false | The request body payload. Use [go template][go-template-doc] with the parameter name as the placeholder (e.g., `{{.id}}` will be replaced with the value of the parameter that has name `id` in the `bodyParams` section). |
|
||||||
| queryParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the query string. |
|
| queryParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the query string. |
|
||||||
| bodyParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the request body payload. |
|
| bodyParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the request body payload. |
|
||||||
| headerParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted as the request headers. |
|
| headerParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted as the request headers. |
|
||||||
|
|
||||||
[go-template-doc]: <https://pkg.go.dev/text/template#pkg-overview>
|
[go-template-doc]: <https://pkg.go.dev/text/template#pkg-overview>
|
||||||
|
|||||||
@@ -11,7 +11,8 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
A `looker-conversational-analytics` tool allows you to ask questions about your Looker data.
|
A `looker-conversational-analytics` tool allows you to ask questions about your
|
||||||
|
Looker data.
|
||||||
|
|
||||||
It's compatible with the following sources:
|
It's compatible with the following sources:
|
||||||
|
|
||||||
@@ -19,9 +20,11 @@ It's compatible with the following sources:
|
|||||||
|
|
||||||
`looker-conversational-analytics` accepts two parameters:
|
`looker-conversational-analytics` accepts two parameters:
|
||||||
|
|
||||||
1. `user_query_with_context`: The question asked of the Conversational Analytics system.
|
1. `user_query_with_context`: The question asked of the Conversational Analytics
|
||||||
2. `explore_references`: A list of one to five explores that can be queried to answer the
|
system.
|
||||||
question. The form of the entry is `[{"model": "model name", "explore": "explore name"}, ...]`
|
2. `explore_references`: A list of one to five explores that can be queried to
|
||||||
|
answer the question. The form of the entry is `[{"model": "model name",
|
||||||
|
"explore": "explore name"}, ...]`
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -42,4 +45,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "lookerca-conversational-analytics". |
|
| kind | string | true | Must be "lookerca-conversational-analytics". |
|
||||||
| source | string | true | Name of the source the SQL should execute on. |
|
| source | string | true | Name of the source the SQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -42,4 +42,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-create-project-file". |
|
| kind | string | true | Must be "looker-create-project-file". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -41,4 +41,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-delete-project-file". |
|
| kind | string | true | Must be "looker-delete-project-file". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -16,8 +16,8 @@ It's compatible with the following sources:
|
|||||||
|
|
||||||
- [looker](../../sources/looker.md)
|
- [looker](../../sources/looker.md)
|
||||||
|
|
||||||
`looker-dev-mode` accepts a boolean parameter, true to enter dev mode and false to exit dev mode.
|
`looker-dev-mode` accepts a boolean parameter, true to enter dev mode and false
|
||||||
|
to exit dev mode.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -39,4 +39,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-dev-mode". |
|
| kind | string | true | Must be "looker-dev-mode". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -10,7 +10,10 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
The `looker-generate-embed-url` tool generates an embeddable URL for a given piece of Looker content. The url generated is created for the user authenticated to the Looker source. When opened in the browser it will create a Looker Embed session.
|
The `looker-generate-embed-url` tool generates an embeddable URL for a given
|
||||||
|
piece of Looker content. The url generated is created for the user authenticated
|
||||||
|
to the Looker source. When opened in the browser it will create a Looker Embed
|
||||||
|
session.
|
||||||
|
|
||||||
It's compatible with the following sources:
|
It's compatible with the following sources:
|
||||||
|
|
||||||
@@ -21,7 +24,9 @@ It's compatible with the following sources:
|
|||||||
1. the `type` of content (e.g., "dashboards", "looks", "query-visualization")
|
1. the `type` of content (e.g., "dashboards", "looks", "query-visualization")
|
||||||
2. the `id` of the content
|
2. the `id` of the content
|
||||||
|
|
||||||
It's recommended to use other tools from the Looker MCP toolbox with this tool to do things like fetch dashboard id's, generate a query, etc that can be supplied to this tool.
|
It's recommended to use other tools from the Looker MCP toolbox with this tool
|
||||||
|
to do things like fetch dashboard id's, generate a query, etc that can be
|
||||||
|
supplied to this tool.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -42,6 +47,6 @@ tools:
|
|||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-generate-embed-url" |
|
| kind | string | true | Must be "looker-generate-embed-url" |
|
||||||
| source | string | true | Name of the source the SQL should execute on. |
|
| source | string | true | Name of the source the SQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -38,4 +38,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-get-connection-databases". |
|
| kind | string | true | Must be "looker-get-connection-databases". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -38,4 +38,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-get-connection-schemas". |
|
| kind | string | true | Must be "looker-get-connection-schemas". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -12,7 +12,6 @@ aliases:
|
|||||||
|
|
||||||
A `looker-get-connection-table-columns` tool returns all the columnes for each table specified.
|
A `looker-get-connection-table-columns` tool returns all the columnes for each table specified.
|
||||||
|
|
||||||
|
|
||||||
It's compatible with the following sources:
|
It's compatible with the following sources:
|
||||||
|
|
||||||
- [looker](../../sources/looker.md)
|
- [looker](../../sources/looker.md)
|
||||||
@@ -40,4 +39,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-get-connection-table-columns". |
|
| kind | string | true | Must be "looker-get-connection-table-columns". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -16,7 +16,8 @@ It's compatible with the following sources:
|
|||||||
|
|
||||||
- [looker](../../sources/looker.md)
|
- [looker](../../sources/looker.md)
|
||||||
|
|
||||||
`looker-get-connection-tables` accepts a `conn` parameter, a `schema` parameter, and an optional `db` parameter.
|
`looker-get-connection-tables` accepts a `conn` parameter, a `schema` parameter,
|
||||||
|
and an optional `db` parameter.
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -38,4 +39,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-get-connection-tables". |
|
| kind | string | true | Must be "looker-get-connection-tables". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -39,4 +39,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-get-connections". |
|
| kind | string | true | Must be "looker-get-connections". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -31,6 +31,7 @@ The return type is an array of maps, each map is formatted like:
|
|||||||
"group_label": "group label"
|
"group_label": "group label"
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
|
|||||||
@@ -52,7 +52,6 @@ The response is a json array with the following elements:
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|
|||||||
@@ -58,7 +58,6 @@ The response is a json array with the following elements:
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|
|||||||
@@ -52,7 +52,6 @@ The response is a json array with the following elements:
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|
|||||||
@@ -38,4 +38,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-get-project-files". |
|
| kind | string | true | Must be "looker-get-project-files". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -38,4 +38,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-get-projects". |
|
| kind | string | true | Must be "looker-get-projects". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -10,11 +10,18 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
The `looker-health-analyze` tool performs various analysis tasks on a Looker instance. The `action` parameter selects the type of analysis to perform:
|
The `looker-health-analyze` tool performs various analysis tasks on a Looker
|
||||||
|
instance. The `action` parameter selects the type of analysis to perform:
|
||||||
|
|
||||||
- `projects`: Analyzes all projects or a specified project, reporting on the number of models and view files, as well as Git connection and validation status.
|
- `projects`: Analyzes all projects or a specified project, reporting on the
|
||||||
- `models`: Analyzes all models or a specified model, providing a count of explores, unused explores, and total query counts.
|
number of models and view files, as well as Git connection and validation
|
||||||
- `explores`: Analyzes all explores or a specified explore, reporting on the number of joins, unused joins, fields, unused fields, and query counts. Being classified as **Unused** is determined by whether a field has been used as a field or filter within the past 90 days in production.
|
status.
|
||||||
|
- `models`: Analyzes all models or a specified model, providing a count of
|
||||||
|
explores, unused explores, and total query counts.
|
||||||
|
- `explores`: Analyzes all explores or a specified explore, reporting on the
|
||||||
|
number of joins, unused joins, fields, unused fields, and query counts. Being
|
||||||
|
classified as **Unused** is determined by whether a field has been used as a
|
||||||
|
field or filter within the past 90 days in production.
|
||||||
|
|
||||||
## Parameters
|
## Parameters
|
||||||
|
|
||||||
@@ -54,4 +61,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-health-analyze" |
|
| kind | string | true | Must be "looker-health-analyze" |
|
||||||
| source | string | true | Looker source name |
|
| source | string | true | Looker source name |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -10,31 +10,36 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
The `looker-health-pulse` tool performs health checks on a Looker instance. The `action` parameter selects the type of check to perform:
|
The `looker-health-pulse` tool performs health checks on a Looker instance. The
|
||||||
|
`action` parameter selects the type of check to perform:
|
||||||
|
|
||||||
- `check_db_connections`: Checks all database connections, runs supported tests, and reports query counts.
|
- `check_db_connections`: Checks all database connections, runs supported tests,
|
||||||
- `check_dashboard_performance`: Finds dashboards with slow running queries in the last 7 days.
|
and reports query counts.
|
||||||
- `check_dashboard_errors`: Lists dashboards with erroring queries in the last 7 days.
|
- `check_dashboard_performance`: Finds dashboards with slow running queries in
|
||||||
- `check_explore_performance`: Lists the slowest explores in the last 7 days and reports average query runtime.
|
the last 7 days.
|
||||||
- `check_schedule_failures`: Lists schedules that have failed in the last 7 days.
|
- `check_dashboard_errors`: Lists dashboards with erroring queries in the last 7
|
||||||
- `check_legacy_features`: Lists enabled legacy features. (*To note, this function is not
|
days.
|
||||||
available in Looker Core.*)
|
- `check_explore_performance`: Lists the slowest explores in the last 7 days and
|
||||||
|
reports average query runtime.
|
||||||
|
- `check_schedule_failures`: Lists schedules that have failed in the last 7
|
||||||
|
days.
|
||||||
|
- `check_legacy_features`: Lists enabled legacy features. (*To note, this
|
||||||
|
function is not available in Looker Core.*)
|
||||||
|
|
||||||
## Parameters
|
## Parameters
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|---------------|:--------:|:------------:|---------------------------------------------|
|
|-----------|:--------:|:------------:|-----------------------------|
|
||||||
| action | string | true | The health check to perform |
|
| action | string | true | The health check to perform |
|
||||||
|
|
||||||
|
| **action** | **description** |
|
||||||
| **action** | **description** |
|
|-----------------------------|---------------------------------------------------------------------|
|
||||||
|---------------------------|--------------------------------------------------------------------------------|
|
| check_db_connections | Checks all database connections and reports query counts and errors |
|
||||||
| check_db_connections | Checks all database connections and reports query counts and errors |
|
| check_dashboard_performance | Finds dashboards with slow queries (>30s) in the last 7 days |
|
||||||
| check_dashboard_performance | Finds dashboards with slow queries (>30s) in the last 7 days |
|
| check_dashboard_errors | Lists dashboards with erroring queries in the last 7 days |
|
||||||
| check_dashboard_errors | Lists dashboards with erroring queries in the last 7 days |
|
| check_explore_performance | Lists slowest explores and average query runtime |
|
||||||
| check_explore_performance | Lists slowest explores and average query runtime |
|
| check_schedule_failures | Lists failed schedules in the last 7 days |
|
||||||
| check_schedule_failures | Lists failed schedules in the last 7 days |
|
| check_legacy_features | Lists enabled legacy features |
|
||||||
| check_legacy_features | Lists enabled legacy features |
|
|
||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
@@ -66,4 +71,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-health-pulse" |
|
| kind | string | true | Must be "looker-health-pulse" |
|
||||||
| source | string | true | Looker source name |
|
| source | string | true | Looker source name |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -10,7 +10,9 @@ aliases:
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
The `looker-health-vacuum` tool helps you identify unused LookML objects such as models, explores, joins, and fields. The `action` parameter selects the type of vacuum to perform:
|
The `looker-health-vacuum` tool helps you identify unused LookML objects such as
|
||||||
|
models, explores, joins, and fields. The `action` parameter selects the type of
|
||||||
|
vacuum to perform:
|
||||||
|
|
||||||
- `models`: Identifies unused explores within a model.
|
- `models`: Identifies unused explores within a model.
|
||||||
- `explores`: Identifies unused joins and fields within an explore.
|
- `explores`: Identifies unused joins and fields within an explore.
|
||||||
@@ -28,7 +30,8 @@ The `looker-health-vacuum` tool helps you identify unused LookML objects such as
|
|||||||
|
|
||||||
## Example
|
## Example
|
||||||
|
|
||||||
Identify unnused fields (*in this case, less than 1 query in the last 20 days*) and joins in the `order_items` explore and `thelook` model
|
Identify unnused fields (*in this case, less than 1 query in the last 20 days*)
|
||||||
|
and joins in the `order_items` explore and `thelook` model
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
tools:
|
tools:
|
||||||
@@ -52,9 +55,8 @@ tools:
|
|||||||
The result is a list of objects that are candidates for deletion.
|
The result is a list of objects that are candidates for deletion.
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-health-vacuum" |
|
| kind | string | true | Must be "looker-health-vacuum" |
|
||||||
| source | string | true | Looker source name |
|
| source | string | true | Looker source name |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -40,4 +40,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-run-dashboard" |
|
| kind | string | true | Must be "looker-run-dashboard" |
|
||||||
| source | string | true | Name of the source the SQL should execute on. |
|
| source | string | true | Name of the source the SQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -42,4 +42,4 @@ tools:
|
|||||||
|-------------|:--------:|:------------:|----------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "looker-update-project-file". |
|
| kind | string | true | Must be "looker-update-project-file". |
|
||||||
| source | string | true | Name of the source Looker instance. |
|
| source | string | true | Name of the source Looker instance. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
@@ -8,24 +8,36 @@ description: >
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
MindsDB is the most widely adopted AI federated database that enables you to query hundreds of datasources and ML models through a single SQL interface. The following tools work with MindsDB databases:
|
MindsDB is the most widely adopted AI federated database that enables you to
|
||||||
|
query hundreds of datasources and ML models through a single SQL interface. The
|
||||||
|
following tools work with MindsDB databases:
|
||||||
|
|
||||||
- [mindsdb-execute-sql](mindsdb-execute-sql.md) - Execute SQL queries directly on MindsDB
|
- [mindsdb-execute-sql](mindsdb-execute-sql.md) - Execute SQL queries directly
|
||||||
|
on MindsDB
|
||||||
- [mindsdb-sql](mindsdb-sql.md) - Execute parameterized SQL queries on MindsDB
|
- [mindsdb-sql](mindsdb-sql.md) - Execute parameterized SQL queries on MindsDB
|
||||||
|
|
||||||
These tools leverage MindsDB's capabilities to:
|
These tools leverage MindsDB's capabilities to:
|
||||||
- **Connect to Multiple Datasources**: Query databases, APIs, file systems, and more through SQL
|
|
||||||
- **Cross-Datasource Operations**: Perform joins and analytics across different data sources
|
- **Connect to Multiple Datasources**: Query databases, APIs, file systems, and
|
||||||
- **ML Model Integration**: Use trained ML models as virtual tables for predictions
|
more through SQL
|
||||||
- **Unstructured Data Processing**: Query documents, images, and other unstructured data as structured tables
|
- **Cross-Datasource Operations**: Perform joins and analytics across different
|
||||||
- **Real-time Predictions**: Get real-time predictions from ML models through SQL
|
data sources
|
||||||
- **API Translation**: Automatically translate SQL queries into REST APIs, GraphQL, and native protocols
|
- **ML Model Integration**: Use trained ML models as virtual tables for
|
||||||
|
predictions
|
||||||
|
- **Unstructured Data Processing**: Query documents, images, and other
|
||||||
|
unstructured data as structured tables
|
||||||
|
- **Real-time Predictions**: Get real-time predictions from ML models through
|
||||||
|
SQL
|
||||||
|
- **API Translation**: Automatically translate SQL queries into REST APIs,
|
||||||
|
GraphQL, and native protocols
|
||||||
|
|
||||||
## Supported Datasources
|
## Supported Datasources
|
||||||
|
|
||||||
MindsDB automatically translates your SQL queries into the appropriate APIs for hundreds of datasources:
|
MindsDB automatically translates your SQL queries into the appropriate APIs for
|
||||||
|
hundreds of datasources:
|
||||||
|
|
||||||
### **Business Applications**
|
### **Business Applications**
|
||||||
|
|
||||||
- **Salesforce**: Query leads, opportunities, accounts, and custom objects
|
- **Salesforce**: Query leads, opportunities, accounts, and custom objects
|
||||||
- **Jira**: Access issues, projects, workflows, and team data
|
- **Jira**: Access issues, projects, workflows, and team data
|
||||||
- **GitHub**: Query repositories, commits, pull requests, and issues
|
- **GitHub**: Query repositories, commits, pull requests, and issues
|
||||||
@@ -33,6 +45,7 @@ MindsDB automatically translates your SQL queries into the appropriate APIs for
|
|||||||
- **HubSpot**: Query contacts, companies, deals, and marketing data
|
- **HubSpot**: Query contacts, companies, deals, and marketing data
|
||||||
|
|
||||||
### **Databases & Storage**
|
### **Databases & Storage**
|
||||||
|
|
||||||
- **MongoDB**: Query NoSQL collections as structured tables
|
- **MongoDB**: Query NoSQL collections as structured tables
|
||||||
- **PostgreSQL/MySQL**: Standard relational databases
|
- **PostgreSQL/MySQL**: Standard relational databases
|
||||||
- **Redis**: Key-value stores and caching layers
|
- **Redis**: Key-value stores and caching layers
|
||||||
@@ -40,11 +53,13 @@ MindsDB automatically translates your SQL queries into the appropriate APIs for
|
|||||||
- **S3/Google Cloud Storage**: File storage and data lakes
|
- **S3/Google Cloud Storage**: File storage and data lakes
|
||||||
|
|
||||||
### **Communication & Email**
|
### **Communication & Email**
|
||||||
|
|
||||||
- **Gmail/Outlook**: Query emails, attachments, and metadata
|
- **Gmail/Outlook**: Query emails, attachments, and metadata
|
||||||
- **Microsoft Teams**: Team communications and files
|
- **Microsoft Teams**: Team communications and files
|
||||||
- **Discord**: Server data and message history
|
- **Discord**: Server data and message history
|
||||||
|
|
||||||
### **Analytics & Monitoring**
|
### **Analytics & Monitoring**
|
||||||
|
|
||||||
- **Google Analytics**: Website traffic and user behavior
|
- **Google Analytics**: Website traffic and user behavior
|
||||||
- **Mixpanel**: Product analytics and user events
|
- **Mixpanel**: Product analytics and user events
|
||||||
- **Datadog**: Infrastructure monitoring and logs
|
- **Datadog**: Infrastructure monitoring and logs
|
||||||
@@ -53,6 +68,7 @@ MindsDB automatically translates your SQL queries into the appropriate APIs for
|
|||||||
## Example Use Cases
|
## Example Use Cases
|
||||||
|
|
||||||
### Cross-Datasource Analytics
|
### Cross-Datasource Analytics
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Join Salesforce opportunities with GitHub activity
|
-- Join Salesforce opportunities with GitHub activity
|
||||||
SELECT
|
SELECT
|
||||||
@@ -66,6 +82,7 @@ WHERE s.stage = 'Closed Won';
|
|||||||
```
|
```
|
||||||
|
|
||||||
### Email & Communication Analysis
|
### Email & Communication Analysis
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Analyze email patterns with Slack activity
|
-- Analyze email patterns with Slack activity
|
||||||
SELECT
|
SELECT
|
||||||
@@ -79,6 +96,7 @@ WHERE e.date >= '2024-01-01';
|
|||||||
```
|
```
|
||||||
|
|
||||||
### ML Model Predictions
|
### ML Model Predictions
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Use ML model to predict customer churn
|
-- Use ML model to predict customer churn
|
||||||
SELECT
|
SELECT
|
||||||
@@ -90,7 +108,9 @@ FROM customer_churn_model
|
|||||||
WHERE predicted_churn_probability > 0.8;
|
WHERE predicted_churn_probability > 0.8;
|
||||||
```
|
```
|
||||||
|
|
||||||
Since MindsDB implements the MySQL wire protocol, these tools are functionally compatible with MySQL tools while providing access to MindsDB's advanced federated database capabilities.
|
Since MindsDB implements the MySQL wire protocol, these tools are functionally
|
||||||
|
compatible with MySQL tools while providing access to MindsDB's advanced
|
||||||
|
federated database capabilities.
|
||||||
|
|
||||||
## Working Configuration Example
|
## Working Configuration Example
|
||||||
|
|
||||||
@@ -113,4 +133,4 @@ tools:
|
|||||||
Execute SQL queries directly on MindsDB database.
|
Execute SQL queries directly on MindsDB database.
|
||||||
Use this tool to run any SQL statement against your MindsDB instance.
|
Use this tool to run any SQL statement against your MindsDB instance.
|
||||||
Example: SELECT * FROM my_table LIMIT 10
|
Example: SELECT * FROM my_table LIMIT 10
|
||||||
```
|
```
|
||||||
|
|||||||
@@ -19,16 +19,23 @@ federated database. It's compatible with any of the following sources:
|
|||||||
`mindsdb-execute-sql` takes one input parameter `sql` and runs the SQL
|
`mindsdb-execute-sql` takes one input parameter `sql` and runs the SQL
|
||||||
statement against the `source`. This tool enables you to:
|
statement against the `source`. This tool enables you to:
|
||||||
|
|
||||||
- **Query Multiple Datasources**: Execute SQL across hundreds of connected datasources
|
- **Query Multiple Datasources**: Execute SQL across hundreds of connected
|
||||||
- **Cross-Datasource Joins**: Perform joins between different databases, APIs, and file systems
|
datasources
|
||||||
- **ML Model Predictions**: Query ML models as virtual tables for real-time predictions
|
- **Cross-Datasource Joins**: Perform joins between different databases, APIs,
|
||||||
- **Unstructured Data**: Query documents, images, and other unstructured data as structured tables
|
and file systems
|
||||||
- **Federated Analytics**: Perform analytics across multiple datasources simultaneously
|
- **ML Model Predictions**: Query ML models as virtual tables for real-time
|
||||||
- **API Translation**: Automatically translate SQL queries into REST APIs, GraphQL, and native protocols
|
predictions
|
||||||
|
- **Unstructured Data**: Query documents, images, and other unstructured data as
|
||||||
|
structured tables
|
||||||
|
- **Federated Analytics**: Perform analytics across multiple datasources
|
||||||
|
simultaneously
|
||||||
|
- **API Translation**: Automatically translate SQL queries into REST APIs,
|
||||||
|
GraphQL, and native protocols
|
||||||
|
|
||||||
## Example Queries
|
## Example Queries
|
||||||
|
|
||||||
### Cross-Datasource Analytics
|
### Cross-Datasource Analytics
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Join Salesforce opportunities with GitHub activity
|
-- Join Salesforce opportunities with GitHub activity
|
||||||
SELECT
|
SELECT
|
||||||
@@ -43,6 +50,7 @@ GROUP BY s.opportunity_name, s.amount, g.repository_name;
|
|||||||
```
|
```
|
||||||
|
|
||||||
### Email & Communication Analysis
|
### Email & Communication Analysis
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Analyze email patterns with Slack activity
|
-- Analyze email patterns with Slack activity
|
||||||
SELECT
|
SELECT
|
||||||
@@ -57,6 +65,7 @@ GROUP BY e.sender, e.subject, s.channel_name;
|
|||||||
```
|
```
|
||||||
|
|
||||||
### ML Model Predictions
|
### ML Model Predictions
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Use ML model to predict customer churn
|
-- Use ML model to predict customer churn
|
||||||
SELECT
|
SELECT
|
||||||
@@ -69,6 +78,7 @@ WHERE predicted_churn_probability > 0.8;
|
|||||||
```
|
```
|
||||||
|
|
||||||
### MongoDB Query
|
### MongoDB Query
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Query MongoDB collections as structured tables
|
-- Query MongoDB collections as structured tables
|
||||||
SELECT
|
SELECT
|
||||||
@@ -119,8 +129,8 @@ tools:
|
|||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
|
|
||||||
| **field** | **type** | **required** | **description** |
|
| **field** | **type** | **required** | **description** |
|
||||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
|-------------|:--------:|:------------:|----------------------------------------------------|
|
||||||
| kind | string | true | Must be "mindsdb-execute-sql". |
|
| kind | string | true | Must be "mindsdb-execute-sql". |
|
||||||
| source | string | true | Name of the source the SQL should execute on. |
|
| source | string | true | Name of the source the SQL should execute on. |
|
||||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user