Compare commits

...

19 Commits

Author SHA1 Message Date
Yuan Teoh
1613cc7606 chore: update yaml tag in tools 2025-09-09 11:18:26 -07:00
Yuan Teoh
66d5a10790 chore: update unmarshal function for ToolsFile 2025-09-05 17:41:01 -07:00
Yuan Teoh
fea5fed265 chore: add preprocessing layer to convert tools file 2025-09-05 17:40:07 -07:00
Yuan Teoh
a060660580 chore!: update kind field to type 2025-09-04 19:16:53 -07:00
Averi Kitsch
0b2121ea72 feat(prebuilt/alloydb-postgres): support ipType and IAM users (#1324)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-04 21:36:03 +00:00
Wenxin Du
be30c1c08f docs: Fix PR template link (#1340)
Should link to the Toolbox repo
2025-09-04 21:22:03 +00:00
Wenxin Du
36225aa6db fix(looker): Fix Looker client OAuth check (#1338) 2025-09-04 21:00:05 +00:00
Averi Kitsch
e86a3823b0 chore: update contributing guide (#1337)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-04 20:35:05 +00:00
Averi Kitsch
afba7a57cd fix: do not print usage on runtime error (#1315)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-04 20:04:09 +00:00
Wenxin Du
88f4b3028d feat(tools/bigquery): Support end-user credential passthrough on multiple BQ tools (#1314)
Support end-user credential passthrough on BQ Tools that are using
clients.
2025-09-04 15:39:39 -04:00
Anushka Saxena
19a7fe2928 docs: restructure contributing guidelines for adding new db source or tool (#1239)
## Description

This PR is for re-structuring the `CONTRIBUTING.md` guide to separate
the documentation for adding new database sources and adding new tools.
The "Adding a New Database Source and Tool" section has been split into
two distinct, standalone sections.

Fixes #1158

---------

Signed-off-by: SaxenaAnushka102 <anushkasaxenaa@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-04 11:06:28 -07:00
Anmol Shukla
54d33fbd90 chore(testing): setup initial directory structure for JS (#1193) 2025-09-04 11:54:59 +05:30
Anushka Saxena
e929942b74 docs: update readme to clarify toolbox server run instructions (#1002)
### Description

Update
[README.md](https://github.com/googleapis/genai-toolbox/blob/main/README.md#running-the-server)
to clarify Toolbox server run instructions for Binary, Docker, and
Source installations.

### Related issue(s)

Addresses and resolves #952

---------

Signed-off-by: Anushka Saxena <anushkasaxenaa@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-03 22:46:47 +00:00
Ajaykumar Yadav
0a8351c32d docs: corrected copilot mcp config (#1253)
## Description
- [x] corrected config for the mssql. 
- [x] Updated/Corrected the MCP config of copilot as per the latest
official docs

```json
//correct config
{
  "servers": {
    "postgres": {
      "type": "stdio",
      "command": "path",
      "args": [
        "--prebuilt",
        "postgres",
        "--stdio"
      ],
      "env": {
        "POSTGRES_HOST": "localhost",
        "POSTGRES_PORT": "5432",
        "POSTGRES_DATABASE": "qlms_db",
        "POSTGRES_USER": "",
        "POSTGRES_PASSWORD": ""
      }
    },
  }
}

```

```json
//outdated/wrong config 1

{
  "mcpServers": {
    "postgres": {
      "type": "stdio",
      "command": "path",
      "args": [
        "--prebuilt",
        "postgres",
        "--stdio"
      ],
      "env": {
        "POSTGRES_HOST": "localhost",
        "POSTGRES_PORT": "5432",
        "POSTGRES_DATABASE": "qlms_db",
        "POSTGRES_USER": "",
        "POSTGRES_PASSWORD": ""
      }
    },
  }
}


//outdated/wrong config 2
{
"mcp":{
  "servers": {
    "postgres": {
      "type": "stdio",
      "command": "path",
      "args": [
        "--prebuilt",
        "postgres",
        "--stdio"
      ],
      "env": {
        "POSTGRES_HOST": "localhost",
        "POSTGRES_PORT": "5432",
        "POSTGRES_DATABASE": "qlms_db",
        "POSTGRES_USER": "",
        "POSTGRES_PASSWORD": ""
      }
    },
  }
}
}
```

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes: https://github.com/googleapis/genai-toolbox/issues/1252

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-03 15:27:56 -07:00
Mend Renovate
e2ddec9675 chore(deps): update dependency go to v1.25.1 (#1325)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [go](https://go.dev/)
([source](https://redirect.github.com/golang/go)) | toolchain | patch |
`1.25.0` -> `1.25.1` |

---

### Release Notes

<details>
<summary>golang/go (go)</summary>

###
[`v1.25.1`](https://redirect.github.com/golang/go/compare/go1.25.0...go1.25.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTEuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-09-03 14:59:33 -07:00
Dr. Strangelove
2cad82e510 feat(tools/looker): Report field suggestions to agent (#1267)
## Description
---
Report the LookML suggestions array or the suggst_explore and
suggest_dimension to the caller so
that the caller can use those suggestions to improve filtering.

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1247
2025-09-03 14:17:22 -04:00
Valeriy
3ae2526e0f feat(source/mysql): support queryParams in MySQL source (#1299)
Fixes #1286

### Motivation
* Allow secure connections to PostgreSQL without custom code.

### Changes
#### Sources
* `mysql`: `Config.QueryParams` added; DSN building rewritten via
`url.Values`.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Yuan Teoh <yuanteoh@google.com>
2025-09-03 10:48:22 -07:00
Dr. Strangelove
8755e3db34 feat(tools/looker): Authenticate via end user credentials (#1257)
Authenticate to Looker with end user credentials

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1258

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-09-03 17:29:25 +00:00
Sri Varshitha
ade9a2515b docs: Redirect firestore doc to official cloud documentation (#1210)
## Description
---

This change redirects the documentation for connecting an IDE to
Firestore to the official Google Cloud documentation.

The new documentation is available at:
https://cloud.google.com/firestore/native/docs/connect-ide-using-mcp-toolbox

Co-authored-by: prernakakkar-google <158031829+prernakakkar-google@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-03 17:09:01 +00:00
363 changed files with 3944 additions and 3022 deletions

View File

@@ -1,16 +1,19 @@
## Description
---
> Should include a concise description of the changes (bug or feature), it's
> impact, along with a summary of the solution
## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed
[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a
[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
@@ -18,4 +21,4 @@
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change
🛠️ Fixes #<issue_number_goes_here>
🛠️ Fixes #<issue_number_goes_here>

View File

@@ -25,33 +25,42 @@ This project follows
## Contribution process
### Code reviews
> [!NOTE]
> New contributions should always include both unit and integration tests.
All submissions, including submissions by project members, require review. We
use GitHub pull requests for this purpose. Consult
[GitHub Help](https://help.github.com/articles/about-pull-requests/) for more
information on using pull requests.
Within 2-5 days, a reviewer will review your PR. They may approve it, or request
changes. When requesting changes, reviewers should self-assign the PR to ensure
### Code reviews
* Within 2-5 days, a reviewer will review your PR. They may approve it, or request
changes.
* When requesting changes, reviewers should self-assign the PR to ensure
they are aware of any updates.
If additional changes are needed, push additional commits to your PR branch -
this helps the reviewer know which parts of the PR have changed. Commits will be
* If additional changes are needed, push additional commits to your PR branch -
this helps the reviewer know which parts of the PR have changed.
* Commits will be
squashed when merged.
Please follow up with changes promptly. If a PR is awaiting changes by the
* Please follow up with changes promptly.
* If a PR is awaiting changes by the
author for more than 10 days, maintainers may mark that PR as Draft. PRs that
are inactive for more than 30 days may be closed.
### Adding a New Database Source and Tool
## Adding a New Database Source or Tool
We recommend creating an
Please create an
[issue](https://github.com/googleapis/genai-toolbox/issues) before
implementation to ensure we can accept the contribution and no duplicated work.
If you have any questions, reach out on our
[Discord](https://discord.gg/Dmm69peqjh) to chat directly with the team. New
contributions should be added with both unit tests and integration tests.
implementation to ensure we can accept the contribution and no duplicated work. This issue
should include an overview of the API design. If you have any questions, reach out on our
[Discord](https://discord.gg/Dmm69peqjh) to chat directly with the team.
#### 1. Implement the New Data Source
> [!NOTE]
> New tools can be added for [pre-existing data sources](https://github.com/googleapis/genai-toolbox/tree/main/internal/sources). However, any new database source should also include at least one new tool type.
### Adding a New Database Source
We recommend looking at an [example source
implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/sources/postgres/postgres.go).
@@ -66,7 +75,7 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
* **Implement the
[`SourceConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L57)
interface**. This interface requires two methods:
* `SourceConfigKind() string`: Returns a unique string identifier for your
* `SourceConfigType() string`: Returns a unique string identifier for your
data source (e.g., `"newdb"`).
* `Initialize(ctx context.Context, tracer trace.Tracer) (Source, error)`:
Creates a new instance of your data source and establishes a connection to
@@ -74,11 +83,11 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
* **Implement the
[`Source`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L63)
interface**. This interface requires one method:
* `SourceKind() string`: Returns the same string identifier as `SourceConfigKind()`.
* `SourceType() string`: Returns the same string identifier as `SourceConfigType()`.
* **Implement `init()`** to register the new Source.
* **Implement Unit Tests** in a file named `newdb_test.go`.
#### 2. Implement the New Tool
### Adding a New Tool
We recommend looking at an [example tool
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
@@ -91,7 +100,7 @@ tools.
* **Implement the
[`ToolConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/tools/tools.go#L61)
interface**. This interface requires one method:
* `ToolConfigKind() string`: Returns a unique string identifier for your tool
* `ToolConfigType() string`: Returns a unique string identifier for your tool
(e.g., `"newdb"`).
* `Initialize(sources map[string]Source) (Tool, error)`: Creates a new
instance of your tool and validates that it can connect to the specified
@@ -111,7 +120,7 @@ tools.
* **Implement `init()`** to register the new Tool.
* **Implement Unit Tests** in a file named `newdb_test.go`.
#### 3. Add Integration Tests
### Adding Integration Tests
* **Add a test file** under a new directory `tests/newdb`.
* **Add pre-defined integration test suites** in the
@@ -153,7 +162,7 @@ tools.
[temp-param-doc]:
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
#### 4. Add Documentation
### Adding Documentation
* **Update the documentation** to include information about your new data source
and tool. This includes:
@@ -163,7 +172,7 @@ tools.
* **(Optional) Add samples** to the `docs/en/samples/<newdb>` directory.
#### (Optional) 5. Add Prebuilt Tools
### (Optional) Adding Prebuilt Tools
You can provide developers with a set of "build-time" tools to aid common
software development user journeys like viewing and creating tables/collections
@@ -177,7 +186,7 @@ and data.
[internal/prebuiltconfigs/prebuiltconfigs_test.go](internal/prebuiltconfigs/prebuiltconfigs_test.go)
and [cmd/root_test.go](cmd/root_test.go).
#### 6. Submit a Pull Request
## Submitting a Pull Request
Submit a pull request to the repository with your changes. Be sure to include a
detailed description of your changes and any requests for long term testing
@@ -206,7 +215,7 @@ resources.
| style | Update src code, with only formatting and whitespace updates (e.g. code formatter or linter changes). |
Pull requests should always add scope whenever possible. The scope is
formatted as `<scope-type>/<scope-kind>` (e.g., `sources/postgres`, or
formatted as `<scope-type>/<scope-type>` (e.g., `sources/postgres`, or
`tools/mssql-sql`).
Ideally, **each PR covers only one scope**, if this is
@@ -218,4 +227,4 @@ resources.
* **PR Description:** PR description should **always** be included. It should
include a concise description of the changes, it's impact, along with a
summary of the solution. If the PR is related to a specific issue, the issue
number should be mentioned in the PR description (e.g. `Fixes #1`).
number should be mentioned in the PR description (e.g. `Fixes #1`).

View File

@@ -165,22 +165,66 @@ go install github.com/googleapis/genai-toolbox@v0.13.0
[Configure](#configuration) a `tools.yaml` to define your tools, and then
execute `toolbox` to start the server:
<details open>
<summary>Binary</summary>
To run Toolbox from binary:
```sh
./toolbox --tools-file "tools.yaml"
```
> [!NOTE]
> Toolbox enables dynamic reloading by default. To disable, use the
> `--disable-reload` flag.
**NOTE:**
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
#### Homebrew Users
</details>
If you installed Toolbox using Homebrew, the `toolbox` binary is available in your system path. You can start the server with the same command:
<details>
<summary>Container image</summary>
To run the server after pulling the [container image](#installing-the-server):
```sh
export VERSION=0.11.0 # Use the version you pulled
docker run -p 5000:5000 \
-v $(pwd)/tools.yaml:/app/tools.yaml \
us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \
--tools-file "/app/tools.yaml"
```
**NOTE:**
The `-v` flag mounts your local `tools.yaml` into the container, and `-p` maps the container's port `5000` to your host's port `5000`.
</details>
<details>
<summary>Source</summary>
To run the server directly from source, navigate to the project root directory and run:
```sh
go run .
```
**NOTE:**
This command runs the project from source, and is more suitable for development and testing. It does **not** compile a binary into your `$GOPATH`. If you want to compile a binary instead, refer the [Developer Documentation](./DEVELOPER.md#building-the-binary).
</details>
<details>
<summary>Homebrew</summary>
If you installed Toolbox using [Homebrew](https://brew.sh/), the `toolbox` binary is available in your system path. You can start the server with the same command:
```sh
toolbox --tools-file "tools.yaml"
```
</details>
You can use `toolbox help` for a full list of flags! To stop the server, send a
terminate signal (`ctrl+c` on most platforms).
@@ -188,6 +232,7 @@ For more detailed documentation on deploying to different environments, check
out the resources in the [How-to
section](https://googleapis.github.io/genai-toolbox/how-to/)
### Integrating your application
Once your server is up and running, you can load the tools into your
@@ -681,7 +726,7 @@ For more details on configuring different types of sources, see the
### Tools
The `tools` section of a `tools.yaml` define the actions an agent can take: what
kind of tool it is, which source(s) it affects, what parameters it uses, etc.
type of tool it is, which source(s) it affects, what parameters it uses, etc.
```yaml
tools:

View File

@@ -15,6 +15,7 @@
package cmd
import (
"bytes"
"context"
_ "embed"
"fmt"
@@ -219,6 +220,9 @@ func NewCommand(opts ...Option) *Command {
o(cmd)
}
// Do not print Usage on runtime error
cmd.SilenceUsage = true
// Set server version
cmd.cfg.Version = versionString
@@ -261,7 +265,6 @@ func NewCommand(opts ...Option) *Command {
type ToolsFile struct {
Sources server.SourceConfigs `yaml:"sources"`
AuthSources server.AuthServiceConfigs `yaml:"authSources"` // Deprecated: Kept for compatibility.
AuthServices server.AuthServiceConfigs `yaml:"authServices"`
Tools server.ToolConfigs `yaml:"tools"`
Toolsets server.ToolsetConfigs `yaml:"toolsets"`
@@ -290,6 +293,76 @@ func parseEnv(input string) (string, error) {
return output, err
}
func convertToolsFile(ctx context.Context, raw []byte) ([]byte, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
panic(err)
}
keysToCheck := []string{"sources", "authServices", "authSources", "tools", "toolsets"}
var input map[string]any
if err := yaml.Unmarshal(raw, &input); err != nil {
return nil, fmt.Errorf("error unmarshaling tools file: %s", err)
}
// convert to tools file v2
var toolsFileV1 bool
var buf bytes.Buffer
for _, kind := range keysToCheck {
resource, ok := input[kind]
if !ok {
// if this is skipped for all keys, the tools file is in v2
continue
}
toolsFileV1 = true
// convert `authSources` to `authServices`
if kind == "authSources" {
logger.WarnContext(ctx, "`authSources` is deprecated, use `authServices` instead")
kind = "authServices"
}
r, ok := resource.(map[string]any)
if !ok {
return nil, fmt.Errorf("'%s' is not a map", kind)
}
for name, d := range r {
buf.WriteString(fmt.Sprintf("kind: %s\n", kind))
buf.WriteString(fmt.Sprintf("name: %s\n", name))
if kind == "toolsets" {
b, err := yaml.Marshal(d)
if err != nil {
return nil, fmt.Errorf("error marshaling: %s", err)
}
buf.WriteString(fmt.Sprintf("tools:\n%s", b))
} else {
fields, ok := d.(map[string]any)
if !ok {
return nil, fmt.Errorf("fields for %s is not a map", name)
}
// Copy all existing fields, renaming 'kind' to 'type'.
buf.WriteString(fmt.Sprintf("type: %s\n", fields["kind"]))
for key, value := range fields {
if key == "kind" {
continue
}
o := map[string]any{key: value}
b, err := yaml.Marshal(o)
if err != nil {
return nil, fmt.Errorf("error marshaling: %s", err)
}
buf.Write(b)
}
}
buf.WriteString("---\n")
}
}
if !toolsFileV1 {
return raw, nil
}
return buf.Bytes(), nil
}
// parseToolsFile parses the provided yaml into appropriate configs.
func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
var toolsFile ToolsFile
@@ -300,8 +373,13 @@ func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
}
raw = []byte(output)
raw, err = convertToolsFile(ctx, raw)
if err != nil {
return toolsFile, fmt.Errorf("error converting tools file: %s", err)
}
// Parse contents
err = yaml.UnmarshalContext(ctx, raw, &toolsFile, yaml.Strict())
toolsFile.Sources, toolsFile.AuthServices, toolsFile.Tools, toolsFile.Toolsets, err = server.UnmarshalResourceConfig(ctx, raw)
if err != nil {
return toolsFile, err
}
@@ -331,15 +409,6 @@ func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
}
}
// Check for conflicts and merge authSources (deprecated, but still support)
for name, authSource := range file.AuthSources {
if _, exists := merged.AuthSources[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("authSource '%s' (file #%d)", name, fileIndex+1))
} else {
merged.AuthSources[name] = authSource
}
}
// Check for conflicts and merge authServices
for name, authService := range file.AuthServices {
if _, exists := merged.AuthServices[name]; exists {
@@ -795,11 +864,6 @@ func run(cmd *Command) error {
}
cmd.cfg.SourceConfigs, cmd.cfg.AuthServiceConfigs, cmd.cfg.ToolConfigs, cmd.cfg.ToolsetConfigs = toolsFile.Sources, toolsFile.AuthServices, toolsFile.Tools, toolsFile.Toolsets
authSourceConfigs := toolsFile.AuthSources
if authSourceConfigs != nil {
cmd.logger.WarnContext(ctx, "`authSources` is deprecated, use `authServices` instead")
cmd.cfg.AuthServiceConfigs = authSourceConfigs
}
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
if err != nil {

View File

@@ -23,6 +23,7 @@ import (
"os"
"path"
"path/filepath"
"reflect"
"regexp"
"runtime"
"strings"
@@ -31,6 +32,7 @@ import (
"github.com/google/go-cmp/cmp"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/auth/google"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
@@ -455,6 +457,200 @@ func TestDefaultLogLevel(t *testing.T) {
}
}
func TestConvertToolsFile(t *testing.T) {
ctx, cancelCtx := context.WithTimeout(context.Background(), time.Minute)
defer cancelCtx()
pr, pw := io.Pipe()
defer pw.Close()
defer pr.Close()
logger, err := log.NewStdLogger(pw, pw, "DEBUG")
if err != nil {
t.Fatalf("failed to setup logger %s", err)
}
ctx = util.WithLogger(ctx, logger)
tcs := []struct {
desc string
in string
want string
isErr bool
errStr string
}{
{
desc: "basic convert",
in: `
sources:
my-pg-instance:
kind: cloud-sql-postgres
project: my-project
region: my-region
instance: my-instance
database: my_db
user: my_user
password: my_pass
tools:
example_tool:
kind: postgres-sql
source: my-pg-instance
description: some description
statement: |
SELECT * FROM SQL_STATEMENT;
parameters:
- name: country
type: string
description: some description
toolsets:
example_toolset:
- example_tool`,
want: `
kind: sources
name: my-pg-instance
type: cloud-sql-postgres
project: my-project
region: my-region
instance: my-instance
database: my_db
user: my_user
password: my_pass
---
kind: tools
name: example_tool
type: postgres-sql
source: my-pg-instance
description: some description
statement: |
SELECT * FROM SQL_STATEMENT;
parameters:
- name: country
type: string
description: some description
---
kind: toolsets
name: example_toolset
tools:
- example_tool`,
},
{
desc: "no convertion needed",
in: `
kind: sources
name: my-pg-instance
type: cloud-sql-postgres
project: my-project
region: my-region
instance: my-instance
database: my_db
user: my_user
password: my_pass
---
kind: tools
name: example_tool
type: postgres-sql
source: my-pg-instance
description: some description
statement: |
SELECT * FROM SQL_STATEMENT;
parameters:
- name: country
type: string
description: some description
---
kind: toolsets
name: example_toolset
tools:
- example_tool`,
want: `
kind: sources
name: my-pg-instance
type: cloud-sql-postgres
project: my-project
region: my-region
instance: my-instance
database: my_db
user: my_user
password: my_pass
---
kind: tools
name: example_tool
type: postgres-sql
source: my-pg-instance
description: some description
statement: |
SELECT * FROM SQL_STATEMENT;
parameters:
- name: country
type: string
description: some description
---
kind: toolsets
name: example_toolset
tools:
- example_tool`,
},
{
desc: "invalid source",
in: `sources: invalid`,
isErr: true,
errStr: "'sources' is not a map",
},
{
desc: "invalid toolset",
in: `toolsets: invalid`,
isErr: true,
errStr: "'toolsets' is not a map",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
output, err := convertToolsFile(ctx, []byte(tc.in))
if tc.isErr {
if err == nil {
t.Fatalf("missing error: %s", tc.errStr)
}
if err.Error() != tc.errStr {
t.Fatalf("invalid error string: got %s, want %s", err, tc.errStr)
}
return
}
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
outputObj, err := unmarshalYAMLUtil(output)
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
wantObj, err := unmarshalYAMLUtil([]byte(tc.want))
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
if !reflect.DeepEqual(outputObj, wantObj) {
t.Fatalf("incorrect output: got %s, want %s", outputObj, wantObj)
}
})
}
}
// unmarshalYAMLUtil decodes a multi-document YAML string into a slice of generic maps.
func unmarshalYAMLUtil(yamlStr []byte) ([]map[string]any, error) {
decoder := yaml.NewDecoder(bytes.NewReader(yamlStr))
var docs []map[string]any
for {
var doc map[string]any
err := decoder.Decode(&doc)
if err != nil {
if err.Error() == "EOF" {
break
}
return nil, err
}
docs = append(docs, doc)
}
return docs, nil
}
func TestParseToolFile(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
@@ -496,7 +692,7 @@ func TestParseToolFile(t *testing.T) {
Sources: server.SourceConfigs{
"my-pg-instance": cloudsqlpgsrc.Config{
Name: "my-pg-instance",
Kind: cloudsqlpgsrc.SourceKind,
Type: cloudsqlpgsrc.SourceType,
Project: "my-project",
Region: "my-region",
Instance: "my-instance",
@@ -509,7 +705,72 @@ func TestParseToolFile(t *testing.T) {
Tools: server.ToolConfigs{
"example_tool": postgressql.Config{
Name: "example_tool",
Kind: "postgres-sql",
Type: "postgres-sql",
Source: "my-pg-instance",
Description: "some description",
Statement: "SELECT * FROM SQL_STATEMENT;\n",
Parameters: []tools.Parameter{
tools.NewStringParameter("country", "some description"),
},
AuthRequired: []string{},
},
},
Toolsets: server.ToolsetConfigs{
"example_toolset": tools.ToolsetConfig{
Name: "example_toolset",
ToolNames: []string{"example_tool"},
},
},
},
},
{
description: "basic example",
in: `
kind: sources
name: my-pg-instance
type: cloud-sql-postgres
project: my-project
region: my-region
instance: my-instance
database: my_db
user: my_user
password: my_pass
---
kind: tools
name: example_tool
type: postgres-sql
source: my-pg-instance
description: some description
statement: |
SELECT * FROM SQL_STATEMENT;
parameters:
- name: country
type: string
description: some description
---
kind: toolsets
name: example_toolset
tools:
- example_tool
`,
wantToolsFile: ToolsFile{
Sources: server.SourceConfigs{
"my-pg-instance": cloudsqlpgsrc.Config{
Name: "my-pg-instance",
Type: cloudsqlpgsrc.SourceType,
Project: "my-project",
Region: "my-region",
Instance: "my-instance",
IPType: "public",
Database: "my_db",
User: "my_user",
Password: "my_pass",
},
},
Tools: server.ToolConfigs{
"example_tool": postgressql.Config{
Name: "example_tool",
Type: "postgres-sql",
Source: "my-pg-instance",
Description: "some description",
Statement: "SELECT * FROM SQL_STATEMENT;\n",
@@ -615,7 +876,7 @@ func TestParseToolFileWithAuth(t *testing.T) {
Sources: server.SourceConfigs{
"my-pg-instance": cloudsqlpgsrc.Config{
Name: "my-pg-instance",
Kind: cloudsqlpgsrc.SourceKind,
Type: cloudsqlpgsrc.SourceType,
Project: "my-project",
Region: "my-region",
Instance: "my-instance",
@@ -628,19 +889,19 @@ func TestParseToolFileWithAuth(t *testing.T) {
AuthServices: server.AuthServiceConfigs{
"my-google-service": google.Config{
Name: "my-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "my-client-id",
},
"other-google-service": google.Config{
Name: "other-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "other-client-id",
},
},
Tools: server.ToolConfigs{
"example_tool": postgressql.Config{
Name: "example_tool",
Kind: "postgres-sql",
Type: "postgres-sql",
Source: "my-pg-instance",
Description: "some description",
Statement: "SELECT * FROM SQL_STATEMENT;\n",
@@ -714,7 +975,7 @@ func TestParseToolFileWithAuth(t *testing.T) {
Sources: server.SourceConfigs{
"my-pg-instance": cloudsqlpgsrc.Config{
Name: "my-pg-instance",
Kind: cloudsqlpgsrc.SourceKind,
Type: cloudsqlpgsrc.SourceType,
Project: "my-project",
Region: "my-region",
Instance: "my-instance",
@@ -724,22 +985,22 @@ func TestParseToolFileWithAuth(t *testing.T) {
Password: "my_pass",
},
},
AuthSources: server.AuthServiceConfigs{
AuthServices: server.AuthServiceConfigs{
"my-google-service": google.Config{
Name: "my-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "my-client-id",
},
"other-google-service": google.Config{
Name: "other-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "other-client-id",
},
},
Tools: server.ToolConfigs{
"example_tool": postgressql.Config{
Name: "example_tool",
Kind: "postgres-sql",
Type: "postgres-sql",
Source: "my-pg-instance",
Description: "some description",
Statement: "SELECT * FROM SQL_STATEMENT;\n",
@@ -815,7 +1076,7 @@ func TestParseToolFileWithAuth(t *testing.T) {
Sources: server.SourceConfigs{
"my-pg-instance": cloudsqlpgsrc.Config{
Name: "my-pg-instance",
Kind: cloudsqlpgsrc.SourceKind,
Type: cloudsqlpgsrc.SourceType,
Project: "my-project",
Region: "my-region",
Instance: "my-instance",
@@ -828,19 +1089,19 @@ func TestParseToolFileWithAuth(t *testing.T) {
AuthServices: server.AuthServiceConfigs{
"my-google-service": google.Config{
Name: "my-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "my-client-id",
},
"other-google-service": google.Config{
Name: "other-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "other-client-id",
},
},
Tools: server.ToolConfigs{
"example_tool": postgressql.Config{
Name: "example_tool",
Kind: "postgres-sql",
Type: "postgres-sql",
Source: "my-pg-instance",
Description: "some description",
Statement: "SELECT * FROM SQL_STATEMENT;\n",
@@ -972,7 +1233,7 @@ func TestEnvVarReplacement(t *testing.T) {
Sources: server.SourceConfigs{
"my-http-instance": httpsrc.Config{
Name: "my-http-instance",
Kind: httpsrc.SourceKind,
Type: httpsrc.SourceType,
BaseURL: "http://test_server/",
Timeout: "10s",
DefaultHeaders: map[string]string{"Authorization": "ACTUAL_HEADER"},
@@ -982,19 +1243,19 @@ func TestEnvVarReplacement(t *testing.T) {
AuthServices: server.AuthServiceConfigs{
"my-google-service": google.Config{
Name: "my-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "ACTUAL_CLIENT_ID",
},
"other-google-service": google.Config{
Name: "other-google-service",
Kind: google.AuthServiceKind,
Type: google.AuthServiceType,
ClientID: "ACTUAL_CLIENT_ID_2",
},
},
Tools: server.ToolConfigs{
"example_tool": http.Config{
Name: "example_tool",
Kind: "http",
Type: "http",
Source: "my-instance",
Method: "GET",
Path: "search?name=alice&pet=cat",

View File

@@ -64,7 +64,7 @@ The structured logging outputs log as JSON:
"timestamp":"2024-11-04T16:45:11.987299-08:00",
"severity":"ERROR",
"logging.googleapis.com/sourceLocation":{...},
"message":"unable to parse tool file at \"tools.yaml\": \"cloud-sql-postgres1\" is not a valid kind of data source"
"message":"unable to parse tool file at \"tools.yaml\": \"cloud-sql-postgres1\" is not a valid type of data source"
}
```

View File

@@ -51,7 +51,7 @@ For more details on configuring different types of sources, see the
### Tools
The `tools` section of your `tools.yaml` defines the actions your agent can
take: what kind of tool it is, which source(s) it affects, what parameters it
take: what type of tool it is, which source(s) it affects, what parameters it
uses, etc.
```yaml

View File

@@ -64,380 +64,26 @@ npm install @google/genai
{{< tabpane persist=header >}}
{{< tab header="LangChain" lang="js" >}}
import { ChatGoogleGenerativeAI } from "@langchain/google-genai";
import { ToolboxClient } from "@toolbox-sdk/core";
import { tool } from "@langchain/core/tools";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { MemorySaver } from "@langchain/langgraph";
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
const model = new ChatGoogleGenerativeAI({
model: "gemini-2.0-flash",
});
const client = new ToolboxClient("http://127.0.0.1:5000");
const toolboxTools = await client.loadToolset("my-toolset");
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(toolboxTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
const tools = toolboxTools.map(getTool);
const agent = createReactAgent({
llm: model,
tools: tools,
checkpointer: new MemorySaver(),
systemPrompt: prompt,
});
const langGraphConfig = {
configurable: {
thread_id: "test-thread",
},
};
for (const query of queries) {
const agentOutput = await agent.invoke(
{
messages: [
{
role: "user",
content: query,
},
],
verbose: true,
},
langGraphConfig
);
const response = agentOutput.messages[agentOutput.messages.length - 1].content;
console.log(response);
}
}
main();
{{< include "quickstart/js/langchain/quickstart.js" >}}
{{< /tab >}}
{{< tab header="GenkitJS" lang="js" >}}
import { ToolboxClient } from "@toolbox-sdk/core";
import { genkit } from "genkit";
import { googleAI } from '@genkit-ai/googleai';
{{< include "quickstart/js/genkit/quickstart.js" >}}
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
const toolboxClient = new ToolboxClient("http://127.0.0.1:5000");
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const toolMap = Object.fromEntries(
toolboxTools.map((tool) => {
const definedTool = ai.defineTool(
{
name: tool.getName(),
description: tool.getDescription(),
inputSchema: tool.getParamSchema(),
},
tool
);
return [tool.getName(), definedTool];
})
);
const tools = Object.values(toolMap);
let conversationHistory = [{ role: "system", content: [{ text: systemPrompt }] }];
for (const query of queries) {
conversationHistory.push({ role: "user", content: [{ text: query }] });
const response = await ai.generate({
messages: conversationHistory,
tools: tools,
});
conversationHistory.push(response.message);
const toolRequests = response.toolRequests;
if (toolRequests?.length > 0) {
// Execute tools concurrently and collect their responses.
const toolResponses = await Promise.all(
toolRequests.map(async (call) => {
try {
const toolOutput = await toolMap[call.name].invoke(call.input);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: toolOutput } }] };
} catch (e) {
console.error(`Error executing tool ${call.name}:`, e);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: { error: e.message } } }] };
}
})
);
conversationHistory.push(...toolResponses);
// Call the AI again with the tool results.
response = await ai.generate({ messages: conversationHistory, tools });
conversationHistory.push(response.message);
}
console.log(response.text);
}
}
main();
{{< /tab >}}
{{< tab header="LlamaIndex" lang="js" >}}
import { gemini, GEMINI_MODEL } from "@llamaindex/google";
import { agent } from "@llamaindex/workflow";
import { createMemory, staticBlock, tool } from "llamaindex";
import { ToolboxClient } from "@toolbox-sdk/core";
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
process.env.GOOGLE_API_KEY = 'your-api-key'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking and cancellations.
When the user searches for a hotel, mention its name, id, location and price tier.
Always mention hotel ids while performing any searches — this is very important for operations.
For any bookings or cancellations, please provide the appropriate confirmation.
Update check-in or check-out dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
// Connect to MCP Toolbox
const client = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await client.loadToolset("my-toolset");
const tools = toolboxTools.map((toolboxTool) => {
return tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool,
});
});
// Initialize LLM
const llm = gemini({
model: GEMINI_MODEL.GEMINI_2_0_FLASH,
apiKey: process.env.GOOGLE_API_KEY,
});
const memory = createMemory({
memoryBlocks: [
staticBlock({
content: prompt,
}),
],
});
// Create the Agent
const myAgent = agent({
tools: tools,
llm,
memory,
systemPrompt: prompt,
});
for (const query of queries) {
const result = await myAgent.run(query);
const output = result.data.result;
console.log(`\nUser: ${query}`);
if (typeof output === "string") {
console.log(output.trim());
} else if (typeof output === "object" && "text" in output) {
console.log(output.text.trim());
} else {
console.log(JSON.stringify(output));
}
}
//You may observe some extra logs during execution due to the run method provided by Llama.
console.log("Agent run finished.");
}
main();
{{< include "quickstart/js/llamaindex/quickstart.js" >}}
{{< /tab >}}
{{< tab header="GoogleGenAI" lang="js" >}}
import { GoogleGenAI } from "@google/genai";
import { ToolboxClient } from "@toolbox-sdk/core";
{{< include "quickstart/js/genAI/quickstart.js" >}}
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
const GOOGLE_API_KEY = 'enter your api here'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, you MUST use the available tools to find information. Mention its name, id,
location and price tier. Always mention hotel id while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
function mapZodTypeToOpenAPIType(zodTypeName) {
console.log(zodTypeName)
const typeMap = {
'ZodString': 'string',
'ZodNumber': 'number',
'ZodBoolean': 'boolean',
'ZodArray': 'array',
'ZodObject': 'object',
};
return typeMap[zodTypeName] || 'string';
}
async function main() {
const toolboxClient = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const geminiTools = [{
functionDeclarations: toolboxTools.map(tool => {
const schema = tool.getParamSchema();
const properties = {};
const required = [];
for (const [key, param] of Object.entries(schema.shape)) {
properties[key] = {
type: mapZodTypeToOpenAPIType(param.constructor.name),
description: param.description || '',
};
required.push(key)
}
return {
name: tool.getName(),
description: tool.getDescription(),
parameters: { type: 'object', properties, required },
};
})
}];
const genAI = new GoogleGenAI({ apiKey: GOOGLE_API_KEY });
const chat = genAI.chats.create({
model: "gemini-2.5-flash",
config: {
systemInstruction: prompt,
tools: geminiTools,
}
});
for (const query of queries) {
let currentResult = await chat.sendMessage({ message: query });
let finalResponseGiven = false
while (!finalResponseGiven) {
const response = currentResult;
const functionCalls = response.functionCalls || [];
if (functionCalls.length === 0) {
console.log(response.text)
finalResponseGiven = true;
} else {
const toolResponses = [];
for (const call of functionCalls) {
const toolName = call.name
const toolToExecute = toolboxTools.find(t => t.getName() === toolName);
if (toolToExecute) {
try {
const functionResult = await toolToExecute(call.args);
toolResponses.push({
functionResponse: { name: call.name, response: { result: functionResult } }
});
} catch (e) {
console.error(`Error executing tool '${toolName}':`, e);
toolResponses.push({
functionResponse: { name: call.name, response: { error: e.message } }
});
}
}
}
currentResult = await chat.sendMessage({ message: toolResponses });
}
}
}
}
main();
{{< /tab >}}
{{< /tabpane >}}

View File

@@ -0,0 +1,119 @@
import { GoogleGenAI } from "@google/genai";
import { ToolboxClient } from "@toolbox-sdk/core";
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
const GOOGLE_API_KEY = 'enter your api here'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, you MUST use the available tools to find information. Mention its name, id,
location and price tier. Always mention hotel id while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
function mapZodTypeToOpenAPIType(zodTypeName) {
console.log(zodTypeName)
const typeMap = {
'ZodString': 'string',
'ZodNumber': 'number',
'ZodBoolean': 'boolean',
'ZodArray': 'array',
'ZodObject': 'object',
};
return typeMap[zodTypeName] || 'string';
}
async function main() {
const toolboxClient = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const geminiTools = [{
functionDeclarations: toolboxTools.map(tool => {
const schema = tool.getParamSchema();
const properties = {};
const required = [];
for (const [key, param] of Object.entries(schema.shape)) {
properties[key] = {
type: mapZodTypeToOpenAPIType(param.constructor.name),
description: param.description || '',
};
required.push(key)
}
return {
name: tool.getName(),
description: tool.getDescription(),
parameters: { type: 'object', properties, required },
};
})
}];
const genAI = new GoogleGenAI({ apiKey: GOOGLE_API_KEY });
const chat = genAI.chats.create({
model: "gemini-2.5-flash",
config: {
systemInstruction: prompt,
tools: geminiTools,
}
});
for (const query of queries) {
let currentResult = await chat.sendMessage({ message: query });
let finalResponseGiven = false
while (!finalResponseGiven) {
const response = currentResult;
const functionCalls = response.functionCalls || [];
if (functionCalls.length === 0) {
console.log(response.text)
finalResponseGiven = true;
} else {
const toolResponses = [];
for (const call of functionCalls) {
const toolName = call.name
const toolToExecute = toolboxTools.find(t => t.getName() === toolName);
if (toolToExecute) {
try {
const functionResult = await toolToExecute(call.args);
toolResponses.push({
functionResponse: { name: call.name, response: { result: functionResult } }
});
} catch (e) {
console.error(`Error executing tool '${toolName}':`, e);
toolResponses.push({
functionResponse: { name: call.name, response: { error: e.message } }
});
}
}
}
currentResult = await chat.sendMessage({ message: toolResponses });
}
}
}
}
main();

View File

@@ -0,0 +1,89 @@
import { ToolboxClient } from "@toolbox-sdk/core";
import { genkit } from "genkit";
import { googleAI } from '@genkit-ai/googleai';
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
const toolboxClient = new ToolboxClient("http://127.0.0.1:5000");
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const toolMap = Object.fromEntries(
toolboxTools.map((tool) => {
const definedTool = ai.defineTool(
{
name: tool.getName(),
description: tool.getDescription(),
inputSchema: tool.getParamSchema(),
},
tool
);
return [tool.getName(), definedTool];
})
);
const tools = Object.values(toolMap);
let conversationHistory = [{ role: "system", content: [{ text: systemPrompt }] }];
for (const query of queries) {
conversationHistory.push({ role: "user", content: [{ text: query }] });
const response = await ai.generate({
messages: conversationHistory,
tools: tools,
});
conversationHistory.push(response.message);
const toolRequests = response.toolRequests;
if (toolRequests?.length > 0) {
// Execute tools concurrently and collect their responses.
const toolResponses = await Promise.all(
toolRequests.map(async (call) => {
try {
const toolOutput = await toolMap[call.name].invoke(call.input);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: toolOutput } }] };
} catch (e) {
console.error(`Error executing tool ${call.name}:`, e);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: { error: e.message } } }] };
}
})
);
conversationHistory.push(...toolResponses);
// Call the AI again with the tool results.
response = await ai.generate({ messages: conversationHistory, tools });
conversationHistory.push(response.message);
}
console.log(response.text);
}
}
main();

View File

@@ -0,0 +1,74 @@
import { ChatGoogleGenerativeAI } from "@langchain/google-genai";
import { ToolboxClient } from "@toolbox-sdk/core";
import { tool } from "@langchain/core/tools";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { MemorySaver } from "@langchain/langgraph";
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
const model = new ChatGoogleGenerativeAI({
model: "gemini-2.0-flash",
});
const client = new ToolboxClient("http://127.0.0.1:5000");
const toolboxTools = await client.loadToolset("my-toolset");
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(toolboxTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
const tools = toolboxTools.map(getTool);
const agent = createReactAgent({
llm: model,
tools: tools,
checkpointer: new MemorySaver(),
systemPrompt: prompt,
});
const langGraphConfig = {
configurable: {
thread_id: "test-thread",
},
};
for (const query of queries) {
const agentOutput = await agent.invoke(
{
messages: [
{
role: "user",
content: query,
},
],
verbose: true,
},
langGraphConfig
);
const response = agentOutput.messages[agentOutput.messages.length - 1].content;
console.log(response);
}
}
main();

View File

@@ -0,0 +1,79 @@
import { gemini, GEMINI_MODEL } from "@llamaindex/google";
import { agent } from "@llamaindex/workflow";
import { createMemory, staticBlock, tool } from "llamaindex";
import { ToolboxClient } from "@toolbox-sdk/core";
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
process.env.GOOGLE_API_KEY = 'your-api-key'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking and cancellations.
When the user searches for a hotel, mention its name, id, location and price tier.
Always mention hotel ids while performing any searches — this is very important for operations.
For any bookings or cancellations, please provide the appropriate confirmation.
Update check-in or check-out dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
// Connect to MCP Toolbox
const client = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await client.loadToolset("my-toolset");
const tools = toolboxTools.map((toolboxTool) => {
return tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool,
});
});
// Initialize LLM
const llm = gemini({
model: GEMINI_MODEL.GEMINI_2_0_FLASH,
apiKey: process.env.GOOGLE_API_KEY,
});
const memory = createMemory({
memoryBlocks: [
staticBlock({
content: prompt,
}),
],
});
// Create the Agent
const myAgent = agent({
tools: tools,
llm,
memory,
systemPrompt: prompt,
});
for (const query of queries) {
const result = await myAgent.run(query);
const output = result.data.result;
console.log(`\nUser: ${query}`);
if (typeof output === "string") {
console.log(output.trim());
} else if (typeof output === "object" && "text" in output) {
console.log(output.text.trim());
} else {
console.log(JSON.stringify(output));
}
}
//You may observe some extra logs during execution due to the run method provided by Llama.
console.log("Agent run finished.");
}
main();

View File

@@ -6,328 +6,9 @@ description: >
Connect your IDE to Firestore using Toolbox.
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Firestore. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Firestore instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
## Set up Firestore
1. Create or select a Google Cloud project.
* [Create a new
project](https://cloud.google.com/resource-manager/docs/creating-managing-projects)
* [Select an existing
project](https://cloud.google.com/resource-manager/docs/creating-managing-projects#identifying_projects)
1. [Enable the Firestore
API](https://console.cloud.google.com/apis/library/firestore.googleapis.com)
for your project.
1. [Create a Firestore
database](https://cloud.google.com/firestore/docs/create-database-web-mobile-client-library)
if you haven't already.
1. Set up authentication for your local environment.
* [Install gcloud CLI](https://cloud.google.com/sdk/docs/install)
* Run `gcloud auth application-default login` to authenticate
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
V0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/windows/amd64/toolbox
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini
CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Add the following configuration, replace the environment variables with your
values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code
Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Add the following configuration, replace the environment variables with your
values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Firestore using MCP. Try asking your AI
assistant to list collections, get documents, query collections, or manage
security rules.
The following tools are available to the LLM:
1. **firestore-get-documents**: Gets multiple documents from Firestore by their
paths
1. **firestore-list-collections**: List Firestore collections for a given parent
path
1. **firestore-delete-documents**: Delete multiple documents from Firestore
1. **firestore-query-collection**: Query documents from a collection with
filtering, ordering, and limit options
1. **firestore-get-rules**: Retrieves the active Firestore security rules for
the current project
1. **firestore-validate-rules**: Validates Firestore security rules syntax and
errors
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
<html>
<head>
<link rel="canonical" href="https://cloud.google.com/firestore/native/docs/connect-ide-using-mcp-toolbox"/>
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/firestore/native/docs/connect-ide-using-mcp-toolbox"/>
</head>
</html>

View File

@@ -235,7 +235,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/windows/amd64/toolb
```json
{
"mcpServers": {
"servers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],

View File

@@ -182,19 +182,17 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/windows/amd64/toolb
```json
{
"mcp" : {
"servers": {
"cloud-sql-sqlserver": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","cloud-sql-mssql","--stdio"],
"env": {
"MSSQL_HOST": "",
"MSSQL_PORT": "",
"MSSQL_DATABASE": "",
"MSSQL_USER": "",
"MSSQL_PASSWORD": ""
}
}
"servers": {
"mssql": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","mssql","--stdio"],
"env": {
"MSSQL_HOST": "",
"MSSQL_PORT": "",
"MSSQL_DATABASE": "",
"MSSQL_USER": "",
"MSSQL_PASSWORD": ""
}
}
}
}
@@ -286,4 +284,4 @@ The following tools are available to the LLM:
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
{{< /notice >}}

View File

@@ -182,7 +182,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/windows/amd64/toolb
```json
{
"mcpServers": {
"servers": {
"mysql": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","mysql","--stdio"],

View File

@@ -217,7 +217,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.13.0/windows/amd64/toolb
```json
{
"mcpServers": {
"servers": {
"postgres": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","postgres","--stdio"],

View File

@@ -19,8 +19,10 @@ See guides, [Connect from your IDE](../how-to/connect-ide/_index.md), for detail
* `ALLOYDB_POSTGRES_CLUSTER`: The ID of your AlloyDB cluster.
* `ALLOYDB_POSTGRES_INSTANCE`: The ID of your AlloyDB instance.
* `ALLOYDB_POSTGRES_DATABASE`: The name of the database to connect to.
* `ALLOYDB_POSTGRES_USER`: The database username.
* `ALLOYDB_POSTGRES_PASSWORD`: The password for the database user.
* `ALLOYDB_POSTGRES_USER`: The database username. Defaults to IAM authentication if unspecified.
* `ALLOYDB_POSTGRES_PASSWORD`: The password for the database user. Defaults to IAM authentication if unspecified.
* `ALLOYDB_POSTGRES_IP_TYPE`: The IP type i.e. "Public
or "Private" (Default: Public).
* **Permissions:**
* **AlloyDB Client** (`roles/alloydb.client`) to connect to the instance.
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.

View File

@@ -28,7 +28,7 @@ The following configurations are placed at the top level of a `tools.yaml` file.
{{< notice tip >}}
If you are accessing Toolbox with multiple applications, each
application should register their own Client ID even if they use the same
"kind" of auth provider.
"type" of auth provider.
{{< /notice >}}
```yaml
@@ -300,4 +300,4 @@ func main() {
}
```
## Kinds of Auth Services
## Types of Auth Services

View File

@@ -55,5 +55,5 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------|
| kind | string | true | Must be "google". |
| type | string | true | Must be "google". |
| clientId | string | true | Client ID of your application from registering your application. |

View File

@@ -132,7 +132,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "alloydb-postgres". |
| type | string | true | Must be "alloydb-postgres". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| cluster | string | true | Name of the AlloyDB cluster (e.g. "my-cluster"). |

View File

@@ -128,7 +128,7 @@ sources:
| **field** | **type** | **required** | **description** |
|----------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery". |
| type | string | true | Must be "bigquery". |
| project | string | true | Id of the Google Cloud project to use for billing and as the default project for BigQuery resources. |
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. Defaults to the table's location or 'US' if the location cannot be determined. [Learn More](https://cloud.google.com/bigquery/docs/locations) |
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. |

View File

@@ -70,6 +70,6 @@ sources:
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------|
| kind | string | true | Must be "bigtable". |
| type | string | true | Must be "bigtable". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| instance | string | true | Name of the Bigtable instance. |

View File

@@ -81,7 +81,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------------------------------------|
| kind | string | true | Must be "clickhouse". |
| type | string | true | Must be "clickhouse". |
| host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "clickhouse.example.com") |
| port | string | true | Port to connect to (e.g. "8443" for HTTPS, "8123" for HTTP) |
| database | string | true | Name of the ClickHouse database to connect to (e.g. "my_database"). |

View File

@@ -106,7 +106,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-mssql". |
| type | string | true | Must be "cloud-sql-mssql". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |

View File

@@ -106,7 +106,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-mysql". |
| type | string | true | Must be "cloud-sql-mysql". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |

View File

@@ -131,7 +131,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-postgres". |
| type | string | true | Must be "cloud-sql-postgres". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |

View File

@@ -37,7 +37,7 @@ For more details about alternate addresses and custom ports refer to [Managing C
| **field** | **type** | **required** | **description** |
|----------------------|:--------:|:------------:|---------------------------------------------------------|
| kind | string | true | Must be "couchbase". |
| type | string | true | Must be "couchbase". |
| connectionString | string | true | Connection string for the Couchbase cluster. |
| bucket | string | true | Name of the bucket to connect to. |
| scope | string | true | Name of the scope within the bucket. |

View File

@@ -320,5 +320,5 @@ Logical operators are case-sensitive. `OR` and `AND` are acceptable whereas `or`
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex". |
| type | string | true | Must be "dataplex". |
| project | string | true | ID of the GCP project used for quota and billing purposes (e.g. "my-project-id").|

View File

@@ -59,7 +59,7 @@ instead of hardcoding your secrets into the configuration file.
| **Field** | **Type** | **Required** | **Description** |
|-------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dgraph". |
| type | string | true | Must be "dgraph". |
| dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). |
| user | string | false | Name of the Dgraph user to connect as (e.g., "groot"). |
| password | string | false | Password of the Dgraph user (e.g., "password"). |

View File

@@ -51,7 +51,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "firebird". |
| type | string | true | Must be "firebird". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1") |
| port | string | true | Port to connect to (e.g. "3050") |
| database | string | true | Path to the Firebird database file (e.g. "/var/lib/firebird/data/test.fdb"). |

View File

@@ -72,6 +72,6 @@ sources:
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "firestore". |
| type | string | true | Must be "firestore". |
| project | string | true | Id of the GCP project that contains the Firestore database (e.g. "my-project-id"). |
| database | string | false | Name of the Firestore database to connect to. Defaults to "(default)" if not specified. |

View File

@@ -44,7 +44,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|------------------------|:-----------------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "http". |
| type | string | true | Must be "http". |
| baseUrl | string | true | The base URL for the HTTP requests (e.g., `https://api.example.com`). |
| timeout | string | false | The timeout for HTTP requests (e.g., "5s", "1m", refer to [ParseDuration][parse-duration-doc] for more examples). Defaults to 30s. |
| headers | map[string]string | false | Default headers to include in the HTTP requests. |

View File

@@ -47,7 +47,8 @@ a self-signed ssl certificate for the Looker server. Anything other than "true"
will be interpreted as false.
The client id and client secret are seemingly random character sequences
assigned by the looker server.
assigned by the looker server. If you are using Looker OAuth you don't need
these settings
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
@@ -58,12 +59,13 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
| -------------------- | :------: | :----------: | ----------------------------------------------------------------------------------------- |
| kind | string | true | Must be "looker". |
| type | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /). |
| client_id | string | true | The client id assigned by Looker. |
| client_secret | string | true | The client secret assigned by Looker. |
| verify_ssl | string | true | Whether to check the ssl certificate of the server. |
| client_id | string | false | The client id assigned by Looker. |
| client_secret | string | false | The client secret assigned by Looker. |
| verify_ssl | string | false | Whether to check the ssl certificate of the server. |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |
| use_client_oauth | string | false | Use OAuth tokens instead of client_id and client_secret. (default: false) |
| show_hidden_models | string | false | Show or hide hidden models. (default: true) |
| show_hidden_explores | string | false | Show or hide hidden explores. (default: true) |
| show_hidden_fields | string | false | Show or hide hidden fields. (default: true) |

View File

@@ -28,5 +28,5 @@ sources:
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------|
| kind | string | true | Must be "mongodb". |
| type | string | true | Must be "mongodb". |
| uri | string | true | connection string to connect to MongoDB |

View File

@@ -55,7 +55,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mssql". |
| type | string | true | Must be "mssql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "1433"). |
| database | string | true | Name of the SQL Server database to connect to (e.g. "my_db"). |

View File

@@ -42,6 +42,9 @@ sources:
database: my_db
user: ${USER_NAME}
password: ${PASSWORD}
# Optional TLS and other driver parameters. For example, enable preferred TLS:
# queryParams:
# tls: preferred
queryTimeout: 30s # Optional: query timeout duration
```
@@ -54,10 +57,11 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "mysql". |
| type | string | true | Must be "mysql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3306"). |
| database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). |
| user | string | true | Name of the MySQL user to connect as (e.g. "my-mysql-user"). |
| password | string | true | Password of the MySQL user (e.g. "my-password"). |
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
| queryParams | map<string,string> | false | Arbitrary DSN parameters passed to the driver (e.g. `tls: preferred`, `charset: utf8mb4`). Useful for enabling TLS or other connection options. |

View File

@@ -51,7 +51,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------|
| kind | string | true | Must be "neo4j". |
| type | string | true | Must be "neo4j". |
| uri | string | true | Connect URI ("bolt://localhost", "neo4j+s://xxx.databases.neo4j.io") |
| user | string | true | Name of the Neo4j user to connect as (e.g. "neo4j"). |
| password | string | true | Password of the Neo4j user (e.g. "my-password"). |

View File

@@ -45,7 +45,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: |-------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oceanbase". |
| type | string | true | Must be "oceanbase". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "2881"). |
| database | string | true | Name of the OceanBase database to connect to (e.g. "my_db"). |

View File

@@ -59,7 +59,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-------------|:------------------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "postgres". |
| type | string | true | Must be "postgres". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1") |
| port | string | true | Port to connect to (e.g. "5432") |
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |

View File

@@ -91,7 +91,7 @@ sources:
| **field** | **type** | **required** | **description** |
|----------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "memorystore-redis". |
| type | string | true | Must be "memorystore-redis". |
| address | string | true | Primary endpoint for the Memorystore Redis instance to connect to. |
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Redis, leave this field blank |
| password | string | false | If you have [Redis AUTH][auth] enabled, specify the AUTH string here |

View File

@@ -71,7 +71,7 @@ sources:
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "spanner". |
| type | string | true | Must be "spanner". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| instance | string | true | Name of the Spanner instance. |
| database | string | true | Name of the database on the Spanner instance |

View File

@@ -61,7 +61,7 @@ sources:
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "sqlite". |
| type | string | true | Must be "sqlite". |
| database | string | true | Path to SQLite database file, or ":memory:" for an in-memory database. |
### Connection Properties

View File

@@ -72,7 +72,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------|
| kind | string | true | Must be "tidb". |
| type | string | true | Must be "tidb". |
| host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "gateway01.*.tidbcloud.com"). |
| port | string | true | Port to connect to (typically "4000" for TiDB). |
| database | string | true | Name of the TiDB database to connect to (e.g. "my_db"). |

View File

@@ -49,7 +49,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** |
|-------------|:------------------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "trino". |
| type | string | true | Must be "trino". |
| host | string | true | Trino coordinator hostname (e.g. "trino.example.com") |
| port | string | true | Trino coordinator port (e.g. "8080", "8443") |
| user | string | false | Username for authentication (e.g. "analyst"). Optional for anonymous access. |

View File

@@ -65,7 +65,7 @@ sources:
| **field** | **type** | **required** | **description** |
|--------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "valkey". |
| type | string | true | Must be "valkey". |
| address | []string | true | Endpoints for the Valkey instance to connect to. |
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Valkey, leave this field blank |
| password | string | false | Password for the Valkey instance |

View File

@@ -259,4 +259,4 @@ tools:
- other-auth-service
```
## Kinds of tools
## Types of tools

View File

@@ -105,7 +105,7 @@ tools:
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------|
| kind | string | true | Must be "alloydb-ai-nl". |
| type | string | true | Must be "alloydb-ai-nl". |
| source | string | true | Name of the AlloyDB source the natural language query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| nlConfig | string | true | The name of the `nl_config` in AlloyDB |

View File

@@ -48,7 +48,7 @@ tools:
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-conversational-analytics". |
| type | string | true | Must be "bigquery-conversational-analytics". |
| source | string | true | Name of the source for chat. |
| description | string | true | Description of the tool
that is passed to the LLM. |

View File

@@ -33,6 +33,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-execute-sql". |
| type | string | true | Must be "bigquery-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -44,6 +44,6 @@ You can use the following sample prompts to call this tool:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-forecast". |
| type | string | true | Must be "bigquery-forecast". |
| source | string | true | Name of the source the forecast tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -34,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-get-dataset-info". |
| type | string | true | Must be "bigquery-get-dataset-info". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -34,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-get-table-info". |
| type | string | true | Must be "bigquery-get-table-info". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -33,6 +33,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-list-dataset-ids". |
| type | string | true | Must be "bigquery-list-dataset-ids". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -34,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-list-table-ids". |
| type | string | true | Must be "bigquery-list-table-ids". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -97,7 +97,7 @@ tools:
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-sql". |
| type | string | true | Must be "bigquery-sql". |
| source | string | true | Name of the source the GoogleSQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The GoogleSQL statement to execute. |

View File

@@ -102,7 +102,7 @@ tools:
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigtable-sql". |
| type | string | true | Must be "bigtable-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |

View File

@@ -41,6 +41,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|---------------------------------------------------------|
| kind | string | true | Must be "clickhouse-execute-sql". |
| type | string | true | Must be "clickhouse-execute-sql". |
| source | string | true | Name of the ClickHouse source to execute SQL against. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -73,7 +73,7 @@ tools:
| **field** | **type** | **required** | **description** |
|--------------------|:------------------:|:------------:|-----------------------------------------------------------|
| kind | string | true | Must be "clickhouse-sql". |
| type | string | true | Must be "clickhouse-sql". |
| source | string | true | Name of the ClickHouse source to execute SQL against. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The SQL statement template to execute. |

View File

@@ -91,7 +91,7 @@ tools:
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "couchbase-sql". |
| type | string | true | Must be "couchbase-sql". |
| source | string | true | Name of the source the SQL query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute |

View File

@@ -55,6 +55,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex-lookup-entry". |
| type | string | true | Must be "dataplex-lookup-entry". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -57,6 +57,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex-search-aspect-types". |
| type | string | true | Must be "dataplex-search-aspect-types". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -59,6 +59,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex-search-entries". |
| type | string | true | Must be "dataplex-search-entries". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -115,7 +115,7 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|----------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dgraph-dql". |
| type | string | true | Must be "dgraph-dql". |
| source | string | true | Name of the source the dql query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | dql statement to execute |

View File

@@ -36,6 +36,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "firebird-execute-sql". |
| type | string | true | Must be "firebird-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -127,7 +127,7 @@ tools:
| **field** | **type** | **required** | **description** |
|---------------------|:---------------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "firebird-sql". |
| type | string | true | Must be "firebird-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |

View File

@@ -34,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:--------------:|:------------:|----------------------------------------------------------|
| kind | string | true | Must be "firestore-delete-documents". |
| type | string | true | Must be "firestore-delete-documents". |
| source | string | true | Name of the Firestore source to delete documents from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -34,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:--------------:|:------------:|------------------------------------------------------------|
| kind | string | true | Must be "firestore-get-documents". |
| type | string | true | Must be "firestore-get-documents". |
| source | string | true | Name of the Firestore source to retrieve documents from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -34,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:-------------:|:------------:|-------------------------------------------------------|
| kind | string | true | Must be "firestore-get-rules". |
| type | string | true | Must be "firestore-get-rules". |
| source | string | true | Name of the Firestore source to retrieve rules from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -37,6 +37,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:----------------:|:------------:|--------------------------------------------------------|
| kind | string | true | Must be "firestore-list-collections". |
| type | string | true | Must be "firestore-list-collections". |
| source | string | true | Name of the Firestore source to list collections from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -63,7 +63,7 @@ When `updateMask` is provided, only the fields listed in the mask are updated. T
| **field** | **type** | **required** | **description** |
|-------------|:--------------:|:------------:|----------------------------------------------------------|
| kind | string | true | Must be "firestore-update-document". |
| type | string | true | Must be "firestore-update-document". |
| source | string | true | Name of the Firestore source to update documents in. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -249,7 +249,7 @@ my-http-tool:
| **field** | **type** | **required** | **description** |
|--------------|:------------------------------------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "http". |
| type | string | true | Must be "http". |
| source | string | true | Name of the source the HTTP request should be sent to. |
| description | string | true | Description of the tool that is passed to the LLM. |
| path | string | true | The path of the HTTP request. You can include static query parameters in the path string. |

View File

@@ -58,6 +58,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-add-dashboard-element" |
| type | string | true | Must be "looker-add-dashboard-element" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -55,6 +55,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-dashboards" |
| type | string | true | Must be "looker-get-dashboards" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -33,6 +33,12 @@ tools:
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
If this returns a suggestions field for a dimension, the contents of suggestions
can be used as filters for this field. If this returns a suggest_explore and
suggest_dimension, a query against that explore and dimension can be used to find
valid filters for this field.
```
The response is a json array with the following elements:
@@ -45,7 +51,10 @@ The response is a json array with the following elements:
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...]
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
@@ -53,6 +62,6 @@ The response is a json array with the following elements:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-dimensions". |
| type | string | true | Must be "looker-get-dimensions". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -49,6 +49,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-explores". |
| type | string | true | Must be "looker-get-explores". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -45,7 +45,10 @@ The response is a json array with the following elements:
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...]
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
@@ -54,6 +57,6 @@ The response is a json array with the following elements:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-filters". |
| type | string | true | Must be "looker-get-filters". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -55,6 +55,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-looks" |
| type | string | true | Must be "looker-get-looks" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -33,6 +33,12 @@ tools:
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
If this returns a suggestions field for a measure, the contents of suggestions
can be used as filters for this field. If this returns a suggest_explore and
suggest_dimension, a query against that explore and dimension can be used to find
valid filters for this field.
```
The response is a json array with the following elements:
@@ -45,7 +51,10 @@ The response is a json array with the following elements:
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...]
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
@@ -54,6 +63,6 @@ The response is a json array with the following elements:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-measures". |
| type | string | true | Must be "looker-get-measures". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -35,6 +35,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-models". |
| type | string | true | Must be "looker-get-models". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -45,7 +45,10 @@ The response is a json array with the following elements:
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...]
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
@@ -54,6 +57,6 @@ The response is a json array with the following elements:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-parameters". |
| type | string | true | Must be "looker-get-parameters". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -48,6 +48,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-make-dashboard" |
| type | string | true | Must be "looker-make-dashboard" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -60,6 +60,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-make-look" |
| type | string | true | Must be "looker-make-look" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -75,6 +75,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-query-sql" |
| type | string | true | Must be "looker-query-sql" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -487,6 +487,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-query-url" |
| type | string | true | Must be "looker-query-url" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -75,6 +75,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-query" |
| type | string | true | Must be "looker-query" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -38,6 +38,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-run-look" |
| type | string | true | Must be "looker-run-look" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -22,7 +22,7 @@ array of documents produced by the final stage of the pipeline.
A `readOnly` flag can be set to `true` as a safety measure to ensure the
pipeline does not contain any write stages (like `$out` or `$merge`).
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -70,7 +70,7 @@ tools:
| **field** | **type** | **required** | **description** |
|:----------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-aggregate`. |
| type | string | true | Must be `mongodb-aggregate`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |

View File

@@ -17,7 +17,7 @@ The tool returns the total count of documents that were deleted. If the filter
does not match any documents (i.e., the deleted count is 0), the tool will
return an error.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -48,7 +48,7 @@ tools:
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:--------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-delete-many`. |
| type | string | true | Must be `mongodb-delete-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |

View File

@@ -21,7 +21,7 @@ such as a user account or a single item from an inventory based on a unique ID.
The tool returns the number of documents deleted, which will be either `1` if a
document was found and deleted, or `0` if no matching document was found.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -52,7 +52,7 @@ tools:
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-delete-one`. |
| type | string | true | Must be `mongodb-delete-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |

View File

@@ -18,7 +18,7 @@ returned. Otherwise, the selection is not guaranteed.
The tool returns a single JSON object representing the document, wrapped in a
JSON array.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -55,7 +55,7 @@ tools:
| **field** | **type** | **required** | **description** |
|:---------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-find-one`. |
| type | string | true | Must be `mongodb-find-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database to query. |

View File

@@ -18,7 +18,7 @@ results (**sorting**), and restricting the number of documents returned
The tool returns a JSON array of the documents found.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -62,7 +62,7 @@ tools:
| **field** | **type** | **required** | **description** |
|:---------------|:---------|:-------------|:----------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-find`. |
| type | string | true | Must be `mongodb-find`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database to query. |

View File

@@ -19,7 +19,7 @@ be a string containing a **JSON array of document objects**. Upon successful
insertion, the tool returns a JSON array containing the unique `_id` of **each**
new document that was created.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -50,7 +50,7 @@ in the `data` parameter, like this:
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-many`. |
| type | string | true | Must be `mongodb-insert-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |

View File

@@ -17,7 +17,7 @@ This tool takes one required parameter named `data`, which must be a string
containing the JSON object you want to insert. Upon successful insertion, the
tool returns the unique `_id` of the newly created document.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -45,7 +45,7 @@ An LLM would call this tool by providing the document as a JSON string in the
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-one`. |
| type | string | true | Must be `mongodb-insert-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |

View File

@@ -17,7 +17,7 @@ MongoDB collection that match a given filter. It locates the documents using a
The tool returns an array of three integers: `[ModifiedCount, UpsertedCount,
MatchedCount]`.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -59,7 +59,7 @@ tools:
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-many`. |
| type | string | true | Must be `mongodb-update-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |

View File

@@ -15,7 +15,7 @@ collection. It locates the document to be updated using a `filterPayload` and
applies modifications defined in an `updatePayload`. If the filter matches
multiple documents, only the first one found will be updated.
This tool is compatible with the following source kind:
This tool is compatible with the following source type:
* [`mongodb`](../../sources/mongodb.md)
@@ -59,7 +59,7 @@ tools:
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-one`. |
| type | string | true | Must be `mongodb-update-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |

View File

@@ -37,6 +37,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "mssql-execute-sql". |
| type | string | true | Must be "mssql-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -103,7 +103,7 @@ tools:
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mssql-sql". |
| type | string | true | Must be "mssql-sql". |
| source | string | true | Name of the source the T-SQL statement should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute. |

View File

@@ -37,6 +37,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mysql-execute-sql". |
| type | string | true | Must be "mysql-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -98,7 +98,7 @@ tools:
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mysql-sql". |
| type | string | true | Must be "mysql-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |

Some files were not shown because too many files have changed in this diff Show More