docs: fix docs lint (#1677)

fix docs lint for release
This commit is contained in:
Wenxin Du
2025-10-09 19:25:05 -04:00
committed by GitHub
parent 7e22cb455d
commit 98f7ee2e36
22 changed files with 78 additions and 45 deletions

View File

@@ -134,8 +134,10 @@ go test -race -v ./...
```shell
go test -race -v ./tests/alloydbpg
```
1. **Timeout:** The integration test should have a timeout on the server.
Look for code like this:
```go
ctx, cancel := context.WithTimeout(context.Background(), time.Minute)
defer cancel()
@@ -146,6 +148,7 @@ go test -race -v ./...
}
defer cleanup()
```
Be sure to set the timeout to a reasonable value for your tests.
#### Running on Pull Requests
@@ -260,7 +263,6 @@ There are 3 GHA workflows we use to achieve document versioning:
To rebuild and redeploy documentation for an already released version that were released before this new system was in place. This workflow can be started on the UI by providing the git version tag which you want to create the documentation for.
The specific versioned subdirectory and the root docs are updated on the versioned-gh-pages branch.
#### Contributors
Request a repo owner to run the preview deployment workflow on your PR. A

View File

@@ -118,6 +118,7 @@ To install Toolbox as a binary:
> <summary>Linux (AMD64)</summary>
>
> To install Toolbox as a binary on Linux (AMD64):
>
> ```sh
> # see releases page for other versions
> export VERSION=0.16.0
@@ -130,6 +131,7 @@ To install Toolbox as a binary:
> <summary>macOS (Apple Silicon)</summary>
>
> To install Toolbox as a binary on macOS (Apple Silicon):
>
> ```sh
> # see releases page for other versions
> export VERSION=0.16.0
@@ -142,6 +144,7 @@ To install Toolbox as a binary:
> <summary>macOS (Intel)</summary>
>
> To install Toolbox as a binary on macOS (Intel):
>
> ```sh
> # see releases page for other versions
> export VERSION=0.16.0
@@ -154,6 +157,7 @@ To install Toolbox as a binary:
> <summary>Windows (AMD64)</summary>
>
> To install Toolbox as a binary on Windows (AMD64):
>
> ```powershell
> # see releases page for other versions
> $VERSION = "0.16.0"

View File

@@ -14,9 +14,8 @@ This tool provisions a cluster with a **private IP address** within the specifie
**Permissions & APIs Required:**
Before using, ensure the following on your GCP project:
1. The [AlloyDB API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com) is enabled.
2. The user or service account executing the tool has one of the following IAM roles:
1. The [AlloyDB API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com) is enabled.
2. The user or service account executing the tool has one of the following IAM roles:
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
- `roles/owner` (the Owner basic IAM role)

View File

@@ -15,14 +15,14 @@ or `ALLOYDB_IAM_USER`) within a specified cluster. It is compatible with
**Permissions & APIs Required:**
Before using, ensure the following on your GCP project:
1. The [AlloyDB
1. The [AlloyDB
API](https://console.cloud.google.com/apis/library/alloydb.googleapis.com)
is enabled.
2. The user or service account executing the tool has one of the following IAM
2. The user or service account executing the tool has one of the following IAM
roles:
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
- `roles/owner` (the Owner basic IAM role)
- `roles/editor` (the Editor basic IAM role)
- `roles/alloydb.admin` (the AlloyDB Admin predefined IAM role)
- `roles/owner` (the Owner basic IAM role)
- `roles/editor` (the Editor basic IAM role)
The tool takes the following input parameters:

View File

@@ -16,13 +16,15 @@ It's compatible with the following sources:
- [bigquery](../../sources/bigquery.md)
`bigquery-list-table-ids` accepts the following parameters:
- **`dataset`** (required): Specifies the dataset from which to list table IDs.
- **`project`** (optional): Defines the Google Cloud project ID. If not provided,
the tool defaults to the project from the source configuration.
The tool's behavior regarding these parameters is influenced by the
The tool's behavior regarding these parameters is influenced by the
`allowedDatasets` restriction on the `bigquery` source:
- **Without `allowedDatasets` restriction:** The tool can list tables from any
- **Without `allowedDatasets` restriction:** The tool can list tables from any
dataset specified by the `dataset` and `project` parameters.
- **With `allowedDatasets` restriction:** Before listing tables, the tool verifies
that the requested dataset is in the allowed list. If it is not, the request is

View File

@@ -61,4 +61,4 @@ tools:
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-search-catalog". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -14,8 +14,8 @@ A `clickhouse-list-databases` tool lists all available databases in a ClickHouse
instance. It's compatible with the [clickhouse](../../sources/clickhouse.md)
source.
This tool executes the `SHOW DATABASES` command and returns a list of all
databases accessible to the configured user, making it useful for database
This tool executes the `SHOW DATABASES` command and returns a list of all
databases accessible to the configured user, making it useful for database
discovery and exploration tasks.
## Example
@@ -31,9 +31,11 @@ tools:
## Return Value
The tool returns an array of objects, where each object contains:
- `name`: The name of the database
Example response:
```json
[
{"name": "default"},

View File

@@ -57,4 +57,4 @@ Example response:
| source | string | true | Name of the ClickHouse source to list tables from. |
| description | string | true | Description of the tool that is passed to the LLM. |
| authRequired | array of string | false | Authentication services required to use this tool. |
| parameters | array of Parameter | false | Parameters for the tool (see Parameters section above). |
| parameters | array of Parameter | false | Parameters for the tool (see Parameters section above). |

View File

@@ -79,4 +79,4 @@ tools:
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The SQL statement template to execute. |
| parameters | array of Parameter | false | Parameters for prepared statement values. |
| templateParameters | array of Parameter | false | Parameters for SQL statement template customization. |
| templateParameters | array of Parameter | false | Parameters for SQL statement template customization. |

View File

@@ -16,7 +16,8 @@ It is a standalone tool and **is not** compatible with any sources.
At invocation time, the tool executes `dataform compile --json` in the specified project directory and returns the resulting JSON object from the CLI.
`dataform-compile-local` takes the following parameter:
`dataform-compile-local` takes the following parameter:
- `project_dir` (string): The absolute or relative path to the local Dataform project directory. The server process must have read access to this path.
## Requirements
@@ -26,6 +27,7 @@ At invocation time, the tool executes `dataform compile --json` in the specified
This tool executes the `dataform` command-line interface (CLI) via a system call. You must have the **`dataform` CLI** installed and available in the server's system `PATH`.
You can typically install the CLI via `npm`:
```bash
npm install -g @dataform/cli
```
@@ -42,6 +44,7 @@ tools:
```
## Reference
| **field** | **type** | **required** | **description** |
| :---- | :---- | :---- | :---- |
| kind | string | true | Must be "dataform-compile-local". |

View File

@@ -21,14 +21,15 @@ form: projects/{project}/locations/{location} and also a required `entry`
parameter which is the resource name of the entry in the following form:
projects/{project}/locations/{location}/entryGroups/{entryGroup}/entries/{entry}.
It also optionally accepts following parameters:
- `view` - View to control which parts of an entry the service should return.
- `view` - View to control which parts of an entry the service should return.
It takes integer values from 1-4 corresponding to type of view - BASIC,
FULL, CUSTOM, ALL
- `aspectTypes` - Limits the aspects returned to the provided aspect types in
- `aspectTypes` - Limits the aspects returned to the provided aspect types in
the format
`projects/{project}/locations/{location}/aspectTypes/{aspectType}`. It only
works for CUSTOM view.
- `paths` - Limits the aspects returned to those associated with the provided
- `paths` - Limits the aspects returned to those associated with the provided
paths within the Entry. It only works for CUSTOM view.
## Requirements
@@ -36,13 +37,13 @@ It also optionally accepts following parameters:
### IAM Permissions
Dataplex uses [Identity and Access Management (IAM)][iam-overview] to control
user and group access to Dataplex resources. Toolbox will use your
[Application Default Credentials (ADC)][adc] to authorize and authenticate when
user and group access to Dataplex resources. Toolbox will use your
[Application Default Credentials (ADC)][adc] to authorize and authenticate when
interacting with [Dataplex][dataplex-docs].
In addition to [setting the ADC for your server][set-adc], you need to ensure
the IAM identity has been given the correct IAM permissions for the tasks you
intend to perform. See [Dataplex Universal Catalog IAM permissions][iam-permissions]
intend to perform. See [Dataplex Universal Catalog IAM permissions][iam-permissions]
and [Dataplex Universal Catalog IAM roles][iam-roles] for more information on
applying IAM permissions and roles to an identity.
@@ -68,4 +69,4 @@ tools:
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "dataplex-lookup-entry". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -62,4 +62,4 @@ tools:
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "dataplex-search-aspect-types". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -256,6 +256,7 @@ tools:
## Error Handling
Common errors include:
- Invalid collection path
- Missing or invalid document data
- Permission denied (if Firestore security rules block the operation)

View File

@@ -8,7 +8,7 @@ aliases:
- /resources/tools/firestore-query-collection
---
# About
## About
The `firestore-query-collection` tool allows you to query Firestore collections
with filters, ordering, and limit capabilities.

View File

@@ -146,6 +146,7 @@ templates throughout the configuration.
## Filter Format
### Simple Filter
```json
{
"field": "age",
@@ -155,6 +156,7 @@ templates throughout the configuration.
```
### AND Filter
```json
{
"and": [
@@ -165,6 +167,7 @@ templates throughout the configuration.
```
### OR Filter
```json
{
"or": [
@@ -175,6 +178,7 @@ templates throughout the configuration.
```
### Nested Filters
```json
{
"or": [
@@ -208,6 +212,7 @@ The tool supports all Firestore native JSON value types:
### Complex Type Examples
**GeoPoint:**
```json
{
"field": "location",
@@ -222,6 +227,7 @@ The tool supports all Firestore native JSON value types:
```
**Array:**
```json
{
"field": "tags",
@@ -366,6 +372,7 @@ curl -X POST http://localhost:5000/api/tool/your-tool-name/invoke \
### Response Format
**Without analyzeQuery:**
```json
[
{
@@ -384,6 +391,7 @@ curl -X POST http://localhost:5000/api/tool/your-tool-name/invoke \
```
**With analyzeQuery:**
```json
{
"documents": [...],

View File

@@ -41,6 +41,7 @@ The tool requires Firestore's native JSON format for document data. Each field
must be wrapped with its type indicator:
### Basic Types
- **String**: `{"stringValue": "your string"}`
- **Integer**: `{"integerValue": "123"}` or `{"integerValue": 123}`
- **Double**: `{"doubleValue": 123.45}`
@@ -50,6 +51,7 @@ must be wrapped with its type indicator:
- **Timestamp**: `{"timestampValue": "2025-01-07T10:00:00Z"}` (RFC3339 format)
### Complex Types
- **GeoPoint**: `{"geoPointValue": {"latitude": 34.052235, "longitude": -118.243683}}`
- **Array**: `{"arrayValue": {"values": [{"stringValue": "item1"}, {"integerValue": "2"}]}}`
- **Map**: `{"mapValue": {"fields": {"key1": {"stringValue": "value1"}, "key2": {"booleanValue": true}}}}`
@@ -78,7 +80,6 @@ deleted. To delete a field, include it in the `updateMask` but omit it from
| source | string | true | Name of the Firestore source to update documents in. |
| description | string | true | Description of the tool that is passed to the LLM. |
## Examples
### Basic Document Update (Full Merge)
@@ -155,6 +156,7 @@ To delete fields, include them in the `updateMask` but omit them from `documentD
```
In this example:
- `name` will be updated to "John Smith"
- `temporaryField` and `obsoleteData` will be deleted from the document (they are in the mask but not in the data)
@@ -302,8 +304,11 @@ tools:
- google-oauth
- api-key
```
## Error Handling
Common errors include:
- Document not found (when using update with a non-existent document)
- Invalid document path
- Missing or invalid document data
@@ -311,6 +316,7 @@ Common errors include:
- Invalid data type conversions
## Best Practices
1. **Use update masks for precision**: When you only need to update specific fields, use the `updateMask` parameter to avoid unintended changes
2. **Always use typed values**: Every field must be wrapped with its appropriate type indicator (e.g., `{"stringValue": "text"}`)
3. **Integer values can be strings**: The tool accepts integer values as strings (e.g., `{"integerValue": "1500"}`)

View File

@@ -3,7 +3,8 @@ title: "postgres-list-available-extensions"
type: docs
weight: 1
description: >
The "postgres-list-available-extensions" tool retrieves all PostgreSQL extensions available for installation on a Postgres database.
The "postgres-list-available-extensions" tool retrieves all PostgreSQL
extensions available for installation on a Postgres database.
aliases:
- /resources/tools/postgres-list-available-extensions
---

View File

@@ -3,7 +3,8 @@ title: "postgres-list-installed-extensions"
type: docs
weight: 1
description: >
The "postgres-list-installed-extensions" tool retrieves all PostgreSQL extensions installed on a Postgres database.
The "postgres-list-installed-extensions" tool retrieves all PostgreSQL
extensions installed on a Postgres database.
aliases:
- /resources/tools/postgres-list-installed-extensions
---

View File

@@ -3,7 +3,8 @@ title: "postgres-list-tables"
type: docs
weight: 1
description: >
The "postgres-list-tables" tool lists schema information for all or specified tables in a Postgres database.
The "postgres-list-tables" tool lists schema information for all or specified
tables in a Postgres database.
aliases:
- /resources/tools/postgres-list-tables
---
@@ -20,12 +21,12 @@ following sources:
`postgres-list-tables` lists detailed schema information (object type, columns,
constraints, indexes, triggers, owner, comment) as JSON for user-created tables
(ordinary or partitioned). The tool takes the following input parameters:
* `table_names` (optional): Filters by a comma-separated list of names. By
default, it lists all tables in user schemas.
* `output_format` (optional): Indicate the output format of table schema.
`simple` will return only the table names, `detailed` will return the full
table information. Default: `detailed`.
(ordinary or partitioned). The tool takes the following input parameters: *
`table_names` (optional): Filters by a comma-separated list of names. By
default, it lists all tables in user schemas.* `output_format` (optional):
Indicate the output format of table schema. `simple` will return only the
table names, `detailed` will return the full table information. Default:
`detailed`.
## Example
@@ -34,7 +35,8 @@ tools:
postgres_list_tables:
kind: postgres-list-tables
source: postgres-source
description: Use this tool to retrieve schema information for all or specified tables. Output format can be simple (only table names) or detailed.
description: Use this tool to retrieve schema information for all or
specified tables. Output format can be simple (only table names) or detailed.
```
## Reference

View File

@@ -34,9 +34,10 @@ tools:
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "spanner-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| readOnly | bool | false | When set to `true`, the `statement` is run as a read-only transaction. Default: `false`. |
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------------------------------------------|
| kind | string | true | Must be "spanner-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| readOnly | bool | false | When set to `true`, the `statement` is run as a read-only transaction. Default: `false`. |

View File

@@ -6,4 +6,4 @@ description: >
An end to end tutorial for building an ADK agent using the AlloyDB AI NL tool.
---
{{< ipynb "alloydb_ai_nl.ipynb" >}}
{{< ipynb "alloydb_ai_nl.ipynb" >}}

View File

@@ -361,4 +361,4 @@ Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
- Learn more about [MCP Inspector](../../how-to/connect_via_mcp.md).
- Learn more about [Toolbox Resources](../../resources/).
- Learn more about [Toolbox How-to guides](../../how-to/).
- Learn more about [Toolbox How-to guides](../../how-to/).