This PR introduces a significant update to the Toolbox configuration file format, which is one of the primary **breaking changes** required for the implementation of the Advanced Control Plane. # Summary of Changes The configuration schema has been updated to enforce resource isolation and facilitate atomic, incremental updates. * Resource Isolation: Resource definitions are now separated into individual blocks, using a distinct structure for each resource type (Source, Tool, Toolset, etc.). This improves readability, management, and auditing of configuration files. * Field Name Modification: Internal field names have been modified to align with declarative methodologies. Specifically, the configuration now separates kind (general resource type, e.g., Source) from type (specific implementation, e.g., Postgres). # User Impact Existing tools.yaml configuration files are now in an outdated format. Users must eventually update their files to the new YAML format. # Mitigation & Compatibility Backward compatibility is maintained during this transition to ensure no immediate user action is required for existing files. * Immediate Backward Compatibility: The source code includes a pre-processing layer that automatically detects outdated configuration files (v1 format) and converts them to the new v2 format under the hood. * [COMING SOON] Migration Support: The new toolbox migrate subcommand will be introduced to allow users to automatically convert their old configuration files to the latest format. # Example Example for config file v2: ``` kind: sources name: my-pg-instance type: cloud-sql-postgres project: my-project region: my-region instance: my-instance database: my_db user: my_user password: my_pass --- kind: authServices name: my-google-auth type: google clientId: testing-id --- kind: tools name: example_tool type: postgres-sql source: my-pg-instance description: some description statement: SELECT * FROM SQL_STATEMENT; parameters: - name: country type: string description: some description --- kind: tools name: example_tool_2 type: postgres-sql source: my-pg-instance description: returning the number one statement: SELECT 1; --- kind: toolsets name: example_toolset tools: - example_tool ``` --------- Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com> Co-authored-by: Averi Kitsch <akitsch@google.com>
5.2 KiB
title, type, weight, description, aliases
| title | type | weight | description | aliases | |
|---|---|---|---|---|---|
| bigtable-sql | docs | 1 | A "bigtable-sql" tool executes a pre-defined SQL statement against a Google Cloud Bigtable instance. |
|
About
A bigtable-sql tool executes a pre-defined SQL statement against a Bigtable
instance. It's compatible with any of the following sources:
GoogleSQL
Bigtable supports SQL queries. The integration with Toolbox supports googlesql
dialect, the specified SQL statement is executed as a data manipulation
language (DML) statements, and specified parameters will
inserted according to their name: e.g. @name.
{{}} Bigtable's GoogleSQL support for DML statements might be limited to certain query types. For detailed information on supported DML statements and use cases, refer to the Bigtable GoogleSQL use cases. {{}}
Example
Note: This tool uses parameterized queries to prevent SQL injections. Query parameters can be used as substitutes for arbitrary expressions. Parameters cannot be used as substitutes for identifiers, column names, table names, or other parts of the query.
kind: tools
name: search_user_by_id_or_name
type: bigtable-sql
source: my-bigtable-instance
statement: |
SELECT
TO_INT64(cf[ 'id' ]) as id,
CAST(cf[ 'name' ] AS string) as name,
FROM
mytable
WHERE
TO_INT64(cf[ 'id' ]) = @id
OR CAST(cf[ 'name' ] AS string) = @name;
description: |
Use this tool to get information for a specific user.
Takes an id number or a name and returns info on the user.
Example:
{{
"id": 123,
"name": "Alice",
}}
parameters:
- name: id
type: integer
description: User ID
- name: name
type: string
description: Name of the user
Example with Template Parameters
Note: This tool allows direct modifications to the SQL statement, including identifiers, column names, and table names. This makes it more vulnerable to SQL injections. Using basic parameters only (see above) is recommended for performance and safety reasons. For more details, please check templateParameters.
kind: tools
name: list_table
type: bigtable-sql
source: my-bigtable-instance
statement: |
SELECT * FROM {{.tableName}};
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "flights",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
Reference
| field | type | required | description |
|---|---|---|---|
| type | string | true | Must be "bigtable-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | parameters | false | List of parameters that will be inserted into the SQL statement. |
| templateParameters | templateParameters | false | List of templateParameters that will be inserted into the SQL statement before executing prepared statement. |
Tips
- Bigtable Studio is a useful to explore and manage your Bigtable data. If you're unfamiliar with the query syntax, Query Builder lets you build a query, run it against a table, and then view the results in the console.
- Some Python libraries limit the use of underscore columns such as
_key. A workaround would be to leverage Bigtable Logical Views to rename the columns.