Compare commits
312 Commits
v0.2.1
...
llm-testin
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
be31376212 | ||
|
|
db8473263e | ||
|
|
2680864dca | ||
|
|
010037af19 | ||
|
|
f69ec70eaf | ||
|
|
187fe69a8b | ||
|
|
45de436118 | ||
|
|
1598e32e34 | ||
|
|
ae68aa58bd | ||
|
|
7fa8633a20 | ||
|
|
615e6e76d9 | ||
|
|
9624d845f2 | ||
|
|
72a7282797 | ||
|
|
29fe3b93cd | ||
|
|
fb3f66acf4 | ||
|
|
1f95eb134b | ||
|
|
4c240ac3c9 | ||
|
|
c6ab74c5da | ||
|
|
04e2529ba9 | ||
|
|
53dd247e6e | ||
|
|
648eede62b | ||
|
|
9b2dfcc553 | ||
|
|
cb514209b6 | ||
|
|
0a93b0482c | ||
|
|
f13e9635ba | ||
|
|
fafed24858 | ||
|
|
6337434623 | ||
|
|
822708afaa | ||
|
|
010c278cbf | ||
|
|
40679d700e | ||
|
|
5fb056ee43 | ||
|
|
a1b60100c2 | ||
|
|
cb92883330 | ||
|
|
bd2f1956bd | ||
|
|
cbb4a33351 | ||
|
|
7badba42ee | ||
|
|
f72e426314 | ||
|
|
7a6644cf0c | ||
|
|
184c681797 | ||
|
|
474df57d62 | ||
|
|
fc1a3813ea | ||
|
|
c7fe3c7f38 | ||
|
|
dc2690bd39 | ||
|
|
b78f7480cf | ||
|
|
ffe9b74211 | ||
|
|
e1355660d4 | ||
|
|
d8e2abe2dd | ||
|
|
7b3539e9ff | ||
|
|
1d658c3b14 | ||
|
|
fd300dc606 | ||
|
|
4827771b78 | ||
|
|
a8df414b11 | ||
|
|
0bf4ebabf1 | ||
|
|
67964d939f | ||
|
|
f77c829271 | ||
|
|
d2977ed1ba | ||
|
|
52e8bf4de1 | ||
|
|
a3aaf93525 | ||
|
|
9197186b8b | ||
|
|
e3844ff76d | ||
|
|
ef6e3f1c32 | ||
|
|
f5f771b0f3 | ||
|
|
12b6636a9b | ||
|
|
d51dbc759b | ||
|
|
4055b0c356 | ||
|
|
65dba4cabc | ||
|
|
447cda2daf | ||
|
|
c54ef61fc6 | ||
|
|
eb98cdc7d1 | ||
|
|
1c067715fa | ||
|
|
cb87f765a6 | ||
|
|
a982314900 | ||
|
|
054ec198b9 | ||
|
|
f0aef29b0c | ||
|
|
075dfa47e1 | ||
|
|
ad62d14cd5 | ||
|
|
5d183b0efe | ||
|
|
904d04bc45 | ||
|
|
75e254c0a4 | ||
|
|
850b32c5b0 | ||
|
|
927ef3c508 | ||
|
|
15d3c45159 | ||
|
|
714d990c34 | ||
|
|
e6c2fb324b | ||
|
|
cf96f4c249 | ||
|
|
8569c6b59f | ||
|
|
44d41a4888 | ||
|
|
4998f82852 | ||
|
|
b81fc6aa6c | ||
|
|
29aa0a70da | ||
|
|
5638ef520a | ||
|
|
2f42de9507 | ||
|
|
0a08d2c15d | ||
|
|
71250e1ced | ||
|
|
702dbc355b | ||
|
|
ef0cbdb4bf | ||
|
|
518a0e4c70 | ||
|
|
d7ba2736eb | ||
|
|
33ae70ec02 | ||
|
|
1c9ad5ea24 | ||
|
|
b76346993f | ||
|
|
1830702fd8 | ||
|
|
b4862825e8 | ||
|
|
f5de1af5bd | ||
|
|
46d7cdf4ba | ||
|
|
9ecf1755ab | ||
|
|
1596f5d772 | ||
|
|
594066f9f4 | ||
|
|
0ddc7240b7 | ||
|
|
0880e16c05 | ||
|
|
a6c49007bf | ||
|
|
ef94648455 | ||
|
|
69d047af46 | ||
|
|
4700dd363c | ||
|
|
386bb23e7c | ||
|
|
ba8a6f3a3b | ||
|
|
ad97578fdf | ||
|
|
3d10f85302 | ||
|
|
5a4cc9af6b | ||
|
|
15f90a3773 | ||
|
|
87380f629d | ||
|
|
953ab9336f | ||
|
|
032b333961 | ||
|
|
25afd63496 | ||
|
|
3180830403 | ||
|
|
a29c80012e | ||
|
|
0fd88b574b | ||
|
|
e9a6018526 | ||
|
|
1702ce1e00 | ||
|
|
5292e12588 | ||
|
|
1bf6003eae | ||
|
|
0857be0aa8 | ||
|
|
22edbea579 | ||
|
|
8df757b280 | ||
|
|
5c66977877 | ||
|
|
e615e355d5 | ||
|
|
9b7e7a0b3e | ||
|
|
72c236be03 | ||
|
|
1d0ed42067 | ||
|
|
301dfa1114 | ||
|
|
6083a224aa | ||
|
|
d6dc0c5269 | ||
|
|
eb52b66d82 | ||
|
|
8dec385538 | ||
|
|
6a832a1d7e | ||
|
|
8df5568901 | ||
|
|
0c07e15c2c | ||
|
|
0e4564f383 | ||
|
|
4b4fbc656a | ||
|
|
4d4b3ebeb9 | ||
|
|
d65747a2dc | ||
|
|
8590061ae4 | ||
|
|
e89abac29f | ||
|
|
6512704e77 | ||
|
|
04dcf47912 | ||
|
|
0e53829703 | ||
|
|
a890d0beee | ||
|
|
ca4491b0a9 | ||
|
|
2068f26302 | ||
|
|
d7579861e8 | ||
|
|
e8e0125eaa | ||
|
|
c98bc4b1a3 | ||
|
|
b58bf76dda | ||
|
|
5a5e06f1a6 | ||
|
|
5c166d0651 | ||
|
|
2c3e7a0c1c | ||
|
|
7138ed5f42 | ||
|
|
54f2614edf | ||
|
|
6bcbadf948 | ||
|
|
e375317914 | ||
|
|
01e089cc34 | ||
|
|
352b3ed91c | ||
|
|
df31fa6680 | ||
|
|
b2ff195831 | ||
|
|
00e3a87258 | ||
|
|
e747b6e289 | ||
|
|
8834a36445 | ||
|
|
9a5d76e2dc | ||
|
|
a087280fe2 | ||
|
|
c26b2f4d9e | ||
|
|
31a1fe971a | ||
|
|
1d096de82f | ||
|
|
652dc5c2dd | ||
|
|
1dce40fc26 | ||
|
|
717f43420a | ||
|
|
7c5ae0bf0b | ||
|
|
8b68764ef6 | ||
|
|
c7189e9fcf | ||
|
|
8b635955fc | ||
|
|
91cc3e366d | ||
|
|
d7390b06b7 | ||
|
|
724957b4a9 | ||
|
|
c891c8e7bc | ||
|
|
11ea7bc584 | ||
|
|
b2176c0e2f | ||
|
|
f629df642b | ||
|
|
50ec7f4a06 | ||
|
|
570d7caf4d | ||
|
|
a6c17c96f3 | ||
|
|
b6cd99e859 | ||
|
|
9ba6235106 | ||
|
|
ad040cfb8b | ||
|
|
644aecd9d2 | ||
|
|
8646989f80 | ||
|
|
d1c870c004 | ||
|
|
038262bbe1 | ||
|
|
7add34b544 | ||
|
|
b8dd50aded | ||
|
|
1e1348f5f0 | ||
|
|
59f4452755 | ||
|
|
a352045116 | ||
|
|
4ed16ccd18 | ||
|
|
fbeb2a9e59 | ||
|
|
d9388ad57e | ||
|
|
8014eea033 | ||
|
|
b61ae8f02f | ||
|
|
aade563c86 | ||
|
|
4dfeeec240 | ||
|
|
0c0d7b8637 | ||
|
|
eadb678a7b | ||
|
|
fc14cbfd07 | ||
|
|
8055aa519f | ||
|
|
ff7c0ffc65 | ||
|
|
073ca58ac1 | ||
|
|
ba0d6489eb | ||
|
|
53ce38021d | ||
|
|
43490dfa81 | ||
|
|
afbf4b2dae | ||
|
|
dced46ad88 | ||
|
|
ba1d52aa04 | ||
|
|
2c3f00c0cd | ||
|
|
71132d05d2 | ||
|
|
c4de4be044 | ||
|
|
59a14a2322 | ||
|
|
12a4f564b7 | ||
|
|
969065e561 | ||
|
|
b690f5ddc2 | ||
|
|
68502951ea | ||
|
|
8954fb0378 | ||
|
|
ae53b8eeff | ||
|
|
91f4402a71 | ||
|
|
52b09a67cb | ||
|
|
4eb78fb2ac | ||
|
|
6cb407969b | ||
|
|
c5c1172170 | ||
|
|
b3be1ffa3c | ||
|
|
a64e7c1e1a | ||
|
|
0106a1ebfd | ||
|
|
1a8a360305 | ||
|
|
e307857085 | ||
|
|
29560d66a0 | ||
|
|
4dba0df12d | ||
|
|
e99655ab68 | ||
|
|
62e8d7bec4 | ||
|
|
f232e5387e | ||
|
|
e9bd41aa18 | ||
|
|
f1ca030038 | ||
|
|
7ea3c904ce | ||
|
|
0a7d3ff06b | ||
|
|
cf967452e1 | ||
|
|
73332b9b9e | ||
|
|
6b907db9f0 | ||
|
|
21b82c10c2 | ||
|
|
6638fbb3a0 | ||
|
|
92ed74a4df | ||
|
|
be85b82078 | ||
|
|
e8ed447d91 | ||
|
|
8ef32cc73a | ||
|
|
75d2296d7a | ||
|
|
b81feb61e6 | ||
|
|
d8fc6bdb6f | ||
|
|
5cf4b8fea1 | ||
|
|
a9905cf39e | ||
|
|
460fe5c944 | ||
|
|
12222fe27a | ||
|
|
ea140969cf | ||
|
|
f02885fd4a | ||
|
|
a7d1d4eb2a | ||
|
|
55dff38db2 | ||
|
|
c46f271fa0 | ||
|
|
37ecfde466 | ||
|
|
cfffe83936 | ||
|
|
460e9e19b4 | ||
|
|
34a7263fdc | ||
|
|
3846951440 | ||
|
|
2f5a0a7257 | ||
|
|
2972422a9e | ||
|
|
fd6f796ab8 | ||
|
|
4cfc169c8a | ||
|
|
2be5387f3f | ||
|
|
78122e74ec | ||
|
|
98c6b7f834 | ||
|
|
c6500d2596 | ||
|
|
8d652e998d | ||
|
|
8ca0d1fc15 | ||
|
|
0fcad95e10 | ||
|
|
007f1bbfae | ||
|
|
e0dbb133eb | ||
|
|
33cca68023 | ||
|
|
64da5b4efe | ||
|
|
559cb66791 | ||
|
|
d98c8efa98 | ||
|
|
38cf4c02be | ||
|
|
1ebb89de41 | ||
|
|
0a19ac7c3b | ||
|
|
a3ab38539c | ||
|
|
0423bdb045 | ||
|
|
4e304569df | ||
|
|
802653a0ca | ||
|
|
0472d3ebb8 | ||
|
|
48ed9399b1 | ||
|
|
7f8c13b501 |
@@ -17,15 +17,8 @@ steps:
|
||||
waitFor: ['-']
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
docker buildx build --build-arg METADATA_TAGS=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$REF_NAME .
|
||||
|
||||
- id: "push-docker"
|
||||
name: "gcr.io/cloud-builders/docker"
|
||||
waitFor:
|
||||
- "build-docker"
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
docker push ${_DOCKER_URI}:$REF_NAME
|
||||
docker buildx create --name container-builder --driver docker-container --bootstrap --use
|
||||
docker buildx build --platform linux/amd64,linux/arm64 --build-arg COMMIT_SHA=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$REF_NAME --push .
|
||||
|
||||
- id: "install-dependencies"
|
||||
name: golang:1
|
||||
@@ -50,7 +43,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.linux.amd64.$REF_NAME" -o toolbox.linux.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
|
||||
|
||||
- id: "store-linux-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -72,7 +65,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=arm64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.arm64.$REF_NAME" -o toolbox.darwin.arm64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
|
||||
|
||||
- id: "store-darwin-arm64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -94,7 +87,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.amd64.$REF_NAME" -o toolbox.darwin.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
|
||||
|
||||
- id: "store-darwin-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -116,7 +109,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=windows GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.windows.amd64.$REF_NAME" -o toolbox.windows.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
|
||||
|
||||
- id: "store-windows-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -124,12 +117,13 @@ steps:
|
||||
- "build-windows-amd64"
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$REF_NAME/windows/amd64/toolbox
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$REF_NAME/windows/amd64/toolbox.exe
|
||||
|
||||
options:
|
||||
automapSubstitutions: true
|
||||
dynamicSubstitutions: true
|
||||
logging: CLOUD_LOGGING_ONLY # Necessary for custom service account
|
||||
machineType: 'E2_HIGHCPU_32'
|
||||
|
||||
substitutions:
|
||||
_REGION: us-central1
|
||||
|
||||
@@ -11,20 +11,37 @@ fi
|
||||
FILES=("linux.amd64" "darwin.arm64" "darwin.amd64" "windows.amd64")
|
||||
output_string=""
|
||||
|
||||
# Define the descriptions - ensure this array's order matches FILES
|
||||
DESCRIPTIONS=(
|
||||
"For **Linux** systems running on **Intel/AMD 64-bit processors**."
|
||||
"For **macOS** systems running on **Apple Silicon** (M1, M2, M3, etc.) processors."
|
||||
"For **macOS** systems running on **Intel processors**."
|
||||
"For **Windows** systems running on **Intel/AMD 64-bit processors**."
|
||||
)
|
||||
|
||||
# Write the table header
|
||||
ROW_FMT="| %-93s | %73s |"
|
||||
output_string+=$(printf "$ROW_FMT" "**os/arch**" "**sha256**")$'\n'
|
||||
output_string+=$(printf "$ROW_FMT" $(printf -- '-%0.s' {1..93}) $(printf -- '-%0.s' {1..73}))$'\n'
|
||||
ROW_FMT="| %-105s | %-120s | %-67s |\n"
|
||||
output_string+=$(printf "$ROW_FMT" "**OS/Architecture**" "**Description**" "**SHA256 Hash**")$'\n'
|
||||
output_string+=$(printf "$ROW_FMT" "$(printf -- '-%0.s' {1..105})" "$(printf -- '-%0.s' {1..120})" "$(printf -- '-%0.s' {1..67})")$'\n'
|
||||
|
||||
|
||||
# Loop through all files matching the pattern "toolbox.*.*"
|
||||
for file in "${FILES[@]}"
|
||||
for i in "${!FILES[@]}"
|
||||
do
|
||||
file_key="${FILES[$i]}" # e.g., "linux.amd64"
|
||||
description_text="${DESCRIPTIONS[$i]}"
|
||||
|
||||
# Extract OS and ARCH from the filename
|
||||
OS=$(echo "$file" | cut -d '.' -f 1)
|
||||
ARCH=$(echo "$file" | cut -d '.' -f 2)
|
||||
OS=$(echo "$file_key" | cut -d '.' -f 1)
|
||||
ARCH=$(echo "$file_key" | cut -d '.' -f 2)
|
||||
|
||||
# Get release URL
|
||||
URL=https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox
|
||||
if [ "$OS" = 'windows' ];
|
||||
then
|
||||
URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox.exe"
|
||||
else
|
||||
URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox"
|
||||
fi
|
||||
|
||||
curl "$URL" --fail --output toolbox || exit 1
|
||||
|
||||
@@ -32,10 +49,10 @@ do
|
||||
SHA256=$(shasum -a 256 toolbox | awk '{print $1}')
|
||||
|
||||
# Write the table row
|
||||
# output_string+="| [$OS/$ARCH]($URL) | $SHA256 |\n"
|
||||
output_string+=$(printf "$ROW_FMT" "[$OS/$ARCH]($URL)" "$SHA256")$'\n'
|
||||
output_string+=$(printf "$ROW_FMT" "[$OS/$ARCH]($URL)" "$description_text" "$SHA256")$'\n'
|
||||
|
||||
rm toolbox
|
||||
done
|
||||
|
||||
printf "$output_string\n"
|
||||
|
||||
|
||||
@@ -24,9 +24,22 @@ steps:
|
||||
script: |
|
||||
go get -d ./...
|
||||
|
||||
- id: "cloud-sql-pg"
|
||||
- id: "compile-test-binary"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
script: |
|
||||
go test -c -race -cover \
|
||||
-coverpkg=./internal/sources/...,./internal/tools/... ./tests/...
|
||||
chmod +x .ci/test_with_coverage.sh
|
||||
|
||||
- id: "cloud-sql-pg"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -43,11 +56,16 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,cloudsqlpg ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Cloud SQL Postgres" \
|
||||
cloudsqlpg \
|
||||
postgressql \
|
||||
postgresexecutesql
|
||||
|
||||
|
||||
- id: "alloydb-pg"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -64,11 +82,81 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,alloydb ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"AlloyDB Postgres" \
|
||||
alloydbpg \
|
||||
postgressql \
|
||||
postgresexecutesql
|
||||
|
||||
- id: "alloydb-ai-nl"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "ALLOYDB_AI_NL_PROJECT=$PROJECT_ID"
|
||||
- "ALLOYDB_AI_NL_CLUSTER=$_ALLOYDB_AI_NL_CLUSTER"
|
||||
- "ALLOYDB_AI_NL_INSTANCE=$_ALLOYDB_AI_NL_INSTANCE"
|
||||
- "ALLOYDB_AI_NL_DATABASE=$_DATABASE_NAME"
|
||||
- "ALLOYDB_AI_NL_REGION=$_REGION"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["ALLOYDB_AI_NL_USER", "ALLOYDB_AI_NL_PASS", "CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"AlloyDB AI NL" \
|
||||
alloydbainl \
|
||||
alloydbainl
|
||||
|
||||
- id: "bigtable"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "BIGTABLE_PROJECT=$PROJECT_ID"
|
||||
- "BIGTABLE_INSTANCE=$_BIGTABLE_INSTANCE"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv:
|
||||
["CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Bigtable" \
|
||||
bigtable \
|
||||
bigtable
|
||||
|
||||
- id: "bigquery"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "BIGQUERY_PROJECT=$PROJECT_ID"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"BigQuery" \
|
||||
bigquery \
|
||||
bigquery
|
||||
|
||||
- id: "postgres"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -83,28 +171,37 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,postgres ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Postgres" \
|
||||
postgres \
|
||||
postgressql \
|
||||
postgresexecutesql
|
||||
|
||||
- id: "spanner"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "SPANNER_PROJECT=$PROJECT_ID"
|
||||
- "SPANNER_DATABASE=$_DATABASE_NAME"
|
||||
- "SPANNER_INSTANCE=$_SPANNER_INSTANCE"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,spanner ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Spanner" \
|
||||
spanner \
|
||||
spanner
|
||||
|
||||
- id: "neo4j"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -117,11 +214,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,neo4j ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Neo4j" \
|
||||
neo4j \
|
||||
neo4j
|
||||
|
||||
- id: "cloud-sql-mssql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -138,11 +238,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,cloudsqlmssql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Cloud SQL MSSQL" \
|
||||
cloudsqlmssql \
|
||||
mssql
|
||||
|
||||
- id: "cloud-sql-mysql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -159,11 +262,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,cloudsqlmysql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Cloud SQL MySQL" \
|
||||
cloudsqlmysql \
|
||||
mysql
|
||||
|
||||
- id: "mysql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -178,11 +284,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,mysql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"MySQL" \
|
||||
mysql \
|
||||
mysql
|
||||
|
||||
- id: "mssql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -197,11 +306,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,mssql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"MSSQL" \
|
||||
mssql \
|
||||
mssql
|
||||
|
||||
- id: "dgraph"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -212,7 +324,108 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,dgraph ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Dgraph" \
|
||||
dgraph \
|
||||
dgraph
|
||||
|
||||
- id: "http"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
secretEnv: ["CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"HTTP" \
|
||||
http \
|
||||
http
|
||||
|
||||
- id: "sqlite"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
secretEnv: ["CLIENT_ID"]
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"SQLite" \
|
||||
sqlite \
|
||||
sqlite
|
||||
|
||||
- id: "couchbase"
|
||||
name : golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "COUCHBASE_SCOPE=$_COUCHBASE_SCOPE"
|
||||
- "COUCHBASE_BUCKET=$_COUCHBASE_BUCKET"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["COUCHBASE_CONNECTION", "COUCHBASE_USER", "COUCHBASE_PASS", "CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Couchbase" \
|
||||
couchbase \
|
||||
couchbase
|
||||
|
||||
- id: "redis"
|
||||
name : golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["REDIS_ADDRESS", "REDIS_PASS", "CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Redis" \
|
||||
redis \
|
||||
redis
|
||||
|
||||
- id: "valkey"
|
||||
name : golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "VALKEY_DATABASE=$_VALKEY_DATABASE"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["VALKEY_ADDRESS", "CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Valkey" \
|
||||
valkey \
|
||||
valkey
|
||||
|
||||
|
||||
availableSecrets:
|
||||
secretManager:
|
||||
@@ -224,6 +437,10 @@ availableSecrets:
|
||||
env: ALLOYDB_POSTGRES_USER
|
||||
- versionName: projects/$PROJECT_ID/secrets/alloydb_pg_pass/versions/latest
|
||||
env: ALLOYDB_POSTGRES_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/alloydb_ai_nl_user/versions/latest
|
||||
env: ALLOYDB_AI_NL_USER
|
||||
- versionName: projects/$PROJECT_ID/secrets/alloydb_ai_nl_pass/versions/latest
|
||||
env: ALLOYDB_AI_NL_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/postgres_user/versions/latest
|
||||
env: POSTGRES_USER
|
||||
- versionName: projects/$PROJECT_ID/secrets/postgres_pass/versions/latest
|
||||
@@ -247,9 +464,21 @@ availableSecrets:
|
||||
- versionName: projects/$PROJECT_ID/secrets/mysql_pass/versions/latest
|
||||
env: MYSQL_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/mssql_user/versions/latest
|
||||
env: MSSQL_USER
|
||||
env: MSSQL_USER
|
||||
- versionName: projects/$PROJECT_ID/secrets/mssql_pass/versions/latest
|
||||
env: MSSQL_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/couchbase_connection/versions/latest
|
||||
env: COUCHBASE_CONNECTION
|
||||
- versionName: projects/$PROJECT_ID/secrets/couchbase_user/versions/latest
|
||||
env: COUCHBASE_USER
|
||||
- versionName: projects/$PROJECT_ID/secrets/couchbase_pass/versions/latest
|
||||
env: COUCHBASE_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/memorystore_redis_address/versions/latest
|
||||
env: REDIS_ADDRESS
|
||||
- versionName: projects/$PROJECT_ID/secrets/memorystore_redis_pass/versions/latest
|
||||
env: REDIS_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/memorystore_valkey_address/versions/latest
|
||||
env: VALKEY_ADDRESS
|
||||
|
||||
|
||||
options:
|
||||
@@ -266,6 +495,9 @@ substitutions:
|
||||
_CLOUD_SQL_POSTGRES_INSTANCE: "cloud-sql-pg-testing"
|
||||
_ALLOYDB_POSTGRES_CLUSTER: "alloydb-pg-testing"
|
||||
_ALLOYDB_POSTGRES_INSTANCE: "alloydb-pg-testing-instance"
|
||||
_ALLOYDB_AI_NL_CLUSTER: "alloydb-ai-nl-testing"
|
||||
_ALLOYDB_AI_NL_INSTANCE: "alloydb-ai-nl-testing-instance"
|
||||
_BIGTABLE_INSTANCE: "bigtable-testing-instance"
|
||||
_POSTGRES_HOST: 127.0.0.1
|
||||
_POSTGRES_PORT: "5432"
|
||||
_SPANNER_INSTANCE: "spanner-testing"
|
||||
@@ -277,3 +509,5 @@ substitutions:
|
||||
_MSSQL_HOST: 127.0.0.1
|
||||
_MSSQL_PORT: "1433"
|
||||
_DGRAPHURL: "https://play.dgraph.io"
|
||||
_COUCHBASE_BUCKET: "couchbase-bucket"
|
||||
_COUCHBASE_SCOPE: "couchbase-scope"
|
||||
60
.ci/test_with_coverage.sh
Executable file
@@ -0,0 +1,60 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Arguments:
|
||||
# $1: Display name for logs (e.g., "Cloud SQL Postgres")
|
||||
# $2: Source package name (e.g., cloudsqlpg)
|
||||
# $3, $4, ...: Tool package names for grep (e.g., postgressql)
|
||||
|
||||
DISPLAY_NAME="$1"
|
||||
SOURCE_PACKAGE_NAME="$2"
|
||||
|
||||
# Construct the test binary name
|
||||
TEST_BINARY="${SOURCE_PACKAGE_NAME}.test"
|
||||
|
||||
# Construct the full source path
|
||||
SOURCE_PATH="sources/${SOURCE_PACKAGE_NAME}/"
|
||||
|
||||
# Shift arguments so that $3 and onwards become the list of tool package names
|
||||
shift 2
|
||||
TOOL_PACKAGE_NAMES=("$@")
|
||||
|
||||
COVERAGE_FILE="${TEST_BINARY%.test}_coverage.out"
|
||||
FILTERED_COVERAGE_FILE="${TEST_BINARY%.test}_filtered_coverage.out"
|
||||
|
||||
export path="github.com/googleapis/genai-toolbox/internal/"
|
||||
|
||||
GREP_PATTERN="^mode:|${path}${SOURCE_PATH}"
|
||||
# Add each tool package path to the grep pattern
|
||||
for tool_name in "${TOOL_PACKAGE_NAMES[@]}"; do
|
||||
if [ -n "$tool_name" ]; then
|
||||
full_tool_path="tools/${tool_name}/"
|
||||
GREP_PATTERN="${GREP_PATTERN}|${path}${full_tool_path}"
|
||||
fi
|
||||
done
|
||||
|
||||
# Run integration test
|
||||
if ! ./"${TEST_BINARY}" -test.v -test.coverprofile="${COVERAGE_FILE}"; then
|
||||
echo "Error: Tests for ${DISPLAY_NAME} failed. Exiting."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Filter source/tool packages
|
||||
if ! grep -E "${GREP_PATTERN}" "${COVERAGE_FILE}" > "${FILTERED_COVERAGE_FILE}"; then
|
||||
echo "Warning: Could not filter coverage for ${DISPLAY_NAME}. Filtered file might be empty or invalid."
|
||||
fi
|
||||
|
||||
# Calculate coverage
|
||||
echo "Calculating coverage for ${DISPLAY_NAME}..."
|
||||
total_coverage=$(go tool cover -func="${FILTERED_COVERAGE_FILE}" 2>/dev/null | grep "total:" | awk '{print $3}')
|
||||
|
||||
|
||||
echo "${DISPLAY_NAME} total coverage: $total_coverage"
|
||||
coverage_numeric=$(echo "$total_coverage" | sed 's/%//')
|
||||
|
||||
# Check coverage threshold
|
||||
if awk -v coverage="$coverage_numeric" 'BEGIN {exit !(coverage < 50)}'; then
|
||||
echo "Coverage failure: ${DISPLAY_NAME} total coverage($total_coverage) is below 50%."
|
||||
exit 1
|
||||
else
|
||||
echo "Coverage for ${DISPLAY_NAME} is sufficient."
|
||||
fi
|
||||
@@ -18,19 +18,13 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
docker buildx build --build-arg METADATA_TAGS=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$VERSION -t ${_DOCKER_URI}:latest .
|
||||
docker buildx create --name container-builder --driver docker-container --bootstrap --use
|
||||
|
||||
- id: "push-docker"
|
||||
name: "gcr.io/cloud-builders/docker"
|
||||
waitFor:
|
||||
- "build-docker"
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
docker push ${_DOCKER_URI}:$VERSION
|
||||
export TAGS="-t ${_DOCKER_URI}:$VERSION"
|
||||
if [[ $_PUSH_LATEST == 'true' ]]; then
|
||||
docker push ${_DOCKER_URI}:latest
|
||||
export TAGS="$TAGS -t ${_DOCKER_URI}:latest"
|
||||
fi
|
||||
docker buildx build --platform linux/amd64,linux/arm64 --build-arg BUILD_TYPE=container.release --build-arg COMMIT_SHA=$(git rev-parse HEAD) $TAGS --push .
|
||||
|
||||
- id: "install-dependencies"
|
||||
name: golang:1
|
||||
@@ -56,7 +50,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.linux.amd64.$VERSION.$(git rev-parse HEAD)" -o toolbox.linux.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
|
||||
|
||||
- id: "store-linux-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -80,7 +74,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=arm64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.arm64.$VERSION.$(git rev-parse HEAD)" -o toolbox.darwin.arm64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
|
||||
|
||||
- id: "store-darwin-arm64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -104,7 +98,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.amd64.$VERSION.$(git rev-parse HEAD)" -o toolbox.darwin.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
|
||||
|
||||
- id: "store-darwin-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -128,7 +122,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=windows GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.windows.amd64.$VERSION.$(git rev-parse HEAD)" -o toolbox.windows.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
|
||||
|
||||
- id: "store-windows-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -137,12 +131,13 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=v$(cat ./cmd/version.txt)
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$VERSION/windows/amd64/toolbox
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$VERSION/windows/amd64/toolbox.exe
|
||||
|
||||
options:
|
||||
automapSubstitutions: true
|
||||
dynamicSubstitutions: true
|
||||
logging: CLOUD_LOGGING_ONLY # Necessary for custom service account
|
||||
machineType: 'E2_HIGHCPU_32'
|
||||
|
||||
substitutions:
|
||||
_REGION: us-central1
|
||||
|
||||
119
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
Normal file
@@ -0,0 +1,119 @@
|
||||
# Copyright 2025 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# https://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
name: 🐞 Bug Report
|
||||
description: File a report for unexpected or undesired behavior.
|
||||
title: "<brief summary of what bug or error was observed>"
|
||||
labels: ["type: bug"]
|
||||
type: "bug"
|
||||
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Thanks for helping us improve! 🙏 Please answer these questions and provide as much information as possible about your problem.
|
||||
|
||||
- id: preamble
|
||||
type: checkboxes
|
||||
attributes:
|
||||
label: Prerequisites
|
||||
description: |
|
||||
Please run through the following list and make sure you've tried the usual "quick fixes":
|
||||
- Search the [current open issues](https://github.com/googleapis/genai-toolbox/issues)
|
||||
- Update to the [latest version of Toolbox](https://github.com/googleapis/genai-toolbox/releases)
|
||||
options:
|
||||
- label: "I've searched the current open issues"
|
||||
required: true
|
||||
- label: "I've updated to the latest version of Toolbox"
|
||||
|
||||
- type: input
|
||||
id: version
|
||||
attributes:
|
||||
label: Toolbox version
|
||||
description: |
|
||||
What version of Toolbox are you using (`toolbox --version`)? e.g.
|
||||
- toolbox version 0.3.0
|
||||
- us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:0.3.0
|
||||
placeholder: ex. toolbox version 0.3.0
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: environment
|
||||
attributes:
|
||||
label: Environment
|
||||
description: "Let us know some details about the environment in which you are seeing the bug!"
|
||||
value: |
|
||||
1. OS type and version: (output of `uname -a`)
|
||||
2. How are you running Toolbox:
|
||||
- As a downloaded binary (e.g. from `curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox`)
|
||||
- As a container (e.g. from `us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION`)
|
||||
- Compiled from source (include the command used to build)
|
||||
|
||||
- type: textarea
|
||||
id: client
|
||||
attributes:
|
||||
label: Client
|
||||
description: "How are you connecting to Toolbox?"
|
||||
value: |
|
||||
1. Client: <name and link to the client are you using>
|
||||
2. Version: <what exact version of the client are you using>
|
||||
3. Example: If possible, please include your code of configuration:
|
||||
|
||||
```python
|
||||
# Code goes here!
|
||||
```
|
||||
|
||||
- id: expected-behavior
|
||||
type: textarea
|
||||
attributes:
|
||||
label: Expected Behavior
|
||||
description: |
|
||||
Please enter a detailed description of the behavior you expected, and any information about what behavior you
|
||||
noticed and why it is defective or unintentional.
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- id: current-behavior
|
||||
type: textarea
|
||||
attributes:
|
||||
label: Current Behavior
|
||||
description: "Please enter a detailed description of the behavior you encountered instead."
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: repro
|
||||
attributes:
|
||||
label: Steps to reproduce?
|
||||
description: |
|
||||
How can we reproduce this bug? Please walk us through it step by step,
|
||||
with as much relevant detail as possible. A 'minimal' reproduction is
|
||||
preferred, which means removing as much of the examples as possible so
|
||||
only the minimum required to run and reproduce the bug is left.
|
||||
value: |
|
||||
1. ?
|
||||
2. ?
|
||||
3. ?
|
||||
...
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: additional-details
|
||||
attributes:
|
||||
label: Additional Details
|
||||
description: |
|
||||
Any other information you want us to know? Things such as tools config,
|
||||
server logs, etc. can be included here.
|
||||
5
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: Google Cloud Support
|
||||
url: https://cloud.google.com/support/
|
||||
about: If you have a support contract with Google, please both open an issue here and open Google Cloud Support portal with a link to the issue.
|
||||
60
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
# Copyright 2025 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# https://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
name: ✨ Feature Request
|
||||
description: Suggest an idea for new or improved behavior.
|
||||
title: "<brief summary of the proposed feature>"
|
||||
labels: ["type: feature request"]
|
||||
type: feature
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Thanks for helping us improve! 🙏 Please answer these questions and provide as much information as possible about your feature request.
|
||||
|
||||
- id: preamble
|
||||
type: checkboxes
|
||||
attributes:
|
||||
label: Prerequisites
|
||||
description: |
|
||||
Please run through the following list and make sure you've tried the usual "quick fixes":
|
||||
options:
|
||||
- label: "Search the [current open issues](https://github.com/googleapis/genai-toolbox/issues)"
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: use-case
|
||||
attributes:
|
||||
label: What are you trying to do that currently feels hard or impossible?
|
||||
description: "A clear and concise description of what the end goal for the feature should be -- avoid generalizing and try to provide a specific use-case."
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: suggested-solution
|
||||
attributes:
|
||||
label: Suggested Solution(s)
|
||||
description: "If you have a suggestion for how this use-case can be solved, please feel free to include it."
|
||||
|
||||
- type: textarea
|
||||
id: alternatives-considered
|
||||
attributes:
|
||||
label: Alternatives Considered
|
||||
description: "Are there any workaround or third party tools to replicate this behavior? Why would adding this feature be preferred over them?"
|
||||
|
||||
- type: textarea
|
||||
id: additional-details
|
||||
attributes:
|
||||
label: Additional Details
|
||||
description: "Any additional information we should know? Please reference it here (issues, PRs, descriptions, or screenshots)"
|
||||
55
.github/ISSUE_TEMPLATE/question.yml
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
# Copyright 2025 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# https://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
name: 💬 Question
|
||||
description: Questions on how something works or the best way to do something?
|
||||
title: "<brief summary of the question>"
|
||||
labels: ["type: question"]
|
||||
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Thanks for helping us improve! 🙏 Please provide as much information as possible about your question.
|
||||
|
||||
- id: preamble
|
||||
type: checkboxes
|
||||
attributes:
|
||||
label: Prerequisites
|
||||
description: |
|
||||
Please run through the following list and make sure you've tried the usual "quick fixes":
|
||||
options:
|
||||
- label: "Search the [current open issues](https://github.com/googleapis/genai-toolbox/issues)"
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: question
|
||||
attributes:
|
||||
label: Question
|
||||
description: "What's your question? Please provide as much relevant information as possible to reduce turnaround time. Include information like what environment, language, or framework you are using."
|
||||
placeholder: "Example: How do I connect using private IP with the AlloyDB source?"
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: code
|
||||
attributes:
|
||||
label: Code
|
||||
description: "Please paste any useful application code that might be relevant to your question. (if your code is in a public repo, feel free to paste a link!)"
|
||||
|
||||
- type: textarea
|
||||
id: additional-details
|
||||
attributes:
|
||||
label: Additional Details
|
||||
description: "Any other information you want us to know that might be helpful in answering your question? (link issues, PRs, descriptions, or screenshots)."
|
||||
15
.github/blunderbuss.yml
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
assign_issues:
|
||||
- kurtisvg
|
||||
- Yuan325
|
||||
- duwenxin99
|
||||
assign_issues_by:
|
||||
- labels:
|
||||
- 'product: bigquery'
|
||||
to:
|
||||
- Genesis929
|
||||
- shobsi
|
||||
- jiaxunwu
|
||||
assign_prs:
|
||||
- kurtisvg
|
||||
- Yuan325
|
||||
- duwenxin99
|
||||
2
.github/label-sync.yml
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
---
|
||||
ignored: true
|
||||
22
.github/labels.yaml
vendored
@@ -68,3 +68,25 @@
|
||||
- name: 'tests: run'
|
||||
color: 3DED97
|
||||
description: Label to trigger Github Action tests.
|
||||
|
||||
- name: 'docs: deploy-preview'
|
||||
color: BFDADC
|
||||
description: Label to trigger Github Action docs preview.
|
||||
|
||||
- name: 'status: contribution welcome'
|
||||
color: 8befd7
|
||||
description: Status - Contributions welcome.
|
||||
|
||||
- name: 'status: awaiting response'
|
||||
color: 8befd7
|
||||
description: Status - Awaiting response from author.
|
||||
|
||||
- name: 'status: awaiting codeowners'
|
||||
color: 8befd7
|
||||
description: Status - Awaiting response from code owners.
|
||||
|
||||
# Product Labels
|
||||
- name: 'product: bigquery'
|
||||
color: 5065c7
|
||||
description: Product - Assigned to the BigQuery team.
|
||||
|
||||
|
||||
13
.github/release-please.yml
vendored
@@ -20,5 +20,16 @@ extraFiles: [
|
||||
"README.md",
|
||||
"docs/en/getting-started/introduction/_index.md",
|
||||
"docs/en/getting-started/local_quickstart.md",
|
||||
"docs/en/how-to/deploy_gke.md",
|
||||
"docs/en/getting-started/mcp_quickstart/_index.md",
|
||||
"docs/en/samples/bigquery/local_quickstart.md",
|
||||
"docs/en/samples/bigquery/mcp_quickstart/_index.md",
|
||||
"docs/en/getting-started/colab_quickstart.ipynb",
|
||||
"docs/en/samples/bigquery/colab_quickstart_bigquery.ipynb",
|
||||
"docs/en/how-to/connect-ide/bigquery_mcp.md",
|
||||
"docs/en/how-to/connect-ide/spanner_mcp.md",
|
||||
"docs/en/how-to/connect-ide/alloydb_pg_mcp.md",
|
||||
"docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md",
|
||||
"docs/en/how-to/connect-ide/cloud_sql_pg_mcp.md",
|
||||
"docs/en/how-to/connect-ide/postgres_mcp.md",
|
||||
"docs/en/how-to/connect-ide/cloud_sql_mssql_mcp.md",
|
||||
]
|
||||
|
||||
28
.github/renovate.json5
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
{
|
||||
extends: [
|
||||
'config:recommended',
|
||||
':semanticCommitTypeAll(chore)',
|
||||
':ignoreUnstable',
|
||||
':separateMajorReleases',
|
||||
':prConcurrentLimitNone',
|
||||
':prHourlyLimitNone',
|
||||
':preserveSemverRanges',
|
||||
],
|
||||
minimumReleaseAge: '3',
|
||||
rebaseWhen: 'conflicted',
|
||||
dependencyDashboardLabels: [
|
||||
'type: process',
|
||||
],
|
||||
"postUpdateOptions": [
|
||||
"gomodTidy"
|
||||
],
|
||||
packageRules: [
|
||||
{
|
||||
groupName: 'GitHub Actions',
|
||||
matchManagers: [
|
||||
'github-actions',
|
||||
],
|
||||
pinDigests: true,
|
||||
},
|
||||
],
|
||||
}
|
||||
2
.github/sync-repo-settings.yaml
vendored
@@ -31,6 +31,8 @@ branchProtectionRules:
|
||||
- "header-check"
|
||||
# - Add required status checks like presubmit tests
|
||||
- "unit tests (ubuntu-latest)"
|
||||
- "unit tests (windows-latest)"
|
||||
- "unit tests (macos-latest)"
|
||||
- "integration-test-pr (toolbox-testing-438616)"
|
||||
requiredApprovingReviewCount: 1
|
||||
requiresCodeOwnerReviews: true
|
||||
|
||||
179
.github/workflows/cloud_build_failure_reporter.yml
vendored
Normal file
@@ -0,0 +1,179 @@
|
||||
# Copyright 2025 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
name: Cloud Build Failure Reporter
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
trigger_names:
|
||||
required: true
|
||||
type: string
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
trigger_names:
|
||||
description: 'Cloud Build trigger names separated by comma.'
|
||||
required: true
|
||||
default: ''
|
||||
|
||||
jobs:
|
||||
report:
|
||||
|
||||
permissions:
|
||||
issues: 'write'
|
||||
checks: 'read'
|
||||
|
||||
runs-on: 'ubuntu-latest'
|
||||
|
||||
steps:
|
||||
- uses: 'actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea' # v7
|
||||
with:
|
||||
script: |-
|
||||
// parse test names
|
||||
const testNameSubstring = '${{ inputs.trigger_names }}';
|
||||
const testNameFound = new Map(); //keeps track of whether each test is found
|
||||
testNameSubstring.split(',').forEach(testName => {
|
||||
testNameFound.set(testName, false);
|
||||
});
|
||||
|
||||
// label for all issues opened by reporter
|
||||
const periodicLabel = 'periodic-failure';
|
||||
|
||||
// check if any reporter opened any issues previously
|
||||
const prevIssues = await github.paginate(github.rest.issues.listForRepo, {
|
||||
...context.repo,
|
||||
state: 'open',
|
||||
creator: 'github-actions[bot]',
|
||||
labels: [periodicLabel]
|
||||
});
|
||||
|
||||
// createOrCommentIssue creates a new issue or comments on an existing issue.
|
||||
const createOrCommentIssue = async function (title, txt) {
|
||||
if (prevIssues.length < 1) {
|
||||
console.log('no previous issues found, creating one');
|
||||
await github.rest.issues.create({
|
||||
...context.repo,
|
||||
title: title,
|
||||
body: txt,
|
||||
labels: [periodicLabel]
|
||||
});
|
||||
return;
|
||||
}
|
||||
// only comment on issue related to the current test
|
||||
for (const prevIssue of prevIssues) {
|
||||
if (prevIssue.title.includes(title)){
|
||||
console.log(
|
||||
`found previous issue ${prevIssue.html_url}, adding comment`
|
||||
);
|
||||
|
||||
await github.rest.issues.createComment({
|
||||
...context.repo,
|
||||
issue_number: prevIssue.number,
|
||||
body: txt
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// updateIssues comments on any existing issues. No-op if no issue exists.
|
||||
const updateIssues = async function (checkName, txt) {
|
||||
if (prevIssues.length < 1) {
|
||||
console.log('no previous issues found.');
|
||||
return;
|
||||
}
|
||||
// only comment on issue related to the current test
|
||||
for (const prevIssue of prevIssues) {
|
||||
if (prevIssue.title.includes(checkName)){
|
||||
console.log(`found previous issue ${prevIssue.html_url}, adding comment`);
|
||||
await github.rest.issues.createComment({
|
||||
...context.repo,
|
||||
issue_number: prevIssue.number,
|
||||
body: txt
|
||||
});
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Find status of check runs.
|
||||
// We will find check runs for each commit and then filter for the periodic.
|
||||
// Checks API only allows for ref and if we use main there could be edge cases where
|
||||
// the check run happened on a SHA that is different from head.
|
||||
const commits = await github.paginate(github.rest.repos.listCommits, {
|
||||
...context.repo
|
||||
});
|
||||
|
||||
const relevantChecks = new Map();
|
||||
for (const commit of commits) {
|
||||
console.log(
|
||||
`checking runs at ${commit.html_url}: ${commit.commit.message}`
|
||||
);
|
||||
const checks = await github.rest.checks.listForRef({
|
||||
...context.repo,
|
||||
ref: commit.sha
|
||||
});
|
||||
|
||||
// Iterate through each check and find matching names
|
||||
for (const check of checks.data.check_runs) {
|
||||
console.log(`Handling test name ${check.name}`);
|
||||
for (const testName of testNameFound.keys()) {
|
||||
if (testNameFound.get(testName) === true){
|
||||
//skip if a check is already found for this name
|
||||
continue;
|
||||
}
|
||||
if (check.name.includes(testName)) {
|
||||
relevantChecks.set(check, commit);
|
||||
testNameFound.set(testName, true);
|
||||
}
|
||||
}
|
||||
}
|
||||
// Break out of the loop early if all tests are found
|
||||
const allTestsFound = Array.from(testNameFound.values()).every(value => value === true);
|
||||
if (allTestsFound){
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Handle each relevant check
|
||||
relevantChecks.forEach((commit, check) => {
|
||||
if (
|
||||
check.status === 'completed' &&
|
||||
check.conclusion === 'success'
|
||||
) {
|
||||
updateIssues(
|
||||
check.name,
|
||||
`[Tests are passing](${check.html_url}) for commit [${commit.sha}](${commit.html_url}).`
|
||||
);
|
||||
} else if (check.status === 'in_progress') {
|
||||
console.log(
|
||||
`Check is pending ${check.html_url} for ${commit.html_url}. Retry again later.`
|
||||
);
|
||||
} else {
|
||||
createOrCommentIssue(
|
||||
`Cloud Build Failure Reporter: ${check.name} failed`,
|
||||
`Cloud Build Failure Reporter found test failure for [**${check.name}** ](${check.html_url}) at [${commit.sha}](${commit.html_url}). Please fix the error and then close the issue after the **${check.name}** test passes.`
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
// no periodic checks found across all commits, report it
|
||||
const noTestFound = Array.from(testNameFound.values()).every(value => value === false);
|
||||
if (noTestFound){
|
||||
createOrCommentIssue(
|
||||
'Missing periodic tests: ${{ inputs.trigger_names }}',
|
||||
`No periodic test is found for triggers: ${{ inputs.trigger_names }}. Last checked from ${
|
||||
commits[0].html_url
|
||||
} to ${commits[commits.length - 1].html_url}.`
|
||||
);
|
||||
}
|
||||
18
.github/workflows/docs_deploy.yaml
vendored
@@ -24,14 +24,14 @@ on:
|
||||
paths:
|
||||
- 'docs/**'
|
||||
- 'github/workflows/docs**'
|
||||
- '.hugo'
|
||||
- '.hugo/**'
|
||||
|
||||
# Allow triggering manually.
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-24.04
|
||||
defaults:
|
||||
run:
|
||||
working-directory: .hugo
|
||||
@@ -39,23 +39,23 @@ jobs:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
with:
|
||||
fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
|
||||
|
||||
- name: Setup Hugo
|
||||
uses: peaceiris/actions-hugo@v3
|
||||
uses: peaceiris/actions-hugo@75d2e84710de30f6ff7268e08f310b60ef14033f # v3
|
||||
with:
|
||||
hugo-version: "latest"
|
||||
hugo-version: "0.145.0"
|
||||
extended: true
|
||||
|
||||
- name: Setup Node
|
||||
uses: actions/setup-node@v4
|
||||
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
|
||||
with:
|
||||
node-version: "22"
|
||||
|
||||
- name: Cache dependencies
|
||||
uses: actions/cache@v4
|
||||
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
|
||||
@@ -69,11 +69,11 @@ jobs:
|
||||
HUGO_RELATIVEURLS: false
|
||||
|
||||
- name: Deploy
|
||||
uses: peaceiris/actions-gh-pages@v4
|
||||
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
publish_dir: .hugo/public
|
||||
# Do not delete previews on each production deploy.
|
||||
# CSS or JS changes will require manual clean-up.
|
||||
keep_files: true
|
||||
commit_message: "deploy: ${{ github.event.head_commit.message }}"
|
||||
commit_message: "deploy: ${{ github.event.head_commit.message }}"
|
||||
|
||||
4
.github/workflows/docs_preview_clean.yaml
vendored
@@ -34,7 +34,7 @@ jobs:
|
||||
group: "preview-${{ github.event.number }}"
|
||||
cancel-in-progress: true
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
with:
|
||||
ref: gh-pages
|
||||
|
||||
@@ -48,7 +48,7 @@ jobs:
|
||||
git push
|
||||
|
||||
- name: Comment
|
||||
uses: actions/github-script@v7
|
||||
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
|
||||
with:
|
||||
script: |
|
||||
github.rest.issues.createComment({
|
||||
|
||||
35
.github/workflows/docs_preview_deploy.yaml
vendored
@@ -17,7 +17,7 @@ name: "docs"
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
|
||||
|
||||
# This Workflow depends on 'github.event.number',
|
||||
# not compatible with branch or manual triggers.
|
||||
on:
|
||||
@@ -26,10 +26,20 @@ on:
|
||||
paths:
|
||||
- 'docs/**'
|
||||
- 'github/workflows/docs**'
|
||||
- '.hugo'
|
||||
- '.hugo/**'
|
||||
pull_request_target:
|
||||
types: [labeled]
|
||||
paths:
|
||||
- 'docs/**'
|
||||
- 'github/workflows/docs**'
|
||||
- '.hugo/**'
|
||||
|
||||
jobs:
|
||||
preview:
|
||||
# run job on proper workflow event triggers (skip job for pull_request event
|
||||
# from forks and only run pull_request_target for "docs: deploy-preview"
|
||||
# label)
|
||||
if: "${{ (github.event.action != 'labeled' && github.event.pull_request.head.repo.full_name == github.event.pull_request.base.repo.full_name) || github.event.label.name == 'docs: deploy-preview' }}"
|
||||
runs-on: ubuntu-24.04
|
||||
defaults:
|
||||
run:
|
||||
@@ -39,23 +49,24 @@ jobs:
|
||||
group: "preview-${{ github.event.number }}"
|
||||
cancel-in-progress: true
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
with:
|
||||
ref: ${{ github.event.pull_request.head.sha }}
|
||||
fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
|
||||
|
||||
- name: Setup Hugo
|
||||
uses: peaceiris/actions-hugo@v3
|
||||
uses: peaceiris/actions-hugo@75d2e84710de30f6ff7268e08f310b60ef14033f # v3
|
||||
with:
|
||||
hugo-version: "latest"
|
||||
hugo-version: "0.145.0"
|
||||
extended: true
|
||||
|
||||
- name: Setup Node
|
||||
uses: actions/setup-node@v4
|
||||
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
|
||||
with:
|
||||
node-version: "22"
|
||||
|
||||
- name: Cache dependencies
|
||||
uses: actions/cache@v4
|
||||
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
|
||||
@@ -70,9 +81,7 @@ jobs:
|
||||
HUGO_RELATIVEURLS: false
|
||||
|
||||
- name: Deploy
|
||||
# If run from a fork, GitHub write operations will fail.
|
||||
if: ${{ !github.event.pull_request.head.repo.fork }}
|
||||
uses: peaceiris/actions-gh-pages@v4
|
||||
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
publish_dir: .hugo/public
|
||||
@@ -80,9 +89,7 @@ jobs:
|
||||
commit_message: "stage: PR-${{ github.event.number }}: ${{ github.event.head_commit.message }}"
|
||||
|
||||
- name: Comment
|
||||
# If run from a fork, GitHub write operations will fail.
|
||||
if: ${{ !github.event.pull_request.head.repo.fork }}
|
||||
uses: actions/github-script@v7
|
||||
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
|
||||
with:
|
||||
script: |
|
||||
github.rest.issues.createComment({
|
||||
@@ -90,4 +97,4 @@ jobs:
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
body: "🔎 Preview at https://${{ github.repository_owner }}.github.io/${{ github.event.repository.name }}/previews/PR-${{ github.event.number }}/"
|
||||
})
|
||||
})
|
||||
|
||||
6
.github/workflows/lint.yaml
vendored
@@ -51,11 +51,11 @@ jobs:
|
||||
console.log('Failed to remove label. Another job may have already removed it!');
|
||||
}
|
||||
- name: Setup Go
|
||||
uses: actions/setup-go@cdcb36043654635271a94b9a6d1392de5bb323a7 # v5.0.1
|
||||
uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
|
||||
with:
|
||||
go-version: "1.22"
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29 # v4.1.6
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
ref: ${{ github.event.pull_request.head.sha }}
|
||||
repository: ${{ github.event.pull_request.head.repo.full_name }}
|
||||
@@ -66,7 +66,7 @@ jobs:
|
||||
run: |
|
||||
go mod tidy && git diff --exit-code
|
||||
- name: golangci-lint
|
||||
uses: golangci/golangci-lint-action@a4f60bb28d35aeee14e6880718e0c85ff1882e64 # v6.0.1
|
||||
uses: golangci/golangci-lint-action@4afd733a84b1f43292c63897423277bb7f4313a9 # v8.0.0
|
||||
with:
|
||||
version: latest
|
||||
args: --timeout 3m
|
||||
|
||||
29
.github/workflows/schedule_reporter.yml
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
# Copyright 2025 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
name: Schedule Reporter
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 6 * * *' # Runs at 6 AM every morning
|
||||
|
||||
jobs:
|
||||
run_reporter:
|
||||
permissions:
|
||||
issues: 'write'
|
||||
checks: 'read'
|
||||
contents: 'read'
|
||||
uses: ./.github/workflows/cloud_build_failure_reporter.yml
|
||||
with:
|
||||
trigger_names: "toolbox-test-nightly,toolbox-test-on-merge"
|
||||
2
.github/workflows/sync-labels.yaml
vendored
@@ -29,7 +29,7 @@ jobs:
|
||||
issues: 'write'
|
||||
pull-requests: 'write'
|
||||
steps:
|
||||
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29 # v4.1.6
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- uses: micnncim/action-label-syncer@3abd5ab72fda571e69fffd97bd4e0033dd5f495c # v1.3.0
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
42
.github/workflows/tests.yaml
vendored
@@ -26,14 +26,13 @@ permissions: read-all
|
||||
|
||||
jobs:
|
||||
integration:
|
||||
# run job on proper workflow event triggers (skip job for pull_request event from forks and only run pull_request_target for "tests: run" label)
|
||||
# run job on proper workflow event triggers (skip job for pull_request event from forks and only run pull_request_target for "tests: run" label)
|
||||
if: "${{ (github.event.action != 'labeled' && github.event.pull_request.head.repo.full_name == github.event.pull_request.base.repo.full_name) || github.event.label.name == 'tests: run' }}"
|
||||
name: unit tests
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
# os: [macos-latest, windows-latest, ubuntu-latest]
|
||||
os: [ubuntu-latest]
|
||||
os: [macos-latest, windows-latest, ubuntu-latest]
|
||||
fail-fast: false
|
||||
permissions:
|
||||
contents: 'read'
|
||||
@@ -58,12 +57,12 @@ jobs:
|
||||
}
|
||||
|
||||
- name: Setup Go
|
||||
uses: actions/setup-go@cdcb36043654635271a94b9a6d1392de5bb323a7 # v5.0.1
|
||||
uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
|
||||
with:
|
||||
go-version: "1.22"
|
||||
|
||||
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29 # v4.1.6
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
with:
|
||||
ref: ${{ github.event.pull_request.head.sha }}
|
||||
repository: ${{ github.event.pull_request.head.repo.full_name }}
|
||||
@@ -75,5 +74,32 @@ jobs:
|
||||
- name: Build
|
||||
run: go build -v ./...
|
||||
|
||||
- name: Run tests
|
||||
run: go test -race -tags="!integration" -v ./...
|
||||
- name: Run tests with coverage
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
run: |
|
||||
source_dir="./internal/sources/*"
|
||||
tool_dir="./internal/tools/*"
|
||||
auth_dir="./internal/auth/*"
|
||||
int_test_dir="./tests/*"
|
||||
included_packages=$(go list ./... | grep -v -e "$source_dir" -e "$tool_dir" -e "$auth_dir" -e "$int_test_dir")
|
||||
go test -race -cover -coverprofile=coverage.out -v $included_packages
|
||||
go test -race -v ./internal/sources/... ./internal/tools/... ./internal/auth/...
|
||||
|
||||
- name: Run tests without coverage
|
||||
if: ${{ runner.os != 'Linux' }}
|
||||
run: |
|
||||
go test -race -v ./internal/... ./cmd/...
|
||||
|
||||
- name: Check coverage
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
run: |
|
||||
FILE_TO_EXCLUDE="github.com/googleapis/genai-toolbox/internal/server/config.go"
|
||||
ESCAPED_PATH=$(echo "$FILE_TO_EXCLUDE" | sed 's/\//\\\//g; s/\./\\\./g')
|
||||
sed -i "/^${ESCAPED_PATH}:/d" coverage.out
|
||||
total_coverage=$(go tool cover -func=coverage.out | grep "total:" | awk '{print $3}')
|
||||
echo "Total coverage: $total_coverage"
|
||||
coverage_numeric=$(echo "$total_coverage" | sed 's/%//')
|
||||
if (( $(echo "$coverage_numeric < 40" | bc -l) )); then
|
||||
echo "Coverage failure: total coverage($total_coverage) is below 40%."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
2
.gitignore
vendored
@@ -13,4 +13,4 @@ node_modules
|
||||
.hugo_build.lock
|
||||
|
||||
# coverage
|
||||
.coverage
|
||||
.coverage
|
||||
|
||||
@@ -12,34 +12,26 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
version: "2"
|
||||
linters:
|
||||
enable:
|
||||
- errcheck
|
||||
- goimports
|
||||
- gosimple
|
||||
- govet
|
||||
- ineffassign
|
||||
- staticcheck
|
||||
- unused
|
||||
linters-settings:
|
||||
gofmt:
|
||||
rewrite-rules:
|
||||
- pattern: 'interface{}'
|
||||
replacement: 'any'
|
||||
- pattern: 'a[b:len(a)]'
|
||||
replacement: 'a[b:]'
|
||||
exclusions:
|
||||
presets:
|
||||
- std-error-handling
|
||||
issues:
|
||||
fix: true
|
||||
run:
|
||||
build-tags:
|
||||
- integration
|
||||
- cloudsqlpg
|
||||
- postgres
|
||||
- alloydb
|
||||
- spanner
|
||||
- cloudsqlmssql
|
||||
- cloudsqlmysql
|
||||
- neo4j
|
||||
- dgraph
|
||||
- mssql
|
||||
- mysql
|
||||
formatters:
|
||||
enable:
|
||||
- goimports
|
||||
settings:
|
||||
gofmt:
|
||||
rewrite-rules:
|
||||
- pattern: interface{}
|
||||
replacement: any
|
||||
- pattern: a[b:len(a)]
|
||||
replacement: a[b:]
|
||||
|
||||
|
Before Width: | Height: | Size: 10 KiB After Width: | Height: | Size: 34 KiB |
1
.hugo/assets/scss/_styles_project.scss
Normal file
@@ -0,0 +1 @@
|
||||
@import 'td/code-dark';
|
||||
@@ -1,2 +1,2 @@
|
||||
$primary: #D84040;
|
||||
$secondary: #8E1616;
|
||||
$primary: #80a7e9;
|
||||
$secondary: #4484f4;
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
title = 'Gen AI Toolbox'
|
||||
title = 'MCP Toolbox for Databases'
|
||||
relativeURLs = true
|
||||
|
||||
languageCode = 'en-us'
|
||||
contentDir = "../docs/en"
|
||||
defaultContentLanguage = "en"
|
||||
defaultContentLanguageInSubdir = false
|
||||
|
||||
@@ -18,7 +17,10 @@ enableRobotsTXT = true
|
||||
proxy = "direct"
|
||||
[module.hugoVersion]
|
||||
extended = true
|
||||
min = "0.73.0"
|
||||
min = "0.146.0"
|
||||
[[module.mounts]]
|
||||
source = "../docs/en"
|
||||
target = 'content'
|
||||
[[module.imports]]
|
||||
path = "github.com/google/docsy"
|
||||
disable = false
|
||||
@@ -26,6 +28,7 @@ enableRobotsTXT = true
|
||||
path = "github.com/martignoni/hugo-notice"
|
||||
|
||||
[params]
|
||||
description = "MCP Toolbox for Databases is an open source MCP server for databases. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more."
|
||||
copyright = "Google LLC"
|
||||
github_repo = "https://github.com/googleapis/genai-toolbox"
|
||||
github_project_repo = "https://github.com/googleapis/genai-toolbox"
|
||||
@@ -45,4 +48,23 @@ enableRobotsTXT = true
|
||||
pre = "<i class='fa-brands fa-github'></i>"
|
||||
|
||||
[markup.goldmark.renderer]
|
||||
unsafe= true
|
||||
unsafe= true
|
||||
|
||||
[markup.highlight]
|
||||
noClasses = false
|
||||
style = "tango"
|
||||
|
||||
[outputFormats]
|
||||
[outputFormats.LLMS]
|
||||
mediaType = "text/plain"
|
||||
baseName = "llms"
|
||||
isPlainText = true
|
||||
root = true
|
||||
[outputFormats.LLMS-FULL]
|
||||
mediaType = "text/plain"
|
||||
baseName = "llms-full"
|
||||
isPlainText = true
|
||||
root = true
|
||||
|
||||
[outputs]
|
||||
home = ["HTML", "RSS", "LLMS", "LLMS-FULL"]
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
{{ template "_default/_markup/td-render-heading.html" . }}
|
||||
14
.hugo/layouts/index.llms-full.txt
Normal file
@@ -0,0 +1,14 @@
|
||||
{{ .Site.Params.description }}
|
||||
|
||||
{{ range .Site.Sections }}
|
||||
# {{ .Title }}
|
||||
{{ .Description }}
|
||||
{{ range .Pages }}
|
||||
# {{ .Title }}
|
||||
{{ .Description }}
|
||||
{{ .RawContent }}
|
||||
{{ range .Pages }}
|
||||
# {{ .Title }}
|
||||
{{ .Description }}
|
||||
{{ .RawContent }}
|
||||
{{end }}{{ end }}{{ end }}
|
||||
9
.hugo/layouts/index.llms.txt
Normal file
@@ -0,0 +1,9 @@
|
||||
# {{ .Site.Title }}
|
||||
|
||||
> {{ .Site.Params.description }}
|
||||
|
||||
## Docs
|
||||
{{ range .Site.Sections }}
|
||||
### {{ .Title }}
|
||||
|
||||
{{ .Description }}{{ range .Pages }}- [{{ .Title }}]({{ .Permalink }}): {{ .Description }}{{ range .Pages }} - [{{ .Title }}]({{ .Permalink }}): {{ .Description }}{{end }}{{ end }}{{ end }}
|
||||
52
.hugo/layouts/partials/page-meta-links.html
Normal file
@@ -0,0 +1,52 @@
|
||||
{{/* cSpell:ignore querify subdir */ -}}
|
||||
{{/* Class names ending with `--KIND` are deprecated in favor of `__KIND`, but we're keeping them for a few releases after 0.9.0 */ -}}
|
||||
|
||||
{{ if .File -}}
|
||||
{{ $path := urls.JoinPath $.Language.Lang $.File.Path -}}
|
||||
{{ $gh_repo := $.Param "github_repo" -}}
|
||||
{{ $gh_url := $.Param "github_url" -}}
|
||||
{{ $gh_subdir := $.Param "github_subdir" | default "" -}}
|
||||
{{ $gh_project_repo := $.Param "github_project_repo" -}}
|
||||
{{ $gh_branch := $.Param "github_branch" | default "main" -}}
|
||||
<div class="td-page-meta ms-2 pb-1 pt-2 mb-0">
|
||||
{{ if $gh_url -}}
|
||||
{{ warnf "Warning: use of `github_url` is deprecated. For details, see https://www.docsy.dev/docs/adding-content/repository-links/#github_url-optional" -}}
|
||||
<a href="{{ $gh_url }}" target="_blank"><i class="fa-solid fa-pen-to-square fa-fw"></i> {{ T "post_edit_this" }}</a>
|
||||
{{ else if $gh_repo -}}
|
||||
|
||||
{{/* Adjust $path based on path_base_for_github_subdir */ -}}
|
||||
{{ $ghs_base := $.Param "path_base_for_github_subdir" -}}
|
||||
{{ $ghs_rename := "" -}}
|
||||
{{ if reflect.IsMap $ghs_base -}}
|
||||
{{ $ghs_rename = $ghs_base.to -}}
|
||||
{{ $ghs_base = $ghs_base.from -}}
|
||||
{{ end -}}
|
||||
{{ with $ghs_base -}}
|
||||
{{ $path = replaceRE . $ghs_rename $path -}}
|
||||
{{ end -}}
|
||||
|
||||
{{ $gh_repo_path := printf "%s/%s/%s" $gh_branch $gh_subdir $path -}}
|
||||
{{ $gh_repo_path = replaceRE "//+" "/" $gh_repo_path -}}
|
||||
|
||||
{{ $viewURL := printf "%s/tree/%s" $gh_repo $gh_repo_path -}}
|
||||
{{ $editURL := printf "%s/edit/%s" $gh_repo $gh_repo_path -}}
|
||||
{{ $issuesURL := printf "%s/issues/new?title=%s" $gh_repo (safeURL $.Title ) -}}
|
||||
{{ $newPageStub := resources.Get "stubs/new-page-template.md" -}}
|
||||
{{ $newPageQS := querify "value" $newPageStub.Content "filename" "change-me.md" | safeURL -}}
|
||||
{{ $newPageURL := printf "%s/new/%s?%s" $gh_repo (path.Dir $gh_repo_path) $newPageQS -}}
|
||||
|
||||
<a href="{{ $viewURL }}" class="td-page-meta--view td-page-meta__view" target="_blank" rel="noopener"><i class="fa-solid fa-file-lines fa-fw"></i> {{ T "post_view_this" }}</a>
|
||||
<a href="{{ $editURL }}" class="td-page-meta--edit td-page-meta__edit" target="_blank" rel="noopener"><i class="fa-solid fa-pen-to-square fa-fw"></i> {{ T "post_edit_this" }}</a>
|
||||
<a href="{{ $newPageURL }}" class="td-page-meta--child td-page-meta__child" target="_blank" rel="noopener"><i class="fa-solid fa-pen-to-square fa-fw"></i> {{ T "post_create_child_page" }}</a>
|
||||
<a href="{{ $issuesURL }}" class="td-page-meta--issue td-page-meta__issue" target="_blank" rel="noopener"><i class="fa-solid fa-list-check fa-fw"></i> {{ T "post_create_issue" }}</a>
|
||||
{{ with $gh_project_repo -}}
|
||||
{{ $project_issueURL := printf "%s/issues/new" . -}}
|
||||
<a href="{{ $project_issueURL }}" class="td-page-meta--project td-page-meta__project-issue" target="_blank" rel="noopener"><i class="fa-solid fa-list-check fa-fw"></i> {{ T "post_create_project_issue" }}</a>
|
||||
{{ end -}}
|
||||
|
||||
{{ end -}}
|
||||
{{ with .CurrentSection.AlternativeOutputFormats.Get "print" -}}
|
||||
<a id="print" href="{{ .RelPermalink | safeURL }}"><i class="fa-solid fa-print fa-fw"></i> {{ T "print_entire_section" }}</a>
|
||||
{{ end }}
|
||||
</div>
|
||||
{{ end -}}
|
||||
1
.hugo/layouts/partials/td/render-heading.html
Normal file
@@ -0,0 +1 @@
|
||||
{{ template "partials/td/render-heading.html" . }}
|
||||
@@ -1,6 +0,0 @@
|
||||
This favicon was generated using the following graphics from Twitter Twemoji:
|
||||
|
||||
- Graphics Title: 1f9f0.svg
|
||||
- Graphics Author: Copyright 2020 Twitter, Inc and other contributors (https://github.com/twitter/twemoji)
|
||||
- Graphics Source: https://github.com/twitter/twemoji/blob/master/assets/svg/1f9f0.svg
|
||||
- Graphics License: CC-BY 4.0 (https://creativecommons.org/licenses/by/4.0/)
|
||||
|
Before Width: | Height: | Size: 4.4 KiB After Width: | Height: | Size: 7.4 KiB |
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 27 KiB |
|
Before Width: | Height: | Size: 4.2 KiB After Width: | Height: | Size: 6.7 KiB |
|
Before Width: | Height: | Size: 460 B After Width: | Height: | Size: 576 B |
|
Before Width: | Height: | Size: 793 B After Width: | Height: | Size: 1.0 KiB |
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 15 KiB |
117
CHANGELOG.md
@@ -1,5 +1,122 @@
|
||||
# Changelog
|
||||
|
||||
## [0.8.0](https://github.com/googleapis/genai-toolbox/compare/v0.7.0...v0.8.0) (2025-07-02)
|
||||
|
||||
|
||||
### ⚠ BREAKING CHANGES
|
||||
|
||||
* **postgres,mssql,cloudsqlmssql:** encode source connection url for sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727))
|
||||
|
||||
### Features
|
||||
|
||||
* Add support for multiple YAML configuration files ([#760](https://github.com/googleapis/genai-toolbox/issues/760)) ([40679d7](https://github.com/googleapis/genai-toolbox/commit/40679d700eded50d19569923e2a71c51e907a8bf))
|
||||
* Add support for optional parameters ([#617](https://github.com/googleapis/genai-toolbox/issues/617)) ([4827771](https://github.com/googleapis/genai-toolbox/commit/4827771b78dee9a1284a898b749509b472061527)), closes [#475](https://github.com/googleapis/genai-toolbox/issues/475)
|
||||
* **mcp:** Support MCP version 2025-03-26 ([#755](https://github.com/googleapis/genai-toolbox/issues/755)) ([474df57](https://github.com/googleapis/genai-toolbox/commit/474df57d62de683079f8d12c31db53396a545fd1))
|
||||
* **sources/http:** Support disable SSL verification for HTTP Source ([#674](https://github.com/googleapis/genai-toolbox/issues/674)) ([4055b0c](https://github.com/googleapis/genai-toolbox/commit/4055b0c3569c527560d7ad34262963b3dd4e282d))
|
||||
* **tools/bigquery:** Add templateParameters field for bigquery ([#699](https://github.com/googleapis/genai-toolbox/issues/699)) ([f5f771b](https://github.com/googleapis/genai-toolbox/commit/f5f771b0f3d159630ff602ff55c6c66b61981446))
|
||||
* **tools/bigtable:** Add templateParameters field for bigtable ([#692](https://github.com/googleapis/genai-toolbox/issues/692)) ([1c06771](https://github.com/googleapis/genai-toolbox/commit/1c067715fac06479eb0060d7067b73dba099ed92))
|
||||
* **tools/couchbase:** Add templateParameters field for couchbase ([#723](https://github.com/googleapis/genai-toolbox/issues/723)) ([9197186](https://github.com/googleapis/genai-toolbox/commit/9197186b8bea1ac4ec1b39c9c5c110807c8b2ba9))
|
||||
* **tools/http:** Add support for HTTP Tool pathParams ([#726](https://github.com/googleapis/genai-toolbox/issues/726)) ([fd300dc](https://github.com/googleapis/genai-toolbox/commit/fd300dc606d88bf9f7bba689e2cee4e3565537dd))
|
||||
* **tools/redis:** Add Redis Source and Tool ([#519](https://github.com/googleapis/genai-toolbox/issues/519)) ([f0aef29](https://github.com/googleapis/genai-toolbox/commit/f0aef29b0c2563e2a00277fbe2784f39f16d2835))
|
||||
* **tools/spanner:** Add templateParameters field for spanner ([#691](https://github.com/googleapis/genai-toolbox/issues/691)) ([075dfa4](https://github.com/googleapis/genai-toolbox/commit/075dfa47e1fd92be4847bd0aec63296146b66455))
|
||||
* **tools/sqlitesql:** Add templateParameters field for sqlitesql ([#687](https://github.com/googleapis/genai-toolbox/issues/687)) ([75e254c](https://github.com/googleapis/genai-toolbox/commit/75e254c0a4ce690ca5fa4d1741550ce54734b226))
|
||||
* **tools/valkey:** Add Valkey Source and Tool ([#532](https://github.com/googleapis/genai-toolbox/issues/532)) ([054ec19](https://github.com/googleapis/genai-toolbox/commit/054ec198b97ba9f36f67dd12b2eff0cc6bc4d080))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **bigquery,mssql:** Fix panic on tools with array param ([#722](https://github.com/googleapis/genai-toolbox/issues/722)) ([7a6644c](https://github.com/googleapis/genai-toolbox/commit/7a6644cf0c5413e5c803955c88a2cfd0a2233ed3))
|
||||
* **postgres,mssql,cloudsqlmssql:** Encode source connection url for sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727)) ([67964d9](https://github.com/googleapis/genai-toolbox/commit/67964d939f27320b63b5759f4b3f3fdaa0c76fbf)), closes [#717](https://github.com/googleapis/genai-toolbox/issues/717)
|
||||
* Set default value to field's type during unmarshalling ([#774](https://github.com/googleapis/genai-toolbox/issues/774)) ([fafed24](https://github.com/googleapis/genai-toolbox/commit/fafed2485839cf1acc1350e8a24103d2e6356ee0)), closes [#771](https://github.com/googleapis/genai-toolbox/issues/771)
|
||||
* **server/mcp:** Do not listen from port for stdio ([#719](https://github.com/googleapis/genai-toolbox/issues/719)) ([d51dbc7](https://github.com/googleapis/genai-toolbox/commit/d51dbc759ba493021d3ec6f5417fc04c21f7044f)), closes [#711](https://github.com/googleapis/genai-toolbox/issues/711)
|
||||
* **tools/mysqlexecutesql:** Handle nil panic and connection leak in Invoke ([#757](https://github.com/googleapis/genai-toolbox/issues/757)) ([7badba4](https://github.com/googleapis/genai-toolbox/commit/7badba42eefb34252be77b852a57d6bd78dd267d))
|
||||
* **tools/mysqlsql:** Handle nil panic and connection leak in invoke ([#758](https://github.com/googleapis/genai-toolbox/issues/758)) ([cbb4a33](https://github.com/googleapis/genai-toolbox/commit/cbb4a333517313744800d148840312e56340f3fd))
|
||||
|
||||
## [0.7.0](https://github.com/googleapis/genai-toolbox/compare/v0.6.0...v0.7.0) (2025-06-10)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add templateParameters field for mssqlsql ([#671](https://github.com/googleapis/genai-toolbox/issues/671)) ([b81fc6a](https://github.com/googleapis/genai-toolbox/commit/b81fc6aa6ccdfbc15676fee4d87041d9ad9682fa))
|
||||
* Add templateParameters field for mysqlsql ([#663](https://github.com/googleapis/genai-toolbox/issues/663)) ([0a08d2c](https://github.com/googleapis/genai-toolbox/commit/0a08d2c15dcbec18bb556f4dc49792ba0c69db46))
|
||||
* **metrics:** Add user agent for prebuilt tools ([#669](https://github.com/googleapis/genai-toolbox/issues/669)) ([29aa0a7](https://github.com/googleapis/genai-toolbox/commit/29aa0a70da3c2eb409a38993b3782da8bec7cb85))
|
||||
* **tools/postgressql:** Add templateParameters field ([#615](https://github.com/googleapis/genai-toolbox/issues/615)) ([b763469](https://github.com/googleapis/genai-toolbox/commit/b76346993f298b4f7493a51405d0a287bacce05f))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Improve versionString ([#658](https://github.com/googleapis/genai-toolbox/issues/658)) ([cf96f4c](https://github.com/googleapis/genai-toolbox/commit/cf96f4c249f0692e3eb19fc56c794ca6a3079307))
|
||||
* **server/stdio:** Notifications should not return a response ([#638](https://github.com/googleapis/genai-toolbox/issues/638)) ([69d047a](https://github.com/googleapis/genai-toolbox/commit/69d047af46f1ec00f236db8a978a7a7627217fd2))
|
||||
* **tools/mysqlsql:** Handled the null value for string case in mysqlsql tools ([#641](https://github.com/googleapis/genai-toolbox/issues/641)) ([ef94648](https://github.com/googleapis/genai-toolbox/commit/ef94648455c3b20adda4f8cff47e70ddccac8c06))
|
||||
* Update path library ([#678](https://github.com/googleapis/genai-toolbox/issues/678)) ([4998f82](https://github.com/googleapis/genai-toolbox/commit/4998f8285287b5daddd0043540f2cf871e256db5)), closes [#662](https://github.com/googleapis/genai-toolbox/issues/662)
|
||||
|
||||
## [0.6.0](https://github.com/googleapis/genai-toolbox/compare/v0.5.0...v0.6.0) (2025-05-28)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add Execute sql tool for SQL Server(MSSQL) ([#585](https://github.com/googleapis/genai-toolbox/issues/585)) ([6083a22](https://github.com/googleapis/genai-toolbox/commit/6083a224aa650caf4e132b4a704323c5f18c4986))
|
||||
* Add mysql-execute-sql tool ([#577](https://github.com/googleapis/genai-toolbox/issues/577)) ([8590061](https://github.com/googleapis/genai-toolbox/commit/8590061ae4908da0e4b1bd6f7cf7ee8d972fa5ba))
|
||||
* Add new BigQuery tools: execute_sql, list_datatset_ids, list_table_ids, get_dataset_info, get_table_info ([0fd88b5](https://github.com/googleapis/genai-toolbox/commit/0fd88b574b4ab0d3bee4585999b814675d3b74ed))
|
||||
* Add spanner-execute-sql tool ([#576](https://github.com/googleapis/genai-toolbox/issues/576)) ([d65747a](https://github.com/googleapis/genai-toolbox/commit/d65747a2dcf3022f22c86a1524ee28c8229f7c20))
|
||||
* Add support for read-only in Spanner tool ([#563](https://github.com/googleapis/genai-toolbox/issues/563)) ([6512704](https://github.com/googleapis/genai-toolbox/commit/6512704e77088d92fea53a85c6e6cbf4b99c988d))
|
||||
* Adding support for the --prebuilt flag ([#604](https://github.com/googleapis/genai-toolbox/issues/604)) ([a29c800](https://github.com/googleapis/genai-toolbox/commit/a29c80012eec4729187c12968b53051d20b263a7))
|
||||
* Support MCP stdio transport protocol ([#607](https://github.com/googleapis/genai-toolbox/issues/607)) ([1702ce1](https://github.com/googleapis/genai-toolbox/commit/1702ce1e00a52170a4271ac999caf534ba00196f))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Explicitly set query location for BigQuery queries ([#586](https://github.com/googleapis/genai-toolbox/issues/586)) ([eb52b66](https://github.com/googleapis/genai-toolbox/commit/eb52b66d82aaa11be6b1489335f49cba8168099b))
|
||||
* Fix spellings in comments ([#561](https://github.com/googleapis/genai-toolbox/issues/561)) ([b58bf76](https://github.com/googleapis/genai-toolbox/commit/b58bf76ddaba407e3fd995dfe86d00a09484e14a))
|
||||
* Prevent tool calls through MCP when auth is required ([#544](https://github.com/googleapis/genai-toolbox/issues/544)) ([e747b6e](https://github.com/googleapis/genai-toolbox/commit/e747b6e289730c17f68be8dec0c6fa6021bb23bd))
|
||||
* Reinitialize required slice if nil ([#571](https://github.com/googleapis/genai-toolbox/issues/571)) ([04dcf47](https://github.com/googleapis/genai-toolbox/commit/04dcf4791272e1dd034b9a03664dd8dbe77fdddd)), closes [#564](https://github.com/googleapis/genai-toolbox/issues/564)
|
||||
|
||||
## [0.5.0](https://github.com/googleapis/genai-toolbox/compare/v0.4.0...v0.5.0) (2025-05-06)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add Couchbase as Source and Tool ([#307](https://github.com/googleapis/genai-toolbox/issues/307)) ([d7390b0](https://github.com/googleapis/genai-toolbox/commit/d7390b06b7bcb15411388e9a4dbcfe75afcca1ee))
|
||||
* Add postgres-execute-sql tool ([#490](https://github.com/googleapis/genai-toolbox/issues/490)) ([11ea7bc](https://github.com/googleapis/genai-toolbox/commit/11ea7bc584aa4ca8e8b0e7a355f6666ccbea2883))
|
||||
|
||||
|
||||
## [0.4.0](https://github.com/googleapis/genai-toolbox/compare/v0.3.0...v0.4.0) (2025-04-23)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add `AuthRequired` to Neo4j & Dgraph Tools ([#434](https://github.com/googleapis/genai-toolbox/issues/434)) ([afbf4b2](https://github.com/googleapis/genai-toolbox/commit/afbf4b2daeb01119a22ce18469bffb9e9f57d2f8))
|
||||
* Add `AuthRequired` to tool manifest ([#433](https://github.com/googleapis/genai-toolbox/issues/433)) ([d9388ad](https://github.com/googleapis/genai-toolbox/commit/d9388ad57e832570aab56b9b357c1fb0ba994852))
|
||||
* Add BigQuery source and tool ([#463](https://github.com/googleapis/genai-toolbox/issues/463)) ([8055aa5](https://github.com/googleapis/genai-toolbox/commit/8055aa519fe6e7993ba524f8f7e684fbfdecc1b9))
|
||||
* Add Bigtable source and tool ([#418](https://github.com/googleapis/genai-toolbox/issues/418)) ([ae53b8e](https://github.com/googleapis/genai-toolbox/commit/ae53b8eeff9d0e9ec14d9c6d4286c856cc8f1811))
|
||||
* Add IAM AuthN to AlloyDB Source ([#399](https://github.com/googleapis/genai-toolbox/issues/399)) ([e8ed447](https://github.com/googleapis/genai-toolbox/commit/e8ed447d9153c60a1d6321285587e6e4ca930f87))
|
||||
* Add IAM AuthN to Cloud SQL Sources ([#414](https://github.com/googleapis/genai-toolbox/issues/414)) ([be85b82](https://github.com/googleapis/genai-toolbox/commit/be85b820785dbce79133b0cf8788bde75ff25fee))
|
||||
* Add toolset feature to mcp ([#425](https://github.com/googleapis/genai-toolbox/issues/425)) ([e307857](https://github.com/googleapis/genai-toolbox/commit/e307857085ac4c8c2ee8292c914daa5534ba74bf)), closes [#403](https://github.com/googleapis/genai-toolbox/issues/403)
|
||||
* Add SQLite source and tool ([#438](https://github.com/googleapis/genai-toolbox/issues/438)) ([fc14cbf](https://github.com/googleapis/genai-toolbox/commit/fc14cbfd078d465591e4fefb80542759e82a2731))
|
||||
* Support env replacement for tools.yaml ([#462](https://github.com/googleapis/genai-toolbox/issues/462)) ([eadb678](https://github.com/googleapis/genai-toolbox/commit/eadb678a7bd4ce74a3b1160f5ed8966ffbb13c61))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* [#419](https://github.com/googleapis/genai-toolbox/issues/419) TLS https URL for SSE endpoint ([#420](https://github.com/googleapis/genai-toolbox/issues/420)) ([0a7d3ff](https://github.com/googleapis/genai-toolbox/commit/0a7d3ff06b88051c752b6d53bc964ed6e6be400e))
|
||||
* **docs:** Fix link 'Edit this page' ([#454](https://github.com/googleapis/genai-toolbox/issues/454)) ([969065e](https://github.com/googleapis/genai-toolbox/commit/969065e561f28ddb9755d99bbe0b288040198296)), closes [#427](https://github.com/googleapis/genai-toolbox/issues/427)
|
||||
* Update http error code from invocation ([#468](https://github.com/googleapis/genai-toolbox/issues/468)) ([ff7c0ff](https://github.com/googleapis/genai-toolbox/commit/ff7c0ffc65172a335e8d3321e5a6b92d38dc7e6d)), closes [#465](https://github.com/googleapis/genai-toolbox/issues/465)
|
||||
|
||||
## [0.3.0](https://github.com/googleapis/genai-toolbox/compare/v0.2.1...v0.3.0) (2025-04-04)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add 'alloydb-ai-nl' tool ([#358](https://github.com/googleapis/genai-toolbox/issues/358)) ([f02885f](https://github.com/googleapis/genai-toolbox/commit/f02885fd4a919103fdabaa4ca38d975dc8497542))
|
||||
* Add HTTP Source and Tool ([#332](https://github.com/googleapis/genai-toolbox/issues/332)) ([64da5b4](https://github.com/googleapis/genai-toolbox/commit/64da5b4efe7d948ceb366c37fdaabd42405bc932))
|
||||
* Adding support for Model Context Protocol (MCP). ([#396](https://github.com/googleapis/genai-toolbox/issues/396)) ([a7d1d4e](https://github.com/googleapis/genai-toolbox/commit/a7d1d4eb2ae337b463d1b25ccb25c3c0eb30df6f))
|
||||
* Added [toolbox-core](https://pypi.org/project/toolbox-core/) SDK – easily integrate Toolbox into any Python function calling framework
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Add `tools-file` flag and deprecate `tools_file` ([#384](https://github.com/googleapis/genai-toolbox/issues/384)) ([34a7263](https://github.com/googleapis/genai-toolbox/commit/34a7263fdce40715de20ef5677f94be29f9f5c98)), closes [#383](https://github.com/googleapis/genai-toolbox/issues/383)
|
||||
|
||||
## [0.2.1](https://github.com/googleapis/genai-toolbox/compare/v0.2.0...v0.2.1) (2025-03-20)
|
||||
|
||||
|
||||
|
||||
@@ -14,21 +14,21 @@ race, religion, or sexual identity and orientation.
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
@@ -75,7 +75,7 @@ receive and address reported violations of the code of conduct. They will then
|
||||
work with a committee consisting of representatives from the Open Source
|
||||
Programs Office and the Google Open Source Strategy team. If for any reason you
|
||||
are uncomfortable reaching out to the Project Steward, please email
|
||||
opensource@google.com.
|
||||
<opensource@google.com>.
|
||||
|
||||
We will investigate every complaint, but you may not receive a direct response.
|
||||
We will use our discretion in determining when and how to follow up on reported
|
||||
@@ -90,4 +90,4 @@ harassment or threats to anyone's safety, we may take action without notice.
|
||||
|
||||
This Code of Conduct is adapted from the Contributor Covenant, version 1.4,
|
||||
available at
|
||||
https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
|
||||
<https://www.contributor-covenant.org/version/1/4/code-of-conduct.html>
|
||||
|
||||
151
CONTRIBUTING.md
@@ -30,4 +30,153 @@ This project follows
|
||||
All submissions, including submissions by project members, require review. We
|
||||
use GitHub pull requests for this purpose. Consult
|
||||
[GitHub Help](https://help.github.com/articles/about-pull-requests/) for more
|
||||
information on using pull requests.
|
||||
information on using pull requests.
|
||||
|
||||
Within 2-5 days, a reviewer will review your PR. They may approve it, or request
|
||||
changes. When requesting changes, reviewers should self-assign the PR to ensure
|
||||
they are aware of any updates.
|
||||
If additional changes are needed, push additional commits to your PR branch -
|
||||
this helps the reviewer know which parts of the PR have changed. Commits will be
|
||||
squashed when merged.
|
||||
Please follow up with changes promptly. If a PR is awaiting changes by the
|
||||
author for more than 10 days, maintainers may mark that PR as Draft. PRs that
|
||||
are inactive for more than 30 days may be closed.
|
||||
|
||||
### Adding a New Database Source and Tool
|
||||
|
||||
We recommend creating an
|
||||
[issue](https://github.com/googleapis/genai-toolbox/issues) before
|
||||
implementation to ensure we can accept the contribution and no duplicated work.
|
||||
If you have any questions, reach out on our
|
||||
[Discord](https://discord.gg/Dmm69peqjh) to chat directly with the team. New
|
||||
contributions should be added with both unit tests and integration tests.
|
||||
|
||||
#### 1. Implement the New Data Source
|
||||
|
||||
We recommend looking at an [example source
|
||||
implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/sources/postgres/postgres.go).
|
||||
|
||||
* **Create a new directory** under `internal/sources` for your database type
|
||||
(e.g., `internal/sources/newdb`).
|
||||
* **Define a configuration struct** for your data source in a file named
|
||||
`newdb.go`. Create a `Config` struct to include all the necessary parameters
|
||||
for connecting to the database (e.g., host, port, username, password, database
|
||||
name) and a `Source` struct to store necessary parameters for tools (e.g.,
|
||||
Name, Kind, connection object, additional config).
|
||||
* **Implement the
|
||||
[`SourceConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L57)
|
||||
interface**. This interface requires two methods:
|
||||
* `SourceConfigKind() string`: Returns a unique string identifier for your
|
||||
data source (e.g., `"newdb"`).
|
||||
* `Initialize(ctx context.Context, tracer trace.Tracer) (Source, error)`:
|
||||
Creates a new instance of your data source and establishes a connection to
|
||||
the database.
|
||||
* **Implement the
|
||||
[`Source`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L63)
|
||||
interface**. This interface requires one method:
|
||||
* `SourceKind() string`: Returns the same string identifier as `SourceConfigKind()`.
|
||||
* **Implement `init()`** to register the new Source.
|
||||
* **Implement Unit Tests** in a file named `newdb_test.go`.
|
||||
|
||||
#### 2. Implement the New Tool
|
||||
|
||||
We recommend looking at an [example tool
|
||||
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgressql).
|
||||
|
||||
* **Create a new directory** under `internal/tools` for your tool type (e.g.,
|
||||
`internal/tools/newdb` or `internal/tools/newdb<tool_name>`).
|
||||
* **Define a configuration struct** for your tool in a file named `newdbtool.go`.
|
||||
Create a `Config` struct and a `Tool` struct to store necessary parameters for
|
||||
tools.
|
||||
* **Implement the
|
||||
[`ToolConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/tools/tools.go#L61)
|
||||
interface**. This interface requires one method:
|
||||
* `ToolConfigKind() string`: Returns a unique string identifier for your tool
|
||||
(e.g., `"newdb"`).
|
||||
* `Initialize(sources map[string]Source) (Tool, error)`: Creates a new
|
||||
instance of your tool and validates that it can connect to the specified
|
||||
data source.
|
||||
* **Implement the `Tool` interface**. This interface requires the following
|
||||
methods:
|
||||
* `Invoke(ctx context.Context, params map[string]any) ([]any, error)`:
|
||||
Executes the operation on the database using the provided parameters.
|
||||
* `ParseParams(data map[string]any, claims map[string]map[string]any)
|
||||
(ParamValues, error)`: Parses and validates the input parameters.
|
||||
* `Manifest() Manifest`: Returns a manifest describing the tool's capabilities
|
||||
and parameters.
|
||||
* `McpManifest() McpManifest`: Returns an MCP manifest describing the tool for
|
||||
use with the Model Context Protocol.
|
||||
* `Authorized(services []string) bool`: Checks if the tool is authorized to
|
||||
run based on the provided authentication services.
|
||||
* **Implement `init()`** to register the new Tool.
|
||||
* **Implement Unit Tests** in a file named `newdb_test.go`.
|
||||
|
||||
#### 3. Add Integration Tests
|
||||
|
||||
* **Add a test file** under a new directory `tests/newdb`.
|
||||
* **Add pre-defined integration test suites** in the
|
||||
`/tests/newdb/newdb_test.go` that are **required** to be run as long as your
|
||||
code contains related features:
|
||||
|
||||
1. [RunToolGetTest][tool-get]: tests for the `GET` endpoint that returns the
|
||||
tool's manifest.
|
||||
|
||||
2. [RunToolInvokeTest][tool-call]: tests for tool calling through the native
|
||||
Toolbox endpoints.
|
||||
|
||||
3. [RunMCPToolCallMethod][mcp-call]: tests tool calling through the MCP
|
||||
endpoints.
|
||||
|
||||
4. (Optional) [RunExecuteSqlToolInvokeTest][execute-sql]: tests an
|
||||
`execute-sql` tool for any source. Only run this test if you are adding an
|
||||
`execute-sql` tool.
|
||||
|
||||
5. (Optional) [RunToolInvokeWithTemplateParameters][temp-param]: tests for [template
|
||||
parameters][temp-param-doc]. Only run this test if template
|
||||
parameters apply to your tool.
|
||||
|
||||
* **Add the new database to the test config** in
|
||||
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
|
||||
|
||||
[tool-get]:
|
||||
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L31
|
||||
[tool-call]:
|
||||
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L79>
|
||||
[mcp-call]:
|
||||
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L554
|
||||
[execute-sql]:
|
||||
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L431>
|
||||
[temp-param]:
|
||||
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L297>
|
||||
[temp-param-doc]:
|
||||
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
|
||||
|
||||
#### 4. Add Documentation
|
||||
|
||||
* **Update the documentation** to include information about your new data source
|
||||
and tool. This includes:
|
||||
* Adding a new page to the `docs/en/resources/sources` directory for your data
|
||||
source.
|
||||
* Adding a new page to the `docs/en/resources/tools` directory for your tool.
|
||||
|
||||
* **(Optional) Add samples** to the `docs/en/samples/<newdb>` directory.
|
||||
|
||||
#### (Optional) 5. Add Prebuilt Tools
|
||||
|
||||
You can provide developers with a set of "build-time" tools to aid common
|
||||
software development user journeys like viewing and creating tables/collections
|
||||
and data.
|
||||
|
||||
* **Create a set of prebuilt tools** by defining a new `tools.yaml` and adding
|
||||
it to `internal/tools`. Make sure the file name matches the source (i.e. for
|
||||
source "alloydb-postgres" create a file named "alloydb-postgres.yaml").
|
||||
* **Update `cmd/root.go`** to add new source to the `prebuilt` flag.
|
||||
* **Add tests** in
|
||||
[internal/prebuiltconfigs/prebuiltconfigs_test.go](internal/prebuiltconfigs/prebuiltconfigs_test.go)
|
||||
and [cmd/root_test.go](cmd/root_test.go).
|
||||
|
||||
#### 6. Submit a Pull Request
|
||||
|
||||
* **Submit a pull request** to the repository with your changes. Be sure to
|
||||
include a detailed description of your changes and any requests for long term
|
||||
testing resources.
|
||||
|
||||
332
DEVELOPER.md
@@ -1,12 +1,16 @@
|
||||
# DEVELOPER.md
|
||||
|
||||
## Before you begin
|
||||
This document provides instructions for setting up your development environment
|
||||
and contributing to the Toolbox project.
|
||||
|
||||
1. Make sure you've setup your databases.
|
||||
## Prerequisites
|
||||
|
||||
1. Install the latest version of [Go](https://go.dev/doc/install).
|
||||
Before you begin, ensure you have the following:
|
||||
|
||||
1. Locate and download dependencies:
|
||||
1. **Databases:** Set up the necessary databases for your development
|
||||
environment.
|
||||
1. **Go:** Install the latest version of [Go](https://go.dev/doc/install).
|
||||
1. **Dependencies:** Download and manage project dependencies:
|
||||
|
||||
```bash
|
||||
go get
|
||||
@@ -15,91 +19,224 @@
|
||||
|
||||
## Developing Toolbox
|
||||
|
||||
### Run Toolbox from local source
|
||||
### Running from Local Source
|
||||
|
||||
1. Open a local connection to your database by starting the [Cloud SQL Auth Proxy][cloudsql-proxy].
|
||||
|
||||
1. You should already have a `tools.yaml` created with your [sources and tools configurations](./README.md#Configuration).
|
||||
|
||||
1. You can specify flags for the Toolbox server. Execute the following to list the possible CLI flags:
|
||||
1. **Configuration:** Create a `tools.yaml` file to configure your sources and
|
||||
tools. See the [Configuration section in the
|
||||
README](./README.md#Configuration) for details.
|
||||
1. **CLI Flags:** List available command-line flags for the Toolbox server:
|
||||
|
||||
```bash
|
||||
go run . --help
|
||||
```
|
||||
|
||||
1. To run the server, execute the following (with any flags, if applicable):
|
||||
1. **Running the Server:** Start the Toolbox server with optional flags. The
|
||||
server listens on port 5000 by default.
|
||||
|
||||
```bash
|
||||
go run .
|
||||
```
|
||||
|
||||
The server will listen on port 5000 (by default).
|
||||
|
||||
1. Test endpoint using the following:
|
||||
1. **Testing the Endpoint:** Verify the server is running by sending a request
|
||||
to the endpoint:
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:5000
|
||||
```
|
||||
|
||||
### Run tests locally
|
||||
## Testing
|
||||
|
||||
1. Run lint with the following:
|
||||
### Infrastructure
|
||||
|
||||
```bash
|
||||
golangci-lint run --fix
|
||||
Toolbox uses both GitHub Actions and Cloud Build to run test workflows. Cloud
|
||||
Build is used when Google credentials are required. Cloud Build uses test
|
||||
project "toolbox-testing-438616".
|
||||
|
||||
### Linting
|
||||
|
||||
Run the lint check to ensure code quality:
|
||||
|
||||
```bash
|
||||
golangci-lint run --fix
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
Execute unit tests locally:
|
||||
|
||||
```bash
|
||||
go test -race -v ./...
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
#### Running Locally
|
||||
|
||||
1. **Environment Variables:** Set the required environment variables. Refer to
|
||||
the [Cloud Build testing configuration](./.ci/integration.cloudbuild.yaml)
|
||||
for a complete list of variables for each source.
|
||||
* `SERVICE_ACCOUNT_EMAIL`: Use your own GCP email.
|
||||
* `CLIENT_ID`: Use the Google Cloud SDK application Client ID. Contact
|
||||
Toolbox maintainers if you don't have it.
|
||||
1. **Running Tests:** Run the integration test for your target source. Specify
|
||||
the required Go build tags at the top of each integration test file.
|
||||
|
||||
```shell
|
||||
go test -race -v ./tests/<YOUR_TEST_DIR>
|
||||
```
|
||||
|
||||
1. Run all tests with the following:
|
||||
For example, to run the AlloyDB integration test:
|
||||
|
||||
```bash
|
||||
go test -race -v ./...
|
||||
```shell
|
||||
go test -race -v ./tests/alloydbpg
|
||||
```
|
||||
|
||||
## Compile the app locally
|
||||
#### Running on Pull Requests
|
||||
|
||||
### Compile Toolbox binary
|
||||
* **Internal Contributors:** Testing workflows should trigger automatically.
|
||||
* **External Contributors:** Request Toolbox maintainers to trigger the testing
|
||||
workflows on your PR.
|
||||
|
||||
1. Run build to compile binary:
|
||||
#### Test Resources
|
||||
|
||||
The following databases have been added as test resources. To add a new database
|
||||
to test against, please contact the Toolbox maintainer team via an issue or PR.
|
||||
Refer to the [Cloud Build testing
|
||||
configuration](./.ci/integration.cloudbuild.yaml) for a complete list of
|
||||
variables for each source.
|
||||
|
||||
* AlloyDB - setup in the test project
|
||||
* AI Natural Language ([setup
|
||||
instructions](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries))
|
||||
has been configured for `alloydb-a`-nl` tool tests
|
||||
* The Cloud Build service account is a user
|
||||
* Bigtable - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* BigQuery - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Cloud SQL Postgres - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Cloud SQL MySQL - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Cloud SQL SQL Server - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Couchbase - setup in the test project via the Marketplace
|
||||
* DGraph - using the public dgraph interface <https://play.dgraph.io> for
|
||||
testing
|
||||
* Memorystore Redis - setup in the test project using a Memorystore for Redis
|
||||
standalone instance
|
||||
* Memorystore Redis Cluster, Memorystore Valkey standalone, and Memorystore
|
||||
Valkey Cluster instances all require PSC connections, which requires extra
|
||||
security setup to connect from Cloud Build. Memorystore Redis standalone is
|
||||
the only one allowing PSA connection.
|
||||
* The Cloud Build service account is a user
|
||||
* Memorystore Valkey - setup in the test project using a Memorystore for Redis
|
||||
standalone instance
|
||||
* The Cloud Build service account is a user
|
||||
* MySQL - setup in the test project using a Cloud SQL instance
|
||||
* Neo4j - setup in the test project on a GCE VM
|
||||
* Postgres - setup in the test project using an AlloyDB instance
|
||||
* Spanner - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* SQL Server - setup in the test project using a Cloud SQL instance
|
||||
* SQLite - setup in the integration test, where we create a temporary database
|
||||
file
|
||||
|
||||
### Other GitHub Checks
|
||||
|
||||
* License header check (`.github/header-checker-lint.yml`) - Ensures files have
|
||||
the appropriate license
|
||||
* CLA/google - Ensures the developer has signed the CLA:
|
||||
<https://cla.developers.google.com/>
|
||||
* conventionalcommits.org - Ensures the commit messages are in the correct
|
||||
format. This repository uses tool [Release
|
||||
Please](https://github.com/googleapis/release-please) to create GitHub
|
||||
releases. It does so by parsing your git history, looking for [Conventional
|
||||
Commit messages](https://www.conventionalcommits.org/), and creating release
|
||||
PRs. Learn more by reading [How should I write my
|
||||
commits?](https://github.com/googleapis/release-please?tab=readme-ov-file#how-should-i-write-my-commits)
|
||||
|
||||
## Developing Documentation
|
||||
|
||||
### Running a Local Hugo Server
|
||||
|
||||
Follow these steps to preview documentation changes locally using a Hugo server:
|
||||
|
||||
1. **Install Hugo:** Ensure you have
|
||||
[Hugo](https://gohugo.io/installation/macos/) extended edition version
|
||||
0.146.0 or later installed.
|
||||
1. **Navigate to the Hugo Directory:**
|
||||
|
||||
```bash
|
||||
cd .hugo
|
||||
```
|
||||
|
||||
1. **Install Dependencies:**
|
||||
|
||||
```bash
|
||||
npm ci
|
||||
```
|
||||
|
||||
1. **Start the Server:**
|
||||
|
||||
```bash
|
||||
hugo server
|
||||
```
|
||||
|
||||
### Previewing Documentation on Pull Requests
|
||||
|
||||
#### Contributors
|
||||
|
||||
Request a repo owner to run the preview deployment workflow on your PR. A
|
||||
preview link will be automatically added as a comment to your PR.
|
||||
|
||||
#### Maintainers
|
||||
|
||||
1. **Inspect Changes:** Review the proposed changes in the PR to ensure they are
|
||||
safe and do not contain malicious code. Pay close attention to changes in the
|
||||
`.github/workflows/` directory.
|
||||
1. **Deploy Preview:** Apply the `docs: deploy-preview` label to the PR to
|
||||
deploy a documentation preview.
|
||||
|
||||
## Building Toolbox
|
||||
|
||||
### Building the Binary
|
||||
|
||||
1. **Build Command:** Compile the Toolbox binary:
|
||||
|
||||
```bash
|
||||
go build -o toolbox
|
||||
```
|
||||
|
||||
1. You can specify flags for the Toolbox server. Execute the following to list the possible CLI flags:
|
||||
|
||||
```bash
|
||||
./toolbox --help
|
||||
```
|
||||
|
||||
1. To run the binary, execute the following (with any flags, if applicable):
|
||||
1. **Running the Binary:** Execute the compiled binary with optional flags. The
|
||||
server listens on port 5000 by default:
|
||||
|
||||
```bash
|
||||
./toolbox
|
||||
```
|
||||
|
||||
The server will listen on port 5000 (by default).
|
||||
|
||||
1. Test endpoint using the following:
|
||||
1. **Testing the Endpoint:** Verify the server is running by sending a request
|
||||
to the endpoint:
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:5000
|
||||
```
|
||||
|
||||
### Compile Toolbox container images
|
||||
### Building Container Images
|
||||
|
||||
1. Run build to compile container image:
|
||||
1. **Build Command:** Build the Toolbox container image:
|
||||
|
||||
```bash
|
||||
docker build -t toolbox:dev .
|
||||
```
|
||||
|
||||
1. Execute the following to view image:
|
||||
1. **View Image:** List available Docker images to confirm the build:
|
||||
|
||||
```bash
|
||||
docker images
|
||||
```
|
||||
|
||||
1. Run container image with Docker:
|
||||
1. **Run Container:** Run the Toolbox container image using Docker:
|
||||
|
||||
```bash
|
||||
docker run -d toolbox:dev
|
||||
@@ -107,57 +244,118 @@
|
||||
|
||||
## Developing Toolbox SDKs
|
||||
|
||||
Please refer to the [SDK developer guide](https://github.com/googleapis/genai-toolbox-langchain-python/blob/main/DEVELOPER.md)
|
||||
Refer to the [SDK developer
|
||||
guide](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/DEVELOPER.md)
|
||||
for instructions on developing Toolbox SDKs.
|
||||
|
||||
## CI/CD Details
|
||||
## Maintainer Information
|
||||
|
||||
Cloud Build is used to run tests against Google Cloud resources in test project.
|
||||
### Team
|
||||
|
||||
Team, `@googleapis/senseai-eco`, has been set as
|
||||
[CODEOWNERS](.github/CODEOWNERS). The GitHub TeamSync tool is used to create
|
||||
this team from MDB Group, `senseai-eco`.
|
||||
|
||||
### Releasing
|
||||
|
||||
There are two types of release for Toolbox, including a versioned release and continuous release.
|
||||
Toolbox has two types of releases: versioned and continuous. It uses Google
|
||||
Cloud project, `database-toolbox`.
|
||||
|
||||
- Versioned release: Official supported distributions with the `latest` tag. The release process for versioned release is in [versioned.release.cloudbuild.yaml](https://github.com/googleapis/genai-toolbox/blob/main/versioned.release.cloudbuild.yaml).
|
||||
- Continuous release: Used for early testing features between official supported releases and end-to-end testings.
|
||||
* **Versioned Release:** Official, supported distributions tagged as `latest`.
|
||||
The release process is defined in
|
||||
[versioned.release.cloudbuild.yaml](.ci/versioned.release.cloudbuild.yaml).
|
||||
* **Continuous Release:** Used for early testing of features between official
|
||||
releases and for end-to-end testing. The release process is defined in
|
||||
[continuous.release.cloudbuild.yaml](.ci/continuous.release.cloudbuild.yaml).
|
||||
* **GitHub Release:** `.github/release-please.yml` automatically creates GitHub
|
||||
Releases and release PRs.
|
||||
|
||||
#### Supported OS and Architecture binaries
|
||||
### How-to Release a new Version
|
||||
|
||||
The following OS and computer architecture is supported within the binary releases.
|
||||
1. [Optional] If you want to override the version number, send a
|
||||
[PR](https://github.com/googleapis/genai-toolbox/pull/31) to trigger
|
||||
[release-please](https://github.com/googleapis/release-please?tab=readme-ov-file#how-do-i-change-the-version-number).
|
||||
You can generate a commit with the following line: `git commit -m "chore:
|
||||
release 0.1.0" -m "Release-As: 0.1.0" --allow-empty`
|
||||
1. [Optional] If you want to edit the changelog, send commits to the release PR
|
||||
1. Approve and merge the PR with the title “[chore(main): release
|
||||
x.x.x](https://github.com/googleapis/genai-toolbox/pull/16)”
|
||||
1. The
|
||||
[trigger](https://pantheon.corp.google.com/cloud-build/triggers;region=us-central1/edit/27bd0d21-264a-4446-b2d7-0df4e9915fb3?e=13802955&inv=1&invt=AbhU8A&mods=logs_tg_staging&project=database-toolbox)
|
||||
should automatically run when a new tag is pushed. You can view [triggered
|
||||
builds here to check the
|
||||
status](https://pantheon.corp.google.com/cloud-build/builds;region=us-central1?query=trigger_id%3D%2227bd0d21-264a-4446-b2d7-0df4e9915fb3%22&e=13802955&inv=1&invt=AbhU8A&mods=logs_tg_staging&project=database-toolbox)
|
||||
1. Update the Github release notes to include the following table:
|
||||
1. Run the following command (from the root directory):
|
||||
|
||||
- linux/amd64
|
||||
- darwin/arm64
|
||||
- darwin/amd64
|
||||
- windows/amd64
|
||||
```
|
||||
export VERSION="v0.0.0"
|
||||
.ci/generate_release_table.sh
|
||||
```
|
||||
|
||||
#### Supported container images
|
||||
1. Copy the table output
|
||||
1. In the GitHub UI, navigate to Releases and click the `edit` button.
|
||||
1. Paste the table at the bottom of release note and click `Update release`.
|
||||
1. Post release in internal chat and on Discord.
|
||||
|
||||
The following base container images is supported within the container image releases.
|
||||
#### Supported Binaries
|
||||
|
||||
- distroless
|
||||
The following operating systems and architectures are supported for binary
|
||||
releases:
|
||||
|
||||
### Automated tests
|
||||
* linux/amd64
|
||||
* darwin/arm64
|
||||
* darwin/amd64
|
||||
* windows/amd64
|
||||
|
||||
Integration and unit tests are automatically triggered via CloudBuild during each PR creation.
|
||||
#### Supported Container Images
|
||||
|
||||
The following base container images are supported for container image releases:
|
||||
|
||||
* distroless
|
||||
|
||||
### Automated Tests
|
||||
|
||||
Integration and unit tests are automatically triggered via Cloud Build on each
|
||||
pull request. Integration tests run on merge and nightly.
|
||||
|
||||
#### Failure notifications
|
||||
|
||||
On-merge and nightly tests that fail have notification setup via Cloud Build
|
||||
Failure Reporter [GitHub Actions
|
||||
Workflow](.github/workflows/schedule_reporter.yml).
|
||||
|
||||
#### Trigger Setup
|
||||
|
||||
Create a Cloud Build trigger via the UI or `gcloud` with the following specs:
|
||||
Configure a Cloud Build trigger using the UI or `gcloud` with the following
|
||||
settings:
|
||||
|
||||
* Event: Pull request
|
||||
* Region:
|
||||
* global - for default worker pools
|
||||
* Source:
|
||||
* **Event:** Pull request
|
||||
* **Region:** global (for default worker pools)
|
||||
* **Source:**
|
||||
* Generation: 1st gen
|
||||
* Repo: googleapis/genai-toolbox (GitHub App)
|
||||
* Base branch: `^main$`
|
||||
* Comment control: Required except for owners and collaborators
|
||||
* Filters: add directory filter
|
||||
* Config: Cloud Build configuration file
|
||||
* **Comment control:** Required except for owners and collaborators
|
||||
* **Filters:** Add directory filter
|
||||
* **Config:** Cloud Build configuration file
|
||||
* Location: Repository (add path to file)
|
||||
* Service account: set for demo service to enable ID token creation to use to authenticated services
|
||||
* **Service account:** Set for demo service to enable ID token creation for
|
||||
authenticated services
|
||||
|
||||
### Trigger
|
||||
### Triggering Tests
|
||||
|
||||
To run Cloud Build tests on GitHub from external contributors, ie RenovateBot, comment: `/gcbrun`.
|
||||
Trigger pull request tests for external contributors by:
|
||||
|
||||
[cloudsql-proxy]: https://cloud.google.com/sql/docs/mysql/sql-proxy
|
||||
* **Cloud Build tests:** Comment `/gcbrun`
|
||||
* **Unit tests:** Add the `tests:run` label
|
||||
|
||||
## Repo Setup & Automation
|
||||
|
||||
* .github/blunderbuss.yml - Auto-assign issues and PRs from GitHub teams
|
||||
* .github/renovate.json5 - Tooling for dependency updates. Dependabot is built
|
||||
into the GitHub repo for GitHub security warnings
|
||||
* go/github-issue-mirror - GitHub issues are automatically mirrored into buganizer
|
||||
* (Suspended) .github/sync-repo-settings.yaml - configure repo settings
|
||||
* .github/release-please.yml - Creates GitHub releases
|
||||
* .github/ISSUE_TEMPLATE - templates for GitHub issues
|
||||
|
||||
@@ -13,18 +13,19 @@
|
||||
# limitations under the License.
|
||||
|
||||
# Use the latest stable golang 1.x to compile to a binary
|
||||
FROM --platform=$BUILDPLATFORM golang:1 as build
|
||||
FROM --platform=$BUILDPLATFORM golang:1 AS build
|
||||
|
||||
WORKDIR /go/src/genai-toolbox
|
||||
COPY . .
|
||||
|
||||
ARG TARGETOS
|
||||
ARG TARGETARCH
|
||||
ARG METADATA_TAGS=dev
|
||||
ARG BUILD_TYPE="container.dev"
|
||||
ARG COMMIT_SHA=""
|
||||
|
||||
RUN go get ./...
|
||||
RUN CGO_ENABLED=0 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=container.${METADATA_TAGS}"
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=container.${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}"
|
||||
|
||||
# Final Stage
|
||||
FROM gcr.io/distroless/static:nonroot
|
||||
|
||||
280
README.md
@@ -1,18 +1,26 @@
|
||||

|
||||
|
||||
# 🧰 Gen AI Toolbox for Databases
|
||||
# MCP Toolbox for Databases
|
||||
|
||||
> [!NOTE]
|
||||
> Gen AI Toolbox for Databases is currently in beta, and may see breaking
|
||||
[](https://discord.gg/Dmm69peqjh)
|
||||
[](https://goreportcard.com/report/github.com/googleapis/genai-toolbox)
|
||||
|
||||
> [!NOTE]
|
||||
> MCP Toolbox for Databases is currently in beta, and may see breaking
|
||||
> changes until the first stable release (v1.0).
|
||||
|
||||
Gen AI Toolbox for Databases is an open source server that makes it easier to
|
||||
build Gen AI tools for interacting with databases. It enables you to develop
|
||||
tools easier, faster, and more securely by handling the complexities such as
|
||||
connection pooling, authentication, and more.
|
||||
MCP Toolbox for Databases is an open source MCP server for databases. It enables
|
||||
you to develop tools easier, faster, and more securely by handling the complexities
|
||||
such as connection pooling, authentication, and more.
|
||||
|
||||
This README provides a brief overview. For comprehensive details, see the [full
|
||||
documentation](https://googleapis.github.io/genai-toolbox/).
|
||||
|
||||
> [!NOTE]
|
||||
> This solution was originally named “Gen AI Toolbox for Databases” as
|
||||
> its initial development predated MCP, but was renamed to align with recently
|
||||
> added MCP compatibility.
|
||||
|
||||
<!-- TOC ignore:true -->
|
||||
## Table of Contents
|
||||
|
||||
@@ -21,23 +29,23 @@ documentation](https://googleapis.github.io/genai-toolbox/).
|
||||
- [Why Toolbox?](#why-toolbox)
|
||||
- [General Architecture](#general-architecture)
|
||||
- [Getting Started](#getting-started)
|
||||
- [Installing the server](#installing-the-server)
|
||||
- [Running the server](#running-the-server)
|
||||
- [Integrating your application](#integrating-your-application)
|
||||
- [Installing the server](#installing-the-server)
|
||||
- [Running the server](#running-the-server)
|
||||
- [Integrating your application](#integrating-your-application)
|
||||
- [Configuration](#configuration)
|
||||
- [Sources](#sources)
|
||||
- [Tools](#tools)
|
||||
- [Toolsets](#toolsets)
|
||||
- [Sources](#sources)
|
||||
- [Tools](#tools)
|
||||
- [Toolsets](#toolsets)
|
||||
- [Versioning](#versioning)
|
||||
- [Contributing](#contributing)
|
||||
|
||||
<!-- /TOC -->
|
||||
|
||||
|
||||
## Why Toolbox?
|
||||
## Why Toolbox?
|
||||
|
||||
Toolbox helps you build Gen AI tools that let your agents access data in your
|
||||
database. Toolbox provides:
|
||||
|
||||
- **Simplified development**: Integrate tools to your agent in less than 10
|
||||
lines of code, reuse tools between multiple agents or frameworks, and deploy
|
||||
new versions of tools more easily.
|
||||
@@ -47,6 +55,33 @@ database. Toolbox provides:
|
||||
- **End-to-end observability**: Out of the box metrics and tracing with built-in
|
||||
support for OpenTelemetry.
|
||||
|
||||
**⚡ Supercharge Your Workflow with an AI Database Assistant ⚡**
|
||||
|
||||
Stop context-switching and let your AI assistant become a true co-developer. By
|
||||
[connecting your IDE to your databases with MCP Toolbox][connect-ide], you can
|
||||
delegate complex and time-consuming database tasks, allowing you to build faster
|
||||
and focus on what matters. This isn't just about code completion; it's about
|
||||
giving your AI the context it needs to handle the entire development lifecycle.
|
||||
|
||||
Here’s how it will save you time:
|
||||
|
||||
- **Query in Plain English**: Interact with your data using natural language
|
||||
right from your IDE. Ask complex questions like, *"How many orders were
|
||||
delivered in 2024, and what items were in them?"* without writing any SQL.
|
||||
- **Automate Database Management**: Simply describe your data needs, and let the
|
||||
AI assistant manage your database for you. It can handle generating queries,
|
||||
creating tables, adding indexes, and more.
|
||||
- **Generate Context-Aware Code**: Empower your AI assistant to generate
|
||||
application code and tests with a deep understanding of your real-time
|
||||
database schema. This accelerates the development cycle by ensuring the
|
||||
generated code is directly usable.
|
||||
- **Slash Development Overhead**: Radically reduce the time spent on manual
|
||||
setup and boilerplate. MCP Toolbox helps streamline lengthy database
|
||||
configurations, repetitive code, and error-prone schema migrations.
|
||||
|
||||
Learn [how to connect your AI tools (IDEs) to Toolbox using MCP][connect-ide].
|
||||
|
||||
[connect-ide]: https://googleapis.github.io/genai-toolbox/how-to/connect-ide/
|
||||
|
||||
## General Architecture
|
||||
|
||||
@@ -62,6 +97,7 @@ redeploying your application.
|
||||
## Getting Started
|
||||
|
||||
### Installing the server
|
||||
|
||||
For the latest version, check the [releases page][releases] and use the
|
||||
following instructions for your OS and CPU architecture.
|
||||
|
||||
@@ -75,7 +111,7 @@ To install Toolbox as a binary:
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.2.1
|
||||
export VERSION=0.8.0
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
|
||||
chmod +x toolbox
|
||||
```
|
||||
@@ -88,7 +124,7 @@ You can also install Toolbox as a container:
|
||||
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.2.1
|
||||
export VERSION=0.8.0
|
||||
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
|
||||
```
|
||||
|
||||
@@ -101,7 +137,7 @@ To install from source, ensure you have the latest version of
|
||||
[Go installed](https://go.dev/doc/install), and then run the following command:
|
||||
|
||||
```sh
|
||||
go install github.com/googleapis/genai-toolbox@v0.2.1
|
||||
go install github.com/googleapis/genai-toolbox@v0.8.0
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
@@ -113,8 +149,10 @@ go install github.com/googleapis/genai-toolbox@v0.2.1
|
||||
execute `toolbox` to start the server:
|
||||
|
||||
```sh
|
||||
./toolbox --tools_file "tools.yaml"
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
> [!NOTE]
|
||||
> Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
|
||||
You can use `toolbox help` for a full list of flags! To stop the server, send a
|
||||
terminate signal (`ctrl+c` on most platforms).
|
||||
@@ -129,65 +167,221 @@ Once your server is up and running, you can load the tools into your
|
||||
application. See below the list of Client SDKs for using various frameworks:
|
||||
|
||||
<details open>
|
||||
<summary>LangChain / LangGraph</summary>
|
||||
<summary>Python (<a href="https://github.com/googleapis/mcp-toolbox-sdk-python">Github</a>)</summary>
|
||||
<br>
|
||||
<blockquote>
|
||||
|
||||
<details open>
|
||||
<summary>Core</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core]:
|
||||
|
||||
```bash
|
||||
pip install toolbox-core
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```python
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = await client.load_toolset("toolset_name")
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
[project's README][toolbox-core-readme].
|
||||
|
||||
[toolbox-core]: https://pypi.org/project/toolbox-core/
|
||||
[toolbox-core-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-core/README.md
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>LangChain / LangGraph</summary>
|
||||
|
||||
1. Install [Toolbox LangChain SDK][toolbox-langchain]:
|
||||
|
||||
```bash
|
||||
pip install toolbox-langchain
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```python
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox LangChain SDK, see the
|
||||
[project's README][toolbox-langchain-readme].
|
||||
For more detailed instructions on using the Toolbox LangChain SDK, see the
|
||||
[project's README][toolbox-langchain-readme].
|
||||
|
||||
[toolbox-langchain]: https://github.com/googleapis/genai-toolbox-langchain-python
|
||||
[toolbox-langchain-readme]: https://github.com/googleapis/genai-toolbox-langchain-python/blob/main/README.md
|
||||
[toolbox-langchain]: https://pypi.org/project/toolbox-langchain/
|
||||
[toolbox-langchain-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/README.md
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>LlamaIndex</summary>
|
||||
</details>
|
||||
<details>
|
||||
<summary>LlamaIndex</summary>
|
||||
|
||||
1. Install [Toolbox Llamaindex SDK][toolbox-llamaindex]:
|
||||
|
||||
```bash
|
||||
pip install toolbox-llamaindex
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```python
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
[project's README][toolbox-llamaindex-readme].
|
||||
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
[project's README][toolbox-llamaindex-readme].
|
||||
|
||||
[toolbox-llamaindex]: https://github.com/googleapis/genai-toolbox-llamaindex-python
|
||||
[toolbox-llamaindex-readme]: https://github.com/googleapis/genai-toolbox-llamaindex-python/blob/main/README.md
|
||||
[toolbox-llamaindex]: https://pypi.org/project/toolbox-llamaindex/
|
||||
[toolbox-llamaindex-readme]: https://github.com/googleapis/genai-toolbox-llamaindex-python/blob/main/README.md
|
||||
|
||||
</details>
|
||||
</details>
|
||||
</blockquote>
|
||||
<details>
|
||||
<summary>Javascript/Typescript (<a href="https://github.com/googleapis/mcp-toolbox-sdk-js">Github</a>)</summary>
|
||||
<br>
|
||||
<blockquote>
|
||||
|
||||
<details open>
|
||||
<summary>Core</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core-js]:
|
||||
|
||||
```bash
|
||||
npm install @toolbox-sdk/core
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```javascript
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const tools = await client.loadToolset('toolsetName');
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
[project's README][toolbox-core-js-readme].
|
||||
|
||||
[toolbox-core-js]: https://www.npmjs.com/package/@toolbox-sdk/core
|
||||
[toolbox-core-js-readme]: https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>LangChain / LangGraph</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core-js]:
|
||||
|
||||
```bash
|
||||
npm install @toolbox-sdk/core
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```javascript
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => tool(currTool, {
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
});
|
||||
|
||||
// Use these tools in your Langchain/Langraph applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
```
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>Genkit</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core-js]:
|
||||
|
||||
```bash
|
||||
npm install @toolbox-sdk/core
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```javascript
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
import { genkit } from 'genkit';
|
||||
|
||||
// Initialise genkit
|
||||
const ai = genkit({
|
||||
plugins: [
|
||||
googleAI({
|
||||
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
|
||||
})
|
||||
],
|
||||
model: googleAI.model('gemini-2.0-flash'),
|
||||
});
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => ai.defineTool({
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
}, toolboxTool)
|
||||
|
||||
// Use these tools in your Genkit applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
```
|
||||
|
||||
</details>
|
||||
</details>
|
||||
</blockquote>
|
||||
|
||||
</details>
|
||||
|
||||
## Configuration
|
||||
|
||||
The primary way to configure Toolbox is through the `tools.yaml` file. If you
|
||||
have multiple files, you can tell toolbox which to load with the `--tools_file
|
||||
have multiple files, you can tell toolbox which to load with the `--tools-file
|
||||
tools.yaml` flag.
|
||||
|
||||
You can find more detailed reference documentation to all resource types in the
|
||||
[Resources](https://googleapis.github.io/genai-toolbox/resources/).
|
||||
|
||||
### Sources
|
||||
|
||||
The `sources` section of your `tools.yaml` defines what data sources your
|
||||
@@ -210,9 +404,8 @@ For more details on configuring different types of sources, see the
|
||||
|
||||
### Tools
|
||||
|
||||
The `tools` section of your `tools.yaml` define your the actions your agent can
|
||||
take: what kind of tool it is, which source(s) it affects, what parameters it
|
||||
uses, etc.
|
||||
The `tools` section of a `tools.yaml` define the actions an agent can take: what
|
||||
kind of tool it is, which source(s) it affects, what parameters it uses, etc.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
@@ -230,7 +423,6 @@ tools:
|
||||
For more details on configuring different types of tools, see the
|
||||
[Tools](https://googleapis.github.io/genai-toolbox/resources/tools).
|
||||
|
||||
|
||||
### Toolsets
|
||||
|
||||
The `toolsets` section of your `tools.yaml` allows you to define groups of tools
|
||||
@@ -267,7 +459,7 @@ This project uses [semantic versioning](https://semver.org/), including a
|
||||
- PATCH version when we make backward compatible bug fixes
|
||||
|
||||
The public API that this applies to is the CLI associated with Toolbox, the
|
||||
interactions with official SDKs, and the definitions in the `tools.yaml` file.
|
||||
interactions with official SDKs, and the definitions in the `tools.yaml` file.
|
||||
|
||||
## Contributing
|
||||
|
||||
|
||||
618
cmd/root.go
@@ -19,26 +19,84 @@ import (
|
||||
_ "embed"
|
||||
"fmt"
|
||||
"io"
|
||||
"maps"
|
||||
"os"
|
||||
"os/signal"
|
||||
"path/filepath"
|
||||
"regexp"
|
||||
"runtime"
|
||||
"slices"
|
||||
"strings"
|
||||
"syscall"
|
||||
"time"
|
||||
|
||||
"github.com/fsnotify/fsnotify"
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/googleapis/genai-toolbox/internal/auth"
|
||||
"github.com/googleapis/genai-toolbox/internal/log"
|
||||
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
"github.com/googleapis/genai-toolbox/internal/telemetry"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/util"
|
||||
|
||||
// Import tool packages for side effect of registration
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/alloydbainl"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigqueryexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygetdatasetinfo"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygettableinfo"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylistdatasetids"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylisttableids"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigtable"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/couchbase"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/dgraph"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/http"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlsql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlsql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/redis"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannersql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/sqlitesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/valkey"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/bigtable"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlmssql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlmysql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/couchbase"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/dgraph"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/http"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/mssql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/mysql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/neo4j"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/postgres"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/redis"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/spanner"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/sqlite"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/valkey"
|
||||
)
|
||||
|
||||
var (
|
||||
// versionString indicates the version of this library.
|
||||
//go:embed version.txt
|
||||
// versionString stores the full semantic version, including build metadata.
|
||||
versionString string
|
||||
// versionNum indicates the numerical part fo the version
|
||||
//go:embed version.txt
|
||||
versionNum string
|
||||
// metadataString indicates additional build or distribution metadata.
|
||||
metadataString string
|
||||
buildType string = "dev" // should be one of "dev", "binary", or "container"
|
||||
// commitSha is the git commit it was built from
|
||||
commitSha string
|
||||
)
|
||||
|
||||
func init() {
|
||||
@@ -47,10 +105,11 @@ func init() {
|
||||
|
||||
// semanticVersion returns the version of the CLI including a compile-time metadata.
|
||||
func semanticVersion() string {
|
||||
v := strings.TrimSpace(versionString)
|
||||
if metadataString != "" {
|
||||
v += "+" + metadataString
|
||||
metadataStrings := []string{buildType, runtime.GOOS, runtime.GOARCH}
|
||||
if commitSha != "" {
|
||||
metadataStrings = append(metadataStrings, commitSha)
|
||||
}
|
||||
v := strings.TrimSpace(versionNum) + "+" + strings.Join(metadataStrings, ".")
|
||||
return v
|
||||
}
|
||||
|
||||
@@ -67,15 +126,20 @@ func Execute() {
|
||||
type Command struct {
|
||||
*cobra.Command
|
||||
|
||||
cfg server.ServerConfig
|
||||
logger log.Logger
|
||||
tools_file string
|
||||
outStream io.Writer
|
||||
errStream io.Writer
|
||||
cfg server.ServerConfig
|
||||
logger log.Logger
|
||||
tools_file string
|
||||
tools_files []string
|
||||
tools_folder string
|
||||
prebuiltConfig string
|
||||
inStream io.Reader
|
||||
outStream io.Writer
|
||||
errStream io.Writer
|
||||
}
|
||||
|
||||
// NewCommand returns a Command object representing an invocation of the CLI.
|
||||
func NewCommand(opts ...Option) *Command {
|
||||
in := os.Stdin
|
||||
out := os.Stdout
|
||||
err := os.Stderr
|
||||
|
||||
@@ -86,6 +150,7 @@ func NewCommand(opts ...Option) *Command {
|
||||
}
|
||||
cmd := &Command{
|
||||
Command: baseCmd,
|
||||
inStream: in,
|
||||
outStream: out,
|
||||
errStream: err,
|
||||
}
|
||||
@@ -97,7 +162,8 @@ func NewCommand(opts ...Option) *Command {
|
||||
// Set server version
|
||||
cmd.cfg.Version = versionString
|
||||
|
||||
// set baseCmd out and err the same as cmd.
|
||||
// set baseCmd in, out and err the same as cmd.
|
||||
baseCmd.SetIn(cmd.inStream)
|
||||
baseCmd.SetOut(cmd.outStream)
|
||||
baseCmd.SetErr(cmd.errStream)
|
||||
|
||||
@@ -105,12 +171,21 @@ func NewCommand(opts ...Option) *Command {
|
||||
flags.StringVarP(&cmd.cfg.Address, "address", "a", "127.0.0.1", "Address of the interface the server will listen on.")
|
||||
flags.IntVarP(&cmd.cfg.Port, "port", "p", 5000, "Port the server will listen on.")
|
||||
|
||||
flags.StringVar(&cmd.tools_file, "tools_file", "tools.yaml", "File path specifying the tool configuration.")
|
||||
flags.StringVar(&cmd.tools_file, "tools_file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt.")
|
||||
// deprecate tools_file
|
||||
_ = flags.MarkDeprecated("tools_file", "please use --tools-file instead")
|
||||
flags.StringVar(&cmd.tools_file, "tools-file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt, --tools-files, or --tools-folder.")
|
||||
flags.StringSliceVar(&cmd.tools_files, "tools-files", []string{}, "Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --prebuilt, --tools-file, or --tools-folder.")
|
||||
flags.StringVar(&cmd.tools_folder, "tools-folder", "", "Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --prebuilt, --tools-file, or --tools-files.")
|
||||
flags.Var(&cmd.cfg.LogLevel, "log-level", "Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'.")
|
||||
flags.Var(&cmd.cfg.LoggingFormat, "logging-format", "Specify logging format to use. Allowed: 'standard' or 'JSON'.")
|
||||
flags.BoolVar(&cmd.cfg.TelemetryGCP, "telemetry-gcp", false, "Enable exporting directly to Google Cloud Monitoring.")
|
||||
flags.StringVar(&cmd.cfg.TelemetryOTLP, "telemetry-otlp", "", "Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318')")
|
||||
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
|
||||
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'postgres', 'spanner', 'spanner-postgres'.")
|
||||
flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.")
|
||||
flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.")
|
||||
flags.BoolVar(&cmd.cfg.UI, "ui", false, "Launches the Toolbox UI web server.")
|
||||
|
||||
// wrap RunE command so that we have access to original Command object
|
||||
cmd.RunE = func(*cobra.Command, []string) error { return run(cmd) }
|
||||
@@ -126,9 +201,31 @@ type ToolsFile struct {
|
||||
Toolsets server.ToolsetConfigs `yaml:"toolsets"`
|
||||
}
|
||||
|
||||
// parseEnv replaces environment variables ${ENV_NAME} with their values.
|
||||
func parseEnv(input string) string {
|
||||
re := regexp.MustCompile(`\$\{(\w+)\}`)
|
||||
|
||||
return re.ReplaceAllStringFunc(input, func(match string) string {
|
||||
parts := re.FindStringSubmatch(match)
|
||||
if len(parts) < 2 {
|
||||
// technically shouldn't happen
|
||||
return match
|
||||
}
|
||||
|
||||
// extract the variable name
|
||||
variableName := parts[1]
|
||||
if value, found := os.LookupEnv(variableName); found {
|
||||
return value
|
||||
}
|
||||
return match
|
||||
})
|
||||
}
|
||||
|
||||
// parseToolsFile parses the provided yaml into appropriate configs.
|
||||
func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
|
||||
var toolsFile ToolsFile
|
||||
// Replace environment variables if found
|
||||
raw = []byte(parseEnv(string(raw)))
|
||||
// Parse contents
|
||||
err := yaml.UnmarshalContext(ctx, raw, &toolsFile, yaml.Strict())
|
||||
if err != nil {
|
||||
@@ -137,7 +234,353 @@ func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
|
||||
return toolsFile, nil
|
||||
}
|
||||
|
||||
// mergeToolsFiles merges multiple ToolsFile structs into one.
|
||||
// Detects and raises errors for resource conflicts in sources, authServices, tools, and toolsets.
|
||||
// All resource names (sources, authServices, tools, toolsets) must be unique across all files.
|
||||
func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
|
||||
merged := ToolsFile{
|
||||
Sources: make(server.SourceConfigs),
|
||||
AuthServices: make(server.AuthServiceConfigs),
|
||||
Tools: make(server.ToolConfigs),
|
||||
Toolsets: make(server.ToolsetConfigs),
|
||||
}
|
||||
|
||||
var conflicts []string
|
||||
|
||||
for fileIndex, file := range files {
|
||||
// Check for conflicts and merge sources
|
||||
for name, source := range file.Sources {
|
||||
if _, exists := merged.Sources[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("source '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.Sources[name] = source
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge authSources (deprecated, but still support)
|
||||
for name, authSource := range file.AuthSources {
|
||||
if _, exists := merged.AuthSources[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("authSource '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.AuthSources[name] = authSource
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge authServices
|
||||
for name, authService := range file.AuthServices {
|
||||
if _, exists := merged.AuthServices[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("authService '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.AuthServices[name] = authService
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge tools
|
||||
for name, tool := range file.Tools {
|
||||
if _, exists := merged.Tools[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("tool '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.Tools[name] = tool
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge toolsets
|
||||
for name, toolset := range file.Toolsets {
|
||||
if _, exists := merged.Toolsets[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("toolset '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.Toolsets[name] = toolset
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If conflicts were detected, return an error
|
||||
if len(conflicts) > 0 {
|
||||
return ToolsFile{}, fmt.Errorf("resource conflicts detected:\n - %s\n\nPlease ensure each source, authService, tool, and toolset has a unique name across all files", strings.Join(conflicts, "\n - "))
|
||||
}
|
||||
|
||||
return merged, nil
|
||||
}
|
||||
|
||||
// loadAndMergeToolsFiles loads multiple YAML files and merges them
|
||||
func loadAndMergeToolsFiles(ctx context.Context, filePaths []string) (ToolsFile, error) {
|
||||
var toolsFiles []ToolsFile
|
||||
|
||||
for _, filePath := range filePaths {
|
||||
buf, err := os.ReadFile(filePath)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to read tool file at %q: %w", filePath, err)
|
||||
}
|
||||
|
||||
toolsFile, err := parseToolsFile(ctx, buf)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to parse tool file at %q: %w", filePath, err)
|
||||
}
|
||||
|
||||
toolsFiles = append(toolsFiles, toolsFile)
|
||||
}
|
||||
|
||||
mergedFile, err := mergeToolsFiles(toolsFiles...)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to merge tools files: %w", err)
|
||||
}
|
||||
|
||||
return mergedFile, nil
|
||||
}
|
||||
|
||||
// loadAndMergeToolsFolder loads all YAML files from a directory and merges them
|
||||
func loadAndMergeToolsFolder(ctx context.Context, folderPath string) (ToolsFile, error) {
|
||||
// Check if directory exists
|
||||
info, err := os.Stat(folderPath)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to access tools folder at %q: %w", folderPath, err)
|
||||
}
|
||||
if !info.IsDir() {
|
||||
return ToolsFile{}, fmt.Errorf("path %q is not a directory", folderPath)
|
||||
}
|
||||
|
||||
// Find all YAML files in the directory
|
||||
pattern := filepath.Join(folderPath, "*.yaml")
|
||||
yamlFiles, err := filepath.Glob(pattern)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("error finding YAML files in %q: %w", folderPath, err)
|
||||
}
|
||||
|
||||
// Also find .yml files
|
||||
ymlPattern := filepath.Join(folderPath, "*.yml")
|
||||
ymlFiles, err := filepath.Glob(ymlPattern)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("error finding YML files in %q: %w", folderPath, err)
|
||||
}
|
||||
|
||||
// Combine both file lists
|
||||
allFiles := append(yamlFiles, ymlFiles...)
|
||||
|
||||
if len(allFiles) == 0 {
|
||||
return ToolsFile{}, fmt.Errorf("no YAML files found in directory %q", folderPath)
|
||||
}
|
||||
|
||||
// Use existing loadAndMergeToolsFiles function
|
||||
return loadAndMergeToolsFiles(ctx, allFiles)
|
||||
}
|
||||
|
||||
func handleDynamicReload(ctx context.Context, toolsFile ToolsFile, s *server.Server) error {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
sourcesMap, authServicesMap, toolsMap, toolsetsMap, err := validateReloadEdits(ctx, toolsFile)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to validate reloaded edits: %w", err)
|
||||
logger.WarnContext(ctx, errMsg.Error())
|
||||
return err
|
||||
}
|
||||
|
||||
s.ResourceMgr.SetResources(sourcesMap, authServicesMap, toolsMap, toolsetsMap)
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// validateReloadEdits checks that the reloaded tools file configs can initialized without failing
|
||||
func validateReloadEdits(
|
||||
ctx context.Context, toolsFile ToolsFile,
|
||||
) (map[string]sources.Source, map[string]auth.AuthService, map[string]tools.Tool, map[string]tools.Toolset, error,
|
||||
) {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
instrumentation, err := util.InstrumentationFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
logger.DebugContext(ctx, "Attempting to parse and validate reloaded tools file.")
|
||||
|
||||
ctx, span := instrumentation.Tracer.Start(ctx, "toolbox/server/reload")
|
||||
defer span.End()
|
||||
|
||||
reloadedConfig := server.ServerConfig{
|
||||
Version: versionString,
|
||||
SourceConfigs: toolsFile.Sources,
|
||||
AuthServiceConfigs: toolsFile.AuthServices,
|
||||
ToolConfigs: toolsFile.Tools,
|
||||
ToolsetConfigs: toolsFile.Toolsets,
|
||||
}
|
||||
|
||||
sourcesMap, authServicesMap, toolsMap, toolsetsMap, err := server.InitializeConfigs(ctx, reloadedConfig)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to initialize reloaded configs: %w", err)
|
||||
logger.WarnContext(ctx, errMsg.Error())
|
||||
return nil, nil, nil, nil, err
|
||||
}
|
||||
|
||||
return sourcesMap, authServicesMap, toolsMap, toolsetsMap, nil
|
||||
}
|
||||
|
||||
// watchChanges checks for changes in the provided yaml tools file(s) or folder.
|
||||
func watchChanges(ctx context.Context, watchDirs map[string]bool, watchedFiles map[string]bool, s *server.Server) {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
w, err := fsnotify.NewWatcher()
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "error setting up new watcher %s", err)
|
||||
return
|
||||
}
|
||||
|
||||
defer w.Close()
|
||||
|
||||
watchingFolder := false
|
||||
var folderToWatch string
|
||||
|
||||
// if watchedFiles is empty, indicates that user passed entire folder instead
|
||||
if len(watchedFiles) == 0 {
|
||||
watchingFolder = true
|
||||
|
||||
// validate that watchDirs only has single element
|
||||
if len(watchDirs) > 1 {
|
||||
logger.WarnContext(ctx, "error setting watcher, expected single tools folder if no file(s) are defined.")
|
||||
return
|
||||
}
|
||||
|
||||
for onlyKey := range watchDirs {
|
||||
folderToWatch = onlyKey
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
for dir := range watchDirs {
|
||||
err := w.Add(dir)
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, fmt.Sprintf("Error adding path %s to watcher: %s", dir, err))
|
||||
break
|
||||
}
|
||||
logger.DebugContext(ctx, fmt.Sprintf("Added directory %s to watcher.", dir))
|
||||
}
|
||||
|
||||
// debounce timer is used to prevent multiple writes triggering multiple reloads
|
||||
debounceDelay := 100 * time.Millisecond
|
||||
debounce := time.NewTimer(1 * time.Minute)
|
||||
debounce.Stop()
|
||||
|
||||
for {
|
||||
select {
|
||||
case <-ctx.Done():
|
||||
logger.DebugContext(ctx, "file watcher context cancelled")
|
||||
return
|
||||
case err, ok := <-w.Errors:
|
||||
if !ok {
|
||||
logger.WarnContext(ctx, "file watcher was closed unexpectedly")
|
||||
return
|
||||
}
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "file watcher error %s", err)
|
||||
return
|
||||
}
|
||||
|
||||
case e, ok := <-w.Events:
|
||||
if !ok {
|
||||
logger.WarnContext(ctx, "file watcher already closed")
|
||||
return
|
||||
}
|
||||
|
||||
// only check for write events which indicate user saved a new tools file
|
||||
if !e.Has(fsnotify.Write) {
|
||||
continue
|
||||
}
|
||||
|
||||
cleanedFilename := filepath.Clean(e.Name)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("WRITE event detected in %s", cleanedFilename))
|
||||
|
||||
folderChanged := watchingFolder &&
|
||||
(strings.HasSuffix(cleanedFilename, ".yaml") || strings.HasSuffix(cleanedFilename, ".yml"))
|
||||
|
||||
if folderChanged || watchedFiles[cleanedFilename] {
|
||||
// indicates the write event is on a relevant file
|
||||
debounce.Reset(debounceDelay)
|
||||
}
|
||||
|
||||
case <-debounce.C:
|
||||
debounce.Stop()
|
||||
var reloadedToolsFile ToolsFile
|
||||
|
||||
if watchingFolder {
|
||||
logger.DebugContext(ctx, "Reloading tools folder.")
|
||||
reloadedToolsFile, err = loadAndMergeToolsFolder(ctx, folderToWatch)
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "error loading tools folder %s", err)
|
||||
continue
|
||||
}
|
||||
} else {
|
||||
logger.DebugContext(ctx, "Reloading tools file(s).")
|
||||
reloadedToolsFile, err = loadAndMergeToolsFiles(ctx, slices.Collect(maps.Keys(watchedFiles)))
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "error loading tools files %s", err)
|
||||
continue
|
||||
}
|
||||
}
|
||||
|
||||
err = handleDynamicReload(ctx, reloadedToolsFile, s)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to parse reloaded tools file at %q: %w", reloadedToolsFile, err)
|
||||
logger.WarnContext(ctx, errMsg.Error())
|
||||
continue
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// updateLogLevel checks if Toolbox have to update the existing log level set by users.
|
||||
// stdio doesn't support "debug" and "info" logs.
|
||||
func updateLogLevel(stdio bool, logLevel string) bool {
|
||||
if stdio {
|
||||
switch strings.ToUpper(logLevel) {
|
||||
case log.Debug, log.Info:
|
||||
return true
|
||||
default:
|
||||
return false
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func resolveWatcherInputs(toolsFile string, toolsFiles []string, toolsFolder string) (map[string]bool, map[string]bool) {
|
||||
var relevantFiles []string
|
||||
|
||||
// map for efficiently checking if a file is relevant
|
||||
watchedFiles := make(map[string]bool)
|
||||
|
||||
// dirs that will be added to watcher (fsnotify prefers watching directory then filtering for file)
|
||||
watchDirs := make(map[string]bool)
|
||||
|
||||
if len(toolsFiles) > 0 {
|
||||
relevantFiles = toolsFiles
|
||||
} else if toolsFolder != "" {
|
||||
watchDirs[filepath.Clean(toolsFolder)] = true
|
||||
} else {
|
||||
relevantFiles = []string{toolsFile}
|
||||
}
|
||||
|
||||
// extract parent dir for relevant files and dedup
|
||||
for _, f := range relevantFiles {
|
||||
cleanFile := filepath.Clean(f)
|
||||
watchedFiles[cleanFile] = true
|
||||
watchDirs[filepath.Dir(cleanFile)] = true
|
||||
}
|
||||
|
||||
return watchDirs, watchedFiles
|
||||
}
|
||||
|
||||
func run(cmd *Command) error {
|
||||
if updateLogLevel(cmd.cfg.Stdio, cmd.cfg.LogLevel.String()) {
|
||||
cmd.cfg.LogLevel = server.StringLevel(log.Warn)
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithCancel(cmd.Context())
|
||||
defer cancel()
|
||||
|
||||
@@ -176,13 +619,13 @@ func run(cmd *Command) error {
|
||||
}
|
||||
cmd.logger = logger
|
||||
default:
|
||||
return fmt.Errorf("logging format invalid.")
|
||||
return fmt.Errorf("logging format invalid")
|
||||
}
|
||||
|
||||
ctx = util.WithLogger(ctx, cmd.logger)
|
||||
|
||||
// Set up OpenTelemetry
|
||||
otelShutdown, err := telemetry.SetupOTel(ctx, cmd.Command.Version, cmd.cfg.TelemetryOTLP, cmd.cfg.TelemetryGCP, cmd.cfg.TelemetryServiceName)
|
||||
otelShutdown, err := telemetry.SetupOTel(ctx, cmd.cfg.Version, cmd.cfg.TelemetryOTLP, cmd.cfg.TelemetryGCP, cmd.cfg.TelemetryServiceName)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("error setting up OpenTelemetry: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
@@ -196,14 +639,86 @@ func run(cmd *Command) error {
|
||||
}
|
||||
}()
|
||||
|
||||
// Read tool file contents
|
||||
buf, err := os.ReadFile(cmd.tools_file)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to read tool file at %q: %w", cmd.tools_file, err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
var toolsFile ToolsFile
|
||||
|
||||
if cmd.prebuiltConfig != "" {
|
||||
// Make sure --prebuilt and --tools-file/--tools-files/--tools-folder flags are mutually exclusive
|
||||
if cmd.tools_file != "" || len(cmd.tools_files) > 0 || cmd.tools_folder != "" {
|
||||
errMsg := fmt.Errorf("--prebuilt and --tools-file/--tools-files/--tools-folder flags cannot be used simultaneously")
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
// Use prebuilt tools
|
||||
buf, err := prebuiltconfigs.Get(cmd.prebuiltConfig)
|
||||
if err != nil {
|
||||
cmd.logger.ErrorContext(ctx, err.Error())
|
||||
return err
|
||||
}
|
||||
logMsg := fmt.Sprint("Using prebuilt tool configuration for ", cmd.prebuiltConfig)
|
||||
cmd.logger.InfoContext(ctx, logMsg)
|
||||
// Append prebuilt.source to Version string for the User Agent
|
||||
cmd.cfg.Version += "+prebuilt." + cmd.prebuiltConfig
|
||||
|
||||
toolsFile, err = parseToolsFile(ctx, buf)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to parse prebuilt tool configuration: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
} else if len(cmd.tools_files) > 0 {
|
||||
// Make sure --tools-file, --tools-files, and --tools-folder flags are mutually exclusive
|
||||
if cmd.tools_file != "" || cmd.tools_folder != "" {
|
||||
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
// Use multiple tools files
|
||||
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging %d tool configuration files", len(cmd.tools_files)))
|
||||
var err error
|
||||
toolsFile, err = loadAndMergeToolsFiles(ctx, cmd.tools_files)
|
||||
if err != nil {
|
||||
cmd.logger.ErrorContext(ctx, err.Error())
|
||||
return err
|
||||
}
|
||||
} else if cmd.tools_folder != "" {
|
||||
// Make sure --tools-folder and other flags are mutually exclusive
|
||||
if cmd.tools_file != "" || len(cmd.tools_files) > 0 {
|
||||
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
// Use tools folder
|
||||
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging all YAML files from directory: %s", cmd.tools_folder))
|
||||
var err error
|
||||
toolsFile, err = loadAndMergeToolsFolder(ctx, cmd.tools_folder)
|
||||
if err != nil {
|
||||
cmd.logger.ErrorContext(ctx, err.Error())
|
||||
return err
|
||||
}
|
||||
} else {
|
||||
// Set default value of tools-file flag to tools.yaml
|
||||
if cmd.tools_file == "" {
|
||||
cmd.tools_file = "tools.yaml"
|
||||
}
|
||||
|
||||
// Read single tool file contents
|
||||
buf, err := os.ReadFile(cmd.tools_file)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to read tool file at %q: %w", cmd.tools_file, err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
toolsFile, err = parseToolsFile(ctx, buf)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to parse tool file at %q: %w", cmd.tools_file, err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
}
|
||||
toolsFile, err := parseToolsFile(ctx, buf)
|
||||
|
||||
cmd.cfg.SourceConfigs, cmd.cfg.AuthServiceConfigs, cmd.cfg.ToolConfigs, cmd.cfg.ToolsetConfigs = toolsFile.Sources, toolsFile.AuthServices, toolsFile.Tools, toolsFile.Toolsets
|
||||
authSourceConfigs := toolsFile.AuthSources
|
||||
if authSourceConfigs != nil {
|
||||
@@ -216,31 +731,60 @@ func run(cmd *Command) error {
|
||||
return errMsg
|
||||
}
|
||||
|
||||
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to create telemetry instrumentation: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
ctx = util.WithInstrumentation(ctx, instrumentation)
|
||||
|
||||
// start server
|
||||
s, err := server.NewServer(ctx, cmd.cfg, cmd.logger)
|
||||
s, err := server.NewServer(ctx, cmd.cfg)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("toolbox failed to initialize: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
err = s.Listen(ctx)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("toolbox failed to start listener: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
cmd.logger.InfoContext(ctx, "Server ready to serve!")
|
||||
|
||||
// run server in background
|
||||
srvErr := make(chan error)
|
||||
go func() {
|
||||
defer close(srvErr)
|
||||
err = s.Serve()
|
||||
if cmd.cfg.Stdio {
|
||||
go func() {
|
||||
defer close(srvErr)
|
||||
err = s.ServeStdio(ctx, cmd.inStream, cmd.outStream)
|
||||
if err != nil {
|
||||
srvErr <- err
|
||||
}
|
||||
}()
|
||||
} else {
|
||||
err = s.Listen(ctx)
|
||||
if err != nil {
|
||||
srvErr <- err
|
||||
errMsg := fmt.Errorf("toolbox failed to start listener: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
}()
|
||||
cmd.logger.InfoContext(ctx, "Server ready to serve!")
|
||||
if cmd.cfg.UI {
|
||||
cmd.logger.InfoContext(ctx, "Toolbox UI is up and running at: http://localhost:5000/ui")
|
||||
}
|
||||
|
||||
go func() {
|
||||
defer close(srvErr)
|
||||
err = s.Serve(ctx)
|
||||
if err != nil {
|
||||
srvErr <- err
|
||||
}
|
||||
}()
|
||||
}
|
||||
|
||||
watchDirs, watchedFiles := resolveWatcherInputs(cmd.tools_file, cmd.tools_files, cmd.tools_folder)
|
||||
|
||||
if !cmd.cfg.DisableReload {
|
||||
// start watching the file(s) or folder for changes to trigger dynamic reloading
|
||||
go watchChanges(ctx, watchDirs, watchedFiles, s)
|
||||
}
|
||||
|
||||
// wait for either the server to error out or the command's context to be canceled
|
||||
select {
|
||||
@@ -256,7 +800,7 @@ func run(cmd *Command) error {
|
||||
cmd.logger.WarnContext(shutdownContext, "Shutting down gracefully...")
|
||||
err := s.Shutdown(shutdownContext)
|
||||
if err == context.DeadlineExceeded {
|
||||
return fmt.Errorf("graceful shutdown timed out... forcing exit.")
|
||||
return fmt.Errorf("graceful shutdown timed out... forcing exit")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
768
cmd/root_test.go
@@ -16,25 +16,41 @@ package cmd
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"context"
|
||||
_ "embed"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"path"
|
||||
"path/filepath"
|
||||
"regexp"
|
||||
"runtime"
|
||||
"strings"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/google/go-cmp/cmp"
|
||||
|
||||
"github.com/googleapis/genai-toolbox/internal/auth/google"
|
||||
"github.com/googleapis/genai-toolbox/internal/log"
|
||||
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
cloudsqlpgsrc "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
|
||||
httpsrc "github.com/googleapis/genai-toolbox/internal/sources/http"
|
||||
"github.com/googleapis/genai-toolbox/internal/telemetry"
|
||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/postgressql"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/http"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
|
||||
"github.com/googleapis/genai-toolbox/internal/util"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
func withDefaults(c server.ServerConfig) server.ServerConfig {
|
||||
data, _ := os.ReadFile("version.txt")
|
||||
c.Version = strings.TrimSpace(string(data))
|
||||
version := strings.TrimSpace(string(data)) // Preserving 'data', new var for clarity
|
||||
c.Version = version + "+" + strings.Join([]string{"dev", runtime.GOOS, runtime.GOARCH}, ".")
|
||||
|
||||
if c.Address == "" {
|
||||
c.Address = "127.0.0.1"
|
||||
}
|
||||
@@ -161,6 +177,20 @@ func TestServerConfigFlags(t *testing.T) {
|
||||
TelemetryServiceName: "toolbox-custom",
|
||||
}),
|
||||
},
|
||||
{
|
||||
desc: "stdio",
|
||||
args: []string{"--stdio"},
|
||||
want: withDefaults(server.ServerConfig{
|
||||
Stdio: true,
|
||||
}),
|
||||
},
|
||||
{
|
||||
desc: "disable reload",
|
||||
args: []string{"--disable-reload"},
|
||||
want: withDefaults(server.ServerConfig{
|
||||
DisableReload: true,
|
||||
}),
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
@@ -185,18 +215,118 @@ func TestToolFileFlag(t *testing.T) {
|
||||
{
|
||||
desc: "default value",
|
||||
args: []string{},
|
||||
want: "tools.yaml",
|
||||
want: "",
|
||||
},
|
||||
{
|
||||
desc: "foo file",
|
||||
args: []string{"--tools_file", "foo.yaml"},
|
||||
args: []string{"--tools-file", "foo.yaml"},
|
||||
want: "foo.yaml",
|
||||
},
|
||||
{
|
||||
desc: "address long",
|
||||
args: []string{"--tools_file", "bar.yaml"},
|
||||
args: []string{"--tools-file", "bar.yaml"},
|
||||
want: "bar.yaml",
|
||||
},
|
||||
{
|
||||
desc: "deprecated flag",
|
||||
args: []string{"--tools_file", "foo.yaml"},
|
||||
want: "foo.yaml",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
c, _, err := invokeCommand(tc.args)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error invoking command: %s", err)
|
||||
}
|
||||
if c.tools_file != tc.want {
|
||||
t.Fatalf("got %v, want %v", c.cfg, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestToolsFilesFlag(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
args []string
|
||||
want []string
|
||||
}{
|
||||
{
|
||||
desc: "no value",
|
||||
args: []string{},
|
||||
want: []string{},
|
||||
},
|
||||
{
|
||||
desc: "single file",
|
||||
args: []string{"--tools-files", "foo.yaml"},
|
||||
want: []string{"foo.yaml"},
|
||||
},
|
||||
{
|
||||
desc: "multiple files",
|
||||
args: []string{"--tools-files", "foo.yaml,bar.yaml"},
|
||||
want: []string{"foo.yaml", "bar.yaml"},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
c, _, err := invokeCommand(tc.args)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error invoking command: %s", err)
|
||||
}
|
||||
if diff := cmp.Diff(c.tools_files, tc.want); diff != "" {
|
||||
t.Fatalf("got %v, want %v", c.tools_files, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestToolsFolderFlag(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
args []string
|
||||
want string
|
||||
}{
|
||||
{
|
||||
desc: "no value",
|
||||
args: []string{},
|
||||
want: "",
|
||||
},
|
||||
{
|
||||
desc: "folder set",
|
||||
args: []string{"--tools-folder", "test-folder"},
|
||||
want: "test-folder",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
c, _, err := invokeCommand(tc.args)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error invoking command: %s", err)
|
||||
}
|
||||
if c.tools_folder != tc.want {
|
||||
t.Fatalf("got %v, want %v", c.tools_folder, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestPrebuiltFlag(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
args []string
|
||||
want string
|
||||
}{
|
||||
{
|
||||
desc: "default value",
|
||||
args: []string{},
|
||||
want: "",
|
||||
},
|
||||
{
|
||||
desc: "custom pre built flag",
|
||||
args: []string{"--tools-file", "alloydb"},
|
||||
want: "alloydb",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
@@ -313,13 +443,14 @@ func TestParseToolFile(t *testing.T) {
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: postgressql.ToolKind,
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
Parameters: []tools.Parameter{
|
||||
tools.NewStringParameter("country", "some description"),
|
||||
},
|
||||
AuthRequired: []string{},
|
||||
},
|
||||
},
|
||||
Toolsets: server.ToolsetConfigs{
|
||||
@@ -442,11 +573,12 @@ func TestParseToolFileWithAuth(t *testing.T) {
|
||||
},
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: postgressql.ToolKind,
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
Name: "example_tool",
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
AuthRequired: []string{},
|
||||
Parameters: []tools.Parameter{
|
||||
tools.NewStringParameter("country", "some description"),
|
||||
tools.NewIntParameterWithAuth("id", "user id", []tools.ParamAuthService{{Name: "my-google-service", Field: "user_id"}}),
|
||||
@@ -540,11 +672,113 @@ func TestParseToolFileWithAuth(t *testing.T) {
|
||||
},
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: postgressql.ToolKind,
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
Name: "example_tool",
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
AuthRequired: []string{},
|
||||
Parameters: []tools.Parameter{
|
||||
tools.NewStringParameter("country", "some description"),
|
||||
tools.NewIntParameterWithAuth("id", "user id", []tools.ParamAuthService{{Name: "my-google-service", Field: "user_id"}}),
|
||||
tools.NewStringParameterWithAuth("email", "user email", []tools.ParamAuthService{{Name: "my-google-service", Field: "email"}, {Name: "other-google-service", Field: "other_email"}}),
|
||||
},
|
||||
},
|
||||
},
|
||||
Toolsets: server.ToolsetConfigs{
|
||||
"example_toolset": tools.ToolsetConfig{
|
||||
Name: "example_toolset",
|
||||
ToolNames: []string{"example_tool"},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
description: "basic example with authRequired",
|
||||
in: `
|
||||
sources:
|
||||
my-pg-instance:
|
||||
kind: cloud-sql-postgres
|
||||
project: my-project
|
||||
region: my-region
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
user: my_user
|
||||
password: my_pass
|
||||
authServices:
|
||||
my-google-service:
|
||||
kind: google
|
||||
clientId: my-client-id
|
||||
other-google-service:
|
||||
kind: google
|
||||
clientId: other-client-id
|
||||
|
||||
tools:
|
||||
example_tool:
|
||||
kind: postgres-sql
|
||||
source: my-pg-instance
|
||||
description: some description
|
||||
statement: |
|
||||
SELECT * FROM SQL_STATEMENT;
|
||||
authRequired:
|
||||
- my-google-service
|
||||
parameters:
|
||||
- name: country
|
||||
type: string
|
||||
description: some description
|
||||
- name: id
|
||||
type: integer
|
||||
description: user id
|
||||
authServices:
|
||||
- name: my-google-service
|
||||
field: user_id
|
||||
- name: email
|
||||
type: string
|
||||
description: user email
|
||||
authServices:
|
||||
- name: my-google-service
|
||||
field: email
|
||||
- name: other-google-service
|
||||
field: other_email
|
||||
|
||||
toolsets:
|
||||
example_toolset:
|
||||
- example_tool
|
||||
`,
|
||||
wantToolsFile: ToolsFile{
|
||||
Sources: server.SourceConfigs{
|
||||
"my-pg-instance": cloudsqlpgsrc.Config{
|
||||
Name: "my-pg-instance",
|
||||
Kind: cloudsqlpgsrc.SourceKind,
|
||||
Project: "my-project",
|
||||
Region: "my-region",
|
||||
Instance: "my-instance",
|
||||
IPType: "public",
|
||||
Database: "my_db",
|
||||
User: "my_user",
|
||||
Password: "my_pass",
|
||||
},
|
||||
},
|
||||
AuthServices: server.AuthServiceConfigs{
|
||||
"my-google-service": google.Config{
|
||||
Name: "my-google-service",
|
||||
Kind: google.AuthServiceKind,
|
||||
ClientID: "my-client-id",
|
||||
},
|
||||
"other-google-service": google.Config{
|
||||
Name: "other-google-service",
|
||||
Kind: google.AuthServiceKind,
|
||||
ClientID: "other-client-id",
|
||||
},
|
||||
},
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
AuthRequired: []string{"my-google-service"},
|
||||
Parameters: []tools.Parameter{
|
||||
tools.NewStringParameter("country", "some description"),
|
||||
tools.NewIntParameterWithAuth("id", "user id", []tools.ParamAuthService{{Name: "my-google-service", Field: "user_id"}}),
|
||||
@@ -583,3 +817,505 @@ func TestParseToolFileWithAuth(t *testing.T) {
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func TestEnvVarReplacement(t *testing.T) {
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
os.Setenv("TestHeader", "ACTUAL_HEADER")
|
||||
os.Setenv("API_KEY", "ACTUAL_API_KEY")
|
||||
os.Setenv("clientId", "ACTUAL_CLIENT_ID")
|
||||
os.Setenv("clientId2", "ACTUAL_CLIENT_ID_2")
|
||||
os.Setenv("toolset_name", "ACTUAL_TOOLSET_NAME")
|
||||
os.Setenv("cat_string", "cat")
|
||||
os.Setenv("food_string", "food")
|
||||
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
description string
|
||||
in string
|
||||
wantToolsFile ToolsFile
|
||||
}{
|
||||
{
|
||||
description: "file with env var example",
|
||||
in: `
|
||||
sources:
|
||||
my-http-instance:
|
||||
kind: http
|
||||
baseUrl: http://test_server/
|
||||
timeout: 10s
|
||||
headers:
|
||||
Authorization: ${TestHeader}
|
||||
queryParams:
|
||||
api-key: ${API_KEY}
|
||||
authServices:
|
||||
my-google-service:
|
||||
kind: google
|
||||
clientId: ${clientId}
|
||||
other-google-service:
|
||||
kind: google
|
||||
clientId: ${clientId2}
|
||||
|
||||
tools:
|
||||
example_tool:
|
||||
kind: http
|
||||
source: my-instance
|
||||
method: GET
|
||||
path: "search?name=alice&pet=${cat_string}"
|
||||
description: some description
|
||||
authRequired:
|
||||
- my-google-auth-service
|
||||
- other-auth-service
|
||||
queryParams:
|
||||
- name: country
|
||||
type: string
|
||||
description: some description
|
||||
authServices:
|
||||
- name: my-google-auth-service
|
||||
field: user_id
|
||||
- name: other-auth-service
|
||||
field: user_id
|
||||
requestBody: |
|
||||
{
|
||||
"age": {{.age}},
|
||||
"city": "{{.city}}",
|
||||
"food": "${food_string}",
|
||||
"other": "$OTHER"
|
||||
}
|
||||
bodyParams:
|
||||
- name: age
|
||||
type: integer
|
||||
description: age num
|
||||
- name: city
|
||||
type: string
|
||||
description: city string
|
||||
headers:
|
||||
Authorization: API_KEY
|
||||
Content-Type: application/json
|
||||
headerParams:
|
||||
- name: Language
|
||||
type: string
|
||||
description: language string
|
||||
|
||||
toolsets:
|
||||
${toolset_name}:
|
||||
- example_tool
|
||||
`,
|
||||
wantToolsFile: ToolsFile{
|
||||
Sources: server.SourceConfigs{
|
||||
"my-http-instance": httpsrc.Config{
|
||||
Name: "my-http-instance",
|
||||
Kind: httpsrc.SourceKind,
|
||||
BaseURL: "http://test_server/",
|
||||
Timeout: "10s",
|
||||
DefaultHeaders: map[string]string{"Authorization": "ACTUAL_HEADER"},
|
||||
QueryParams: map[string]string{"api-key": "ACTUAL_API_KEY"},
|
||||
},
|
||||
},
|
||||
AuthServices: server.AuthServiceConfigs{
|
||||
"my-google-service": google.Config{
|
||||
Name: "my-google-service",
|
||||
Kind: google.AuthServiceKind,
|
||||
ClientID: "ACTUAL_CLIENT_ID",
|
||||
},
|
||||
"other-google-service": google.Config{
|
||||
Name: "other-google-service",
|
||||
Kind: google.AuthServiceKind,
|
||||
ClientID: "ACTUAL_CLIENT_ID_2",
|
||||
},
|
||||
},
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": http.Config{
|
||||
Name: "example_tool",
|
||||
Kind: "http",
|
||||
Source: "my-instance",
|
||||
Method: "GET",
|
||||
Path: "search?name=alice&pet=cat",
|
||||
Description: "some description",
|
||||
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
|
||||
QueryParams: []tools.Parameter{
|
||||
tools.NewStringParameterWithAuth("country", "some description",
|
||||
[]tools.ParamAuthService{{Name: "my-google-auth-service", Field: "user_id"},
|
||||
{Name: "other-auth-service", Field: "user_id"}}),
|
||||
},
|
||||
RequestBody: `{
|
||||
"age": {{.age}},
|
||||
"city": "{{.city}}",
|
||||
"food": "food",
|
||||
"other": "$OTHER"
|
||||
}
|
||||
`,
|
||||
BodyParams: []tools.Parameter{tools.NewIntParameter("age", "age num"), tools.NewStringParameter("city", "city string")},
|
||||
Headers: map[string]string{"Authorization": "API_KEY", "Content-Type": "application/json"},
|
||||
HeaderParams: []tools.Parameter{tools.NewStringParameter("Language", "language string")},
|
||||
},
|
||||
},
|
||||
Toolsets: server.ToolsetConfigs{
|
||||
"ACTUAL_TOOLSET_NAME": tools.ToolsetConfig{
|
||||
Name: "ACTUAL_TOOLSET_NAME",
|
||||
ToolNames: []string{"example_tool"},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.description, func(t *testing.T) {
|
||||
toolsFile, err := parseToolsFile(ctx, testutils.FormatYaml(tc.in))
|
||||
if err != nil {
|
||||
t.Fatalf("failed to parse input: %v", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantToolsFile.Sources, toolsFile.Sources); diff != "" {
|
||||
t.Fatalf("incorrect sources parse: diff %v", diff)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantToolsFile.AuthServices, toolsFile.AuthServices); diff != "" {
|
||||
t.Fatalf("incorrect authServices parse: diff %v", diff)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantToolsFile.Tools, toolsFile.Tools); diff != "" {
|
||||
t.Fatalf("incorrect tools parse: diff %v", diff)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantToolsFile.Toolsets, toolsFile.Toolsets); diff != "" {
|
||||
t.Fatalf("incorrect tools parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
// normalizeFilepaths is a helper function to allow same filepath formats for Mac and Windows.
|
||||
// this prevents needing multiple "want" cases for TestResolveWatcherInputs
|
||||
func normalizeFilepaths(m map[string]bool) map[string]bool {
|
||||
newMap := make(map[string]bool)
|
||||
for k, v := range m {
|
||||
newMap[filepath.ToSlash(k)] = v
|
||||
}
|
||||
return newMap
|
||||
}
|
||||
|
||||
func TestResolveWatcherInputs(t *testing.T) {
|
||||
tcs := []struct {
|
||||
description string
|
||||
toolsFile string
|
||||
toolsFiles []string
|
||||
toolsFolder string
|
||||
wantWatchDirs map[string]bool
|
||||
wantWatchedFiles map[string]bool
|
||||
}{
|
||||
{
|
||||
description: "single tools file",
|
||||
toolsFile: "tools_folder/example_tools.yaml",
|
||||
toolsFiles: []string{},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true},
|
||||
wantWatchedFiles: map[string]bool{"tools_folder/example_tools.yaml": true},
|
||||
},
|
||||
{
|
||||
description: "default tools file (root dir)",
|
||||
toolsFile: "tools.yaml",
|
||||
toolsFiles: []string{},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{".": true},
|
||||
wantWatchedFiles: map[string]bool{"tools.yaml": true},
|
||||
},
|
||||
{
|
||||
description: "multiple files in different folders",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{"tools_folder/example_tools.yaml", "tools_folder2/example_tools.yaml"},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true, "tools_folder2": true},
|
||||
wantWatchedFiles: map[string]bool{
|
||||
"tools_folder/example_tools.yaml": true,
|
||||
"tools_folder2/example_tools.yaml": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
description: "multiple files in same folder",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{"tools_folder/example_tools.yaml", "tools_folder/example_tools2.yaml"},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true},
|
||||
wantWatchedFiles: map[string]bool{
|
||||
"tools_folder/example_tools.yaml": true,
|
||||
"tools_folder/example_tools2.yaml": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
description: "multiple files in different levels",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{
|
||||
"tools_folder/example_tools.yaml",
|
||||
"tools_folder/special_tools/example_tools2.yaml"},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true, "tools_folder/special_tools": true},
|
||||
wantWatchedFiles: map[string]bool{
|
||||
"tools_folder/example_tools.yaml": true,
|
||||
"tools_folder/special_tools/example_tools2.yaml": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
description: "tools folder",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{},
|
||||
toolsFolder: "tools_folder",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true},
|
||||
wantWatchedFiles: map[string]bool{},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.description, func(t *testing.T) {
|
||||
gotWatchDirs, gotWatchedFiles := resolveWatcherInputs(tc.toolsFile, tc.toolsFiles, tc.toolsFolder)
|
||||
|
||||
normalizedGotWatchDirs := normalizeFilepaths(gotWatchDirs)
|
||||
normalizedGotWatchedFiles := normalizeFilepaths(gotWatchedFiles)
|
||||
|
||||
if diff := cmp.Diff(tc.wantWatchDirs, normalizedGotWatchDirs); diff != "" {
|
||||
t.Errorf("incorrect watchDirs: diff %v", diff)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantWatchedFiles, normalizedGotWatchedFiles); diff != "" {
|
||||
t.Errorf("incorrect watchedFiles: diff %v", diff)
|
||||
}
|
||||
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// helper function for testing file detection in dynamic reloading
|
||||
func tmpFileWithCleanup(content []byte) (string, func(), error) {
|
||||
f, err := os.CreateTemp("", "*")
|
||||
if err != nil {
|
||||
return "", nil, err
|
||||
}
|
||||
cleanup := func() { os.Remove(f.Name()) }
|
||||
|
||||
if _, err := f.Write(content); err != nil {
|
||||
cleanup()
|
||||
return "", nil, err
|
||||
}
|
||||
if err := f.Close(); err != nil {
|
||||
cleanup()
|
||||
return "", nil, err
|
||||
}
|
||||
return f.Name(), cleanup, err
|
||||
}
|
||||
|
||||
func TestSingleEdit(t *testing.T) {
|
||||
ctx, cancelCtx := context.WithTimeout(context.Background(), time.Minute)
|
||||
defer cancelCtx()
|
||||
|
||||
pr, pw := io.Pipe()
|
||||
defer pw.Close()
|
||||
defer pr.Close()
|
||||
|
||||
fileToWatch, cleanup, err := tmpFileWithCleanup([]byte("initial content"))
|
||||
if err != nil {
|
||||
t.Fatalf("error editing tools file %s", err)
|
||||
}
|
||||
defer cleanup()
|
||||
|
||||
logger, err := log.NewStdLogger(pw, pw, "DEBUG")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to setup logger %s", err)
|
||||
}
|
||||
ctx = util.WithLogger(ctx, logger)
|
||||
|
||||
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to setup instrumentation %s", err)
|
||||
}
|
||||
ctx = util.WithInstrumentation(ctx, instrumentation)
|
||||
|
||||
mockServer := &server.Server{}
|
||||
|
||||
cleanFileToWatch := filepath.Clean(fileToWatch)
|
||||
watchDir := filepath.Dir(cleanFileToWatch)
|
||||
|
||||
watchedFiles := map[string]bool{cleanFileToWatch: true}
|
||||
watchDirs := map[string]bool{watchDir: true}
|
||||
|
||||
go watchChanges(ctx, watchDirs, watchedFiles, mockServer)
|
||||
|
||||
// escape backslash so regex doesn't fail on windows filepaths
|
||||
regexEscapedPathFile := strings.ReplaceAll(cleanFileToWatch, `\`, `\\\\*\\`)
|
||||
regexEscapedPathFile = path.Clean(regexEscapedPathFile)
|
||||
|
||||
regexEscapedPathDir := strings.ReplaceAll(watchDir, `\`, `\\\\*\\`)
|
||||
regexEscapedPathDir = path.Clean(regexEscapedPathDir)
|
||||
|
||||
begunWatchingDir := regexp.MustCompile(fmt.Sprintf(`DEBUG "Added directory %s to watcher."`, regexEscapedPathDir))
|
||||
_, err = testutils.WaitForString(ctx, begunWatchingDir, pr)
|
||||
if err != nil {
|
||||
t.Fatalf("timeout or error waiting for watcher to start: %s", err)
|
||||
}
|
||||
|
||||
err = os.WriteFile(fileToWatch, []byte("modification"), 0777)
|
||||
if err != nil {
|
||||
t.Fatalf("error writing to file: %v", err)
|
||||
}
|
||||
|
||||
detectedFileChange := regexp.MustCompile(fmt.Sprintf(`DEBUG "WRITE event detected in %s"`, regexEscapedPathFile))
|
||||
_, err = testutils.WaitForString(ctx, detectedFileChange, pr)
|
||||
if err != nil {
|
||||
t.Fatalf("timeout or error waiting for file to detect write: %s", err)
|
||||
}
|
||||
}
|
||||
|
||||
func TestPrebuiltTools(t *testing.T) {
|
||||
alloydb_config, _ := prebuiltconfigs.Get("alloydb-postgres")
|
||||
bigquery_config, _ := prebuiltconfigs.Get("bigquery")
|
||||
cloudsqlpg_config, _ := prebuiltconfigs.Get("cloud-sql-postgres")
|
||||
cloudsqlmysql_config, _ := prebuiltconfigs.Get("cloud-sql-mysql")
|
||||
cloudsqlmssql_config, _ := prebuiltconfigs.Get("cloud-sql-mssql")
|
||||
postgresconfig, _ := prebuiltconfigs.Get("postgres")
|
||||
spanner_config, _ := prebuiltconfigs.Get("spanner")
|
||||
spannerpg_config, _ := prebuiltconfigs.Get("spanner-postgres")
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
name string
|
||||
in []byte
|
||||
wantToolset server.ToolsetConfigs
|
||||
}{
|
||||
{
|
||||
name: "alloydb prebuilt tools",
|
||||
in: alloydb_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"alloydb-postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "alloydb-postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "bigquery prebuilt tools",
|
||||
in: bigquery_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"bigquery-database-tools": tools.ToolsetConfig{
|
||||
Name: "bigquery-database-tools",
|
||||
ToolNames: []string{"execute_sql", "get_dataset_info", "get_table_info", "list_dataset_ids", "list_table_ids"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "cloudsqlpg prebuilt tools",
|
||||
in: cloudsqlpg_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"cloud-sql-postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "cloud-sql-postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "cloudsqlmysql prebuilt tools",
|
||||
in: cloudsqlmysql_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"cloud-sql-mysql-database-tools": tools.ToolsetConfig{
|
||||
Name: "cloud-sql-mysql-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "cloudsqlmssql prebuilt tools",
|
||||
in: cloudsqlmssql_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"cloud-sql-mssql-database-tools": tools.ToolsetConfig{
|
||||
Name: "cloud-sql-mssql-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "postgres prebuilt tools",
|
||||
in: postgresconfig,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "spanner prebuilt tools",
|
||||
in: spanner_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"spanner-database-tools": tools.ToolsetConfig{
|
||||
Name: "spanner-database-tools",
|
||||
ToolNames: []string{"execute_sql", "execute_sql_dql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "spanner pg prebuilt tools",
|
||||
in: spannerpg_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"spanner-postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "spanner-postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "execute_sql_dql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
toolsFile, err := parseToolsFile(ctx, tc.in)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to parse input: %v", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantToolset, toolsFile.Toolsets); diff != "" {
|
||||
t.Fatalf("incorrect tools parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestUpdateLogLevel(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
stdio bool
|
||||
logLevel string
|
||||
want bool
|
||||
}{
|
||||
{
|
||||
desc: "no stdio",
|
||||
stdio: false,
|
||||
logLevel: "info",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
desc: "stdio with info log",
|
||||
stdio: true,
|
||||
logLevel: "info",
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
desc: "stdio with debug log",
|
||||
stdio: true,
|
||||
logLevel: "debug",
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
desc: "stdio with warn log",
|
||||
stdio: true,
|
||||
logLevel: "warn",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
desc: "stdio with error log",
|
||||
stdio: true,
|
||||
logLevel: "error",
|
||||
want: false,
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := updateLogLevel(tc.stdio, tc.logLevel)
|
||||
if got != tc.want {
|
||||
t.Fatalf("incorrect indication to update log level: got %t, want %t", got, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1 +1 @@
|
||||
0.2.1
|
||||
0.8.0
|
||||
|
||||
@@ -12,4 +12,4 @@ description: >
|
||||
<link rel="canonical" href="getting-started/introduction/"/>
|
||||
<meta http-equiv="refresh" content="0;url=getting-started/introduction"/>
|
||||
</head>
|
||||
</html>
|
||||
</html>
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
---
|
||||
title: "About"
|
||||
type: docs
|
||||
weight: 5
|
||||
description: A list of other information related to Toolbox.
|
||||
weight: 6
|
||||
description: >
|
||||
A list of other information related to Toolbox.
|
||||
---
|
||||
|
||||
@@ -7,24 +7,24 @@ description: Frequently asked questions about Toolbox.
|
||||
|
||||
## How can I deploy or run Toolbox?
|
||||
|
||||
Gen AI Toolbox for Databases is open-source and can be ran or deployed to a
|
||||
MCP Toolbox for Databases is open-source and can be ran or deployed to a
|
||||
multitude of environments. For convenience, we release [compiled binaries and
|
||||
docker images][release-notes] (but you can always compile yourself as well!).
|
||||
docker images][release-notes] (but you can always compile yourself as well!).
|
||||
|
||||
For detailed instructions, check our these resources:
|
||||
|
||||
- [Quickstart: How to Run Locally](../getting-started/local_quickstart.md)
|
||||
- [Deploy to Cloud Run](../how-to/deploy_toolbox.md)
|
||||
|
||||
[release-notes]: https://github.com/googleapis/genai-toolbox/releases/
|
||||
|
||||
|
||||
## Do I need a Google Cloud account/project to get started with Toolbox?
|
||||
|
||||
Nope! While some of the sources Toolbox connects to may require GCP credentials,
|
||||
Toolbox doesn't require them and can connect to a bunch of different resources
|
||||
that don't.
|
||||
that don't.
|
||||
|
||||
## Does Toolbox take contributions from external users?
|
||||
## Does Toolbox take contributions from external users?
|
||||
|
||||
Absolutely! Please check out our [DEVELOPER.md][] for instructions on how to get
|
||||
started developing _on_ Toolbox instead of with it, and the [CONTRIBUTING.md][]
|
||||
@@ -33,17 +33,16 @@ for instructions on completing the CLA and getting a PR accepted.
|
||||
[DEVELOPER.md]: https://github.com/googleapis/genai-toolbox/blob/main/DEVELOPER.md
|
||||
[CONTRIBUTING.MD]: https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md
|
||||
|
||||
|
||||
## Can Toolbox support a feature to let me do _$FOO_?
|
||||
## Can Toolbox support a feature to let me do _$FOO_?
|
||||
|
||||
Maybe? The best place to start is by [opening an issue][github-issue] for
|
||||
discussion (or seeing if there is already one open), so we can better understand
|
||||
your use case and the best way to solve it. Generally we aim to prioritize the
|
||||
most popular issues, so make sure to +1 ones you are the most interested in.
|
||||
most popular issues, so make sure to +1 ones you are the most interested in.
|
||||
|
||||
[github-issue]: https://github.com/googleapis/genai-toolbox/issues
|
||||
|
||||
## Can Toolbox be used for non-database tools?
|
||||
## Can Toolbox be used for non-database tools?
|
||||
|
||||
Currently, Toolbox is primarily focused on making it easier to create and
|
||||
develop tools focused on interacting with Databases. We believe that there are a
|
||||
@@ -55,34 +54,34 @@ GRPC tools might be helpful in assisting with migrating to Toolbox or in
|
||||
accomplishing more complicated workflows. We're looking into what that might
|
||||
best look like in Toolbox.
|
||||
|
||||
## Can I use _$BAR_ orchestration framework to use tools from Toolbox?
|
||||
## Can I use _$BAR_ orchestration framework to use tools from Toolbox?
|
||||
|
||||
Currently, Toolbox only supports a limited number of client SDKs at our initial
|
||||
launch. We are investigating support for more frameworks as well as more general
|
||||
approaches for users without a framework -- look forward to seeing an update
|
||||
soon.
|
||||
|
||||
## Why does Toolbox use a server-client architecture pattern?
|
||||
## Why does Toolbox use a server-client architecture pattern?
|
||||
|
||||
Toolbox's server-client architecture allows us to more easily support a wide
|
||||
variety of languages and frameworks with a centralized implementation. It also
|
||||
allows us to tackle problems like connection pooling, auth, or caching more
|
||||
completely than entirely client-side solutions.
|
||||
|
||||
## Why was Toolbox written in Go?
|
||||
## Why was Toolbox written in Go?
|
||||
|
||||
While a large part of the Gen AI Ecosystem is predominately Python, we opted to
|
||||
use Go. We chose Go because it's still easy and simple to use, but also easier
|
||||
to write fast, efficient, and concurrent servers. Additionally, given the
|
||||
server-client architecture, we can still meet many developers where they are
|
||||
with clients in their preferred language. As Gen AI matures, we want developers to be able to use Toolbox on the serving path of mission critical applications. It's easier to build the needed robustness, performance and scalability in Go than in Python.
|
||||
with clients in their preferred language. As Gen AI matures, we want developers
|
||||
to be able to use Toolbox on the serving path of mission critical applications.
|
||||
It's easier to build the needed robustness, performance and scalability in Go
|
||||
than in Python.
|
||||
|
||||
## Is Toolbox compatible with Model Context Protocol (MCP)?
|
||||
|
||||
## Is Toolbox compatible with Model Context Protocol (MCP)?
|
||||
|
||||
Toolbox currently uses it's own custom protocol for server-client communication.
|
||||
[Anthropic's Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
|
||||
launched towards the end of Toolbox's development, and is currently missing
|
||||
functionality to support some of our features. We're currently exploring how
|
||||
best to bring Toolbox's functionality to the wider MCP ecosystem.
|
||||
|
||||
Yes! Toolbox is compatible with [Anthropic's Model Context Protocol
|
||||
(MCP)](https://modelcontextprotocol.io/). Please checkout [Connect via
|
||||
MCP](../how-to/connect_via_mcp.md) on how to connect to Toolbox with an MCP
|
||||
client.
|
||||
|
||||
@@ -2,5 +2,6 @@
|
||||
title: "Concepts"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: Some core concepts in Toolbox
|
||||
description: >
|
||||
Some core concepts in Toolbox
|
||||
---
|
||||
|
||||
@@ -2,7 +2,8 @@
|
||||
title: "Telemetry"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: An overview of telemetry and observability in Toolbox.
|
||||
description: >
|
||||
An overview of telemetry and observability in Toolbox.
|
||||
---
|
||||
|
||||
## About
|
||||
@@ -16,7 +17,6 @@ through [OpenTelemetry](https://opentelemetry.io/). Additional flags can be
|
||||
passed to Toolbox to enable different logging behavior, or to export metrics
|
||||
through a specific [exporter](#exporter).
|
||||
|
||||
|
||||
## Logging
|
||||
|
||||
The following flags can be used to customize Toolbox logging:
|
||||
@@ -26,14 +26,16 @@ The following flags can be used to customize Toolbox logging:
|
||||
| `--log-level` | Preferred log level, allowed values: `debug`, `info`, `warn`, `error`. Default: `info`. |
|
||||
| `--logging-format` | Preferred logging format, allowed values: `standard`, `json`. Default: `standard`. |
|
||||
|
||||
__Example:__
|
||||
**Example:**
|
||||
|
||||
```bash
|
||||
./toolbox --tools_file "tools.yaml" --log-level warn --logging-format json
|
||||
./toolbox --tools-file "tools.yaml" --log-level warn --logging-format json
|
||||
```
|
||||
|
||||
### Level
|
||||
|
||||
Toolbox supports the following log levels, including:
|
||||
|
||||
| **Log level** | **Description** |
|
||||
|---------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| Debug | Debug logs typically contain information that is only useful during the debugging phase and may be of little value during production. |
|
||||
@@ -45,17 +47,18 @@ Toolbox will only output logs that are equal or more severe to the
|
||||
level that it is set. Below are the log levels that Toolbox supports in the
|
||||
order of severity.
|
||||
|
||||
|
||||
### Format
|
||||
|
||||
Toolbox supports both standard and structured logging format.
|
||||
|
||||
The standard logging outputs log as string:
|
||||
|
||||
```
|
||||
2024-11-12T15:08:11.451377-08:00 INFO "Initialized 0 sources.\n"
|
||||
```
|
||||
|
||||
The structured logging outputs log as JSON:
|
||||
|
||||
```
|
||||
{
|
||||
"timestamp":"2024-11-04T16:45:11.987299-08:00",
|
||||
@@ -65,9 +68,9 @@ The structured logging outputs log as JSON:
|
||||
}
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
{{< notice tip >}}
|
||||
`logging.googleapis.com/sourceLocation` shows the source code
|
||||
location information associated with the log entry, if any.
|
||||
location information associated with the log entry, if any.
|
||||
{{< /notice >}}
|
||||
|
||||
## Telemetry
|
||||
@@ -81,18 +84,22 @@ A metric is a measurement of a service captured at runtime. The collected data
|
||||
can be used to provide important insights into the service. Toolbox provides the
|
||||
following custom metrics:
|
||||
|
||||
| **Metric Name** | **Description** |
|
||||
|------------------------------------|-------------------------------------------------------|
|
||||
| `toolbox.server.toolset.get.count` | Counts the number of toolset manifest requests served |
|
||||
| `toolbox.server.tool.get.count` | Counts the number of tool manifest requests served |
|
||||
| `toolbox.server.tool.get.invoke` | Counts the number of tool invocation requests served |
|
||||
| **Metric Name** | **Description** |
|
||||
|------------------------------------|---------------------------------------------------------|
|
||||
| `toolbox.server.toolset.get.count` | Counts the number of toolset manifest requests served |
|
||||
| `toolbox.server.tool.get.count` | Counts the number of tool manifest requests served |
|
||||
| `toolbox.server.tool.get.invoke` | Counts the number of tool invocation requests served |
|
||||
| `toolbox.server.mcp.sse.count` | Counts the number of mcp sse connection requests served |
|
||||
| `toolbox.server.mcp.post.count` | Counts the number of mcp post requests served |
|
||||
|
||||
All custom metrics have the following attributes/labels:
|
||||
|
||||
| **Metric Attributes** | **Description** |
|
||||
|-----------------------|-----------------------------------------------------------|
|
||||
| `toolbox.name` | Name of the toolset or tool, if applicable. |
|
||||
| `toolbox.status` | Operation status code, for example: `success`, `failure`. |
|
||||
| **Metric Attributes** | **Description** |
|
||||
|----------------------------|-----------------------------------------------------------|
|
||||
| `toolbox.name` | Name of the toolset or tool, if applicable. |
|
||||
| `toolbox.operation.status` | Operation status code, for example: `success`, `failure`. |
|
||||
| `toolbox.sse.sessionId` | Session id for sse connection, if applicable. |
|
||||
| `toolbox.method` | Method of JSON-RPC request, if applicable. |
|
||||
|
||||
### Traces
|
||||
|
||||
@@ -120,7 +127,6 @@ unified [resource][resource]. The list of resource attributes included are:
|
||||
| `service.name` | Open telemetry service name. Defaulted to `toolbox`. User can set the service name via flag mentioned above to distinguish between different toolbox service. |
|
||||
| `service.version` | The version of Toolbox used. |
|
||||
|
||||
|
||||
[resource]: https://opentelemetry.io/docs/languages/go/resources/
|
||||
|
||||
### Exporter
|
||||
@@ -146,9 +152,10 @@ Exporter][gcp-trace-exporter].
|
||||
[gcp-trace-exporter]:
|
||||
https://github.com/GoogleCloudPlatform/opentelemetry-operations-go/tree/main/exporter/trace
|
||||
|
||||
{{< notice note >}}
|
||||
If you're using Google Cloud Monitoring, the following APIs will need to be
|
||||
{{< notice note >}}
|
||||
If you're using Google Cloud Monitoring, the following APIs will need to be
|
||||
enabled:
|
||||
|
||||
- [Cloud Logging API](https://cloud.google.com/logging/docs/api/enable-api)
|
||||
- [Cloud Monitoring API](https://cloud.google.com/monitoring/api/enable-api)
|
||||
- [Cloud Trace API](https://cloud.google.com/apis/enableflow?apiid=cloudtrace.googleapis.com)
|
||||
@@ -179,7 +186,7 @@ The following flags are used to determine Toolbox's telemetry configuration:
|
||||
| **flag** | **type** | **description** |
|
||||
|----------------------------|----------|----------------------------------------------------------------------------------------------------------------|
|
||||
| `--telemetry-gcp` | bool | Enable exporting directly to Google Cloud Monitoring. Default is `false`. |
|
||||
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "http://127.0.0.1:4318"). |
|
||||
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "<http://127.0.0.1:4318>"). |
|
||||
| `--telemetry-service-name` | string | Sets the value of the `service.name` resource attribute. Default is `toolbox`. |
|
||||
|
||||
In addition to the flags noted above, you can also make additional configuration
|
||||
@@ -189,14 +196,16 @@ environmental variables.
|
||||
[sdk-configuration]:
|
||||
https://opentelemetry.io/docs/languages/sdk-configuration/general/
|
||||
|
||||
__Examples:__
|
||||
**Examples:**
|
||||
|
||||
To enable Google Cloud Exporter:
|
||||
|
||||
```bash
|
||||
./toolbox --telemetry-gcp
|
||||
```
|
||||
|
||||
To enable OTLP Exporter, provide Collector endpoint:
|
||||
|
||||
```bash
|
||||
./toolbox --telemetry-otlp="http://127.0.0.1:4553"
|
||||
```
|
||||
|
||||
@@ -3,5 +3,5 @@ title: "Getting Started"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
How to get started with Toolbox
|
||||
How to get started with Toolbox.
|
||||
---
|
||||
|
||||
1025
docs/en/getting-started/colab_quickstart.ipynb
Normal file
@@ -1,17 +1,28 @@
|
||||
---
|
||||
title: "Configuration"
|
||||
type: docs
|
||||
weight: 3
|
||||
description: How to configure Toolbox's tools.yaml file.
|
||||
weight: 4
|
||||
description: >
|
||||
How to configure Toolbox's tools.yaml file.
|
||||
---
|
||||
|
||||
The primary way to configure Toolbox is through the `tools.yaml` file. If you
|
||||
have multiple files, you can tell toolbox which to load with the `--tools_file
|
||||
have multiple files, you can tell toolbox which to load with the `--tools-file
|
||||
tools.yaml` flag.
|
||||
|
||||
You can find more detailed reference documentation to all resource types in the
|
||||
[Resources](../resources/).
|
||||
|
||||
### Using Environment Variables
|
||||
|
||||
To avoid hardcoding certain secret fields like passwords, usernames, API keys
|
||||
etc., you could use environment variables instead with the format `${ENV_NAME}`.
|
||||
|
||||
```yaml
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
### Sources
|
||||
|
||||
The `sources` section of your `tools.yaml` defines what data sources your
|
||||
@@ -25,8 +36,8 @@ sources:
|
||||
host: 127.0.0.1
|
||||
port: 5432
|
||||
database: toolbox_db
|
||||
user: toolbox_user
|
||||
password: my-password
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
For more details on configuring different types of sources, see the
|
||||
@@ -54,7 +65,6 @@ tools:
|
||||
For more details on configuring different types of tools, see the
|
||||
[Tools](../resources/tools/).
|
||||
|
||||
|
||||
### Toolsets
|
||||
|
||||
The `toolsets` section of your `tools.yaml` allows you to define groups of tools
|
||||
@@ -79,4 +89,4 @@ all_tools = client.load_toolset()
|
||||
|
||||
# This will only load the tools listed in 'my_second_toolset'
|
||||
my_second_toolset = client.load_toolset("my_second_toolset")
|
||||
```
|
||||
```
|
||||
|
||||
@@ -2,18 +2,25 @@
|
||||
title: "Introduction"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: An introduction to Gen AI Toolbox for Databases.
|
||||
description: >
|
||||
An introduction to MCP Toolbox for Databases.
|
||||
---
|
||||
|
||||
Gen AI Toolbox for Databases is an open source server that makes it easier to
|
||||
build Gen AI tools for interacting with databases. It enables you to develop
|
||||
tools easier, faster, and more securely by handling the complexities such as
|
||||
connection pooling, authentication, and more.
|
||||
MCP Toolbox for Databases is an open source MCP server for databases. It enables
|
||||
you to develop tools easier, faster, and more securely by handling the complexities
|
||||
such as connection pooling, authentication, and more.
|
||||
|
||||
## Why Toolbox?
|
||||
{{< notice note >}}
|
||||
This solution was originally named “Gen AI Toolbox for
|
||||
Databases” as its initial development predated MCP, but was renamed to align
|
||||
with recently added MCP compatibility.
|
||||
{{< /notice >}}
|
||||
|
||||
## Why Toolbox?
|
||||
|
||||
Toolbox helps you build Gen AI tools that let your agents access data in your
|
||||
database. Toolbox provides:
|
||||
|
||||
- **Simplified development**: Integrate tools to your agent in less than 10
|
||||
lines of code, reuse tools between multiple agents or frameworks, and deploy
|
||||
new versions of tools more easily.
|
||||
@@ -23,6 +30,33 @@ database. Toolbox provides:
|
||||
- **End-to-end observability**: Out of the box metrics and tracing with built-in
|
||||
support for OpenTelemetry.
|
||||
|
||||
**⚡ Supercharge Your Workflow with an AI Database Assistant ⚡**
|
||||
|
||||
Stop context-switching and let your AI assistant become a true co-developer. By
|
||||
[connecting your IDE to your databases with MCP Toolbox][connect-ide], you can
|
||||
delegate complex and time-consuming database tasks, allowing you to build faster
|
||||
and focus on what matters. This isn't just about code completion; it's about
|
||||
giving your AI the context it needs to handle the entire development lifecycle.
|
||||
|
||||
Here’s how it will save you time:
|
||||
|
||||
- **Query in Plain English**: Interact with your data using natural language
|
||||
right from your IDE. Ask complex questions like, *"How many orders were
|
||||
delivered in 2024, and what items were in them?"* without writing any SQL.
|
||||
- **Automate Database Management**: Simply describe your data needs, and let the
|
||||
AI assistant manage your database for you. It can handle generating queries,
|
||||
creating tables, adding indexes, and more.
|
||||
- **Generate Context-Aware Code**: Empower your AI assistant to generate
|
||||
application code and tests with a deep understanding of your real-time
|
||||
database schema. This accelerates the development cycle by ensuring the
|
||||
generated code is directly usable.
|
||||
- **Slash Development Overhead**: Radically reduce the time spent on manual
|
||||
setup and boilerplate. MCP Toolbox helps streamline lengthy database
|
||||
configurations, repetitive code, and error-prone schema migrations.
|
||||
|
||||
Learn [how to connect your AI tools (IDEs) to Toolbox using MCP][connect-ide].
|
||||
|
||||
[connect-ide]: ../../how-to/connect-ide/
|
||||
|
||||
## General Architecture
|
||||
|
||||
@@ -38,6 +72,7 @@ redeploying your application.
|
||||
## Getting Started
|
||||
|
||||
### Installing the server
|
||||
|
||||
For the latest version, check the [releases page][releases] and use the
|
||||
following instructions for your OS and CPU architecture.
|
||||
|
||||
@@ -51,7 +86,7 @@ To install Toolbox as a binary:
|
||||
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.2.1
|
||||
export VERSION=0.8.0
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
|
||||
chmod +x toolbox
|
||||
```
|
||||
@@ -62,7 +97,7 @@ You can also install Toolbox as a container:
|
||||
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.2.1
|
||||
export VERSION=0.8.0
|
||||
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
|
||||
```
|
||||
|
||||
@@ -73,7 +108,7 @@ To install from source, ensure you have the latest version of
|
||||
[Go installed](https://go.dev/doc/install), and then run the following command:
|
||||
|
||||
```sh
|
||||
go install github.com/googleapis/genai-toolbox@v0.2.1
|
||||
go install github.com/googleapis/genai-toolbox@v0.8.0
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
@@ -86,8 +121,11 @@ go install github.com/googleapis/genai-toolbox@v0.2.1
|
||||
execute `toolbox` to start the server:
|
||||
|
||||
```sh
|
||||
./toolbox --tools_file "tools.yaml"
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
You can use `toolbox help` for a full list of flags! To stop the server, send a
|
||||
terminate signal (`ctrl+c` on most platforms).
|
||||
@@ -100,25 +138,48 @@ out the resources in the [How-to section](../../how-to/_index.md)
|
||||
Once your server is up and running, you can load the tools into your
|
||||
application. See below the list of Client SDKs for using various frameworks:
|
||||
|
||||
#### Python
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
|
||||
Once you've installed the [Toolbox Core
|
||||
SDK](https://pypi.org/project/toolbox-core/), you can load
|
||||
tools:
|
||||
|
||||
{{< highlight python >}}
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = await client.load_toolset("toolset_name")
|
||||
{{< /highlight >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-core/README.md).
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="LangChain" lang="en" %}}
|
||||
|
||||
Once you've installed the [Toolbox LangChain
|
||||
SDK](https://github.com/googleapis/genai-toolbox-langchain-python), you can load
|
||||
SDK](https://pypi.org/project/toolbox-langchain/), you can load
|
||||
tools:
|
||||
|
||||
{{< highlight python >}}
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
{{< /highlight >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox LangChain SDK, see the
|
||||
[project's README](https://github.com/googleapis/genai-toolbox-langchain-python/blob/main/README.md).
|
||||
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/README.md).
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Llamaindex" lang="en" %}}
|
||||
@@ -131,10 +192,12 @@ tools:
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as client:
|
||||
|
||||
# these tools can be passed to your application
|
||||
|
||||
tools = client.load_toolset()
|
||||
{{< /highlight >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
@@ -142,3 +205,115 @@ For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
#### Javascript/Typescript
|
||||
|
||||
Once you've installed the [Toolbox Core
|
||||
SDK](https://www.npmjs.com/package/@toolbox-sdk/core), you can load
|
||||
tools:
|
||||
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="LangChain/Langraph" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => tool(currTool, {
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
});
|
||||
|
||||
// Use these tools in your Langchain/Langraph applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Genkit" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
import { genkit } from 'genkit';
|
||||
|
||||
// Initialise genkit
|
||||
const ai = genkit({
|
||||
plugins: [
|
||||
googleAI({
|
||||
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
|
||||
})
|
||||
],
|
||||
model: googleAI.model('gemini-2.0-flash'),
|
||||
});
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => ai.defineTool({
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
}, toolboxTool)
|
||||
|
||||
// Use these tools in your Genkit applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="LlamaIndex" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
import { tool } from "llamaindex";
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => tool({
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
parameters: toolboxTool.getParams(),
|
||||
execute: toolboxTool
|
||||
});;
|
||||
|
||||
// Use these tools in your LlamaIndex applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
|
||||
|
Before Width: | Height: | Size: 53 KiB After Width: | Height: | Size: 154 KiB |
@@ -1,34 +1,21 @@
|
||||
---
|
||||
title: "Quickstart"
|
||||
title: "Quickstart (Local)"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
How to get started running Toolbox locally with Python, PostgreSQL, and
|
||||
LangGraph or LlamaIndex.
|
||||
How to get started running Toolbox locally with Python, PostgreSQL, and [Agent Development Kit](https://google.github.io/adk-docs/),
|
||||
[LangGraph](https://www.langchain.com/langgraph), [LlamaIndex](https://www.llamaindex.ai/) or [GoogleGenAI](https://pypi.org/project/google-genai/).
|
||||
---
|
||||
|
||||
[](https://colab.research.google.com/github/googleapis/genai-toolbox/blob/main/docs/en/getting-started/colab_quickstart.ipynb)
|
||||
|
||||
## Before you begin
|
||||
|
||||
This guide assumes you have already done the following:
|
||||
This guide assumes you have already done the following:
|
||||
|
||||
1. Installed [Python 3.9+][install-python] (including [pip][install-pip] and
|
||||
your preferred virtual environment tool for managing dependencies e.g. [venv][install-venv])
|
||||
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres]
|
||||
1. Completed setup for usage with an LLM model such as
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="LangChain" lang="en" %}}
|
||||
- [langchain-vertexai](https://python.langchain.com/docs/integrations/llms/google_vertex_ai_palm/#setup) package.
|
||||
|
||||
- [langchain-google-genai](https://python.langchain.com/docs/integrations/chat/google_generative_ai/#setup) package.
|
||||
|
||||
- [langchain-anthropic](https://python.langchain.com/docs/integrations/chat/anthropic/#setup) package.
|
||||
{{% /tab %}}
|
||||
{{% tab header="LlamaIndex" lang="en" %}}
|
||||
- [llama-index-llms-google-genai](https://pypi.org/project/llama-index-llms-google-genai/) package.
|
||||
|
||||
- [llama-index-llms-anthropic](https://docs.llamaindex.ai/en/stable/examples/llm/anthropic) package.
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
[install-python]: https://wiki.python.org/moin/BeginnersGuide/Download
|
||||
[install-pip]: https://pip.pypa.io/en/stable/installation/
|
||||
@@ -38,7 +25,7 @@ This guide assumes you have already done the following:
|
||||
## Step 1: Set up your database
|
||||
|
||||
In this section, we will create a database, insert some data that needs to be
|
||||
access by our agent, and create a database user for Toolbox to connect with.
|
||||
accessed by our agent, and create a database user for Toolbox to connect with.
|
||||
|
||||
1. Connect to postgres using the `psql` command:
|
||||
|
||||
@@ -48,11 +35,48 @@ access by our agent, and create a database user for Toolbox to connect with.
|
||||
|
||||
Here, `postgres` denotes the default postgres superuser.
|
||||
|
||||
{{< notice info >}}
|
||||
|
||||
#### **Having trouble connecting?**
|
||||
|
||||
* **Password Prompt:** If you are prompted for a password for the `postgres`
|
||||
user and do not know it (or a blank password doesn't work), your PostgreSQL
|
||||
installation might require a password or a different authentication method.
|
||||
* **`FATAL: role "postgres" does not exist`:** This error means the default
|
||||
`postgres` superuser role isn't available under that name on your system.
|
||||
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
|
||||
You can typically check with `sudo systemctl status postgresql` and start it
|
||||
with `sudo systemctl start postgresql` on Linux systems.
|
||||
|
||||
<br/>
|
||||
|
||||
#### **Common Solution**
|
||||
|
||||
For password issues or if the `postgres` role seems inaccessible directly, try
|
||||
switching to the `postgres` operating system user first. This user often has
|
||||
permission to connect without a password for local connections (this is called
|
||||
peer authentication).
|
||||
|
||||
```bash
|
||||
sudo -i -u postgres
|
||||
psql -h 127.0.0.1
|
||||
```
|
||||
|
||||
Once you are in the `psql` shell using this method, you can proceed with the
|
||||
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
|
||||
`exit` to return to your normal user shell.
|
||||
|
||||
If desired, once connected to `psql` as the `postgres` OS user, you can set a
|
||||
password for the `postgres` *database* user using: `ALTER USER postgres WITH
|
||||
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
|
||||
postgres` and a password next time.
|
||||
{{< /notice >}}
|
||||
|
||||
1. Create a new database and a new user:
|
||||
|
||||
{{< notice tip >}}
|
||||
For a real application, it's best to follow the principle of least permission
|
||||
and only grant the privileges your application needs.
|
||||
{{< notice tip >}}
|
||||
For a real application, it's best to follow the principle of least permission
|
||||
and only grant the privileges your application needs.
|
||||
{{< /notice >}}
|
||||
|
||||
```sql
|
||||
@@ -64,14 +88,16 @@ access by our agent, and create a database user for Toolbox to connect with.
|
||||
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
|
||||
```
|
||||
|
||||
|
||||
|
||||
1. End the database session:
|
||||
|
||||
```bash
|
||||
\q
|
||||
```
|
||||
|
||||
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
|
||||
need to type `exit` after `\q` to leave the `postgres` user's shell
|
||||
session.)
|
||||
|
||||
1. Connect to your database with your new user:
|
||||
|
||||
```bash
|
||||
@@ -115,7 +141,6 @@ access by our agent, and create a database user for Toolbox to connect with.
|
||||
\q
|
||||
```
|
||||
|
||||
|
||||
## Step 2: Install and configure Toolbox
|
||||
|
||||
In this section, we will download Toolbox, configure our tools in a
|
||||
@@ -123,15 +148,15 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
|
||||
1. Download the latest version of Toolbox as a binary:
|
||||
|
||||
{{< notice tip >}}
|
||||
Select the
|
||||
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
|
||||
corresponding to your OS and CPU architecture.
|
||||
{{< notice tip >}}
|
||||
Select the
|
||||
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
|
||||
corresponding to your OS and CPU architecture.
|
||||
{{< /notice >}}
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```bash
|
||||
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.2.1/$OS/toolbox
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.8.0/$OS/toolbox
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
@@ -145,6 +170,11 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
such as `user`, `password`, or `database` that you may have customized in the
|
||||
previous step.
|
||||
|
||||
{{< notice tip >}}
|
||||
In practice, use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-pg-source:
|
||||
@@ -152,8 +182,8 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
host: 127.0.0.1
|
||||
port: 5432
|
||||
database: toolbox_db
|
||||
user: toolbox_user
|
||||
password: my-password
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
tools:
|
||||
search-hotels-by-name:
|
||||
kind: postgres-sql
|
||||
@@ -211,23 +241,43 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
type: string
|
||||
description: The ID of the hotel to cancel.
|
||||
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
|
||||
toolsets:
|
||||
my-toolset:
|
||||
- search-hotels-by-name
|
||||
- search-hotels-by-location
|
||||
- book-hotel
|
||||
- update-hotel
|
||||
- cancel-hotel
|
||||
```
|
||||
|
||||
For more info on tools, check out the `Resources` section of the docs.
|
||||
|
||||
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
|
||||
|
||||
```bash
|
||||
./toolbox --tools_file "tools.yaml"
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
## Step 3: Connect your agent to Toolbox
|
||||
|
||||
In this section, we will write and run a LangGraph agent that will load the Tools
|
||||
In this section, we will write and run an agent that will load the Tools
|
||||
from Toolbox.
|
||||
|
||||
{{< notice tip>}} If you prefer to experiment within a Google Colab environment,
|
||||
you can connect to a
|
||||
[local runtime](https://research.google.com/colaboratory/local-runtimes.html).
|
||||
{{< /notice >}}
|
||||
|
||||
1. In a new terminal, install the SDK package.
|
||||
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="ADK" lang="bash" >}}
|
||||
|
||||
pip install toolbox-core
|
||||
{{< /tab >}}
|
||||
{{< tab header="Langchain" lang="bash" >}}
|
||||
|
||||
pip install toolbox-langchain
|
||||
@@ -236,46 +286,143 @@ pip install toolbox-langchain
|
||||
|
||||
pip install toolbox-llamaindex
|
||||
{{< /tab >}}
|
||||
{{< tab header="Core" lang="bash" >}}
|
||||
|
||||
pip install toolbox-core
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
1. Install other required dependencies:
|
||||
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="ADK" lang="bash" >}}
|
||||
|
||||
pip install google-adk
|
||||
{{< /tab >}}
|
||||
{{< tab header="Langchain" lang="bash" >}}
|
||||
|
||||
# TODO(developer): replace with correct package if needed
|
||||
|
||||
pip install langgraph langchain-google-vertexai
|
||||
|
||||
# pip install langchain-google-genai
|
||||
|
||||
# pip install langchain-anthropic
|
||||
|
||||
{{< /tab >}}
|
||||
{{< tab header="LlamaIndex" lang="bash" >}}
|
||||
|
||||
# TODO(developer): replace with correct package if needed
|
||||
|
||||
pip install llama-index-llms-google-genai
|
||||
|
||||
# pip install llama-index-llms-anthropic
|
||||
|
||||
{{< /tab >}}
|
||||
{{< tab header="Core" lang="bash" >}}
|
||||
|
||||
pip install google-genai
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
|
||||
1. Create a new file named `hotel_agent.py` and copy the following
|
||||
code to create an agent:
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="ADK" lang="python" >}}
|
||||
from google.adk.agents import Agent
|
||||
from google.adk.runners import Runner
|
||||
from google.adk.sessions import InMemorySessionService
|
||||
from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService
|
||||
from google.genai import types
|
||||
from toolbox_core import ToolboxSyncClient
|
||||
|
||||
import asyncio
|
||||
import os
|
||||
|
||||
# TODO(developer): replace this with your Google API key
|
||||
|
||||
os.environ['GOOGLE_API_KEY'] = 'your-api-key'
|
||||
|
||||
async def main():
|
||||
with ToolboxSyncClient("<http://127.0.0.1:5000>") as toolbox_client:
|
||||
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
|
||||
root_agent = Agent(
|
||||
model='gemini-2.0-flash-001',
|
||||
name='hotel_agent',
|
||||
description='A helpful AI assistant.',
|
||||
instruction=prompt,
|
||||
tools=toolbox_client.load_toolset("my-toolset"),
|
||||
)
|
||||
|
||||
session_service = InMemorySessionService()
|
||||
artifacts_service = InMemoryArtifactService()
|
||||
session = await session_service.create_session(
|
||||
state={}, app_name='hotel_agent', user_id='123'
|
||||
)
|
||||
runner = Runner(
|
||||
app_name='hotel_agent',
|
||||
agent=root_agent,
|
||||
artifact_service=artifacts_service,
|
||||
session_service=session_service,
|
||||
)
|
||||
|
||||
queries = [
|
||||
"Find hotels in Basel with Basel in it's name.",
|
||||
"Can you book the Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
|
||||
for query in queries:
|
||||
content = types.Content(role='user', parts=[types.Part(text=query)])
|
||||
events = runner.run(session_id=session.id,
|
||||
user_id='123', new_message=content)
|
||||
|
||||
responses = (
|
||||
part.text
|
||||
for event in events
|
||||
for part in event.content.parts
|
||||
if part.text is not None
|
||||
)
|
||||
|
||||
for text in responses:
|
||||
print(text)
|
||||
|
||||
asyncio.run(main())
|
||||
{{< /tab >}}
|
||||
{{< tab header="LangChain" lang="python" >}}
|
||||
import asyncio
|
||||
|
||||
from langgraph.prebuilt import create_react_agent
|
||||
|
||||
# TODO(developer): replace this with another import if needed
|
||||
|
||||
from langchain_google_vertexai import ChatVertexAI
|
||||
|
||||
# from langchain_google_genai import ChatGoogleGenerativeAI
|
||||
|
||||
# from langchain_anthropic import ChatAnthropic
|
||||
|
||||
from langgraph.checkpoint.memory import MemorySaver
|
||||
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
@@ -287,25 +434,25 @@ queries = [
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
|
||||
def main():
|
||||
async def run_application():
|
||||
# TODO(developer): replace this with another model if needed
|
||||
model = ChatVertexAI(model_name="gemini-1.5-pro")
|
||||
# model = ChatGoogleGenerativeAI(model="gemini-1.5-pro")
|
||||
model = ChatVertexAI(model_name="gemini-2.0-flash-001")
|
||||
# model = ChatGoogleGenerativeAI(model="gemini-2.0-flash-001")
|
||||
# model = ChatAnthropic(model="claude-3-5-sonnet-20240620")
|
||||
|
||||
|
||||
# Load the tools from the Toolbox server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
tools = client.load_toolset()
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
tools = await client.aload_toolset()
|
||||
|
||||
agent = create_react_agent(model, tools, checkpointer=MemorySaver())
|
||||
agent = create_react_agent(model, tools, checkpointer=MemorySaver())
|
||||
|
||||
config = {"configurable": {"thread_id": "thread-1"}}
|
||||
for query in queries:
|
||||
inputs = {"messages": [("user", prompt + query)]}
|
||||
response = agent.invoke(inputs, stream_mode="values", config=config)
|
||||
print(response["messages"][-1].content)
|
||||
config = {"configurable": {"thread_id": "thread-1"}}
|
||||
for query in queries:
|
||||
inputs = {"messages": [("user", prompt + query)]}
|
||||
response = agent.invoke(inputs, stream_mode="values", config=config)
|
||||
print(response["messages"][-1].content)
|
||||
|
||||
main()
|
||||
asyncio.run(run_application())
|
||||
{{< /tab >}}
|
||||
{{< tab header="LlamaIndex" lang="python" >}}
|
||||
import asyncio
|
||||
@@ -315,18 +462,20 @@ from llama_index.core.agent.workflow import AgentWorkflow
|
||||
|
||||
from llama_index.core.workflow import Context
|
||||
|
||||
# TODO(developer): replace this with another import if needed
|
||||
# TODO(developer): replace this with another import if needed
|
||||
|
||||
from llama_index.llms.google_genai import GoogleGenAI
|
||||
|
||||
# from llama_index.llms.anthropic import Anthropic
|
||||
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
@@ -338,48 +487,172 @@ queries = [
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
|
||||
async def main():
|
||||
async def run_application():
|
||||
# TODO(developer): replace this with another model if needed
|
||||
llm = GoogleGenAI(
|
||||
model="gemini-1.5-pro",
|
||||
vertexai_config={"project": "twisha-dev", "location": "us-central1"},
|
||||
model="gemini-2.0-flash-001",
|
||||
vertexai_config={"project": "project-id", "location": "us-central1"},
|
||||
)
|
||||
# llm = GoogleGenAI(
|
||||
# api_key=os.getenv("GOOGLE_API_KEY"),
|
||||
# model="gemini-1.5-pro",
|
||||
# model="gemini-2.0-flash-001",
|
||||
# )
|
||||
# llm = Anthropic(
|
||||
# model="claude-3-7-sonnet-latest",
|
||||
# api_key=os.getenv("ANTHROPIC_API_KEY")
|
||||
# )
|
||||
|
||||
|
||||
# Load the tools from the Toolbox server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
tools = client.load_toolset()
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
tools = await client.aload_toolset()
|
||||
|
||||
agent = AgentWorkflow.from_tools_or_functions(
|
||||
tools,
|
||||
llm=llm,
|
||||
system_prompt=prompt,
|
||||
)
|
||||
ctx = Context(agent)
|
||||
for query in queries:
|
||||
response = await agent.run(user_msg=query, ctx=ctx)
|
||||
print(f"---- {query} ----")
|
||||
print(str(response))
|
||||
agent = AgentWorkflow.from_tools_or_functions(
|
||||
tools,
|
||||
llm=llm,
|
||||
system_prompt=prompt,
|
||||
)
|
||||
ctx = Context(agent)
|
||||
for query in queries:
|
||||
response = await agent.run(user_msg=query, ctx=ctx)
|
||||
print(f"---- {query} ----")
|
||||
print(str(response))
|
||||
|
||||
asyncio.run(run_application())
|
||||
{{< /tab >}}
|
||||
{{< tab header="Core" lang="python" >}}
|
||||
import asyncio
|
||||
|
||||
from google import genai
|
||||
from google.genai.types import (
|
||||
Content,
|
||||
FunctionDeclaration,
|
||||
GenerateContentConfig,
|
||||
Part,
|
||||
Tool,
|
||||
)
|
||||
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel id while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
|
||||
queries = [
|
||||
"Find hotels in Basel with Basel in it's name.",
|
||||
"Please book the hotel Hilton Basel for me.",
|
||||
"This is too expensive. Please cancel it.",
|
||||
"Please book Hyatt Regency for me",
|
||||
"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
|
||||
async def run_application():
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox_client:
|
||||
|
||||
# The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use
|
||||
# integration. While this example uses Google's genai client, these callables can be adapted for
|
||||
# various function-calling or agent frameworks. For easier integration with supported frameworks
|
||||
# (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the
|
||||
# provided wrapper packages, which handle framework-specific boilerplate.
|
||||
toolbox_tools = await toolbox_client.load_toolset("my-toolset")
|
||||
genai_client = genai.Client(
|
||||
vertexai=True, project="project-id", location="us-central1"
|
||||
)
|
||||
|
||||
genai_tools = [
|
||||
Tool(
|
||||
function_declarations=[
|
||||
FunctionDeclaration.from_callable_with_api_option(callable=tool)
|
||||
]
|
||||
)
|
||||
for tool in toolbox_tools
|
||||
]
|
||||
history = []
|
||||
for query in queries:
|
||||
user_prompt_content = Content(
|
||||
role="user",
|
||||
parts=[Part.from_text(text=query)],
|
||||
)
|
||||
history.append(user_prompt_content)
|
||||
|
||||
response = genai_client.models.generate_content(
|
||||
model="gemini-2.0-flash-001",
|
||||
contents=history,
|
||||
config=GenerateContentConfig(
|
||||
system_instruction=prompt,
|
||||
tools=genai_tools,
|
||||
),
|
||||
)
|
||||
history.append(response.candidates[0].content)
|
||||
function_response_parts = []
|
||||
for function_call in response.function_calls:
|
||||
fn_name = function_call.name
|
||||
# The tools are sorted alphabetically
|
||||
if fn_name == "search-hotels-by-name":
|
||||
function_result = await toolbox_tools[3](**function_call.args)
|
||||
elif fn_name == "search-hotels-by-location":
|
||||
function_result = await toolbox_tools[2](**function_call.args)
|
||||
elif fn_name == "book-hotel":
|
||||
function_result = await toolbox_tools[0](**function_call.args)
|
||||
elif fn_name == "update-hotel":
|
||||
function_result = await toolbox_tools[4](**function_call.args)
|
||||
elif fn_name == "cancel-hotel":
|
||||
function_result = await toolbox_tools[1](**function_call.args)
|
||||
else:
|
||||
raise ValueError("Function name not present.")
|
||||
function_response = {"result": function_result}
|
||||
function_response_part = Part.from_function_response(
|
||||
name=function_call.name,
|
||||
response=function_response,
|
||||
)
|
||||
function_response_parts.append(function_response_part)
|
||||
|
||||
if function_response_parts:
|
||||
tool_response_content = Content(role="tool", parts=function_response_parts)
|
||||
history.append(tool_response_content)
|
||||
|
||||
response2 = genai_client.models.generate_content(
|
||||
model="gemini-2.0-flash-001",
|
||||
contents=history,
|
||||
config=GenerateContentConfig(
|
||||
tools=genai_tools,
|
||||
),
|
||||
)
|
||||
final_model_response_content = response2.candidates[0].content
|
||||
history.append(final_model_response_content)
|
||||
print(response2.text)
|
||||
|
||||
asyncio.run(run_application())
|
||||
|
||||
asyncio.run(main())
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
{{< tabpane text=true persist=header >}}
|
||||
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="ADK" lang="en" %}}
|
||||
To learn more about Agent Development Kit, check out the [ADK
|
||||
documentation.](https://google.github.io/adk-docs/)
|
||||
{{% /tab %}}
|
||||
{{% tab header="Langchain" lang="en" %}}
|
||||
To learn more about Agents in LangChain, check out the [LangGraph Agent documentation.](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)
|
||||
To learn more about Agents in LangChain, check out the [LangGraph Agent
|
||||
documentation.](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)
|
||||
{{% /tab %}}
|
||||
{{% tab header="LlamaIndex" lang="en" %}}
|
||||
To learn more about Agents in LlamaIndex, check out the [LlamaIndex AgentWorkflow documentation.](https://docs.llamaindex.ai/en/stable/examples/agent/agent_workflow_basic/)
|
||||
To learn more about Agents in LlamaIndex, check out the [LlamaIndex
|
||||
AgentWorkflow
|
||||
documentation.](https://docs.llamaindex.ai/en/stable/examples/agent/agent_workflow_basic/)
|
||||
{{% /tab %}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
To learn more about tool calling with Google GenAI, check out the
|
||||
[Google GenAI
|
||||
Documentation](https://github.com/googleapis/python-genai?tab=readme-ov-file#manually-declare-and-invoke-a-function-for-function-calling).
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
1. Run your agent, and observe the results:
|
||||
|
||||
```sh
|
||||
|
||||
241
docs/en/getting-started/mcp_quickstart/_index.md
Normal file
@@ -0,0 +1,241 @@
|
||||
---
|
||||
title: "Quickstart (MCP)"
|
||||
type: docs
|
||||
weight: 3
|
||||
description: >
|
||||
How to get started running Toolbox locally with MCP Inspector.
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
[Model Context Protocol](https://modelcontextprotocol.io) is an open protocol
|
||||
that standardizes how applications provide context to LLMs. Check out this page
|
||||
on how to [connect to Toolbox via MCP](../../how-to/connect_via_mcp.md).
|
||||
|
||||
## Step 1: Set up your database
|
||||
|
||||
In this section, we will create a database, insert some data that needs to be
|
||||
access by our agent, and create a database user for Toolbox to connect with.
|
||||
|
||||
1. Connect to postgres using the `psql` command:
|
||||
|
||||
```bash
|
||||
psql -h 127.0.0.1 -U postgres
|
||||
```
|
||||
|
||||
Here, `postgres` denotes the default postgres superuser.
|
||||
|
||||
1. Create a new database and a new user:
|
||||
|
||||
{{< notice tip >}}
|
||||
For a real application, it's best to follow the principle of least permission
|
||||
and only grant the privileges your application needs.
|
||||
{{< /notice >}}
|
||||
|
||||
```sql
|
||||
CREATE USER toolbox_user WITH PASSWORD 'my-password';
|
||||
|
||||
CREATE DATABASE toolbox_db;
|
||||
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
|
||||
|
||||
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
|
||||
```
|
||||
|
||||
1. End the database session:
|
||||
|
||||
```bash
|
||||
\q
|
||||
```
|
||||
|
||||
1. Connect to your database with your new user:
|
||||
|
||||
```bash
|
||||
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
|
||||
```
|
||||
|
||||
1. Create a table using the following command:
|
||||
|
||||
```sql
|
||||
CREATE TABLE hotels(
|
||||
id INTEGER NOT NULL PRIMARY KEY,
|
||||
name VARCHAR NOT NULL,
|
||||
location VARCHAR NOT NULL,
|
||||
price_tier VARCHAR NOT NULL,
|
||||
checkin_date DATE NOT NULL,
|
||||
checkout_date DATE NOT NULL,
|
||||
booked BIT NOT NULL
|
||||
);
|
||||
```
|
||||
|
||||
1. Insert data into the table.
|
||||
|
||||
```sql
|
||||
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
|
||||
VALUES
|
||||
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
|
||||
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
|
||||
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
|
||||
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
|
||||
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
|
||||
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
|
||||
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
|
||||
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
|
||||
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
|
||||
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
|
||||
```
|
||||
|
||||
1. End the database session:
|
||||
|
||||
```bash
|
||||
\q
|
||||
```
|
||||
|
||||
## Step 2: Install and configure Toolbox
|
||||
|
||||
In this section, we will download Toolbox, configure our tools in a
|
||||
`tools.yaml`, and then run the Toolbox server.
|
||||
|
||||
1. Download the latest version of Toolbox as a binary:
|
||||
|
||||
{{< notice tip >}}
|
||||
Select the
|
||||
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
|
||||
corresponding to your OS and CPU architecture.
|
||||
{{< /notice >}}
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```bash
|
||||
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.8.0/$OS/toolbox
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Write the following into a `tools.yaml` file. Be sure to update any fields
|
||||
such as `user`, `password`, or `database` that you may have customized in the
|
||||
previous step.
|
||||
|
||||
{{< notice tip >}}
|
||||
In practice, use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-pg-source:
|
||||
kind: postgres
|
||||
host: 127.0.0.1
|
||||
port: 5432
|
||||
database: toolbox_db
|
||||
user: toolbox_user
|
||||
password: my-password
|
||||
tools:
|
||||
search-hotels-by-name:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Search for hotels based on name.
|
||||
parameters:
|
||||
- name: name
|
||||
type: string
|
||||
description: The name of the hotel.
|
||||
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
|
||||
search-hotels-by-location:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Search for hotels based on location.
|
||||
parameters:
|
||||
- name: location
|
||||
type: string
|
||||
description: The location of the hotel.
|
||||
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
|
||||
book-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: >-
|
||||
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to book.
|
||||
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
|
||||
update-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: >-
|
||||
Update a hotel's check-in and check-out dates by its ID. Returns a message
|
||||
indicating whether the hotel was successfully updated or not.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to update.
|
||||
- name: checkin_date
|
||||
type: string
|
||||
description: The new check-in date of the hotel.
|
||||
- name: checkout_date
|
||||
type: string
|
||||
description: The new check-out date of the hotel.
|
||||
statement: >-
|
||||
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
|
||||
as date) WHERE id = $1;
|
||||
cancel-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Cancel a hotel by its ID.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to cancel.
|
||||
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
|
||||
toolsets:
|
||||
my-toolset:
|
||||
- search-hotels-by-name
|
||||
- search-hotels-by-location
|
||||
- book-hotel
|
||||
- update-hotel
|
||||
- cancel-hotel
|
||||
```
|
||||
|
||||
For more info on tools, check out the
|
||||
[Tools](../../resources/tools/_index.md) section.
|
||||
|
||||
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
|
||||
|
||||
```bash
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
|
||||
## Step 3: Connect to MCP Inspector
|
||||
|
||||
1. Run the MCP Inspector:
|
||||
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector
|
||||
```
|
||||
|
||||
1. Type `y` when it asks to install the inspector package.
|
||||
|
||||
1. It should show the following when the MCP Inspector is up and running:
|
||||
|
||||
```bash
|
||||
🔍 MCP Inspector is up and running at http://127.0.0.1:5173 🚀
|
||||
```
|
||||
|
||||
1. Open the above link in your browser.
|
||||
|
||||
1. For `Transport Type`, select `SSE`.
|
||||
|
||||
1. For `URL`, type in `http://127.0.0.1:5000/mcp/sse`.
|
||||
|
||||
1. Click Connect.
|
||||
|
||||

|
||||
|
||||
1. Select `List Tools`, you will see a list of tools configured in `tools.yaml`.
|
||||
|
||||

|
||||
|
||||
1. Test out your tools here!
|
||||
BIN
docs/en/getting-started/mcp_quickstart/inspector.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
BIN
docs/en/getting-started/mcp_quickstart/inspector_tools.png
Normal file
|
After Width: | Height: | Size: 24 KiB |
9
docs/en/how-to/connect-ide/_index.md
Normal file
@@ -0,0 +1,9 @@
|
||||
---
|
||||
title: "Connect from your IDE"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
List of guides detailing how to connect your AI tools (IDEs) to Toolbox using MCP.
|
||||
aliases:
|
||||
- /how-to/connect_tools_using_mcp
|
||||
---
|
||||
13
docs/en/how-to/connect-ide/alloydb_pg_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "AlloyDB using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to AlloyDB using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/bigquery_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "BigQuery using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to BigQuery using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/bigquery/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/bigquery/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/cloud_sql_mssql_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Cloud SQL for SQL Server using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Cloud SQL for SQL Server using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/sql/docs/sqlserver/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/sql/docs/sqlserver/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Cloud SQL for MySQL using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Cloud SQL for MySQL using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/sql/docs/mysql/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/sql/docs/mysql/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/cloud_sql_pg_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Cloud SQL for Postgres using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Cloud SQL for Postgres using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/sql/docs/postgres/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/sql/docs/postgres/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
278
docs/en/how-to/connect-ide/postgres_mcp.md
Normal file
@@ -0,0 +1,278 @@
|
||||
---
|
||||
title: "PostgreSQL using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to PostgreSQL using Toolbox.
|
||||
---
|
||||
|
||||
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
|
||||
an open protocol for connecting Large Language Models (LLMs) to data sources
|
||||
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
|
||||
to expose your developer assistant tools to a Postgres instance:
|
||||
|
||||
* [Cursor][cursor]
|
||||
* [Windsurf][windsurf] (Codium)
|
||||
* [Visual Studio Code][vscode] (Copilot)
|
||||
* [Cline][cline] (VS Code extension)
|
||||
* [Claude desktop][claudedesktop]
|
||||
* [Claude code][claudecode]
|
||||
|
||||
[toolbox]: https://github.com/googleapis/genai-toolbox
|
||||
[cursor]: #configure-your-mcp-client
|
||||
[windsurf]: #configure-your-mcp-client
|
||||
[vscode]: #configure-your-mcp-client
|
||||
[cline]: #configure-your-mcp-client
|
||||
[claudedesktop]: #configure-your-mcp-client
|
||||
[claudecode]: #configure-your-mcp-client
|
||||
|
||||
{{< notice tip >}}
|
||||
This guide can be used with [AlloyDB
|
||||
Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
|
||||
{{< /notice >}}
|
||||
|
||||
## Set up the database
|
||||
|
||||
1. Create or select a PostgreSQL instance.
|
||||
|
||||
* [Install PostgreSQL locally](https://www.postgresql.org/download/)
|
||||
* [Install AlloyDB Omni](https://cloud.google.com/alloydb/omni/current/docs/quickstart)
|
||||
|
||||
1. Create or reuse [a database
|
||||
user](https://cloud.google.com/alloydb/omni/current/docs/database-users/manage-users)
|
||||
and have the username and password ready.
|
||||
|
||||
## Install MCP Toolbox
|
||||
|
||||
1. Download the latest version of Toolbox as a binary. Select the [correct
|
||||
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
|
||||
to your OS and CPU architecture. You are required to use Toolbox version
|
||||
V0.6.0+:
|
||||
|
||||
<!-- {x-release-please-start-version} -->
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="linux/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.8.0/linux/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/arm64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.8.0/darwin/arm64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.8.0/darwin/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="windows/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.8.0/windows/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Verify the installation:
|
||||
|
||||
```bash
|
||||
./toolbox --version
|
||||
```
|
||||
|
||||
## Configure your MCP Client
|
||||
|
||||
{{< tabpane text=true >}}
|
||||
{{% tab header="Claude code" lang="en" %}}
|
||||
|
||||
1. Install [Claude
|
||||
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
|
||||
1. Create a `.mcp.json` file in your project root if it doesn't exist.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude code to apply the new configuration.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Claude desktop" lang="en" %}}
|
||||
|
||||
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
|
||||
1. Under the Developer tab, tap Edit Config to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude desktop.
|
||||
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
|
||||
new MCP server available.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cline" lang="en" %}}
|
||||
|
||||
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
|
||||
the **MCP Servers** icon.
|
||||
1. Tap Configure MCP Servers to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. You should see a green active status after the server is successfully
|
||||
connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cursor" lang="en" %}}
|
||||
|
||||
1. Create a `.cursor` directory in your project root if it doesn't exist.
|
||||
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
|
||||
Settings > MCP**. You should see a green active status after the server is
|
||||
successfully connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
|
||||
|
||||
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
|
||||
create a `.vscode` directory in your project root if it doesn't exist.
|
||||
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Windsurf" lang="en" %}}
|
||||
|
||||
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
|
||||
Cascade assistant.
|
||||
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
## Use Tools
|
||||
|
||||
Your AI tool is now connected to Postgres using MCP. Try asking your AI
|
||||
assistant to list tables, create a table, or define and execute other SQL
|
||||
statements.
|
||||
|
||||
The following tools are available to the LLM:
|
||||
|
||||
1. **list_tables**: lists tables and descriptions
|
||||
1. **execute_sql**: execute any SQL statement
|
||||
|
||||
{{< notice note >}}
|
||||
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
|
||||
will adapt to the tools available, so this shouldn't affect most users.
|
||||
{{< /notice >}}
|
||||
13
docs/en/how-to/connect-ide/spanner_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Spanner using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Spanner using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/spanner/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/spanner/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
147
docs/en/how-to/connect_via_mcp.md
Normal file
@@ -0,0 +1,147 @@
|
||||
---
|
||||
title: "Connect via MCP Client"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
How to connect to Toolbox from a MCP Client.
|
||||
---
|
||||
|
||||
## Toolbox SDKs vs Model Context Protocol (MCP)
|
||||
|
||||
Toolbox now supports connections via both the native Toolbox SDKs and via [Model
|
||||
Context Protocol (MCP)](https://modelcontextprotocol.io/). However, Toolbox has
|
||||
several features which are not supported in the MCP specification (such as
|
||||
Authenticated Parameters and Authorized invocation).
|
||||
|
||||
We recommend using the native SDKs over MCP clients to leverage these features.
|
||||
The native SDKs can be combined with MCP clients in many cases.
|
||||
|
||||
### Protocol Versions
|
||||
|
||||
Toolbox currently supports the following versions of MCP specification:
|
||||
|
||||
* [2024-11-05](https://spec.modelcontextprotocol.io/specification/2024-11-05/)
|
||||
|
||||
### Features Not Supported by MCP
|
||||
|
||||
Toolbox has several features that are not yet supported in the MCP specification:
|
||||
|
||||
* **AuthZ/AuthN:** There are no auth implementation in the `2024-11-05`
|
||||
specification. This includes:
|
||||
* [Authenticated Parameters](../resources/tools/_index.md#authenticated-parameters)
|
||||
* [Authorized Invocations](../resources/tools/_index.md#authorized-invocations)
|
||||
* **Notifications:** Currently, editing Toolbox Tools requires a server restart.
|
||||
Clients should reload tools on disconnect to get the latest version.
|
||||
|
||||
## Connecting to Toolbox with an MCP client
|
||||
|
||||
### Before you begin
|
||||
|
||||
{{< notice note >}}
|
||||
MCP is only compatible with Toolbox version 0.3.0 and above.
|
||||
{{< /notice >}}
|
||||
|
||||
1. [Install](../getting-started/introduction/_index.md#installing-the-server)
|
||||
Toolbox version 0.3.0+.
|
||||
|
||||
1. Make sure you've set up and initialized your database.
|
||||
|
||||
1. [Set up](../getting-started/configure.md) your `tools.yaml` file.
|
||||
|
||||
### Connecting via Standard Input/Output (stdio)
|
||||
|
||||
Toolbox supports the
|
||||
[stdio](https://modelcontextprotocol.io/docs/concepts/transports#standard-input%2Foutput-stdio)
|
||||
transport protocol. Users that wish to use stdio will have to include the
|
||||
`--stdio` flag when running Toolbox.
|
||||
|
||||
```bash
|
||||
./toolbox --stdio
|
||||
```
|
||||
|
||||
When running with stdio, Toolbox will listen via stdio instead of acting as a
|
||||
remote HTTP server. Logs will be set to the `warn` level by default. `debug` and
|
||||
`info` logs are not supported with stdio.
|
||||
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
### Connecting via HTTP
|
||||
|
||||
Toolbox supports the HTTP transport protocol with and without SSE.
|
||||
|
||||
{{< tabpane text=true >}} {{% tab header="HTTP with SSE" lang="en" %}}
|
||||
Add the following configuration to your MCP client configuration:
|
||||
|
||||
```bash
|
||||
{
|
||||
"mcpServers": {
|
||||
"toolbox": {
|
||||
"type": "sse",
|
||||
"url": "http://127.0.0.1:5000/mcp/sse",
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
If you would like to connect to a specific toolset, replace `url` with
|
||||
`"http://127.0.0.1:5000/mcp/{toolset_name}/sse"`.
|
||||
{{% /tab %}} {{% tab header="HTTP POST" lang="en" %}}
|
||||
Connect to Toolbox HTTP POST via `http://127.0.0.1:5000/mcp`.
|
||||
|
||||
If you would like to connect to a specific toolset, connect via
|
||||
`http://127.0.0.1:5000/mcp/{toolset_name}`.
|
||||
{{% /tab %}} {{< /tabpane >}}
|
||||
|
||||
### Using the MCP Inspector with Toolbox
|
||||
|
||||
Use MCP [Inspector](https://github.com/modelcontextprotocol/inspector) for
|
||||
testing and debugging Toolbox server.
|
||||
|
||||
{{< tabpane text=true >}}
|
||||
{{% tab header="STDIO" lang="en" %}}
|
||||
|
||||
1. Run Inspector with Toolbox as a subprocess:
|
||||
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector ./toolbox --stdio
|
||||
```
|
||||
|
||||
1. For `Transport Type` dropdown menu, select `STDIO`.
|
||||
|
||||
1. In `Command`, make sure that it is set to :`./toolbox` (or the correct path
|
||||
to where the Toolbox binary is installed).
|
||||
|
||||
1. In `Arguments`, make sure that it's filled with `--stdio`.
|
||||
|
||||
1. Click the `Connect` button. It might take awhile to spin up Toolbox. Voila!
|
||||
You should be able to inspect your toolbox tools!
|
||||
{{% /tab %}}
|
||||
{{% tab header="HTTP with SSE" lang="en" %}}
|
||||
1. [Run Toolbox](../getting-started/introduction/_index.md#running-the-server).
|
||||
|
||||
1. In a separate terminal, run Inspector directly through `npx`:
|
||||
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector
|
||||
```
|
||||
|
||||
1. For `Transport Type` dropdown menu, select `SSE`.
|
||||
|
||||
1. For `URL`, type in `http://127.0.0.1:5000/mcp/sse` to use all tool or
|
||||
`http//127.0.0.1:5000/mcp/{toolset_name}/sse` to use a specific toolset.
|
||||
|
||||
1. Click the `Connect` button. Voila! You should be able to inspect your toolbox
|
||||
tools!
|
||||
{{% /tab %}} {{< /tabpane >}}
|
||||
|
||||
### Tested Clients
|
||||
|
||||
| Client | SSE Works | MCP Config Docs |
|
||||
|--------|--------|--------|
|
||||
| Claude Desktop | ✅ | <https://modelcontextprotocol.io/quickstart/user#1-download-claude-for-desktop> |
|
||||
| MCP Inspector | ✅ | <https://github.com/modelcontextprotocol/inspector> |
|
||||
| Cursor | ✅ | <https://docs.cursor.com/context/model-context-protocol> |
|
||||
| Windsurf | ✅ | <https://docs.windsurf.com/windsurf/mcp> |
|
||||
| VS Code (Insiders) | ✅ | <https://code.visualstudio.com/docs/copilot/chat/mcp-servers> |
|
||||
@@ -1,14 +1,13 @@
|
||||
---
|
||||
title: "Deploy using Docker Compose"
|
||||
type: docs
|
||||
weight: 3
|
||||
weight: 4
|
||||
description: >
|
||||
How to deploy Toolbox using Docker Compose.
|
||||
How to deploy Toolbox using Docker Compose.
|
||||
---
|
||||
|
||||
<!-- Contributor: Sujith R Pillai <sujithrpillai@gmail.com> -->
|
||||
|
||||
|
||||
## Before you begin
|
||||
|
||||
1. [Install Docker Compose.](https://docs.docker.com/compose/install/)
|
||||
@@ -35,7 +34,7 @@ services:
|
||||
- "5000:5000"
|
||||
volumes:
|
||||
- ./config:/config
|
||||
command: [ "toolbox", "--tools_file", "/config/tools.yaml", "--address", "0.0.0.0"]
|
||||
command: [ "toolbox", "--tools-file", "/config/tools.yaml", "--address", "0.0.0.0"]
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
@@ -74,19 +73,16 @@ networks:
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
|
||||
{{< notice tip >}}
|
||||
{{< notice tip >}}
|
||||
|
||||
You can use this setup quickly set up Toolbox + Postgres to follow along in our
|
||||
[Quickstart](../getting-started/local_quickstart.md)
|
||||
|
||||
{{< /notice >}}
|
||||
|
||||
|
||||
{{< /notice >}}
|
||||
|
||||
## Connecting with Toolbox Client SDK
|
||||
|
||||
Next, we will use Toolbox with the Client SDKs:
|
||||
Next, we will use Toolbox with the Client SDKs:
|
||||
|
||||
1. The url for the Toolbox server running using docker-compose will be:
|
||||
|
||||
@@ -101,14 +97,14 @@ Next, we will use Toolbox with the Client SDKs:
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
|
||||
async with ToolboxClient("http://$YOUR_URL") as toolbox:
|
||||
{{< /tab >}}
|
||||
{{< tab header="Llamaindex" lang="Python" >}}
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
|
||||
async with ToolboxClient("http://$YOUR_URL") as toolbox:
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: "Deploy to Kubernetes"
|
||||
type: docs
|
||||
weight: 2
|
||||
weight: 4
|
||||
description: >
|
||||
How to set up and configure Toolbox to deploy on Kubernetes with Google Kubernetes Engine (GKE).
|
||||
---
|
||||
@@ -9,7 +9,6 @@ description: >
|
||||
|
||||
## Before you begin
|
||||
|
||||
|
||||
1. Set the PROJECT_ID environment variable:
|
||||
|
||||
```bash
|
||||
@@ -40,7 +39,6 @@ description: >
|
||||
```bash
|
||||
kubectl version --client
|
||||
```
|
||||
|
||||
|
||||
1. If needed, install `kubectl` component using the Google Cloud CLI:
|
||||
|
||||
@@ -62,7 +60,7 @@ description: >
|
||||
gcloud iam service-accounts create $SA_NAME
|
||||
```
|
||||
|
||||
1. Grant any IAM roles necessary to the IAM service account. Each source have a
|
||||
1. Grant any IAM roles necessary to the IAM service account. Each source have a
|
||||
list of necessary IAM permissions listed on it's page. The example below is
|
||||
for cloud sql postgres source:
|
||||
|
||||
@@ -76,7 +74,7 @@ description: >
|
||||
- [CloudSQL IAM Identity](../resources/sources/cloud-sql-pg.md#iam-permissions)
|
||||
- [Spanner IAM Identity](../resources/sources/spanner.md#iam-permissions)
|
||||
|
||||
## Deploy to Kubernetes
|
||||
## Deploy to Kubernetes
|
||||
|
||||
1. Set environment variables:
|
||||
|
||||
@@ -94,7 +92,7 @@ description: >
|
||||
|
||||
```bash
|
||||
gcloud container clusters create-auto $CLUSTER_NAME \
|
||||
--location=us-central1
|
||||
--location=us-central1
|
||||
```
|
||||
|
||||
1. Get authentication credentials to interact with the cluster. This also
|
||||
@@ -254,6 +252,7 @@ description: >
|
||||
```
|
||||
|
||||
## Clean up resources
|
||||
|
||||
1. Delete secret.
|
||||
|
||||
```bash
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: "Deploy to Cloud Run"
|
||||
type: docs
|
||||
weight: 1
|
||||
weight: 3
|
||||
description: >
|
||||
How to set up and configure Toolbox to run on Cloud Run.
|
||||
---
|
||||
@@ -33,7 +33,7 @@ description: >
|
||||
cloudbuild.googleapis.com \
|
||||
artifactregistry.googleapis.com \
|
||||
iam.googleapis.com \
|
||||
secretmanager.googleapis.com
|
||||
secretmanager.googleapis.com
|
||||
|
||||
```
|
||||
|
||||
@@ -48,21 +48,12 @@ description: >
|
||||
- Cloud Run Developer (roles/run.developer)
|
||||
- Service Account User role (roles/iam.serviceAccountUser)
|
||||
|
||||
{{< notice note >}}
|
||||
If you are under a domain restriction organization policy
|
||||
[restricting](https://cloud.google.com/run/docs/authenticating/public#domain-restricted-sharing)
|
||||
unauthenticated invocations for your project, you will need to access your
|
||||
deployed service as described under [Testing private
|
||||
services](https://cloud.google.com/run/docs/triggering/https-request#testing-private).
|
||||
{{< /notice >}}
|
||||
|
||||
{{< notice note >}}
|
||||
{{< notice note >}}
|
||||
If you are using sources that require VPC-access (such as
|
||||
AlloyDB or Cloud SQL over private IP), make sure your Cloud Run service and the
|
||||
database are in the same VPC network.
|
||||
database are in the same VPC network.
|
||||
{{< /notice >}}
|
||||
|
||||
|
||||
## Create a service account
|
||||
|
||||
1. Create a backend service account if you don't already have one:
|
||||
@@ -71,7 +62,7 @@ database are in the same VPC network.
|
||||
gcloud iam service-accounts create toolbox-identity
|
||||
```
|
||||
|
||||
1. Grant permissions to use secret manager:
|
||||
1. Grant permissions to use secret manager:
|
||||
|
||||
```bash
|
||||
gcloud projects add-iam-policy-binding $PROJECT_ID \
|
||||
@@ -79,7 +70,8 @@ database are in the same VPC network.
|
||||
--role roles/secretmanager.secretAccessor
|
||||
```
|
||||
|
||||
1. Grant additional permissions to the service account that are specific to the source, e.g.:
|
||||
1. Grant additional permissions to the service account that are specific to the
|
||||
source, e.g.:
|
||||
- [AlloyDB for PostgreSQL](../resources/sources/alloydb-pg.md#iam-permissions)
|
||||
- [Cloud SQL for PostgreSQL](../resources/sources/cloud-sql-pg.md#iam-permissions)
|
||||
|
||||
@@ -87,7 +79,7 @@ database are in the same VPC network.
|
||||
|
||||
Create a `tools.yaml` file that contains your configuration for Toolbox. For
|
||||
details, see the
|
||||
[configuration](https://github.com/googleapis/genai-toolbox/blob/main/README.md#configuration)
|
||||
[configuration](https://googleapis.github.io/genai-toolbox/resources/sources/)
|
||||
section.
|
||||
|
||||
## Deploy to Cloud Run
|
||||
@@ -105,7 +97,8 @@ section.
|
||||
gcloud secrets versions add tools --data-file=tools.yaml
|
||||
```
|
||||
|
||||
1. Set an environment variable to the container image that you want to use for cloud run:
|
||||
1. Set an environment variable to the container image that you want to use for
|
||||
cloud run:
|
||||
|
||||
```bash
|
||||
export IMAGE=us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:latest
|
||||
@@ -119,7 +112,7 @@ section.
|
||||
--service-account toolbox-identity \
|
||||
--region us-central1 \
|
||||
--set-secrets "/app/tools.yaml=tools:latest" \
|
||||
--args="--tools_file=/app/tools.yaml","--address=0.0.0.0","--port=8080"
|
||||
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080"
|
||||
# --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud
|
||||
```
|
||||
|
||||
@@ -131,34 +124,30 @@ section.
|
||||
--service-account toolbox-identity \
|
||||
--region us-central1 \
|
||||
--set-secrets "/app/tools.yaml=tools:latest" \
|
||||
--args="--tools_file=/app/tools.yaml","--address=0.0.0.0","--port=8080" \
|
||||
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080" \
|
||||
# TODO(dev): update the following to match your VPC if necessary
|
||||
--network default \
|
||||
--subnet default
|
||||
# --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud
|
||||
```
|
||||
|
||||
## Connecting to Cloud Run
|
||||
|
||||
Next, we will use `gcloud` to authenticate requests to our Cloud Run instance:
|
||||
|
||||
1. Run the `run services proxy` to proxy connections to Cloud Run:
|
||||
|
||||
```bash
|
||||
gcloud run services proxy toolbox --port=8080 --region=us-central1
|
||||
```
|
||||
|
||||
If you are prompted to install the proxy, reply *Y* to install.
|
||||
|
||||
1. Finally, use `curl` to verify the endpoint works:
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:8080
|
||||
```
|
||||
|
||||
## Connecting with Toolbox Client SDK
|
||||
|
||||
Next, we will use Toolbox with client SDK:
|
||||
You can connect to Toolbox Cloud Run instances directly through the SDK
|
||||
|
||||
1. [Set up `Cloud Run Invoker` role
|
||||
access](https://cloud.google.com/run/docs/securing/managing-access#service-add-principals)
|
||||
to your Cloud Run service.
|
||||
|
||||
1. Set up [Application Default
|
||||
Credentials](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
|
||||
for the principle you set up the `Cloud Run Invoker` role access to.
|
||||
|
||||
{{< notice tip >}}
|
||||
If you're working in some other environment than local, set up [environment
|
||||
specific Default
|
||||
Credentials](https://cloud.google.com/docs/authentication/provide-credentials-adc).
|
||||
{{< /notice >}}
|
||||
|
||||
1. Run the following to retrieve a non-deterministic URL for the cloud run service:
|
||||
|
||||
@@ -168,18 +157,16 @@ Next, we will use Toolbox with client SDK:
|
||||
|
||||
1. Import and initialize the toolbox client with the URL retrieved above:
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
from toolbox_langchain import ToolboxClient
|
||||
```python
|
||||
from toolbox_core import ToolboxClient, auth_methods
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
{{< /tab >}}
|
||||
{{< tab header="Llamaindex" lang="Python" >}}
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
auth_token_provider = auth_methods.aget_google_id_token # can also use sync method
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
# Replace with the Cloud Run service URL generated in the previous step.
|
||||
async with ToolboxClient(
|
||||
URL,
|
||||
client_headers={"Authorization": auth_token_provider},
|
||||
) as toolbox:
|
||||
```
|
||||
|
||||
Now, you can use this client to connect to the deployed Cloud Run instance!
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
---
|
||||
title: "Export Telemetry"
|
||||
type: docs
|
||||
weight: 4
|
||||
weight: 5
|
||||
description: >
|
||||
How to set up and configure Toolbox to use the Otel Collector.
|
||||
How to set up and configure Toolbox to use the Otel Collector.
|
||||
---
|
||||
|
||||
|
||||
## About
|
||||
## About
|
||||
|
||||
The [OpenTelemetry Collector][about-collector] offers a vendor-agnostic
|
||||
implementation of how to receive, process and export telemetry data. It removes
|
||||
@@ -20,6 +20,7 @@ the need to run, operate, and maintain multiple agents/collectors.
|
||||
To configure the collector, you will have to provide a configuration file. The
|
||||
configuration file consists of four classes of pipeline component that access
|
||||
telemetry data.
|
||||
|
||||
- `Receivers`
|
||||
- `Processors`
|
||||
- `Exporters`
|
||||
@@ -56,7 +57,7 @@ service:
|
||||
exporters: ["googlecloud"]
|
||||
```
|
||||
|
||||
## Running the Connector
|
||||
## Running the Collector
|
||||
|
||||
There are a couple of steps to run and use a Collector.
|
||||
|
||||
|
||||
@@ -2,5 +2,6 @@
|
||||
title: "Resources"
|
||||
type: docs
|
||||
weight: 4
|
||||
description: List of reference documentation for resources in Toolbox.
|
||||
description: >
|
||||
List of reference documentation for resources in Toolbox.
|
||||
---
|
||||
|
||||
@@ -7,11 +7,11 @@ description: >
|
||||
---
|
||||
|
||||
AuthServices represent services that handle authentication and authorization. It
|
||||
can primarily be used by [Tools](../tools) in two different ways:
|
||||
can primarily be used by [Tools](../tools) in two different ways:
|
||||
|
||||
- [**Authorized Invocation**][auth-invoke] is when a tool
|
||||
to be validate by the auth service before the call can be invoked. Toolbox
|
||||
will rejected an calls that fail to validate or have an invalid token.
|
||||
is validated by the auth service before the call can be invoked. Toolbox
|
||||
will reject any calls that fail to validate or have an invalid token.
|
||||
- [**Authenticated Parameters**][auth-params] replace the value of a parameter
|
||||
with a field from an [OIDC][openid-claims] claim. Toolbox will automatically
|
||||
resolve the ID token provided by the client and replace the parameter in the
|
||||
@@ -35,19 +35,24 @@ If you are accessing Toolbox with multiple applications, each
|
||||
authServices:
|
||||
my_auth_app_1:
|
||||
kind: google
|
||||
clientId: YOUR_CLIENT_ID_1
|
||||
clientId: ${YOUR_CLIENT_ID_1}
|
||||
my_auth_app_2:
|
||||
kind: google
|
||||
clientId: YOUR_CLIENT_ID_2
|
||||
clientId: ${YOUR_CLIENT_ID_2}
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
After you've configured an `authService` you'll, need to reference it in the
|
||||
configuration for each tool that should use it:
|
||||
- **Authorized Invocations** for authorizing a tool call, [use the
|
||||
`requiredAuth` field in a tool config][auth-invoke]
|
||||
- **Authenticated Parameters** for using the value from a ODIC claim, [use the
|
||||
`authServices` field in a parameter config][auth-params]
|
||||
|
||||
- **Authorized Invocations** for authorizing a tool call, [use the
|
||||
`authRequired` field in a tool config][auth-invoke]
|
||||
- **Authenticated Parameters** for using the value from a OIDC claim, [use the
|
||||
`authServices` field in a parameter config][auth-params]
|
||||
|
||||
## Specifying ID Tokens from Clients
|
||||
|
||||
@@ -55,59 +60,125 @@ After [configuring](#example) your `authServices` section, use a Toolbox SDK to
|
||||
add your ID tokens to the header of a Tool invocation request. When specifying a
|
||||
token you will provide a function (that returns an id). This function is called
|
||||
when the tool is invoked. This allows you to cache and refresh the ID token as
|
||||
needed.
|
||||
needed.
|
||||
|
||||
The primary method for providing these getters is via the `auth_token_getters`
|
||||
parameter when loading tools, or the `add_auth_token_getter`() /
|
||||
`add_auth_token_getters()` methods on a loaded tool object.
|
||||
|
||||
### Specifying tokens during load
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
{{< tab header="Core" lang="Python" >}}
|
||||
import asyncio
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
async def get_auth_token():
|
||||
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
|
||||
# This example just returns a placeholder. Replace with your actual token retrieval.
|
||||
return "YOUR_ID_TOKEN" # Placeholder
|
||||
|
||||
# for a single tool use:
|
||||
authorized_tool = toolbox.load_tool("my-tool-name", auth_tokens={"my_auth": get_auth_token})
|
||||
async def main():
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox:
|
||||
auth_tool = await toolbox.load_tool(
|
||||
"get_sensitive_data",
|
||||
auth_token_getters={"my_auth_app_1": get_auth_token}
|
||||
)
|
||||
result = await auth_tool(param="value")
|
||||
print(result)
|
||||
|
||||
# for a toolset use:
|
||||
authorized_tools = toolbox.load_toolset("my-toolset-name", auth_tokens={"my_auth": get_auth_token})
|
||||
if **name** == "**main**":
|
||||
asyncio.run(main())
|
||||
{{< /tab >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
import asyncio
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
async def get_auth_token():
|
||||
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
|
||||
# This example just returns a placeholder. Replace with your actual token retrieval.
|
||||
return "YOUR_ID_TOKEN" # Placeholder
|
||||
|
||||
async def main():
|
||||
toolbox = ToolboxClient("<http://127.0.0.1:5000>")
|
||||
|
||||
auth_tool = await toolbox.aload_tool(
|
||||
"get_sensitive_data",
|
||||
auth_token_getters={"my_auth_app_1": get_auth_token}
|
||||
)
|
||||
result = await auth_tool.ainvoke({"param": "value"})
|
||||
print(result)
|
||||
|
||||
if **name** == "**main**":
|
||||
asyncio.run(main())
|
||||
{{< /tab >}}
|
||||
{{< tab header="Llamaindex" lang="Python" >}}
|
||||
import asyncio
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
async def get_auth_token():
|
||||
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
|
||||
# This example just returns a placeholder. Replace with your actual token retrieval.
|
||||
return "YOUR_ID_TOKEN" # Placeholder
|
||||
|
||||
# for a single tool use:
|
||||
authorized_tool = toolbox.load_tool("my-tool-name", auth_tokens={"my_auth": get_auth_token})
|
||||
async def main():
|
||||
toolbox = ToolboxClient("<http://127.0.0.1:5000>")
|
||||
|
||||
# for a toolset use:
|
||||
authorized_tools = toolbox.load_toolset("my-toolset-name", auth_tokens={"my_auth": get_auth_token})
|
||||
{{< /tab >}}
|
||||
auth_tool = await toolbox.aload_tool(
|
||||
"get_sensitive_data",
|
||||
auth_token_getters={"my_auth_app_1": get_auth_token}
|
||||
)
|
||||
# result = await auth_tool.acall(param="value")
|
||||
# print(result.content)
|
||||
|
||||
if **name** == "**main**":
|
||||
asyncio.run(main()){{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
|
||||
### Specifying tokens for existing tools
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
tools = toolbox.load_toolset()
|
||||
{{< tab header="Core" lang="Python" >}}
|
||||
tools = await toolbox.load_toolset()
|
||||
|
||||
# for a single token
|
||||
auth_tools = [tool.add_auth_token("my_auth", get_auth_token) for tool in tools]
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)
|
||||
|
||||
# OR, if multiple tokens are needed
|
||||
authorized_tool = tools[0].add_auth_tokens({
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getters({
|
||||
"my_auth1": get_auth1_token,
|
||||
"my_auth2": get_auth2_token,
|
||||
})
|
||||
})
|
||||
{{< /tab >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
tools = toolbox.load_toolset()
|
||||
|
||||
# for a single token
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)
|
||||
|
||||
# OR, if multiple tokens are needed
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getters({
|
||||
"my_auth1": get_auth1_token,
|
||||
"my_auth2": get_auth2_token,
|
||||
})
|
||||
{{< /tab >}}
|
||||
{{< tab header="Llamaindex" lang="Python" >}}
|
||||
tools = toolbox.load_toolset()
|
||||
|
||||
# for a single token
|
||||
auth_tools = [tool.add_auth_token("my_auth", get_auth_token) for tool in tools]
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)
|
||||
|
||||
# OR, if multiple tokens are needed
|
||||
authorized_tool = tools[0].add_auth_tokens({
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getters({
|
||||
"my_auth1": get_auth1_token,
|
||||
"my_auth2": get_auth2_token,
|
||||
})
|
||||
})
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
|
||||
@@ -43,9 +43,14 @@ id-token][provided-claims] can be used for the parameter.
|
||||
authServices:
|
||||
my-google-auth:
|
||||
kind: google
|
||||
clientId: YOUR_GOOGLE_CLIENT_ID
|
||||
clientId: ${YOUR_GOOGLE_CLIENT_ID}
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|
||||
@@ -11,6 +11,11 @@ Sources as a map in the `sources` section of your `tools.yaml` file. Typically,
|
||||
a source configuration will contain any information needed to connect with and
|
||||
interact with the database.
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-cloud-sql-source:
|
||||
@@ -19,11 +24,11 @@ sources:
|
||||
region: us-central1
|
||||
instance: my-instance-name
|
||||
database: my_db
|
||||
user: my-user
|
||||
password: my-password
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
In implementation, each source is a different connection pool or client that used
|
||||
to connect to the database and execute the tool.
|
||||
to connect to the database and execute the tool.
|
||||
|
||||
## Available Sources
|
||||
## Available Sources
|
||||
|
||||
@@ -24,7 +24,6 @@ cluster][alloydb-free-trial].
|
||||
|
||||
## Requirements
|
||||
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
By default, AlloyDB for PostgreSQL source uses the [AlloyDB Go
|
||||
@@ -46,24 +45,47 @@ permissions):
|
||||
### Networking
|
||||
|
||||
AlloyDB supports connecting over both from external networks via the internet
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the AlloyDB page
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the AlloyDB page
|
||||
[Connection overview][conn-overview].
|
||||
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
`public` or `private` to match your cluster's configuration. Regardless of which
|
||||
you choose, all connections use IAM-based authorization and are encrypted with
|
||||
mTLS.
|
||||
mTLS.
|
||||
|
||||
[private-ip]: https://cloud.google.com/alloydb/docs/private-ip
|
||||
[public-ip]: https://cloud.google.com/alloydb/docs/connect-public-ip
|
||||
[conn-overview]: https://cloud.google.com/alloydb/docs/connection-overview
|
||||
|
||||
### Database User
|
||||
### Authentication
|
||||
|
||||
Currently, this source only uses standard authentication. You will need to [create
|
||||
a PostgreSQL user][alloydb-users] to login to the database with.
|
||||
This source supports both password-based authentication and IAM
|
||||
authentication (using your [Application Default Credentials][adc]).
|
||||
|
||||
#### Standard Authentication
|
||||
|
||||
To connect using user/password, [create
|
||||
a PostgreSQL user][alloydb-users] and input your credentials in the `user` and
|
||||
`password` fields.
|
||||
|
||||
```yaml
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
#### IAM Authentication
|
||||
|
||||
To connect using IAM authentication:
|
||||
|
||||
1. Prepare your database instance and user following this [guide][iam-guide].
|
||||
2. You could choose one of the two ways to log in:
|
||||
- Specify your IAM email as the `user`.
|
||||
- Leave your `user` field blank. Toolbox will fetch the [ADC][adc]
|
||||
automatically and log in using the email associated with it.
|
||||
3. Leave the `password` field blank.
|
||||
|
||||
[iam-guide]: https://cloud.google.com/alloydb/docs/database-users/manage-iam-auth
|
||||
[alloydb-users]: https://cloud.google.com/alloydb/docs/database-users/about
|
||||
|
||||
## Example
|
||||
@@ -71,27 +93,32 @@ a PostgreSQL user][alloydb-users] to login to the database with.
|
||||
```yaml
|
||||
sources:
|
||||
my-alloydb-pg-source:
|
||||
kind: "alloydb-postgres"
|
||||
project: "my-project-id"
|
||||
region: "us-central1"
|
||||
cluster: "my-cluster"
|
||||
instance: "my-instance"
|
||||
database: "my_db"
|
||||
user: "my-user"
|
||||
password: "my-password"
|
||||
kind: alloydb-postgres
|
||||
project: my-project-id
|
||||
region: us-central1
|
||||
cluster: my-cluster
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
# ipType: "public"
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "alloydb-postgres". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| cluster | string | true | Name of the AlloyDB cluster (e.g. "my-cluster"). |
|
||||
| instance | string | true | Name of the AlloyDB instance within the cluser (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
|
||||
| user | string | true | Name of the Postgres user to connect as (e.g. "my-pg-user"). |
|
||||
| password | string | true | Password of the Postgres user (e.g. "my-password"). |
|
||||
| ipType | string | false | IP Type of the AlloyDB instance; must be one of `public` or `private`. Default: `public`. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "alloydb-postgres". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| cluster | string | true | Name of the AlloyDB cluster (e.g. "my-cluster"). |
|
||||
| instance | string | true | Name of the AlloyDB instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
|
||||
| user | string | false | Name of the Postgres user to connect as (e.g. "my-pg-user"). Defaults to IAM auth using [ADC][adc] email if unspecified. |
|
||||
| password | string | false | Password of the Postgres user (e.g. "my-password"). Defaults to attempting IAM authentication if unspecified. |
|
||||
| ipType | string | false | IP Type of the AlloyDB instance; must be one of `public` or `private`. Default: `public`. |
|
||||
|
||||
73
docs/en/resources/sources/bigquery.md
Normal file
@@ -0,0 +1,73 @@
|
||||
---
|
||||
title: "BigQuery"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
BigQuery is Google Cloud's fully managed, petabyte-scale, and cost-effective
|
||||
analytics data warehouse that lets you run analytics over vast amounts of
|
||||
data in near real time. With BigQuery, there's no infrastructure to set
|
||||
up or manage, letting you focus on finding meaningful insights using
|
||||
GoogleSQL and taking advantage of flexible pricing models across on-demand
|
||||
and flat-rate options.
|
||||
---
|
||||
|
||||
# BigQuery Source
|
||||
|
||||
[BigQuery][bigquery-docs] is Google Cloud's fully managed, petabyte-scale,
|
||||
and cost-effective analytics data warehouse that lets you run analytics
|
||||
over vast amounts of data in near real time. With BigQuery, there's no
|
||||
infrastructure to set up or manage, letting you focus on finding meaningful
|
||||
insights using GoogleSQL and taking advantage of flexible pricing models
|
||||
across on-demand and flat-rate options.
|
||||
|
||||
If you are new to BigQuery, you can try to
|
||||
[load and query data with the bq tool][bigquery-quickstart-cli].
|
||||
|
||||
BigQuery uses [GoogleSQL][bigquery-googlesql] for querying data. GoogleSQL
|
||||
is an ANSI-compliant structured query language (SQL) that is also implemented
|
||||
for other Google Cloud services. SQL queries are handled by cluster nodes
|
||||
in the same way as NoSQL data requests. Therefore, the same best practices
|
||||
apply when creating SQL queries to run against your BigQuery data, such as
|
||||
avoiding full table scans or complex filters.
|
||||
|
||||
[bigquery-docs]: https://cloud.google.com/bigquery/docs
|
||||
[bigquery-quickstart-cli]: https://cloud.google.com/bigquery/docs/quickstarts/quickstart-command-line
|
||||
[bigquery-googlesql]: https://cloud.google.com/bigquery/docs/reference/standard-sql/
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
BigQuery uses [Identity and Access Management (IAM)][iam-overview] to control
|
||||
user and group access to BigQuery resources like projects, datasets, and tables.
|
||||
Toolbox will use your [Application Default Credentials (ADC)][adc] to authorize
|
||||
and authenticate when interacting with [BigQuery][bigquery-docs].
|
||||
|
||||
In addition to [setting the ADC for your server][set-adc], you need to ensure
|
||||
the IAM identity has been given the correct IAM permissions for the queries
|
||||
you intend to run. Common roles include `roles/bigquery.user` (which includes
|
||||
permissions to run jobs and read data) or `roles/bigquery.dataViewer`. See
|
||||
[Introduction to BigQuery IAM][grant-permissions] for more information on
|
||||
applying IAM permissions and roles to an identity.
|
||||
|
||||
[iam-overview]: https://cloud.google.com/bigquery/docs/access-control
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
[grant-permissions]: https://cloud.google.com/bigquery/docs/access-control
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-bigquery-source:
|
||||
kind: "bigquery"
|
||||
project: "my-project-id"
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. The default behavior is for it to be executed in the US multi-region |
|
||||
70
docs/en/resources/sources/bigtable.md
Normal file
@@ -0,0 +1,70 @@
|
||||
---
|
||||
title: "Bigtable"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Bigtable is a low-latency NoSQL database service for machine learning, operational analytics, and user-facing operations. It's a wide-column, key-value store that can scale to billions of rows and thousands of columns. With Bigtable, you can replicate your data to regions across the world for high availability and data resiliency.
|
||||
|
||||
---
|
||||
|
||||
# Bigtable Source
|
||||
|
||||
[Bigtable][bigtable-docs] is a low-latency NoSQL database service for machine
|
||||
learning, operational analytics, and user-facing operations. It's a wide-column,
|
||||
key-value store that can scale to billions of rows and thousands of columns.
|
||||
With Bigtable, you can replicate your data to regions across the world for high
|
||||
availability and data resiliency.
|
||||
|
||||
If you are new to Bigtable, you can try to [create an instance and write data
|
||||
with the cbt CLI][bigtable-quickstart-with-cli].
|
||||
|
||||
You can use [GoogleSQL statements][bigtable-googlesql] to query your Bigtable
|
||||
data. GoogleSQL is an ANSI-compliant structured query language (SQL) that is
|
||||
also implemented for other Google Cloud services. SQL queries are handled by
|
||||
cluster nodes in the same way as NoSQL data requests. Therefore, the same best
|
||||
practices apply when creating SQL queries to run against your Bigtable data,
|
||||
such as avoiding full table scans or complex filters.
|
||||
|
||||
[bigtable-docs]: https://cloud.google.com/bigtable/docs
|
||||
[bigtable-quickstart-with-cli]:
|
||||
https://cloud.google.com/bigtable/docs/create-instance-write-data-cbt-cli
|
||||
|
||||
[bigtable-googlesql]:
|
||||
https://cloud.google.com/bigtable/docs/googlesql-overview
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
Bigtable uses [Identity and Access Management (IAM)][iam-overview] to control
|
||||
user and group access to Bigtable resources at the project, instance, table, and
|
||||
backup level. Toolbox will use your [Application Default Credentials (ADC)][adc]
|
||||
to authorize and authenticate when interacting with [Bigtable][bigtable-docs].
|
||||
|
||||
In addition to [setting the ADC for your server][set-adc], you need to ensure
|
||||
the IAM identity has been given the correct IAM permissions for the query
|
||||
provided. See [Apply IAM roles][grant-permissions] for more information on
|
||||
applying IAM permissions and roles to an identity.
|
||||
|
||||
[iam-overview]: https://cloud.google.com/bigtable/docs/access-control
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
[grant-permissions]: https://cloud.google.com/bigtable/docs/access-control#iam-management-instance
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-bigtable-source:
|
||||
kind: "bigtable"
|
||||
project: "my-project-id"
|
||||
instance: "test-instance"
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigtable". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| instance | string | true | Name of the Bigtable instance. |
|
||||
@@ -34,26 +34,25 @@ permissions):
|
||||
|
||||
- `roles/cloudsql.client`
|
||||
|
||||
{{< notice tip >}}
|
||||
{{< notice tip >}}
|
||||
If you are connecting from Compute Engine, make sure your VM
|
||||
also has the [proper
|
||||
scope](https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam)
|
||||
to connect using the Cloud SQL Admin API.
|
||||
to connect using the Cloud SQL Admin API.
|
||||
{{< /notice >}}
|
||||
|
||||
[csql-go-conn]: https://github.com/GoogleCloudPlatform/cloud-sql-go-connector
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
[gce-access-scopes]: https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam
|
||||
|
||||
### Networking
|
||||
|
||||
Cloud SQL supports connecting over both from external networks via the internet
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the Cloud SQL page
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the Cloud SQL page
|
||||
[Connection overview][conn-overview].
|
||||
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
`public` or `private` to match your cluster's configuration. Regardless of which
|
||||
you choose, all connections use IAM-based authorization and are encrypted with
|
||||
mTLS.
|
||||
@@ -64,8 +63,8 @@ mTLS.
|
||||
|
||||
### Database User
|
||||
|
||||
Currently, this source only uses standard authentication. You will need to [create a
|
||||
SQL Server user][cloud-sql-users] to login to the database with.
|
||||
Currently, this source only uses standard authentication. You will need to
|
||||
[create a SQL Server user][cloud-sql-users] to login to the database with.
|
||||
|
||||
[cloud-sql-users]: https://cloud.google.com/sql/docs/sqlserver/create-manage-users
|
||||
|
||||
@@ -80,9 +79,16 @@ sources:
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
ipAddress: localhost
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
# ipType: private
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
@@ -90,7 +96,7 @@ sources:
|
||||
| kind | string | true | Must be "cloud-sql-mssql". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluser (e.g. "my-instance"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Cloud SQL database to connect to (e.g. "my_db"). |
|
||||
| ipAddress | string | true | IP address of the Cloud SQL instance to connect to. |
|
||||
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-pg-user"). |
|
||||
|
||||
@@ -11,7 +11,7 @@ description: >
|
||||
## About
|
||||
|
||||
[Cloud SQL for MySQL][csql-mysql-docs] is a fully-managed database service
|
||||
that helps you set up, maintain, manage, and administer your MySQL
|
||||
that helps you set up, maintain, manage, and administer your MySQL
|
||||
relational databases on Google Cloud Platform.
|
||||
|
||||
If you are new to Cloud SQL for MySQL, you can try [creating and connecting
|
||||
@@ -35,29 +35,28 @@ permissions):
|
||||
|
||||
- `roles/cloudsql.client`
|
||||
|
||||
{{< notice tip >}}
|
||||
{{< notice tip >}}
|
||||
If you are connecting from Compute Engine, make sure your VM
|
||||
also has the [proper
|
||||
scope](https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam)
|
||||
to connect using the Cloud SQL Admin API.
|
||||
to connect using the Cloud SQL Admin API.
|
||||
{{< /notice >}}
|
||||
|
||||
[csql-go-conn]: https://github.com/GoogleCloudPlatform/cloud-sql-go-connector
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
[gce-access-scopes]: https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam
|
||||
|
||||
### Networking
|
||||
|
||||
Cloud SQL supports connecting over both from external networks via the internet
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the Cloud SQL page
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the Cloud SQL page
|
||||
[Connection overview][conn-overview].
|
||||
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
`public` or `private` to match your cluster's configuration. Regardless of which
|
||||
you choose, all connections use IAM-based authorization and are encrypted with
|
||||
mTLS.
|
||||
mTLS.
|
||||
|
||||
[private-ip]: https://cloud.google.com/sql/docs/mysql/configure-private-ip
|
||||
[public-ip]: https://cloud.google.com/sql/docs/mysql/configure-ip
|
||||
@@ -65,7 +64,7 @@ mTLS.
|
||||
|
||||
### Database User
|
||||
|
||||
Current, this source only uses standard authentication. You will need to [create
|
||||
Currently, this source only uses standard authentication. You will need to [create
|
||||
a MySQL user][cloud-sql-users] to login to the database with.
|
||||
|
||||
[cloud-sql-users]: https://cloud.google.com/sql/docs/mysql/create-manage-users
|
||||
@@ -75,16 +74,21 @@ a MySQL user][cloud-sql-users] to login to the database with.
|
||||
```yaml
|
||||
sources:
|
||||
my-cloud-sql-mysql-source:
|
||||
kind: "cloud-sql-mysql"
|
||||
project: "my-project-id"
|
||||
region: "us-central1"
|
||||
instance: "my-instance"
|
||||
database: "my_db"
|
||||
user: "my-user"
|
||||
password: "my-password"
|
||||
kind: cloud-sql-mysql
|
||||
project: my-project-id
|
||||
region: us-central1
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
# ipType: "private"
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|
||||
@@ -35,65 +35,93 @@ permissions):
|
||||
|
||||
- `roles/cloudsql.client`
|
||||
|
||||
{{< notice tip >}}
|
||||
{{< notice tip >}}
|
||||
If you are connecting from Compute Engine, make sure your VM
|
||||
also has the [proper
|
||||
scope](https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam)
|
||||
to connect using the Cloud SQL Admin API.
|
||||
to connect using the Cloud SQL Admin API.
|
||||
{{< /notice >}}
|
||||
|
||||
[csql-go-conn]: https://github.com/GoogleCloudPlatform/cloud-sql-go-connector
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
[gce-access-scopes]: https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam
|
||||
[csql-go-conn]: <https://github.com/GoogleCloudPlatform/cloud-sql-go-connector>
|
||||
[adc]: <https://cloud.google.com/docs/authentication#adc>
|
||||
[set-adc]: <https://cloud.google.com/docs/authentication/provide-credentials-adc>
|
||||
|
||||
### Networking
|
||||
|
||||
Cloud SQL supports connecting over both from external networks via the internet
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the Cloud SQL page
|
||||
([public IP][public-ip]), and internal networks ([private IP][private-ip]).
|
||||
For more information on choosing between the two options, see the Cloud SQL page
|
||||
[Connection overview][conn-overview].
|
||||
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
You can configure the `ipType` parameter in your source configuration to
|
||||
`public` or `private` to match your cluster's configuration. Regardless of which
|
||||
you choose, all connections use IAM-based authorization and are encrypted with
|
||||
mTLS.
|
||||
mTLS.
|
||||
|
||||
[private-ip]: https://cloud.google.com/sql/docs/postgres/configure-private-ip
|
||||
[public-ip]: https://cloud.google.com/sql/docs/postgres/configure-ip
|
||||
[conn-overview]: https://cloud.google.com/sql/docs/postgres/connect-overview
|
||||
|
||||
### Database User
|
||||
### Authentication
|
||||
|
||||
Current, this source only uses standard authentication. You will need to [create
|
||||
a PostreSQL user][cloud-sql-users] to login to the database with.
|
||||
This source supports both password-based authentication and IAM
|
||||
authentication (using your [Application Default Credentials][adc]).
|
||||
|
||||
[cloud-sql-users]: https://cloud.google.com/sql/docs/postgres/create-manage-users
|
||||
#### Standard Authentication
|
||||
|
||||
To connect using user/password, [create
|
||||
a PostgreSQL user][cloudsql-users] and input your credentials in the `user` and
|
||||
`password` fields.
|
||||
|
||||
```yaml
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
#### IAM Authentication
|
||||
|
||||
To connect using IAM authentication:
|
||||
|
||||
1. Prepare your database instance and user following this [guide][iam-guide].
|
||||
2. You could choose one of the two ways to log in:
|
||||
- Specify your IAM email as the `user`.
|
||||
- Leave your `user` field blank. Toolbox will fetch the [ADC][adc]
|
||||
automatically and log in using the email associated with it.
|
||||
|
||||
3. Leave the `password` field blank.
|
||||
|
||||
[iam-guide]: https://cloud.google.com/sql/docs/postgres/iam-logins
|
||||
[cloudsql-users]: https://cloud.google.com/sql/docs/postgres/create-manage-users
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-cloud-sql-pg-source:
|
||||
kind: "cloud-sql-postgres"
|
||||
project: "my-project-id"
|
||||
region: "us-central1"
|
||||
instance: "my-instance"
|
||||
database: "my_db"
|
||||
user: "my-user"
|
||||
password: "my-password"
|
||||
kind: cloud-sql-postgres
|
||||
project: my-project-id
|
||||
region: us-central1
|
||||
instance: my-instance
|
||||
database: my_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
# ipType: "private"
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "cloud-sql-postgres". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
|
||||
| user | string | true | Name of the Postgres user to connect as (e.g. "my-pg-user"). |
|
||||
| password | string | true | Password of the Postgres user (e.g. "my-password"). |
|
||||
| ipType | string | false | IP Type of the Cloud SQL instance; must be one of `public` or `private`. Default: `public`. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "cloud-sql-postgres". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
|
||||
| user | string | false | Name of the Postgres user to connect as (e.g. "my-pg-user"). Defaults to IAM auth using [ADC][adc] email if unspecified. |
|
||||
| password | string | false | Password of the Postgres user (e.g. "my-password"). Defaults to attempting IAM authentication if unspecified. |
|
||||
| ipType | string | false | IP Type of the Cloud SQL instance; must be one of `public` or `private`. Default: `public`. |
|
||||
|
||||
44
docs/en/resources/sources/couchbase.md
Normal file
@@ -0,0 +1,44 @@
|
||||
---
|
||||
title: "couchbase"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "couchbase" source connects to a Couchbase database.
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `couchbase` source establishes a connection to a Couchbase database cluster,
|
||||
allowing tools to execute SQL queries against it.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-couchbase-instance:
|
||||
kind: couchbase
|
||||
connectionString: couchbase://localhost:8091
|
||||
bucket: travel-sample
|
||||
scope: inventory
|
||||
username: Administrator
|
||||
password: password
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|----------------------|:--------:|:------------:|---------------------------------------------------------|
|
||||
| kind | string | true | Must be "couchbase". |
|
||||
| connectionString | string | true | Connection string for the Couchbase cluster. |
|
||||
| bucket | string | true | Name of the bucket to connect to. |
|
||||
| scope | string | true | Name of the scope within the bucket. |
|
||||
| username | string | false | Username for authentication. |
|
||||
| password | string | false | Password for authentication. |
|
||||
| clientCert | string | false | Path to client certificate file for TLS authentication. |
|
||||
| clientCertPassword | string | false | Password for the client certificate. |
|
||||
| clientKey | string | false | Path to client key file for TLS authentication. |
|
||||
| clientKeyPassword | string | false | Password for the client key. |
|
||||
| caCert | string | false | Path to CA certificate file. |
|
||||
| noSslVerify | boolean | false | If true, skip server certificate verification. **Warning:** This option should only be used in development or testing environments. Disabling SSL verification poses significant security risks in production as it makes your connection vulnerable to man-in-the-middle attacks. |
|
||||
| profile | string | false | Name of the connection profile to apply. |
|
||||
| queryScanConsistency | integer | false | Query scan consistency. Controls the consistency guarantee for index scanning. Values: 1 for "not_bounded" (fastest option, but results may not include the most recent operations), 2 for "request_plus" (highest consistency level, includes all operations up until the query started, but incurs a performance penalty). If not specified, defaults to the Couchbase Go SDK default. |
|
||||
@@ -9,7 +9,10 @@ description: >
|
||||
|
||||
## About
|
||||
|
||||
[Dgraph][dgraph-docs] is an open-source graph database. It is designed for real-time workloads, horizontal scalability, and data flexibility. Implemented as a distributed system, Dgraph processes queries in parallel to deliver the fastest result.
|
||||
[Dgraph][dgraph-docs] is an open-source graph database. It is designed for
|
||||
real-time workloads, horizontal scalability, and data flexibility. Implemented
|
||||
as a distributed system, Dgraph processes queries in parallel to deliver the
|
||||
fastest result.
|
||||
|
||||
This source can connect to either a self-managed Dgraph cluster or one hosted on
|
||||
Dgraph Cloud. If you're new to Dgraph, the fastest way to get started is to
|
||||
@@ -18,7 +21,7 @@ Dgraph Cloud. If you're new to Dgraph, the fastest way to get started is to
|
||||
[dgraph-docs]: https://dgraph.io/docs
|
||||
[dgraph-login]: https://cloud.dgraph.io/login
|
||||
|
||||
## Requirements
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
|
||||
@@ -34,20 +37,25 @@ and user credentials for that namespace.
|
||||
```yaml
|
||||
sources:
|
||||
my-dgraph-source:
|
||||
kind: "dgraph"
|
||||
dgraphUrl: "https://xxxx.cloud.dgraph.io"
|
||||
user: "groot"
|
||||
password: "password"
|
||||
apiKey: abc123
|
||||
namepace : 0
|
||||
kind: dgraph
|
||||
dgraphUrl: https://xxxx.cloud.dgraph.io
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
apiKey: ${API_KEY}
|
||||
namespace : 0
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **Field** | **Type** | **Required** | **Description** |
|
||||
|-------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "dgraph". |
|
||||
| dgraphUrl | string | true | Connection URI (e.g. "https://xxx.cloud.dgraph.io", "https://localhost:8080"). |
|
||||
| dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). |
|
||||
| user | string | false | Name of the Dgraph user to connect as (e.g., "groot"). |
|
||||
| password | string | false | Password of the Dgraph user (e.g., "password"). |
|
||||
| apiKey | string | false | API key to connect to a Dgraph Cloud instance. |
|
||||
|
||||
49
docs/en/resources/sources/http.md
Normal file
@@ -0,0 +1,49 @@
|
||||
---
|
||||
title: "HTTP"
|
||||
linkTitle: "HTTP"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
The HTTP source enables the Toolbox to retrieve data from a remote server using HTTP requests.
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
The HTTP Source allows Toolbox to retrieve data from arbitrary HTTP
|
||||
endpoints. This enables Generative AI applications to access data from web APIs
|
||||
and other HTTP-accessible resources.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-http-source:
|
||||
kind: http
|
||||
baseUrl: https://api.example.com/data
|
||||
timeout: 10s # default to 30s
|
||||
headers:
|
||||
Authorization: Bearer ${API_KEY}
|
||||
Content-Type: application/json
|
||||
queryParams:
|
||||
param1: value1
|
||||
param2: value2
|
||||
# disableSslVerification: false
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|------------------------|:-----------------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "http". |
|
||||
| baseUrl | string | true | The base URL for the HTTP requests (e.g., `https://api.example.com`). |
|
||||
| timeout | string | false | The timeout for HTTP requests (e.g., "5s", "1m", refer to [ParseDuration][parse-duration-doc] for more examples). Defaults to 30s. |
|
||||
| headers | map[string]string | false | Default headers to include in the HTTP requests. |
|
||||
| queryParams | map[string]string | false | Default query parameters to include in the HTTP requests. |
|
||||
| disableSslVerification | bool | false | Disable SSL certificate verification. This should only be used for local development. Defaults to `false`. |
|
||||
|
||||
[parse-duration-doc]: https://pkg.go.dev/time#ParseDuration
|
||||
@@ -7,7 +7,7 @@ description: >
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
## About
|
||||
|
||||
[SQL Server][mssql-docs] is a relational database management system (RDBMS)
|
||||
developed by Microsoft that allows users to store, retrieve, and manage large
|
||||
@@ -20,7 +20,7 @@ amount of data through a structured format.
|
||||
### Database User
|
||||
|
||||
This source only uses standard authentication. You will need to [create a
|
||||
SQL Server user][mssql-users] to login to the database with.
|
||||
SQL Server user][mssql-users] to login to the database with.
|
||||
|
||||
[mssql-users]: https://learn.microsoft.com/en-us/sql/relational-databases/security/authentication-access/create-a-database-user?view=sql-server-ver16
|
||||
|
||||
@@ -29,14 +29,19 @@ SQL Server user][mssql-users] to login to the database with.
|
||||
```yaml
|
||||
sources:
|
||||
my-mssql-source:
|
||||
kind: "mssql"
|
||||
host: "127.0.0.1"
|
||||
port: "1433"
|
||||
database: "my_db"
|
||||
user: "my-user"
|
||||
password: "my-password"
|
||||
kind: mssql
|
||||
host: 127.0.0.1
|
||||
port: 1433
|
||||
database: my_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|
||||
@@ -11,7 +11,7 @@ description: >
|
||||
|
||||
[MySQL][mysql-docs] is a relational database management system (RDBMS) that
|
||||
stores and manages data. It's a popular choice for developers because of its
|
||||
reliability, performance, and ease of use.
|
||||
reliability, performance, and ease of use.
|
||||
|
||||
[mysql-docs]: https://www.mysql.com/
|
||||
|
||||
@@ -20,7 +20,7 @@ reliability, performance, and ease of use.
|
||||
### Database User
|
||||
|
||||
This source only uses standard authentication. You will need to [create a
|
||||
MySQL user][mysql-users] to login to the database with.
|
||||
MySQL user][mysql-users] to login to the database with.
|
||||
|
||||
[mysql-users]: https://dev.mysql.com/doc/refman/8.4/en/user-names.html
|
||||
|
||||
@@ -29,14 +29,19 @@ MySQL user][mysql-users] to login to the database with.
|
||||
```yaml
|
||||
sources:
|
||||
my-mysql-source:
|
||||
kind: "mysql"
|
||||
host: "127.0.0.1"
|
||||
port: "3306"
|
||||
database: "my_db"
|
||||
user: "my-user"
|
||||
password: "my-password"
|
||||
kind: mysql
|
||||
host: 127.0.0.1
|
||||
port: 3306
|
||||
database: my_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|
||||
@@ -13,13 +13,13 @@ reliability, feature robustness, and performance.
|
||||
|
||||
[neo4j-docs]: https://neo4j.com/docs
|
||||
|
||||
## Requirements
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
|
||||
This source only uses standard authentication. You will need to [create a Neo4j
|
||||
user][neo4j-users] to log in to the database with, or use the default `neo4j`
|
||||
user if available.
|
||||
user if available.
|
||||
|
||||
[neo4j-users]: https://neo4j.com/docs/operations-manual/current/authentication-authorization/manage-users/
|
||||
|
||||
@@ -28,13 +28,18 @@ user if available.
|
||||
```yaml
|
||||
sources:
|
||||
my-neo4j-source:
|
||||
kind: "neo4j"
|
||||
uri: "neo4j+s://xxxx.databases.neo4j.io:7687"
|
||||
user: "neo4j"
|
||||
password: "my-password"
|
||||
kind: neo4j
|
||||
uri: neo4j+s://xxxx.databases.neo4j.io:7687
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
database: "neo4j"
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
@@ -44,5 +49,3 @@ sources:
|
||||
| user | string | true | Name of the Neo4j user to connect as (e.g. "neo4j"). |
|
||||
| password | string | true | Password of the Neo4j user (e.g. "my-password"). |
|
||||
| database | string | true | Name of the Neo4j database to connect to (e.g. "neo4j"). |
|
||||
|
||||
|
||||
|
||||
@@ -7,7 +7,7 @@ description: >
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
## About
|
||||
|
||||
[PostgreSQL][pg-docs] is a powerful, open source object-relational database
|
||||
system with over 35 years of active development that has earned it a strong
|
||||
@@ -20,7 +20,7 @@ reputation for reliability, feature robustness, and performance.
|
||||
### Database User
|
||||
|
||||
This source only uses standard authentication. You will need to [create a
|
||||
PostreSQL user][pg-users] to login to the database with.
|
||||
PostgreSQL user][pg-users] to login to the database with.
|
||||
|
||||
[pg-users]: https://www.postgresql.org/docs/current/sql-createuser.html
|
||||
|
||||
@@ -29,14 +29,19 @@ PostreSQL user][pg-users] to login to the database with.
|
||||
```yaml
|
||||
sources:
|
||||
my-pg-source:
|
||||
kind: "postgres"
|
||||
host: "127.0.0.1"
|
||||
port: "5432"
|
||||
database: "my_db"
|
||||
user: "my-user"
|
||||
password: "my-password"
|
||||
kind: postgres
|
||||
host: 127.0.0.1
|
||||
port: 5432
|
||||
database: my_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
@@ -47,5 +52,3 @@ sources:
|
||||
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
|
||||
| user | string | true | Name of the Postgres user to connect as (e.g. "my-pg-user"). |
|
||||
| password | string | true | Password of the Postgres user (e.g. "my-password"). |
|
||||
|
||||
|
||||
|
||||
96
docs/en/resources/sources/redis.md
Normal file
@@ -0,0 +1,96 @@
|
||||
---
|
||||
title: "Redis"
|
||||
linkTitle: "Redis"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Redis is an open-source, in-memory data structure store.
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
Redis is an open-source, in-memory data structure store, used as a database,
|
||||
cache, and message broker. It supports data structures such as strings, hashes,
|
||||
lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, and
|
||||
geospatial indexes with radius queries.
|
||||
|
||||
If you are new to Redis, you can find installation and getting started guides on
|
||||
the [official Redis website](https://redis.io/docs/getting-started/).
|
||||
|
||||
## Requirements
|
||||
|
||||
### Redis
|
||||
|
||||
[AUTH string][auth] is a password for connection to Redis. If you have the
|
||||
`requirepass` directive set in your Redis configuration, incoming client
|
||||
connections must authenticate in order to connect.
|
||||
|
||||
Specify your AUTH string in the password field:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-redis-instance:
|
||||
kind: redis
|
||||
address:
|
||||
- 127.0.0.1
|
||||
username: ${MY_USER_NAME}
|
||||
password: ${MY_AUTH_STRING} # Omit this field if you don't have a password.
|
||||
# database: 0
|
||||
# clusterEnabled: false
|
||||
# useGCPIAM: false
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
### Memorystore For Redis
|
||||
|
||||
Memorystore standalone instances support authentication using an [AUTH][auth]
|
||||
string.
|
||||
|
||||
Here is an example tools.yaml config with [AUTH][auth] enabled:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-redis-cluster-instance:
|
||||
kind: memorystore-redis
|
||||
address:
|
||||
- 127.0.0.1
|
||||
password: ${MY_AUTH_STRING}
|
||||
# useGCPIAM: false
|
||||
# clusterEnabled: false
|
||||
```
|
||||
|
||||
Memorystore Redis Cluster supports IAM authentication instead. Grant your
|
||||
account the required [IAM role][iam] and make sure to set `useGCPIAM` to `true`.
|
||||
|
||||
Here is an example tools.yaml config for Memorystore Redis Cluster instances
|
||||
using IAM authentication:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-redis-cluster-instance:
|
||||
kind: memorystore-redis
|
||||
address: 127.0.0.1
|
||||
useGCPIAM: true
|
||||
clusterEnabled: true
|
||||
```
|
||||
|
||||
[iam]: https://cloud.google.com/memorystore/docs/cluster/about-iam-auth
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|----------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "memorystore-redis". |
|
||||
| address | string | true | Primary endpoint for the Memorystore Redis instance to connect to. |
|
||||
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Redis, leave this field blank |
|
||||
| password | string | false | If you have [Redis AUTH][auth] enabled, specify the AUTH string here |
|
||||
| database | int | false | The Redis database to connect to. Not applicable for cluster enabled instances. The default database is `0`. |
|
||||
| clusterEnabled | bool | false | Set it to `true` if using a Redis Cluster instance. Defaults to `false`. |
|
||||
| useGCPIAM | string | false | Set it to `true` if you are using GCP's IAM authentication. Defaults to `false`. |
|
||||
|
||||
[auth]: https://cloud.google.com/memorystore/docs/redis/about-redis-auth
|
||||
@@ -8,7 +8,7 @@ description: >
|
||||
|
||||
---
|
||||
|
||||
# Spanner Source
|
||||
# Spanner Source
|
||||
|
||||
[Spanner][spanner-docs] is a fully managed, mission-critical database service
|
||||
that brings together relational, graph, key-value, and search. It offers
|
||||
@@ -23,7 +23,7 @@ the Google Cloud console][spanner-quickstart].
|
||||
[spanner-quickstart]:
|
||||
https://cloud.google.com/spanner/docs/create-query-database-console
|
||||
|
||||
## Requirements
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
|
||||
67
docs/en/resources/sources/sqlite.md
Normal file
@@ -0,0 +1,67 @@
|
||||
---
|
||||
title: "SQLite"
|
||||
linkTitle: "SQLite"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
SQLite is a C-language library that implements a small, fast, self-contained,
|
||||
high-reliability, full-featured, SQL database engine.
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
[SQLite](https://sqlite.org/) is a software library that provides a relational
|
||||
database management system. The lite in SQLite means lightweight in terms of
|
||||
setup, database administration, and required resources.
|
||||
|
||||
SQLite has the following notable characteristics:
|
||||
|
||||
- Self-contained with no external dependencies
|
||||
- Serverless - the SQLite library accesses its storage files directly
|
||||
- Single database file that can be easily copied or moved
|
||||
- Zero-configuration - no setup or administration needed
|
||||
- Transactional with ACID properties
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database File
|
||||
|
||||
You need a SQLite database file. This can be:
|
||||
|
||||
- An existing database file
|
||||
- A path where a new database file should be created
|
||||
- `:memory:` for an in-memory database
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-sqlite-db:
|
||||
kind: "sqlite"
|
||||
database: "/path/to/database.db"
|
||||
```
|
||||
|
||||
For an in-memory database:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-sqlite-memory-db:
|
||||
kind: "sqlite"
|
||||
database: ":memory:"
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
### Configuration Fields
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "spanner". |
|
||||
| database | string | true | Path to SQLite database file, or ":memory:" for an in-memory database. |
|
||||
|
||||
### Connection Properties
|
||||
|
||||
SQLite connections are configured with these defaults for optimal performance:
|
||||
|
||||
- `MaxOpenConns`: 1 (SQLite only supports one writer at a time)
|
||||
- `MaxIdleConns`: 1
|
||||
69
docs/en/resources/sources/valkey.md
Normal file
@@ -0,0 +1,69 @@
|
||||
---
|
||||
title: "Valkey"
|
||||
linkTitle: "Valkey"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Valkey is an open-source, in-memory data structure store, forked from Redis.
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
Valkey is an open-source, in-memory data structure store that originated as a
|
||||
fork of Redis. It's designed to be used as a database, cache, and message
|
||||
broker, supporting a wide range of data structures like strings, hashes, lists,
|
||||
sets, sorted sets with range queries, bitmaps, hyperloglogs, and geospatial
|
||||
indexes with radius queries.
|
||||
|
||||
If you're new to Valkey, you can find installation and getting started guides on
|
||||
the [official Valkey website](https://valkey.io/docs/getting-started/).
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-valkey-instance:
|
||||
kind: valkey
|
||||
address:
|
||||
- 127.0.0.1
|
||||
username: ${YOUR_USERNAME}
|
||||
password: ${YOUR_PASSWORD}
|
||||
# database: 0
|
||||
# useGCPIAM: false
|
||||
# disableCache: false
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
### IAM Authentication
|
||||
|
||||
If you are using GCP's Memorystore for Valkey, you can connect using IAM
|
||||
authentication. Grant your account the required [IAM role][iam] and set
|
||||
`useGCPIAM` to `true`:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-valkey-instance:
|
||||
kind: valkey
|
||||
address:
|
||||
- 127.0.0.1
|
||||
useGCPIAM: true
|
||||
```
|
||||
|
||||
[iam]: https://cloud.google.com/memorystore/docs/valkey/about-iam-auth
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|--------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "valkey". |
|
||||
| address | []string | true | Endpoints for the Valkey instance to connect to. |
|
||||
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Valkey, leave this field blank |
|
||||
| password | string | false | Password for the Valkey instance |
|
||||
| database | int | false | The Valkey database to connect to. Not applicable for cluster enabled instances. The default database is `0`. |
|
||||
| useGCPIAM | bool | false | Set it to `true` if you are using GCP's IAM authentication. Defaults to `false`. |
|
||||
| disableCache | bool | false | Set it to `true` if you want to enable client-side caching. Defaults to `false`. |
|
||||
@@ -11,7 +11,6 @@ A tool represents an action your agent can take, such as running a SQL
|
||||
statement. You can define Tools as a map in the `tools` section of your
|
||||
`tools.yaml` file. Typically, a tool will require a source to act on:
|
||||
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
search_flights_by_number:
|
||||
@@ -50,7 +49,6 @@ tools:
|
||||
description: 1 to 4 digit number
|
||||
```
|
||||
|
||||
|
||||
## Specifying Parameters
|
||||
|
||||
Parameters for each Tool will define what inputs the agent will need to provide
|
||||
@@ -79,44 +77,53 @@ the parameter.
|
||||
description: Airline unique 2 letter identifier
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:---------------:|:------------:|-----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, the parameter is not required. |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
|
||||
### Array Parameters
|
||||
|
||||
The `array` type is a list of items passed in as a single parameter.
|
||||
To use the `array` type, you must also specify what kind of items are
|
||||
To use the `array` type, you must also specify what kind of items are
|
||||
in the list using the items field:
|
||||
|
||||
```yaml
|
||||
parameters:
|
||||
- name: preffered_airlines
|
||||
- name: preferred_airlines
|
||||
type: array
|
||||
description: A list of airline, ordered by preference.
|
||||
description: A list of airline, ordered by preference.
|
||||
items:
|
||||
name: name
|
||||
name: name
|
||||
type: string
|
||||
description: Name of the airline.
|
||||
description: Name of the airline.
|
||||
statement: |
|
||||
SELECT * FROM airlines WHERE preferred_airlines = ANY($1);
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:----------------:|:------------:|----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| items | parameter object | true | Specify a Parameter object for the type of the values in the array. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:----------------:|:------------:|-----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be "array" |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, the parameter is not required. |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| items | parameter object | true | Specify a Parameter object for the type of the values in the array. |
|
||||
|
||||
{{< notice note >}}
|
||||
Items in array should not have a default value. If provided, it will be ignored.
|
||||
{{< /notice >}}
|
||||
|
||||
### Authenticated Parameters
|
||||
|
||||
Authenticated parameters are automatically populated with user
|
||||
information decoded from [ID tokens](../authsources/#specifying-id-tokens-from-clients) that
|
||||
are passed in request headers. They do not take input values in request bodies
|
||||
like other parameters. To use authenticated parameters, you must configure
|
||||
the tool to map the required [authServices](../authservices) to
|
||||
specific claims within the user's ID token.
|
||||
information decoded from [ID
|
||||
tokens](../authsources/#specifying-id-tokens-from-clients) that are passed in
|
||||
request headers. They do not take input values in request bodies like other
|
||||
parameters. To use authenticated parameters, you must configure the tool to map
|
||||
the required [authServices](../authservices) to specific claims within the
|
||||
user's ID token.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
@@ -141,6 +148,60 @@ specific claims within the user's ID token.
|
||||
| name | string | true | Name of the [authServices](../authservices) used to verify the OIDC auth token. |
|
||||
| field | string | true | Claim field decoded from the OIDC token used to auto-populate this parameter. |
|
||||
|
||||
### Template Parameters
|
||||
|
||||
Template parameters types include `string`, `integer`, `float`, `boolean` types.
|
||||
In most cases, the description will be provided to the LLM as context on
|
||||
specifying the parameter. Template parameters will be inserted into the SQL
|
||||
statement before executing the prepared statement. They will be inserted without
|
||||
quotes, so to insert a string using template parameters, quotes must be
|
||||
explicitly added within the string.
|
||||
|
||||
Template parameter arrays can also be used similarly to basic parameters, and array
|
||||
items must be strings. Once inserted into the SQL statement, the outer layer of
|
||||
quotes will be removed. Therefore to insert strings into the SQL statement, a
|
||||
set of quotes must be explicitly added within the string.
|
||||
|
||||
{{< notice warning >}}
|
||||
Because template parameters can directly replace identifiers, column names, and
|
||||
table names, they are prone to SQL injections. Basic parameters are preferred
|
||||
for performance and safety reasons.
|
||||
{{< /notice >}}
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
select_columns_from_table:
|
||||
kind: postgres-sql
|
||||
source: my-pg-instance
|
||||
statement: |
|
||||
SELECT {{array .columnNames}} FROM {{.tableName}}
|
||||
description: |
|
||||
Use this tool to list all information from a specific table.
|
||||
Example:
|
||||
{{
|
||||
"tableName": "flights",
|
||||
"columnNames": ["id", "name"]
|
||||
}}
|
||||
templateParameters:
|
||||
- name: tableName
|
||||
type: string
|
||||
description: Table to select from
|
||||
- name: columnNames
|
||||
type: array
|
||||
description: The columns to select
|
||||
items:
|
||||
name: column
|
||||
type: string
|
||||
description: Name of a column to select
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:----------------:|:-------------:|-------------------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the template parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| description | string | true | Natural language description of the template parameter to describe it to the agent. |
|
||||
| items | parameter object |true (if array)| Specify a Parameter object for the type of the values in the array (string only). |
|
||||
|
||||
## Authorized Invocations
|
||||
|
||||
You can require an authorization check for any Tool invocation request by
|
||||
|
||||
7
docs/en/resources/tools/alloydbainl/_index.md
Normal file
@@ -0,0 +1,7 @@
|
||||
---
|
||||
title: "AlloyDB AI NL"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
AlloyDB AI NL Tool.
|
||||
---
|
||||