mirror of
https://github.com/googleapis/genai-toolbox.git
synced 2026-01-11 08:28:11 -05:00
Compare commits
292 Commits
v0.4.0
...
mongodb-in
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8c6f290481 | ||
|
|
768cbdc07c | ||
|
|
6a7edab6e6 | ||
|
|
71b751c400 | ||
|
|
7815dce6a0 | ||
|
|
6b4dd50a20 | ||
|
|
7e814804aa | ||
|
|
296a7a7aa8 | ||
|
|
b37aedd880 | ||
|
|
3a19e46d63 | ||
|
|
2c6eb4f47f | ||
|
|
8a8649b789 | ||
|
|
9159550089 | ||
|
|
78e9752f62 | ||
|
|
dfde52ca9a | ||
|
|
a7474752d8 | ||
|
|
be65924aa6 | ||
|
|
59e23e1725 | ||
|
|
74dbd6124d | ||
|
|
c600c30374 | ||
|
|
ccc3498cf0 | ||
|
|
f55dd6fcd0 | ||
|
|
1526b8ab8c | ||
|
|
2a1b1ff787 | ||
|
|
86ccff0b43 | ||
|
|
d61e552ead | ||
|
|
9c289da638 | ||
|
|
0b28b72aa0 | ||
|
|
3f6ec2944e | ||
|
|
c67e01bcf9 | ||
|
|
81d05053b2 | ||
|
|
15417d4e0c | ||
|
|
9aa6aa079d | ||
|
|
fa3e9ac04b | ||
|
|
4ee8cfa1f4 | ||
|
|
552e86bc43 | ||
|
|
1387f858b6 | ||
|
|
481cc608ba | ||
|
|
d16728e5c6 | ||
|
|
49b1562f73 | ||
|
|
a1def43b35 | ||
|
|
480d76dfff | ||
|
|
9334368a42 | ||
|
|
2a650349cb | ||
|
|
5f7cc32127 | ||
|
|
abdab54503 | ||
|
|
e78bce32dc | ||
|
|
3727b1d053 | ||
|
|
7eff0f9ac7 | ||
|
|
4468bc920b | ||
|
|
9a55b80482 | ||
|
|
e5ac5ba9ee | ||
|
|
2bb790e4f8 | ||
|
|
2083ba5048 | ||
|
|
53afed5b76 | ||
|
|
2b2732ec39 | ||
|
|
9b1505e4bd | ||
|
|
b7795c8857 | ||
|
|
000831c15b | ||
|
|
5c54cc973d | ||
|
|
8cc91ee3f7 | ||
|
|
ed5ef4caea | ||
|
|
208df0a428 | ||
|
|
313d3ca0d0 | ||
|
|
4dae5a6ed7 | ||
|
|
26bdba46ca | ||
|
|
a817b120ca | ||
|
|
8ce311f256 | ||
|
|
86227e3104 | ||
|
|
206bea4575 | ||
|
|
65843621c5 | ||
|
|
e681a7e36c | ||
|
|
0d1cadb245 | ||
|
|
6d27dabfb2 | ||
|
|
f312fc01b2 | ||
|
|
4998cae260 | ||
|
|
2b69700c5e | ||
|
|
c081ace46b | ||
|
|
edf32abd84 | ||
|
|
aa8dbec970 | ||
|
|
a7963c5a83 | ||
|
|
ea3c805467 | ||
|
|
ebbbe4c409 | ||
|
|
f9743ecf7e | ||
|
|
d3693c0d6b | ||
|
|
1a1815d822 | ||
|
|
391cb5bfe8 | ||
|
|
2bdcc0841a | ||
|
|
a6693ab8b0 | ||
|
|
e1325880d1 | ||
|
|
b7230a93df | ||
|
|
32712fa018 | ||
|
|
35e0919184 | ||
|
|
72a7282797 | ||
|
|
29fe3b93cd | ||
|
|
fb3f66acf4 | ||
|
|
1f95eb134b | ||
|
|
4c240ac3c9 | ||
|
|
c6ab74c5da | ||
|
|
04e2529ba9 | ||
|
|
53dd247e6e | ||
|
|
648eede62b | ||
|
|
9b2dfcc553 | ||
|
|
cb514209b6 | ||
|
|
0a93b0482c | ||
|
|
f13e9635ba | ||
|
|
fafed24858 | ||
|
|
6337434623 | ||
|
|
822708afaa | ||
|
|
010c278cbf | ||
|
|
40679d700e | ||
|
|
5fb056ee43 | ||
|
|
a1b60100c2 | ||
|
|
cb92883330 | ||
|
|
bd2f1956bd | ||
|
|
cbb4a33351 | ||
|
|
7badba42ee | ||
|
|
f72e426314 | ||
|
|
7a6644cf0c | ||
|
|
184c681797 | ||
|
|
474df57d62 | ||
|
|
fc1a3813ea | ||
|
|
c7fe3c7f38 | ||
|
|
dc2690bd39 | ||
|
|
b78f7480cf | ||
|
|
ffe9b74211 | ||
|
|
e1355660d4 | ||
|
|
d8e2abe2dd | ||
|
|
7b3539e9ff | ||
|
|
1d658c3b14 | ||
|
|
fd300dc606 | ||
|
|
4827771b78 | ||
|
|
a8df414b11 | ||
|
|
0bf4ebabf1 | ||
|
|
67964d939f | ||
|
|
f77c829271 | ||
|
|
d2977ed1ba | ||
|
|
52e8bf4de1 | ||
|
|
a3aaf93525 | ||
|
|
9197186b8b | ||
|
|
e3844ff76d | ||
|
|
ef6e3f1c32 | ||
|
|
f5f771b0f3 | ||
|
|
12b6636a9b | ||
|
|
d51dbc759b | ||
|
|
4055b0c356 | ||
|
|
65dba4cabc | ||
|
|
447cda2daf | ||
|
|
c54ef61fc6 | ||
|
|
eb98cdc7d1 | ||
|
|
1c067715fa | ||
|
|
cb87f765a6 | ||
|
|
a982314900 | ||
|
|
054ec198b9 | ||
|
|
f0aef29b0c | ||
|
|
075dfa47e1 | ||
|
|
ad62d14cd5 | ||
|
|
5d183b0efe | ||
|
|
904d04bc45 | ||
|
|
75e254c0a4 | ||
|
|
850b32c5b0 | ||
|
|
927ef3c508 | ||
|
|
15d3c45159 | ||
|
|
714d990c34 | ||
|
|
e6c2fb324b | ||
|
|
cf96f4c249 | ||
|
|
8569c6b59f | ||
|
|
44d41a4888 | ||
|
|
4998f82852 | ||
|
|
b81fc6aa6c | ||
|
|
29aa0a70da | ||
|
|
5638ef520a | ||
|
|
2f42de9507 | ||
|
|
0a08d2c15d | ||
|
|
71250e1ced | ||
|
|
702dbc355b | ||
|
|
ef0cbdb4bf | ||
|
|
518a0e4c70 | ||
|
|
d7ba2736eb | ||
|
|
33ae70ec02 | ||
|
|
1c9ad5ea24 | ||
|
|
b76346993f | ||
|
|
1830702fd8 | ||
|
|
b4862825e8 | ||
|
|
f5de1af5bd | ||
|
|
46d7cdf4ba | ||
|
|
9ecf1755ab | ||
|
|
1596f5d772 | ||
|
|
594066f9f4 | ||
|
|
0ddc7240b7 | ||
|
|
0880e16c05 | ||
|
|
a6c49007bf | ||
|
|
ef94648455 | ||
|
|
69d047af46 | ||
|
|
4700dd363c | ||
|
|
386bb23e7c | ||
|
|
ba8a6f3a3b | ||
|
|
ad97578fdf | ||
|
|
3d10f85302 | ||
|
|
5a4cc9af6b | ||
|
|
15f90a3773 | ||
|
|
87380f629d | ||
|
|
953ab9336f | ||
|
|
032b333961 | ||
|
|
25afd63496 | ||
|
|
3180830403 | ||
|
|
a29c80012e | ||
|
|
0fd88b574b | ||
|
|
e9a6018526 | ||
|
|
1702ce1e00 | ||
|
|
5292e12588 | ||
|
|
1bf6003eae | ||
|
|
0857be0aa8 | ||
|
|
22edbea579 | ||
|
|
8df757b280 | ||
|
|
5c66977877 | ||
|
|
e615e355d5 | ||
|
|
9b7e7a0b3e | ||
|
|
72c236be03 | ||
|
|
1d0ed42067 | ||
|
|
301dfa1114 | ||
|
|
6083a224aa | ||
|
|
d6dc0c5269 | ||
|
|
eb52b66d82 | ||
|
|
8dec385538 | ||
|
|
6a832a1d7e | ||
|
|
8df5568901 | ||
|
|
0c07e15c2c | ||
|
|
0e4564f383 | ||
|
|
4b4fbc656a | ||
|
|
4d4b3ebeb9 | ||
|
|
d65747a2dc | ||
|
|
8590061ae4 | ||
|
|
e89abac29f | ||
|
|
6512704e77 | ||
|
|
04dcf47912 | ||
|
|
0e53829703 | ||
|
|
a890d0beee | ||
|
|
ca4491b0a9 | ||
|
|
2068f26302 | ||
|
|
d7579861e8 | ||
|
|
e8e0125eaa | ||
|
|
c98bc4b1a3 | ||
|
|
b58bf76dda | ||
|
|
5a5e06f1a6 | ||
|
|
5c166d0651 | ||
|
|
2c3e7a0c1c | ||
|
|
7138ed5f42 | ||
|
|
54f2614edf | ||
|
|
6bcbadf948 | ||
|
|
e375317914 | ||
|
|
01e089cc34 | ||
|
|
352b3ed91c | ||
|
|
df31fa6680 | ||
|
|
b2ff195831 | ||
|
|
00e3a87258 | ||
|
|
e747b6e289 | ||
|
|
8834a36445 | ||
|
|
9a5d76e2dc | ||
|
|
a087280fe2 | ||
|
|
c26b2f4d9e | ||
|
|
31a1fe971a | ||
|
|
1d096de82f | ||
|
|
652dc5c2dd | ||
|
|
1dce40fc26 | ||
|
|
717f43420a | ||
|
|
7c5ae0bf0b | ||
|
|
8b68764ef6 | ||
|
|
c7189e9fcf | ||
|
|
8b635955fc | ||
|
|
91cc3e366d | ||
|
|
d7390b06b7 | ||
|
|
724957b4a9 | ||
|
|
c891c8e7bc | ||
|
|
11ea7bc584 | ||
|
|
b2176c0e2f | ||
|
|
f629df642b | ||
|
|
50ec7f4a06 | ||
|
|
570d7caf4d | ||
|
|
a6c17c96f3 | ||
|
|
b6cd99e859 | ||
|
|
9ba6235106 | ||
|
|
ad040cfb8b | ||
|
|
644aecd9d2 | ||
|
|
8646989f80 | ||
|
|
d1c870c004 | ||
|
|
038262bbe1 | ||
|
|
7add34b544 | ||
|
|
b8dd50aded | ||
|
|
1e1348f5f0 | ||
|
|
59f4452755 | ||
|
|
a352045116 |
@@ -17,15 +17,8 @@ steps:
|
||||
waitFor: ['-']
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
docker buildx build --build-arg METADATA_TAGS=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$REF_NAME .
|
||||
|
||||
- id: "push-docker"
|
||||
name: "gcr.io/cloud-builders/docker"
|
||||
waitFor:
|
||||
- "build-docker"
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
docker push ${_DOCKER_URI}:$REF_NAME
|
||||
docker buildx create --name container-builder --driver docker-container --bootstrap --use
|
||||
docker buildx build --platform linux/amd64,linux/arm64 --build-arg COMMIT_SHA=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$REF_NAME --push .
|
||||
|
||||
- id: "install-dependencies"
|
||||
name: golang:1
|
||||
@@ -50,7 +43,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.linux.amd64.$REF_NAME" -o toolbox.linux.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
|
||||
|
||||
- id: "store-linux-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -72,7 +65,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=arm64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.arm64.$REF_NAME" -o toolbox.darwin.arm64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
|
||||
|
||||
- id: "store-darwin-arm64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -94,7 +87,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.amd64.$REF_NAME" -o toolbox.darwin.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
|
||||
|
||||
- id: "store-darwin-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -116,7 +109,7 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
CGO_ENABLED=0 GOOS=windows GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.windows.amd64.$REF_NAME" -o toolbox.windows.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
|
||||
|
||||
- id: "store-windows-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -124,12 +117,13 @@ steps:
|
||||
- "build-windows-amd64"
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$REF_NAME/windows/amd64/toolbox
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$REF_NAME/windows/amd64/toolbox.exe
|
||||
|
||||
options:
|
||||
automapSubstitutions: true
|
||||
dynamicSubstitutions: true
|
||||
logging: CLOUD_LOGGING_ONLY # Necessary for custom service account
|
||||
machineType: 'E2_HIGHCPU_32'
|
||||
|
||||
substitutions:
|
||||
_REGION: us-central1
|
||||
|
||||
@@ -11,20 +11,37 @@ fi
|
||||
FILES=("linux.amd64" "darwin.arm64" "darwin.amd64" "windows.amd64")
|
||||
output_string=""
|
||||
|
||||
# Define the descriptions - ensure this array's order matches FILES
|
||||
DESCRIPTIONS=(
|
||||
"For **Linux** systems running on **Intel/AMD 64-bit processors**."
|
||||
"For **macOS** systems running on **Apple Silicon** (M1, M2, M3, etc.) processors."
|
||||
"For **macOS** systems running on **Intel processors**."
|
||||
"For **Windows** systems running on **Intel/AMD 64-bit processors**."
|
||||
)
|
||||
|
||||
# Write the table header
|
||||
ROW_FMT="| %-93s | %73s |"
|
||||
output_string+=$(printf "$ROW_FMT" "**os/arch**" "**sha256**")$'\n'
|
||||
output_string+=$(printf "$ROW_FMT" $(printf -- '-%0.s' {1..93}) $(printf -- '-%0.s' {1..73}))$'\n'
|
||||
ROW_FMT="| %-105s | %-120s | %-67s |\n"
|
||||
output_string+=$(printf "$ROW_FMT" "**OS/Architecture**" "**Description**" "**SHA256 Hash**")$'\n'
|
||||
output_string+=$(printf "$ROW_FMT" "$(printf -- '-%0.s' {1..105})" "$(printf -- '-%0.s' {1..120})" "$(printf -- '-%0.s' {1..67})")$'\n'
|
||||
|
||||
|
||||
# Loop through all files matching the pattern "toolbox.*.*"
|
||||
for file in "${FILES[@]}"
|
||||
for i in "${!FILES[@]}"
|
||||
do
|
||||
file_key="${FILES[$i]}" # e.g., "linux.amd64"
|
||||
description_text="${DESCRIPTIONS[$i]}"
|
||||
|
||||
# Extract OS and ARCH from the filename
|
||||
OS=$(echo "$file" | cut -d '.' -f 1)
|
||||
ARCH=$(echo "$file" | cut -d '.' -f 2)
|
||||
OS=$(echo "$file_key" | cut -d '.' -f 1)
|
||||
ARCH=$(echo "$file_key" | cut -d '.' -f 2)
|
||||
|
||||
# Get release URL
|
||||
URL=https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox
|
||||
if [ "$OS" = 'windows' ];
|
||||
then
|
||||
URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox.exe"
|
||||
else
|
||||
URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox"
|
||||
fi
|
||||
|
||||
curl "$URL" --fail --output toolbox || exit 1
|
||||
|
||||
@@ -32,10 +49,10 @@ do
|
||||
SHA256=$(shasum -a 256 toolbox | awk '{print $1}')
|
||||
|
||||
# Write the table row
|
||||
# output_string+="| [$OS/$ARCH]($URL) | $SHA256 |\n"
|
||||
output_string+=$(printf "$ROW_FMT" "[$OS/$ARCH]($URL)" "$SHA256")$'\n'
|
||||
output_string+=$(printf "$ROW_FMT" "[$OS/$ARCH]($URL)" "$description_text" "$SHA256")$'\n'
|
||||
|
||||
rm toolbox
|
||||
done
|
||||
|
||||
printf "$output_string\n"
|
||||
|
||||
|
||||
@@ -24,9 +24,22 @@ steps:
|
||||
script: |
|
||||
go get -d ./...
|
||||
|
||||
- id: "cloud-sql-pg"
|
||||
- id: "compile-test-binary"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
script: |
|
||||
go test -c -race -cover \
|
||||
-coverpkg=./internal/sources/...,./internal/tools/... ./tests/...
|
||||
chmod +x .ci/test_with_coverage.sh
|
||||
|
||||
- id: "cloud-sql-pg"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -43,11 +56,16 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,cloudsqlpg ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Cloud SQL Postgres" \
|
||||
cloudsqlpg \
|
||||
postgressql \
|
||||
postgresexecutesql
|
||||
|
||||
|
||||
- id: "alloydb-pg"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -64,11 +82,15 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,alloydb ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"AlloyDB Postgres" \
|
||||
alloydbpg \
|
||||
postgressql \
|
||||
postgresexecutesql
|
||||
|
||||
- id: "alloydb-ai-nl"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -85,11 +107,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,alloydb_ai_nl ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"AlloyDB AI NL" \
|
||||
alloydbainl \
|
||||
alloydbainl
|
||||
|
||||
- id: "bigtable"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -104,11 +129,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,bigtable ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Bigtable" \
|
||||
bigtable \
|
||||
bigtable
|
||||
|
||||
- id: "bigquery"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -121,11 +149,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,bigquery ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"BigQuery" \
|
||||
bigquery \
|
||||
bigquery
|
||||
|
||||
- id: "postgres"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -140,11 +171,15 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,postgres ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Postgres" \
|
||||
postgres \
|
||||
postgressql \
|
||||
postgresexecutesql
|
||||
|
||||
- id: "spanner"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -159,11 +194,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,spanner ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Spanner" \
|
||||
spanner \
|
||||
spanner
|
||||
|
||||
- id: "neo4j"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -176,11 +214,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,neo4j ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Neo4j" \
|
||||
neo4j \
|
||||
neo4j
|
||||
|
||||
- id: "cloud-sql-mssql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -197,11 +238,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,cloudsqlmssql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Cloud SQL MSSQL" \
|
||||
cloudsqlmssql \
|
||||
mssql
|
||||
|
||||
- id: "cloud-sql-mysql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -218,11 +262,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,cloudsqlmysql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Cloud SQL MySQL" \
|
||||
cloudsqlmysql \
|
||||
mysql
|
||||
|
||||
- id: "mysql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -237,11 +284,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,mysql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"MySQL" \
|
||||
mysql \
|
||||
mysql
|
||||
|
||||
- id: "mssql"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -256,11 +306,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,mssql ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"MSSQL" \
|
||||
mssql \
|
||||
mssql
|
||||
|
||||
- id: "dgraph"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -271,11 +324,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,dgraph ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"Dgraph" \
|
||||
dgraph \
|
||||
dgraph
|
||||
|
||||
- id: "http"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -286,11 +342,14 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,http ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"HTTP" \
|
||||
http \
|
||||
http
|
||||
|
||||
- id: "sqlite"
|
||||
name: golang:1
|
||||
waitFor: ["install-dependencies"]
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
@@ -302,8 +361,133 @@ steps:
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
go test -race -v -tags=integration,sqlite ./tests
|
||||
.ci/test_with_coverage.sh \
|
||||
"SQLite" \
|
||||
sqlite \
|
||||
sqlite
|
||||
|
||||
- id: "couchbase"
|
||||
name : golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "COUCHBASE_SCOPE=$_COUCHBASE_SCOPE"
|
||||
- "COUCHBASE_BUCKET=$_COUCHBASE_BUCKET"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["COUCHBASE_CONNECTION", "COUCHBASE_USER", "COUCHBASE_PASS", "CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Couchbase" \
|
||||
couchbase \
|
||||
couchbase
|
||||
|
||||
- id: "redis"
|
||||
name : golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["REDIS_ADDRESS", "REDIS_PASS", "CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Redis" \
|
||||
redis \
|
||||
redis
|
||||
|
||||
- id: "valkey"
|
||||
name : golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "VALKEY_DATABASE=$_VALKEY_DATABASE"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["VALKEY_ADDRESS", "CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Valkey" \
|
||||
valkey \
|
||||
valkey
|
||||
|
||||
- id: "firestore"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "FIRESTORE_PROJECT=$PROJECT_ID"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
secretEnv: ["CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Firestore" \
|
||||
firestore \
|
||||
firestore
|
||||
|
||||
- id: "looker"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "FIRESTORE_PROJECT=$PROJECT_ID"
|
||||
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
|
||||
- "LOOKER_VERIFY_SSL=$_LOOKER_VERIFY_SSL"
|
||||
secretEnv: ["CLIENT_ID", "LOOKER_BASE_URL", "LOOKER_CLIENT_ID", "LOOKER_CLIENT_SECRET"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Looker" \
|
||||
looker \
|
||||
looker
|
||||
|
||||
|
||||
|
||||
- id: "alloydbwaitforoperation"
|
||||
name: golang:1
|
||||
waitFor: ["compile-test-binary"]
|
||||
entrypoint: /bin/bash
|
||||
env:
|
||||
- "GOPATH=/gopath"
|
||||
- "API_KEY=$(gcloud auth print-access-token)"
|
||||
secretEnv: ["CLIENT_ID"]
|
||||
volumes:
|
||||
- name: "go"
|
||||
path: "/gopath"
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
.ci/test_with_coverage.sh \
|
||||
"Alloydb Wait for Operation" \
|
||||
utility \
|
||||
utility/alloydbwaitforoperation
|
||||
|
||||
availableSecrets:
|
||||
secretManager:
|
||||
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
|
||||
@@ -341,9 +525,27 @@ availableSecrets:
|
||||
- versionName: projects/$PROJECT_ID/secrets/mysql_pass/versions/latest
|
||||
env: MYSQL_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/mssql_user/versions/latest
|
||||
env: MSSQL_USER
|
||||
env: MSSQL_USER
|
||||
- versionName: projects/$PROJECT_ID/secrets/mssql_pass/versions/latest
|
||||
env: MSSQL_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/couchbase_connection/versions/latest
|
||||
env: COUCHBASE_CONNECTION
|
||||
- versionName: projects/$PROJECT_ID/secrets/couchbase_user/versions/latest
|
||||
env: COUCHBASE_USER
|
||||
- versionName: projects/$PROJECT_ID/secrets/couchbase_pass/versions/latest
|
||||
env: COUCHBASE_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/memorystore_redis_address/versions/latest
|
||||
env: REDIS_ADDRESS
|
||||
- versionName: projects/$PROJECT_ID/secrets/memorystore_redis_pass/versions/latest
|
||||
env: REDIS_PASS
|
||||
- versionName: projects/$PROJECT_ID/secrets/memorystore_valkey_address/versions/latest
|
||||
env: VALKEY_ADDRESS
|
||||
- versionName: projects/107716898620/secrets/looker_base_url/versions/latest
|
||||
env: LOOKER_BASE_URL
|
||||
- versionName: projects/107716898620/secrets/looker_client_id/versions/latest
|
||||
env: LOOKER_CLIENT_ID
|
||||
- versionName: projects/107716898620/secrets/looker_client_secret/versions/latest
|
||||
env: LOOKER_CLIENT_SECRET
|
||||
|
||||
|
||||
options:
|
||||
@@ -374,3 +576,6 @@ substitutions:
|
||||
_MSSQL_HOST: 127.0.0.1
|
||||
_MSSQL_PORT: "1433"
|
||||
_DGRAPHURL: "https://play.dgraph.io"
|
||||
_COUCHBASE_BUCKET: "couchbase-bucket"
|
||||
_COUCHBASE_SCOPE: "couchbase-scope"
|
||||
_LOOKER_VERIFY_SSL: "true"
|
||||
62
.ci/test_with_coverage.sh
Executable file
62
.ci/test_with_coverage.sh
Executable file
@@ -0,0 +1,62 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Arguments:
|
||||
# $1: Display name for logs (e.g., "Cloud SQL Postgres")
|
||||
# $2: Integration test's package name (e.g., cloudsqlpg)
|
||||
# $3, $4, ...: Tool package names for grep (e.g., postgressql), if the
|
||||
# integration test specifically check a separate package inside a folder, please
|
||||
# specify the full path instead (e.g., postgressql/postgresexecutesql)
|
||||
|
||||
DISPLAY_NAME="$1"
|
||||
SOURCE_PACKAGE_NAME="$2"
|
||||
|
||||
# Construct the test binary name
|
||||
TEST_BINARY="${SOURCE_PACKAGE_NAME}.test"
|
||||
|
||||
# Construct the full source path
|
||||
SOURCE_PATH="sources/${SOURCE_PACKAGE_NAME}/"
|
||||
|
||||
# Shift arguments so that $3 and onwards become the list of tool package names
|
||||
shift 2
|
||||
TOOL_PACKAGE_NAMES=("$@")
|
||||
|
||||
COVERAGE_FILE="${TEST_BINARY%.test}_coverage.out"
|
||||
FILTERED_COVERAGE_FILE="${TEST_BINARY%.test}_filtered_coverage.out"
|
||||
|
||||
export path="github.com/googleapis/genai-toolbox/internal/"
|
||||
|
||||
GREP_PATTERN="^mode:|${path}${SOURCE_PATH}"
|
||||
# Add each tool package path to the grep pattern
|
||||
for tool_name in "${TOOL_PACKAGE_NAMES[@]}"; do
|
||||
if [ -n "$tool_name" ]; then
|
||||
full_tool_path="tools/${tool_name}/"
|
||||
GREP_PATTERN="${GREP_PATTERN}|${path}${full_tool_path}"
|
||||
fi
|
||||
done
|
||||
|
||||
# Run integration test
|
||||
if ! ./"${TEST_BINARY}" -test.v -test.coverprofile="${COVERAGE_FILE}"; then
|
||||
echo "Error: Tests for ${DISPLAY_NAME} failed. Exiting."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Filter source/tool packages
|
||||
if ! grep -E "${GREP_PATTERN}" "${COVERAGE_FILE}" > "${FILTERED_COVERAGE_FILE}"; then
|
||||
echo "Warning: Could not filter coverage for ${DISPLAY_NAME}. Filtered file might be empty or invalid."
|
||||
fi
|
||||
|
||||
# Calculate coverage
|
||||
echo "Calculating coverage for ${DISPLAY_NAME}..."
|
||||
total_coverage=$(go tool cover -func="${FILTERED_COVERAGE_FILE}" 2>/dev/null | grep "total:" | awk '{print $3}')
|
||||
|
||||
|
||||
echo "${DISPLAY_NAME} total coverage: $total_coverage"
|
||||
coverage_numeric=$(echo "$total_coverage" | sed 's/%//')
|
||||
|
||||
# Check coverage threshold
|
||||
if awk -v coverage="$coverage_numeric" 'BEGIN {exit !(coverage < 50)}'; then
|
||||
echo "Coverage failure: ${DISPLAY_NAME} total coverage($total_coverage) is below 50%."
|
||||
exit 1
|
||||
else
|
||||
echo "Coverage for ${DISPLAY_NAME} is sufficient."
|
||||
fi
|
||||
@@ -18,19 +18,13 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
docker buildx build --build-arg METADATA_TAGS=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$VERSION -t ${_DOCKER_URI}:latest .
|
||||
docker buildx create --name container-builder --driver docker-container --bootstrap --use
|
||||
|
||||
- id: "push-docker"
|
||||
name: "gcr.io/cloud-builders/docker"
|
||||
waitFor:
|
||||
- "build-docker"
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
docker push ${_DOCKER_URI}:$VERSION
|
||||
export TAGS="-t ${_DOCKER_URI}:$VERSION"
|
||||
if [[ $_PUSH_LATEST == 'true' ]]; then
|
||||
docker push ${_DOCKER_URI}:latest
|
||||
export TAGS="$TAGS -t ${_DOCKER_URI}:latest"
|
||||
fi
|
||||
docker buildx build --platform linux/amd64,linux/arm64 --build-arg BUILD_TYPE=container.release --build-arg COMMIT_SHA=$(git rev-parse HEAD) $TAGS --push .
|
||||
|
||||
- id: "install-dependencies"
|
||||
name: golang:1
|
||||
@@ -56,7 +50,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.linux.amd64.$VERSION.$(git rev-parse HEAD)" -o toolbox.linux.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
|
||||
|
||||
- id: "store-linux-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -80,7 +74,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=arm64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.arm64.$VERSION.$(git rev-parse HEAD)" -o toolbox.darwin.arm64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
|
||||
|
||||
- id: "store-darwin-arm64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -104,7 +98,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=darwin GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.darwin.amd64.$VERSION.$(git rev-parse HEAD)" -o toolbox.darwin.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
|
||||
|
||||
- id: "store-darwin-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -128,7 +122,7 @@ steps:
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=$(cat ./cmd/version.txt)
|
||||
CGO_ENABLED=0 GOOS=windows GOARCH=amd64 \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=binary.windows.amd64.$VERSION.$(git rev-parse HEAD)" -o toolbox.windows.amd64
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
|
||||
|
||||
- id: "store-windows-amd64"
|
||||
name: "gcr.io/cloud-builders/gcloud:latest"
|
||||
@@ -137,12 +131,13 @@ steps:
|
||||
script: |
|
||||
#!/usr/bin/env bash
|
||||
export VERSION=v$(cat ./cmd/version.txt)
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$VERSION/windows/amd64/toolbox
|
||||
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$VERSION/windows/amd64/toolbox.exe
|
||||
|
||||
options:
|
||||
automapSubstitutions: true
|
||||
dynamicSubstitutions: true
|
||||
logging: CLOUD_LOGGING_ONLY # Necessary for custom service account
|
||||
machineType: 'E2_HIGHCPU_32'
|
||||
|
||||
substitutions:
|
||||
_REGION: us-central1
|
||||
|
||||
13
.github/blunderbuss.yml
vendored
13
.github/blunderbuss.yml
vendored
@@ -1,8 +1,15 @@
|
||||
assign_issues:
|
||||
- kurtisvg
|
||||
- Yuan325
|
||||
- duwenxin99
|
||||
- akitsch
|
||||
assign_issues_by:
|
||||
- labels:
|
||||
- 'product: bigquery'
|
||||
to:
|
||||
- Genesis929
|
||||
- shobsi
|
||||
- jiaxunwu
|
||||
assign_prs:
|
||||
- kurtisvg
|
||||
- Yuan325
|
||||
- duwenxin99
|
||||
- duwenxin99
|
||||
- akitsch
|
||||
|
||||
2
.github/label-sync.yml
vendored
Normal file
2
.github/label-sync.yml
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
---
|
||||
ignored: true
|
||||
22
.github/labels.yaml
vendored
22
.github/labels.yaml
vendored
@@ -50,7 +50,7 @@
|
||||
color: ffffc7
|
||||
description: Desirable enhancement or fix. May not be included in next release.
|
||||
|
||||
- name: do not merge
|
||||
- name: 'do not merge'
|
||||
color: d93f0b
|
||||
description: Indicates a pull request not ready for merge, due to either quality
|
||||
or timing.
|
||||
@@ -65,6 +65,26 @@
|
||||
color: ededed
|
||||
description: Release please has completed a release for this.
|
||||
|
||||
- name: 'blunderbuss: assign'
|
||||
color: 3DED97
|
||||
description: Have blunderbuss assign this to someone new.
|
||||
|
||||
- name: 'tests: run'
|
||||
color: 3DED97
|
||||
description: Label to trigger Github Action tests.
|
||||
|
||||
- name: 'docs: deploy-preview'
|
||||
color: BFDADC
|
||||
description: Label to trigger Github Action docs preview.
|
||||
|
||||
- name: 'status: help wanted'
|
||||
color: 8befd7
|
||||
description: 'Status: Unplanned work open to contributions from the community.'
|
||||
- name: 'status: feedback wanted'
|
||||
color: 8befd7
|
||||
description: 'Status: waiting for feedback from community or issue author.'
|
||||
|
||||
# Product Labels
|
||||
- name: 'product: bigquery'
|
||||
color: 5065c7
|
||||
description: 'Product: Assigned to the BigQuery team.'
|
||||
|
||||
14
.github/release-please.yml
vendored
14
.github/release-please.yml
vendored
@@ -20,6 +20,18 @@ extraFiles: [
|
||||
"README.md",
|
||||
"docs/en/getting-started/introduction/_index.md",
|
||||
"docs/en/getting-started/local_quickstart.md",
|
||||
"docs/en/getting-started/local_quickstart_js.md",
|
||||
"docs/en/getting-started/local_quickstart_go.md",
|
||||
"docs/en/getting-started/mcp_quickstart/_index.md",
|
||||
"docs/en/how-to/deploy_gke.md",
|
||||
"docs/en/samples/bigquery/local_quickstart.md",
|
||||
"docs/en/samples/bigquery/mcp_quickstart/_index.md",
|
||||
"docs/en/getting-started/colab_quickstart.ipynb",
|
||||
"docs/en/samples/bigquery/colab_quickstart_bigquery.ipynb",
|
||||
"docs/en/how-to/connect-ide/bigquery_mcp.md",
|
||||
"docs/en/how-to/connect-ide/spanner_mcp.md",
|
||||
"docs/en/how-to/connect-ide/alloydb_pg_mcp.md",
|
||||
"docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md",
|
||||
"docs/en/how-to/connect-ide/cloud_sql_pg_mcp.md",
|
||||
"docs/en/how-to/connect-ide/postgres_mcp.md",
|
||||
"docs/en/how-to/connect-ide/cloud_sql_mssql_mcp.md",
|
||||
]
|
||||
|
||||
2
.github/renovate.json5
vendored
2
.github/renovate.json5
vendored
@@ -1,7 +1,7 @@
|
||||
{
|
||||
extends: [
|
||||
'config:recommended',
|
||||
':semanticCommits',
|
||||
':semanticCommitTypeAll(chore)',
|
||||
':ignoreUnstable',
|
||||
':separateMajorReleases',
|
||||
':prConcurrentLimitNone',
|
||||
|
||||
2
.github/sync-repo-settings.yaml
vendored
2
.github/sync-repo-settings.yaml
vendored
@@ -31,6 +31,8 @@ branchProtectionRules:
|
||||
- "header-check"
|
||||
# - Add required status checks like presubmit tests
|
||||
- "unit tests (ubuntu-latest)"
|
||||
- "unit tests (windows-latest)"
|
||||
- "unit tests (macos-latest)"
|
||||
- "integration-test-pr (toolbox-testing-438616)"
|
||||
requiredApprovingReviewCount: 1
|
||||
requiresCodeOwnerReviews: true
|
||||
|
||||
2
.github/workflows/docs_deploy.yaml
vendored
2
.github/workflows/docs_deploy.yaml
vendored
@@ -31,7 +31,7 @@ on:
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-24.04
|
||||
defaults:
|
||||
run:
|
||||
working-directory: .hugo
|
||||
|
||||
22
.github/workflows/docs_preview_deploy.yaml
vendored
22
.github/workflows/docs_preview_deploy.yaml
vendored
@@ -17,7 +17,7 @@ name: "docs"
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
|
||||
|
||||
# This Workflow depends on 'github.event.number',
|
||||
# not compatible with branch or manual triggers.
|
||||
on:
|
||||
@@ -26,10 +26,20 @@ on:
|
||||
paths:
|
||||
- 'docs/**'
|
||||
- 'github/workflows/docs**'
|
||||
- '.hugo'
|
||||
- '.hugo/**'
|
||||
pull_request_target:
|
||||
types: [labeled]
|
||||
paths:
|
||||
- 'docs/**'
|
||||
- 'github/workflows/docs**'
|
||||
- '.hugo/**'
|
||||
|
||||
jobs:
|
||||
preview:
|
||||
# run job on proper workflow event triggers (skip job for pull_request event
|
||||
# from forks and only run pull_request_target for "docs: deploy-preview"
|
||||
# label)
|
||||
if: "${{ (github.event.action != 'labeled' && github.event.pull_request.head.repo.full_name == github.event.pull_request.base.repo.full_name) || github.event.label.name == 'docs: deploy-preview' }}"
|
||||
runs-on: ubuntu-24.04
|
||||
defaults:
|
||||
run:
|
||||
@@ -41,6 +51,8 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
|
||||
with:
|
||||
# Checkout the PR's HEAD commit (supports forks).
|
||||
ref: ${{ github.event.pull_request.head.sha }}
|
||||
fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
|
||||
|
||||
- name: Setup Hugo
|
||||
@@ -70,8 +82,6 @@ jobs:
|
||||
HUGO_RELATIVEURLS: false
|
||||
|
||||
- name: Deploy
|
||||
# If run from a fork, GitHub write operations will fail.
|
||||
if: ${{ !github.event.pull_request.head.repo.fork }}
|
||||
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
@@ -80,8 +90,6 @@ jobs:
|
||||
commit_message: "stage: PR-${{ github.event.number }}: ${{ github.event.head_commit.message }}"
|
||||
|
||||
- name: Comment
|
||||
# If run from a fork, GitHub write operations will fail.
|
||||
if: ${{ !github.event.pull_request.head.repo.fork }}
|
||||
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
|
||||
with:
|
||||
script: |
|
||||
@@ -90,4 +98,4 @@ jobs:
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
body: "🔎 Preview at https://${{ github.repository_owner }}.github.io/${{ github.event.repository.name }}/previews/PR-${{ github.event.number }}/"
|
||||
})
|
||||
})
|
||||
|
||||
4
.github/workflows/lint.yaml
vendored
4
.github/workflows/lint.yaml
vendored
@@ -51,7 +51,7 @@ jobs:
|
||||
console.log('Failed to remove label. Another job may have already removed it!');
|
||||
}
|
||||
- name: Setup Go
|
||||
uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0
|
||||
uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
|
||||
with:
|
||||
go-version: "1.22"
|
||||
- name: Checkout code
|
||||
@@ -66,7 +66,7 @@ jobs:
|
||||
run: |
|
||||
go mod tidy && git diff --exit-code
|
||||
- name: golangci-lint
|
||||
uses: golangci/golangci-lint-action@55c2c1448f86e01eaae002a5a3a9624417608d84 # v6.5.2
|
||||
uses: golangci/golangci-lint-action@4afd733a84b1f43292c63897423277bb7f4313a9 # v8.0.0
|
||||
with:
|
||||
version: latest
|
||||
args: --timeout 3m
|
||||
|
||||
17
.github/workflows/tests.yaml
vendored
17
.github/workflows/tests.yaml
vendored
@@ -32,8 +32,7 @@ jobs:
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
# os: [macos-latest, windows-latest, ubuntu-latest]
|
||||
os: [ubuntu-latest]
|
||||
os: [macos-latest, windows-latest, ubuntu-latest]
|
||||
fail-fast: false
|
||||
permissions:
|
||||
contents: 'read'
|
||||
@@ -58,7 +57,7 @@ jobs:
|
||||
}
|
||||
|
||||
- name: Setup Go
|
||||
uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0
|
||||
uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
|
||||
with:
|
||||
go-version: "1.22"
|
||||
|
||||
@@ -75,16 +74,24 @@ jobs:
|
||||
- name: Build
|
||||
run: go build -v ./...
|
||||
|
||||
- name: Run tests
|
||||
- name: Run tests with coverage
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
run: |
|
||||
source_dir="./internal/sources/*"
|
||||
tool_dir="./internal/tools/*"
|
||||
auth_dir="./internal/auth/*"
|
||||
included_packages=$(go list ./... | grep -v -e "$source_dir" -e "$tool_dir" -e "$auth_dir")
|
||||
int_test_dir="./tests/*"
|
||||
included_packages=$(go list ./... | grep -v -e "$source_dir" -e "$tool_dir" -e "$auth_dir" -e "$int_test_dir")
|
||||
go test -race -cover -coverprofile=coverage.out -v $included_packages
|
||||
go test -race -v ./internal/sources/... ./internal/tools/... ./internal/auth/...
|
||||
|
||||
- name: Run tests without coverage
|
||||
if: ${{ runner.os != 'Linux' }}
|
||||
run: |
|
||||
go test -race -v ./internal/... ./cmd/...
|
||||
|
||||
- name: Check coverage
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
run: |
|
||||
FILE_TO_EXCLUDE="github.com/googleapis/genai-toolbox/internal/server/config.go"
|
||||
ESCAPED_PATH=$(echo "$FILE_TO_EXCLUDE" | sed 's/\//\\\//g; s/\./\\\./g')
|
||||
|
||||
7
.gitignore
vendored
7
.gitignore
vendored
@@ -4,6 +4,9 @@
|
||||
# vscode
|
||||
.vscode/
|
||||
|
||||
# idea
|
||||
.idea/
|
||||
|
||||
# npm
|
||||
node_modules
|
||||
|
||||
@@ -14,3 +17,7 @@ node_modules
|
||||
|
||||
# coverage
|
||||
.coverage
|
||||
|
||||
# executable
|
||||
genai-toolbox
|
||||
toolbox
|
||||
|
||||
@@ -12,39 +12,26 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
version: "2"
|
||||
linters:
|
||||
enable:
|
||||
- errcheck
|
||||
- goimports
|
||||
- gosimple
|
||||
- govet
|
||||
- ineffassign
|
||||
- staticcheck
|
||||
- unused
|
||||
linters-settings:
|
||||
gofmt:
|
||||
rewrite-rules:
|
||||
- pattern: 'interface{}'
|
||||
replacement: 'any'
|
||||
- pattern: 'a[b:len(a)]'
|
||||
replacement: 'a[b:]'
|
||||
exclusions:
|
||||
presets:
|
||||
- std-error-handling
|
||||
issues:
|
||||
fix: true
|
||||
run:
|
||||
build-tags:
|
||||
- integration
|
||||
- cloudsqlpg
|
||||
- postgres
|
||||
- alloydb
|
||||
- spanner
|
||||
- cloudsqlmssql
|
||||
- cloudsqlmysql
|
||||
- neo4j
|
||||
- dgraph
|
||||
- mssql
|
||||
- mysql
|
||||
- http
|
||||
- alloydb_ai_nl
|
||||
- bigtable
|
||||
- bigquery
|
||||
- sqlite
|
||||
formatters:
|
||||
enable:
|
||||
- goimports
|
||||
settings:
|
||||
gofmt:
|
||||
rewrite-rules:
|
||||
- pattern: interface{}
|
||||
replacement: any
|
||||
- pattern: a[b:len(a)]
|
||||
replacement: a[b:]
|
||||
|
||||
1
.hugo/assets/scss/_styles_project.scss
Normal file
1
.hugo/assets/scss/_styles_project.scss
Normal file
@@ -0,0 +1 @@
|
||||
@import 'td/code-dark';
|
||||
@@ -17,7 +17,7 @@ enableRobotsTXT = true
|
||||
proxy = "direct"
|
||||
[module.hugoVersion]
|
||||
extended = true
|
||||
min = "0.73.0"
|
||||
min = "0.146.0"
|
||||
[[module.mounts]]
|
||||
source = "../docs/en"
|
||||
target = 'content'
|
||||
@@ -28,6 +28,7 @@ enableRobotsTXT = true
|
||||
path = "github.com/martignoni/hugo-notice"
|
||||
|
||||
[params]
|
||||
description = "MCP Toolbox for Databases is an open source MCP server for databases. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more."
|
||||
copyright = "Google LLC"
|
||||
github_repo = "https://github.com/googleapis/genai-toolbox"
|
||||
github_project_repo = "https://github.com/googleapis/genai-toolbox"
|
||||
@@ -47,4 +48,23 @@ enableRobotsTXT = true
|
||||
pre = "<i class='fa-brands fa-github'></i>"
|
||||
|
||||
[markup.goldmark.renderer]
|
||||
unsafe= true
|
||||
unsafe= true
|
||||
|
||||
[markup.highlight]
|
||||
noClasses = false
|
||||
style = "tango"
|
||||
|
||||
[outputFormats]
|
||||
[outputFormats.LLMS]
|
||||
mediaType = "text/plain"
|
||||
baseName = "llms"
|
||||
isPlainText = true
|
||||
root = true
|
||||
[outputFormats.LLMS-FULL]
|
||||
mediaType = "text/plain"
|
||||
baseName = "llms-full"
|
||||
isPlainText = true
|
||||
root = true
|
||||
|
||||
[outputs]
|
||||
home = ["HTML", "RSS", "LLMS", "LLMS-FULL"]
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
{{ template "_default/_markup/td-render-heading.html" . }}
|
||||
14
.hugo/layouts/index.llms-full.txt
Normal file
14
.hugo/layouts/index.llms-full.txt
Normal file
@@ -0,0 +1,14 @@
|
||||
{{ .Site.Params.description }}
|
||||
|
||||
{{ range .Site.Sections }}
|
||||
# {{ .Title }}
|
||||
{{ .Description }}
|
||||
{{ range .Pages }}
|
||||
# {{ .Title }}
|
||||
{{ .Description }}
|
||||
{{ .RawContent }}
|
||||
{{ range .Pages }}
|
||||
# {{ .Title }}
|
||||
{{ .Description }}
|
||||
{{ .RawContent }}
|
||||
{{end }}{{ end }}{{ end }}
|
||||
9
.hugo/layouts/index.llms.txt
Normal file
9
.hugo/layouts/index.llms.txt
Normal file
@@ -0,0 +1,9 @@
|
||||
# {{ .Site.Title }}
|
||||
|
||||
> {{ .Site.Params.description }}
|
||||
|
||||
## Docs
|
||||
{{ range .Site.Sections }}
|
||||
### {{ .Title }}
|
||||
|
||||
{{ .Description }}{{ range .Pages }}- [{{ .Title }}]({{ .Permalink }}): {{ .Description }}{{ range .Pages }} - [{{ .Title }}]({{ .Permalink }}): {{ .Description }}{{end }}{{ end }}{{ end }}
|
||||
1
.hugo/layouts/partials/td/render-heading.html
Normal file
1
.hugo/layouts/partials/td/render-heading.html
Normal file
@@ -0,0 +1 @@
|
||||
{{ template "partials/td/render-heading.html" . }}
|
||||
2
.hugo/layouts/shortcodes/include.html
Normal file
2
.hugo/layouts/shortcodes/include.html
Normal file
@@ -0,0 +1,2 @@
|
||||
{{ $file := .Get 0 }}
|
||||
{{ (printf "%s%s" .Page.File.Dir $file) | readFile | replaceRE "^---[\\s\\S]+?---" "" | safeHTML }}
|
||||
97
CHANGELOG.md
97
CHANGELOG.md
@@ -1,5 +1,102 @@
|
||||
# Changelog
|
||||
|
||||
## [0.9.0](https://github.com/googleapis/genai-toolbox/compare/v0.8.0...v0.9.0) (2025-07-11)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Dynamic reloading for toolbox config ([#800](https://github.com/googleapis/genai-toolbox/issues/800)) ([4c240ac](https://github.com/googleapis/genai-toolbox/commit/4c240ac3c961cd14738c998ba2d10d5235ef523e))
|
||||
* **sources/mysql:** Add queryTimeout support to MySQL source ([#830](https://github.com/googleapis/genai-toolbox/issues/830)) ([391cb5b](https://github.com/googleapis/genai-toolbox/commit/391cb5bfe845e554411240a1d9838df5331b25fa))
|
||||
* **tools/bigquery:** Add optional projectID parameter to bigquery tools ([#799](https://github.com/googleapis/genai-toolbox/issues/799)) ([c6ab74c](https://github.com/googleapis/genai-toolbox/commit/c6ab74c5dad53a0e7885a18438ab3be36b9b7cb3))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Cleanup unassigned err log ([#857](https://github.com/googleapis/genai-toolbox/issues/857)) ([c081ace](https://github.com/googleapis/genai-toolbox/commit/c081ace46bb24cb3fd2adb21d519489be0d3f3c3))
|
||||
* Fix docs preview deployment pipeline ([#787](https://github.com/googleapis/genai-toolbox/issues/787)) ([0a93b04](https://github.com/googleapis/genai-toolbox/commit/0a93b0482c8d3c64b324e67408d408f5576ecaf3))
|
||||
* **tools:** Nil parameter error when arrays are used ([#801](https://github.com/googleapis/genai-toolbox/issues/801)) ([2bdcc08](https://github.com/googleapis/genai-toolbox/commit/2bdcc0841ab37d18e2f0d6fe63fb6f10da3e302b))
|
||||
* Trigger reload on additional fsnotify operations ([#854](https://github.com/googleapis/genai-toolbox/issues/854)) ([aa8dbec](https://github.com/googleapis/genai-toolbox/commit/aa8dbec97095cf0d7ac771c8084a84e2d3d8ce4e))
|
||||
|
||||
## [0.8.0](https://github.com/googleapis/genai-toolbox/compare/v0.7.0...v0.8.0) (2025-07-02)
|
||||
|
||||
|
||||
### ⚠ BREAKING CHANGES
|
||||
|
||||
* **postgres,mssql,cloudsqlmssql:** encode source connection url for sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727))
|
||||
|
||||
### Features
|
||||
|
||||
* Add support for multiple YAML configuration files ([#760](https://github.com/googleapis/genai-toolbox/issues/760)) ([40679d7](https://github.com/googleapis/genai-toolbox/commit/40679d700eded50d19569923e2a71c51e907a8bf))
|
||||
* Add support for optional parameters ([#617](https://github.com/googleapis/genai-toolbox/issues/617)) ([4827771](https://github.com/googleapis/genai-toolbox/commit/4827771b78dee9a1284a898b749509b472061527)), closes [#475](https://github.com/googleapis/genai-toolbox/issues/475)
|
||||
* **mcp:** Support MCP version 2025-03-26 ([#755](https://github.com/googleapis/genai-toolbox/issues/755)) ([474df57](https://github.com/googleapis/genai-toolbox/commit/474df57d62de683079f8d12c31db53396a545fd1))
|
||||
* **sources/http:** Support disable SSL verification for HTTP Source ([#674](https://github.com/googleapis/genai-toolbox/issues/674)) ([4055b0c](https://github.com/googleapis/genai-toolbox/commit/4055b0c3569c527560d7ad34262963b3dd4e282d))
|
||||
* **tools/bigquery:** Add templateParameters field for bigquery ([#699](https://github.com/googleapis/genai-toolbox/issues/699)) ([f5f771b](https://github.com/googleapis/genai-toolbox/commit/f5f771b0f3d159630ff602ff55c6c66b61981446))
|
||||
* **tools/bigtable:** Add templateParameters field for bigtable ([#692](https://github.com/googleapis/genai-toolbox/issues/692)) ([1c06771](https://github.com/googleapis/genai-toolbox/commit/1c067715fac06479eb0060d7067b73dba099ed92))
|
||||
* **tools/couchbase:** Add templateParameters field for couchbase ([#723](https://github.com/googleapis/genai-toolbox/issues/723)) ([9197186](https://github.com/googleapis/genai-toolbox/commit/9197186b8bea1ac4ec1b39c9c5c110807c8b2ba9))
|
||||
* **tools/http:** Add support for HTTP Tool pathParams ([#726](https://github.com/googleapis/genai-toolbox/issues/726)) ([fd300dc](https://github.com/googleapis/genai-toolbox/commit/fd300dc606d88bf9f7bba689e2cee4e3565537dd))
|
||||
* **tools/redis:** Add Redis Source and Tool ([#519](https://github.com/googleapis/genai-toolbox/issues/519)) ([f0aef29](https://github.com/googleapis/genai-toolbox/commit/f0aef29b0c2563e2a00277fbe2784f39f16d2835))
|
||||
* **tools/spanner:** Add templateParameters field for spanner ([#691](https://github.com/googleapis/genai-toolbox/issues/691)) ([075dfa4](https://github.com/googleapis/genai-toolbox/commit/075dfa47e1fd92be4847bd0aec63296146b66455))
|
||||
* **tools/sqlitesql:** Add templateParameters field for sqlitesql ([#687](https://github.com/googleapis/genai-toolbox/issues/687)) ([75e254c](https://github.com/googleapis/genai-toolbox/commit/75e254c0a4ce690ca5fa4d1741550ce54734b226))
|
||||
* **tools/valkey:** Add Valkey Source and Tool ([#532](https://github.com/googleapis/genai-toolbox/issues/532)) ([054ec19](https://github.com/googleapis/genai-toolbox/commit/054ec198b97ba9f36f67dd12b2eff0cc6bc4d080))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **bigquery,mssql:** Fix panic on tools with array param ([#722](https://github.com/googleapis/genai-toolbox/issues/722)) ([7a6644c](https://github.com/googleapis/genai-toolbox/commit/7a6644cf0c5413e5c803955c88a2cfd0a2233ed3))
|
||||
* **postgres,mssql,cloudsqlmssql:** Encode source connection url for sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727)) ([67964d9](https://github.com/googleapis/genai-toolbox/commit/67964d939f27320b63b5759f4b3f3fdaa0c76fbf)), closes [#717](https://github.com/googleapis/genai-toolbox/issues/717)
|
||||
* Set default value to field's type during unmarshalling ([#774](https://github.com/googleapis/genai-toolbox/issues/774)) ([fafed24](https://github.com/googleapis/genai-toolbox/commit/fafed2485839cf1acc1350e8a24103d2e6356ee0)), closes [#771](https://github.com/googleapis/genai-toolbox/issues/771)
|
||||
* **server/mcp:** Do not listen from port for stdio ([#719](https://github.com/googleapis/genai-toolbox/issues/719)) ([d51dbc7](https://github.com/googleapis/genai-toolbox/commit/d51dbc759ba493021d3ec6f5417fc04c21f7044f)), closes [#711](https://github.com/googleapis/genai-toolbox/issues/711)
|
||||
* **tools/mysqlexecutesql:** Handle nil panic and connection leak in Invoke ([#757](https://github.com/googleapis/genai-toolbox/issues/757)) ([7badba4](https://github.com/googleapis/genai-toolbox/commit/7badba42eefb34252be77b852a57d6bd78dd267d))
|
||||
* **tools/mysqlsql:** Handle nil panic and connection leak in invoke ([#758](https://github.com/googleapis/genai-toolbox/issues/758)) ([cbb4a33](https://github.com/googleapis/genai-toolbox/commit/cbb4a333517313744800d148840312e56340f3fd))
|
||||
|
||||
## [0.7.0](https://github.com/googleapis/genai-toolbox/compare/v0.6.0...v0.7.0) (2025-06-10)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add templateParameters field for mssqlsql ([#671](https://github.com/googleapis/genai-toolbox/issues/671)) ([b81fc6a](https://github.com/googleapis/genai-toolbox/commit/b81fc6aa6ccdfbc15676fee4d87041d9ad9682fa))
|
||||
* Add templateParameters field for mysqlsql ([#663](https://github.com/googleapis/genai-toolbox/issues/663)) ([0a08d2c](https://github.com/googleapis/genai-toolbox/commit/0a08d2c15dcbec18bb556f4dc49792ba0c69db46))
|
||||
* **metrics:** Add user agent for prebuilt tools ([#669](https://github.com/googleapis/genai-toolbox/issues/669)) ([29aa0a7](https://github.com/googleapis/genai-toolbox/commit/29aa0a70da3c2eb409a38993b3782da8bec7cb85))
|
||||
* **tools/postgressql:** Add templateParameters field ([#615](https://github.com/googleapis/genai-toolbox/issues/615)) ([b763469](https://github.com/googleapis/genai-toolbox/commit/b76346993f298b4f7493a51405d0a287bacce05f))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Improve versionString ([#658](https://github.com/googleapis/genai-toolbox/issues/658)) ([cf96f4c](https://github.com/googleapis/genai-toolbox/commit/cf96f4c249f0692e3eb19fc56c794ca6a3079307))
|
||||
* **server/stdio:** Notifications should not return a response ([#638](https://github.com/googleapis/genai-toolbox/issues/638)) ([69d047a](https://github.com/googleapis/genai-toolbox/commit/69d047af46f1ec00f236db8a978a7a7627217fd2))
|
||||
* **tools/mysqlsql:** Handled the null value for string case in mysqlsql tools ([#641](https://github.com/googleapis/genai-toolbox/issues/641)) ([ef94648](https://github.com/googleapis/genai-toolbox/commit/ef94648455c3b20adda4f8cff47e70ddccac8c06))
|
||||
* Update path library ([#678](https://github.com/googleapis/genai-toolbox/issues/678)) ([4998f82](https://github.com/googleapis/genai-toolbox/commit/4998f8285287b5daddd0043540f2cf871e256db5)), closes [#662](https://github.com/googleapis/genai-toolbox/issues/662)
|
||||
|
||||
## [0.6.0](https://github.com/googleapis/genai-toolbox/compare/v0.5.0...v0.6.0) (2025-05-28)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add Execute sql tool for SQL Server(MSSQL) ([#585](https://github.com/googleapis/genai-toolbox/issues/585)) ([6083a22](https://github.com/googleapis/genai-toolbox/commit/6083a224aa650caf4e132b4a704323c5f18c4986))
|
||||
* Add mysql-execute-sql tool ([#577](https://github.com/googleapis/genai-toolbox/issues/577)) ([8590061](https://github.com/googleapis/genai-toolbox/commit/8590061ae4908da0e4b1bd6f7cf7ee8d972fa5ba))
|
||||
* Add new BigQuery tools: execute_sql, list_datatset_ids, list_table_ids, get_dataset_info, get_table_info ([0fd88b5](https://github.com/googleapis/genai-toolbox/commit/0fd88b574b4ab0d3bee4585999b814675d3b74ed))
|
||||
* Add spanner-execute-sql tool ([#576](https://github.com/googleapis/genai-toolbox/issues/576)) ([d65747a](https://github.com/googleapis/genai-toolbox/commit/d65747a2dcf3022f22c86a1524ee28c8229f7c20))
|
||||
* Add support for read-only in Spanner tool ([#563](https://github.com/googleapis/genai-toolbox/issues/563)) ([6512704](https://github.com/googleapis/genai-toolbox/commit/6512704e77088d92fea53a85c6e6cbf4b99c988d))
|
||||
* Adding support for the --prebuilt flag ([#604](https://github.com/googleapis/genai-toolbox/issues/604)) ([a29c800](https://github.com/googleapis/genai-toolbox/commit/a29c80012eec4729187c12968b53051d20b263a7))
|
||||
* Support MCP stdio transport protocol ([#607](https://github.com/googleapis/genai-toolbox/issues/607)) ([1702ce1](https://github.com/googleapis/genai-toolbox/commit/1702ce1e00a52170a4271ac999caf534ba00196f))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Explicitly set query location for BigQuery queries ([#586](https://github.com/googleapis/genai-toolbox/issues/586)) ([eb52b66](https://github.com/googleapis/genai-toolbox/commit/eb52b66d82aaa11be6b1489335f49cba8168099b))
|
||||
* Fix spellings in comments ([#561](https://github.com/googleapis/genai-toolbox/issues/561)) ([b58bf76](https://github.com/googleapis/genai-toolbox/commit/b58bf76ddaba407e3fd995dfe86d00a09484e14a))
|
||||
* Prevent tool calls through MCP when auth is required ([#544](https://github.com/googleapis/genai-toolbox/issues/544)) ([e747b6e](https://github.com/googleapis/genai-toolbox/commit/e747b6e289730c17f68be8dec0c6fa6021bb23bd))
|
||||
* Reinitialize required slice if nil ([#571](https://github.com/googleapis/genai-toolbox/issues/571)) ([04dcf47](https://github.com/googleapis/genai-toolbox/commit/04dcf4791272e1dd034b9a03664dd8dbe77fdddd)), closes [#564](https://github.com/googleapis/genai-toolbox/issues/564)
|
||||
|
||||
## [0.5.0](https://github.com/googleapis/genai-toolbox/compare/v0.4.0...v0.5.0) (2025-05-06)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* Add Couchbase as Source and Tool ([#307](https://github.com/googleapis/genai-toolbox/issues/307)) ([d7390b0](https://github.com/googleapis/genai-toolbox/commit/d7390b06b7bcb15411388e9a4dbcfe75afcca1ee))
|
||||
* Add postgres-execute-sql tool ([#490](https://github.com/googleapis/genai-toolbox/issues/490)) ([11ea7bc](https://github.com/googleapis/genai-toolbox/commit/11ea7bc584aa4ca8e8b0e7a355f6666ccbea2883))
|
||||
|
||||
|
||||
## [0.4.0](https://github.com/googleapis/genai-toolbox/compare/v0.3.0...v0.4.0) (2025-04-23)
|
||||
|
||||
|
||||
|
||||
@@ -14,21 +14,21 @@ race, religion, or sexual identity and orientation.
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
@@ -75,7 +75,7 @@ receive and address reported violations of the code of conduct. They will then
|
||||
work with a committee consisting of representatives from the Open Source
|
||||
Programs Office and the Google Open Source Strategy team. If for any reason you
|
||||
are uncomfortable reaching out to the Project Steward, please email
|
||||
opensource@google.com.
|
||||
<opensource@google.com>.
|
||||
|
||||
We will investigate every complaint, but you may not receive a direct response.
|
||||
We will use our discretion in determining when and how to follow up on reported
|
||||
@@ -90,4 +90,4 @@ harassment or threats to anyone's safety, we may take action without notice.
|
||||
|
||||
This Code of Conduct is adapted from the Contributor Covenant, version 1.4,
|
||||
available at
|
||||
https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
|
||||
<https://www.contributor-covenant.org/version/1/4/code-of-conduct.html>
|
||||
|
||||
151
CONTRIBUTING.md
151
CONTRIBUTING.md
@@ -30,4 +30,153 @@ This project follows
|
||||
All submissions, including submissions by project members, require review. We
|
||||
use GitHub pull requests for this purpose. Consult
|
||||
[GitHub Help](https://help.github.com/articles/about-pull-requests/) for more
|
||||
information on using pull requests.
|
||||
information on using pull requests.
|
||||
|
||||
Within 2-5 days, a reviewer will review your PR. They may approve it, or request
|
||||
changes. When requesting changes, reviewers should self-assign the PR to ensure
|
||||
they are aware of any updates.
|
||||
If additional changes are needed, push additional commits to your PR branch -
|
||||
this helps the reviewer know which parts of the PR have changed. Commits will be
|
||||
squashed when merged.
|
||||
Please follow up with changes promptly. If a PR is awaiting changes by the
|
||||
author for more than 10 days, maintainers may mark that PR as Draft. PRs that
|
||||
are inactive for more than 30 days may be closed.
|
||||
|
||||
### Adding a New Database Source and Tool
|
||||
|
||||
We recommend creating an
|
||||
[issue](https://github.com/googleapis/genai-toolbox/issues) before
|
||||
implementation to ensure we can accept the contribution and no duplicated work.
|
||||
If you have any questions, reach out on our
|
||||
[Discord](https://discord.gg/Dmm69peqjh) to chat directly with the team. New
|
||||
contributions should be added with both unit tests and integration tests.
|
||||
|
||||
#### 1. Implement the New Data Source
|
||||
|
||||
We recommend looking at an [example source
|
||||
implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/sources/postgres/postgres.go).
|
||||
|
||||
* **Create a new directory** under `internal/sources` for your database type
|
||||
(e.g., `internal/sources/newdb`).
|
||||
* **Define a configuration struct** for your data source in a file named
|
||||
`newdb.go`. Create a `Config` struct to include all the necessary parameters
|
||||
for connecting to the database (e.g., host, port, username, password, database
|
||||
name) and a `Source` struct to store necessary parameters for tools (e.g.,
|
||||
Name, Kind, connection object, additional config).
|
||||
* **Implement the
|
||||
[`SourceConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L57)
|
||||
interface**. This interface requires two methods:
|
||||
* `SourceConfigKind() string`: Returns a unique string identifier for your
|
||||
data source (e.g., `"newdb"`).
|
||||
* `Initialize(ctx context.Context, tracer trace.Tracer) (Source, error)`:
|
||||
Creates a new instance of your data source and establishes a connection to
|
||||
the database.
|
||||
* **Implement the
|
||||
[`Source`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L63)
|
||||
interface**. This interface requires one method:
|
||||
* `SourceKind() string`: Returns the same string identifier as `SourceConfigKind()`.
|
||||
* **Implement `init()`** to register the new Source.
|
||||
* **Implement Unit Tests** in a file named `newdb_test.go`.
|
||||
|
||||
#### 2. Implement the New Tool
|
||||
|
||||
We recommend looking at an [example tool
|
||||
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgressql).
|
||||
|
||||
* **Create a new directory** under `internal/tools` for your tool type (e.g.,
|
||||
`internal/tools/newdb` or `internal/tools/newdb<tool_name>`).
|
||||
* **Define a configuration struct** for your tool in a file named `newdbtool.go`.
|
||||
Create a `Config` struct and a `Tool` struct to store necessary parameters for
|
||||
tools.
|
||||
* **Implement the
|
||||
[`ToolConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/tools/tools.go#L61)
|
||||
interface**. This interface requires one method:
|
||||
* `ToolConfigKind() string`: Returns a unique string identifier for your tool
|
||||
(e.g., `"newdb"`).
|
||||
* `Initialize(sources map[string]Source) (Tool, error)`: Creates a new
|
||||
instance of your tool and validates that it can connect to the specified
|
||||
data source.
|
||||
* **Implement the `Tool` interface**. This interface requires the following
|
||||
methods:
|
||||
* `Invoke(ctx context.Context, params map[string]any) ([]any, error)`:
|
||||
Executes the operation on the database using the provided parameters.
|
||||
* `ParseParams(data map[string]any, claims map[string]map[string]any)
|
||||
(ParamValues, error)`: Parses and validates the input parameters.
|
||||
* `Manifest() Manifest`: Returns a manifest describing the tool's capabilities
|
||||
and parameters.
|
||||
* `McpManifest() McpManifest`: Returns an MCP manifest describing the tool for
|
||||
use with the Model Context Protocol.
|
||||
* `Authorized(services []string) bool`: Checks if the tool is authorized to
|
||||
run based on the provided authentication services.
|
||||
* **Implement `init()`** to register the new Tool.
|
||||
* **Implement Unit Tests** in a file named `newdb_test.go`.
|
||||
|
||||
#### 3. Add Integration Tests
|
||||
|
||||
* **Add a test file** under a new directory `tests/newdb`.
|
||||
* **Add pre-defined integration test suites** in the
|
||||
`/tests/newdb/newdb_test.go` that are **required** to be run as long as your
|
||||
code contains related features:
|
||||
|
||||
1. [RunToolGetTest][tool-get]: tests for the `GET` endpoint that returns the
|
||||
tool's manifest.
|
||||
|
||||
2. [RunToolInvokeTest][tool-call]: tests for tool calling through the native
|
||||
Toolbox endpoints.
|
||||
|
||||
3. [RunMCPToolCallMethod][mcp-call]: tests tool calling through the MCP
|
||||
endpoints.
|
||||
|
||||
4. (Optional) [RunExecuteSqlToolInvokeTest][execute-sql]: tests an
|
||||
`execute-sql` tool for any source. Only run this test if you are adding an
|
||||
`execute-sql` tool.
|
||||
|
||||
5. (Optional) [RunToolInvokeWithTemplateParameters][temp-param]: tests for [template
|
||||
parameters][temp-param-doc]. Only run this test if template
|
||||
parameters apply to your tool.
|
||||
|
||||
* **Add the new database to the test config** in
|
||||
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
|
||||
|
||||
[tool-get]:
|
||||
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L31
|
||||
[tool-call]:
|
||||
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L79>
|
||||
[mcp-call]:
|
||||
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L554
|
||||
[execute-sql]:
|
||||
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L431>
|
||||
[temp-param]:
|
||||
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L297>
|
||||
[temp-param-doc]:
|
||||
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
|
||||
|
||||
#### 4. Add Documentation
|
||||
|
||||
* **Update the documentation** to include information about your new data source
|
||||
and tool. This includes:
|
||||
* Adding a new page to the `docs/en/resources/sources` directory for your data
|
||||
source.
|
||||
* Adding a new page to the `docs/en/resources/tools` directory for your tool.
|
||||
|
||||
* **(Optional) Add samples** to the `docs/en/samples/<newdb>` directory.
|
||||
|
||||
#### (Optional) 5. Add Prebuilt Tools
|
||||
|
||||
You can provide developers with a set of "build-time" tools to aid common
|
||||
software development user journeys like viewing and creating tables/collections
|
||||
and data.
|
||||
|
||||
* **Create a set of prebuilt tools** by defining a new `tools.yaml` and adding
|
||||
it to `internal/tools`. Make sure the file name matches the source (i.e. for
|
||||
source "alloydb-postgres" create a file named "alloydb-postgres.yaml").
|
||||
* **Update `cmd/root.go`** to add new source to the `prebuilt` flag.
|
||||
* **Add tests** in
|
||||
[internal/prebuiltconfigs/prebuiltconfigs_test.go](internal/prebuiltconfigs/prebuiltconfigs_test.go)
|
||||
and [cmd/root_test.go](cmd/root_test.go).
|
||||
|
||||
#### 6. Submit a Pull Request
|
||||
|
||||
* **Submit a pull request** to the repository with your changes. Be sure to
|
||||
include a detailed description of your changes and any requests for long term
|
||||
testing resources.
|
||||
|
||||
352
DEVELOPER.md
352
DEVELOPER.md
@@ -1,12 +1,16 @@
|
||||
# DEVELOPER.md
|
||||
|
||||
## Before you begin
|
||||
This document provides instructions for setting up your development environment
|
||||
and contributing to the Toolbox project.
|
||||
|
||||
1. Make sure you've setup your databases.
|
||||
## Prerequisites
|
||||
|
||||
1. Install the latest version of [Go](https://go.dev/doc/install).
|
||||
Before you begin, ensure you have the following:
|
||||
|
||||
1. Locate and download dependencies:
|
||||
1. **Databases:** Set up the necessary databases for your development
|
||||
environment.
|
||||
1. **Go:** Install the latest version of [Go](https://go.dev/doc/install).
|
||||
1. **Dependencies:** Download and manage project dependencies:
|
||||
|
||||
```bash
|
||||
go get
|
||||
@@ -15,116 +19,224 @@
|
||||
|
||||
## Developing Toolbox
|
||||
|
||||
### Run Toolbox from local source
|
||||
### Running from Local Source
|
||||
|
||||
1. Create a `tools.yaml` file with your [sources and tools configurations](./README.md#Configuration).
|
||||
|
||||
1. You can specify flags for the Toolbox server. Execute the following to list the possible CLI flags:
|
||||
1. **Configuration:** Create a `tools.yaml` file to configure your sources and
|
||||
tools. See the [Configuration section in the
|
||||
README](./README.md#Configuration) for details.
|
||||
1. **CLI Flags:** List available command-line flags for the Toolbox server:
|
||||
|
||||
```bash
|
||||
go run . --help
|
||||
```
|
||||
|
||||
1. To run the server, execute the following (with any flags, if applicable):
|
||||
1. **Running the Server:** Start the Toolbox server with optional flags. The
|
||||
server listens on port 5000 by default.
|
||||
|
||||
```bash
|
||||
go run .
|
||||
```
|
||||
|
||||
The server will listen on port 5000 (by default).
|
||||
|
||||
1. Test endpoint using the following:
|
||||
1. **Testing the Endpoint:** Verify the server is running by sending a request
|
||||
to the endpoint:
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:5000
|
||||
```
|
||||
|
||||
### Testing
|
||||
## Testing
|
||||
|
||||
- Run the lint check:
|
||||
### Infrastructure
|
||||
|
||||
```bash
|
||||
golangci-lint run --fix
|
||||
Toolbox uses both GitHub Actions and Cloud Build to run test workflows. Cloud
|
||||
Build is used when Google credentials are required. Cloud Build uses test
|
||||
project "toolbox-testing-438616".
|
||||
|
||||
### Linting
|
||||
|
||||
Run the lint check to ensure code quality:
|
||||
|
||||
```bash
|
||||
golangci-lint run --fix
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
|
||||
Execute unit tests locally:
|
||||
|
||||
```bash
|
||||
go test -race -v ./...
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
#### Running Locally
|
||||
|
||||
1. **Environment Variables:** Set the required environment variables. Refer to
|
||||
the [Cloud Build testing configuration](./.ci/integration.cloudbuild.yaml)
|
||||
for a complete list of variables for each source.
|
||||
* `SERVICE_ACCOUNT_EMAIL`: Use your own GCP email.
|
||||
* `CLIENT_ID`: Use the Google Cloud SDK application Client ID. Contact
|
||||
Toolbox maintainers if you don't have it.
|
||||
1. **Running Tests:** Run the integration test for your target source. Specify
|
||||
the required Go build tags at the top of each integration test file.
|
||||
|
||||
```shell
|
||||
go test -race -v ./tests/<YOUR_TEST_DIR>
|
||||
```
|
||||
|
||||
- Run unit tests locally:
|
||||
For example, to run the AlloyDB integration test:
|
||||
|
||||
```bash
|
||||
go test -race -v ./...
|
||||
```shell
|
||||
go test -race -v ./tests/alloydbpg
|
||||
```
|
||||
|
||||
- Run integration tests locally:
|
||||
1. Set required environment variables. For a complete lists of required
|
||||
vairables for each source, check out the [Cloud Build testing
|
||||
configuration](./.ci/integration.cloudbuild.yaml).
|
||||
- Use your own GCP email as the `SERVICE_ACCOUNT_EMAIL`.
|
||||
- Use the Google Cloud SDK application Client ID as the `CLIENT_ID`. Ask the
|
||||
Toolbox maintainers if you don't know it already.
|
||||
#### Running on Pull Requests
|
||||
|
||||
2. Run the integration test for your target source with the required Go
|
||||
build tags specified at the top of each integration test file:
|
||||
* **Internal Contributors:** Testing workflows should trigger automatically.
|
||||
* **External Contributors:** Request Toolbox maintainers to trigger the testing
|
||||
workflows on your PR.
|
||||
|
||||
```shell
|
||||
go test -race -v -tags=integration,<YOUR_SOURCE_KIND> ./tests
|
||||
```
|
||||
#### Test Resources
|
||||
|
||||
For example, to run the AlloyDB integration test, run:
|
||||
The following databases have been added as test resources. To add a new database
|
||||
to test against, please contact the Toolbox maintainer team via an issue or PR.
|
||||
Refer to the [Cloud Build testing
|
||||
configuration](./.ci/integration.cloudbuild.yaml) for a complete list of
|
||||
variables for each source.
|
||||
|
||||
```shell
|
||||
go test -race -v -tags=integration,alloydb ./tests
|
||||
```
|
||||
* AlloyDB - setup in the test project
|
||||
* AI Natural Language ([setup
|
||||
instructions](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries))
|
||||
has been configured for `alloydb-a`-nl` tool tests
|
||||
* The Cloud Build service account is a user
|
||||
* Bigtable - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* BigQuery - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Cloud SQL Postgres - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Cloud SQL MySQL - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Cloud SQL SQL Server - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* Couchbase - setup in the test project via the Marketplace
|
||||
* DGraph - using the public dgraph interface <https://play.dgraph.io> for
|
||||
testing
|
||||
* Memorystore Redis - setup in the test project using a Memorystore for Redis
|
||||
standalone instance
|
||||
* Memorystore Redis Cluster, Memorystore Valkey standalone, and Memorystore
|
||||
Valkey Cluster instances all require PSC connections, which requires extra
|
||||
security setup to connect from Cloud Build. Memorystore Redis standalone is
|
||||
the only one allowing PSA connection.
|
||||
* The Cloud Build service account is a user
|
||||
* Memorystore Valkey - setup in the test project using a Memorystore for Redis
|
||||
standalone instance
|
||||
* The Cloud Build service account is a user
|
||||
* MySQL - setup in the test project using a Cloud SQL instance
|
||||
* Neo4j - setup in the test project on a GCE VM
|
||||
* Postgres - setup in the test project using an AlloyDB instance
|
||||
* Spanner - setup in the test project
|
||||
* The Cloud Build service account is a user
|
||||
* SQL Server - setup in the test project using a Cloud SQL instance
|
||||
* SQLite - setup in the integration test, where we create a temporary database
|
||||
file
|
||||
|
||||
- Run integration tests on your PR:
|
||||
### Other GitHub Checks
|
||||
|
||||
For internal contributors, the testing workflows should trigger
|
||||
automatically. For external contributors, ask the Toolbox
|
||||
maintainers to trigger the testing workflows on your PR.
|
||||
* License header check (`.github/header-checker-lint.yml`) - Ensures files have
|
||||
the appropriate license
|
||||
* CLA/google - Ensures the developer has signed the CLA:
|
||||
<https://cla.developers.google.com/>
|
||||
* conventionalcommits.org - Ensures the commit messages are in the correct
|
||||
format. This repository uses tool [Release
|
||||
Please](https://github.com/googleapis/release-please) to create GitHub
|
||||
releases. It does so by parsing your git history, looking for [Conventional
|
||||
Commit messages](https://www.conventionalcommits.org/), and creating release
|
||||
PRs. Learn more by reading [How should I write my
|
||||
commits?](https://github.com/googleapis/release-please?tab=readme-ov-file#how-should-i-write-my-commits)
|
||||
|
||||
## Compile the app locally
|
||||
## Developing Documentation
|
||||
|
||||
### Compile Toolbox binary
|
||||
### Running a Local Hugo Server
|
||||
|
||||
1. Run build to compile binary:
|
||||
Follow these steps to preview documentation changes locally using a Hugo server:
|
||||
|
||||
1. **Install Hugo:** Ensure you have
|
||||
[Hugo](https://gohugo.io/installation/macos/) extended edition version
|
||||
0.146.0 or later installed.
|
||||
1. **Navigate to the Hugo Directory:**
|
||||
|
||||
```bash
|
||||
cd .hugo
|
||||
```
|
||||
|
||||
1. **Install Dependencies:**
|
||||
|
||||
```bash
|
||||
npm ci
|
||||
```
|
||||
|
||||
1. **Start the Server:**
|
||||
|
||||
```bash
|
||||
hugo server
|
||||
```
|
||||
|
||||
### Previewing Documentation on Pull Requests
|
||||
|
||||
#### Contributors
|
||||
|
||||
Request a repo owner to run the preview deployment workflow on your PR. A
|
||||
preview link will be automatically added as a comment to your PR.
|
||||
|
||||
#### Maintainers
|
||||
|
||||
1. **Inspect Changes:** Review the proposed changes in the PR to ensure they are
|
||||
safe and do not contain malicious code. Pay close attention to changes in the
|
||||
`.github/workflows/` directory.
|
||||
1. **Deploy Preview:** Apply the `docs: deploy-preview` label to the PR to
|
||||
deploy a documentation preview.
|
||||
|
||||
## Building Toolbox
|
||||
|
||||
### Building the Binary
|
||||
|
||||
1. **Build Command:** Compile the Toolbox binary:
|
||||
|
||||
```bash
|
||||
go build -o toolbox
|
||||
```
|
||||
|
||||
1. You can specify flags for the Toolbox server. Execute the following to list the possible CLI flags:
|
||||
|
||||
```bash
|
||||
./toolbox --help
|
||||
```
|
||||
|
||||
1. To run the binary, execute the following (with any flags, if applicable):
|
||||
1. **Running the Binary:** Execute the compiled binary with optional flags. The
|
||||
server listens on port 5000 by default:
|
||||
|
||||
```bash
|
||||
./toolbox
|
||||
```
|
||||
|
||||
The server will listen on port 5000 (by default).
|
||||
|
||||
1. Test endpoint using the following:
|
||||
1. **Testing the Endpoint:** Verify the server is running by sending a request
|
||||
to the endpoint:
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:5000
|
||||
```
|
||||
|
||||
### Compile Toolbox container images
|
||||
### Building Container Images
|
||||
|
||||
1. Run build to compile container image:
|
||||
1. **Build Command:** Build the Toolbox container image:
|
||||
|
||||
```bash
|
||||
docker build -t toolbox:dev .
|
||||
```
|
||||
|
||||
1. Execute the following to view image:
|
||||
1. **View Image:** List available Docker images to confirm the build:
|
||||
|
||||
```bash
|
||||
docker images
|
||||
```
|
||||
|
||||
1. Run container image with Docker:
|
||||
1. **Run Container:** Run the Toolbox container image using Docker:
|
||||
|
||||
```bash
|
||||
docker run -d toolbox:dev
|
||||
@@ -132,56 +244,118 @@
|
||||
|
||||
## Developing Toolbox SDKs
|
||||
|
||||
Please refer to the [SDK developer guide](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/DEVELOPER.md)
|
||||
Refer to the [SDK developer
|
||||
guide](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/DEVELOPER.md)
|
||||
for instructions on developing Toolbox SDKs.
|
||||
|
||||
## (Optional) Maintainer Information
|
||||
## Maintainer Information
|
||||
|
||||
### Team
|
||||
|
||||
Team, `@googleapis/senseai-eco`, has been set as
|
||||
[CODEOWNERS](.github/CODEOWNERS). The GitHub TeamSync tool is used to create
|
||||
this team from MDB Group, `senseai-eco`.
|
||||
|
||||
### Releasing
|
||||
|
||||
There are two types of release for Toolbox, including a versioned release and continuous release.
|
||||
Toolbox has two types of releases: versioned and continuous. It uses Google
|
||||
Cloud project, `database-toolbox`.
|
||||
|
||||
- Versioned release: Official supported distributions with the `latest` tag. The release process for versioned release is in [versioned.release.cloudbuild.yaml](https://github.com/googleapis/genai-toolbox/blob/main/versioned.release.cloudbuild.yaml).
|
||||
- Continuous release: Used for early testing features between official supported releases and end-to-end testings.
|
||||
* **Versioned Release:** Official, supported distributions tagged as `latest`.
|
||||
The release process is defined in
|
||||
[versioned.release.cloudbuild.yaml](.ci/versioned.release.cloudbuild.yaml).
|
||||
* **Continuous Release:** Used for early testing of features between official
|
||||
releases and for end-to-end testing. The release process is defined in
|
||||
[continuous.release.cloudbuild.yaml](.ci/continuous.release.cloudbuild.yaml).
|
||||
* **GitHub Release:** `.github/release-please.yml` automatically creates GitHub
|
||||
Releases and release PRs.
|
||||
|
||||
#### Supported OS and Architecture binaries
|
||||
### How-to Release a new Version
|
||||
|
||||
The following OS and computer architecture is supported within the binary releases.
|
||||
1. [Optional] If you want to override the version number, send a
|
||||
[PR](https://github.com/googleapis/genai-toolbox/pull/31) to trigger
|
||||
[release-please](https://github.com/googleapis/release-please?tab=readme-ov-file#how-do-i-change-the-version-number).
|
||||
You can generate a commit with the following line: `git commit -m "chore:
|
||||
release 0.1.0" -m "Release-As: 0.1.0" --allow-empty`
|
||||
1. [Optional] If you want to edit the changelog, send commits to the release PR
|
||||
1. Approve and merge the PR with the title “[chore(main): release
|
||||
x.x.x](https://github.com/googleapis/genai-toolbox/pull/16)”
|
||||
1. The
|
||||
[trigger](https://pantheon.corp.google.com/cloud-build/triggers;region=us-central1/edit/27bd0d21-264a-4446-b2d7-0df4e9915fb3?e=13802955&inv=1&invt=AbhU8A&mods=logs_tg_staging&project=database-toolbox)
|
||||
should automatically run when a new tag is pushed. You can view [triggered
|
||||
builds here to check the
|
||||
status](https://pantheon.corp.google.com/cloud-build/builds;region=us-central1?query=trigger_id%3D%2227bd0d21-264a-4446-b2d7-0df4e9915fb3%22&e=13802955&inv=1&invt=AbhU8A&mods=logs_tg_staging&project=database-toolbox)
|
||||
1. Update the Github release notes to include the following table:
|
||||
1. Run the following command (from the root directory):
|
||||
|
||||
- linux/amd64
|
||||
- darwin/arm64
|
||||
- darwin/amd64
|
||||
- windows/amd64
|
||||
```
|
||||
export VERSION="v0.0.0"
|
||||
.ci/generate_release_table.sh
|
||||
```
|
||||
|
||||
#### Supported container images
|
||||
1. Copy the table output
|
||||
1. In the GitHub UI, navigate to Releases and click the `edit` button.
|
||||
1. Paste the table at the bottom of release note and click `Update release`.
|
||||
1. Post release in internal chat and on Discord.
|
||||
|
||||
The following base container images is supported within the container image releases.
|
||||
#### Supported Binaries
|
||||
|
||||
- distroless
|
||||
The following operating systems and architectures are supported for binary
|
||||
releases:
|
||||
|
||||
### Automated tests
|
||||
* linux/amd64
|
||||
* darwin/arm64
|
||||
* darwin/amd64
|
||||
* windows/amd64
|
||||
|
||||
Integration and unit tests are automatically triggered via CloudBuild during each PR creation.
|
||||
#### Supported Container Images
|
||||
|
||||
The following base container images are supported for container image releases:
|
||||
|
||||
* distroless
|
||||
|
||||
### Automated Tests
|
||||
|
||||
Integration and unit tests are automatically triggered via Cloud Build on each
|
||||
pull request. Integration tests run on merge and nightly.
|
||||
|
||||
#### Failure notifications
|
||||
|
||||
On-merge and nightly tests that fail have notification setup via Cloud Build
|
||||
Failure Reporter [GitHub Actions
|
||||
Workflow](.github/workflows/schedule_reporter.yml).
|
||||
|
||||
#### Trigger Setup
|
||||
|
||||
Create a Cloud Build trigger via the UI or `gcloud` with the following specs:
|
||||
Configure a Cloud Build trigger using the UI or `gcloud` with the following
|
||||
settings:
|
||||
|
||||
- Event: Pull request
|
||||
- Region:
|
||||
- global - for default worker pools
|
||||
- Source:
|
||||
- Generation: 1st gen
|
||||
- Repo: googleapis/genai-toolbox (GitHub App)
|
||||
- Base branch: `^main$`
|
||||
- Comment control: Required except for owners and collaborators
|
||||
- Filters: add directory filter
|
||||
- Config: Cloud Build configuration file
|
||||
- Location: Repository (add path to file)
|
||||
- Service account: set for demo service to enable ID token creation to use to authenticated services
|
||||
* **Event:** Pull request
|
||||
* **Region:** global (for default worker pools)
|
||||
* **Source:**
|
||||
* Generation: 1st gen
|
||||
* Repo: googleapis/genai-toolbox (GitHub App)
|
||||
* Base branch: `^main$`
|
||||
* **Comment control:** Required except for owners and collaborators
|
||||
* **Filters:** Add directory filter
|
||||
* **Config:** Cloud Build configuration file
|
||||
* Location: Repository (add path to file)
|
||||
* **Service account:** Set for demo service to enable ID token creation for
|
||||
authenticated services
|
||||
|
||||
### Trigger
|
||||
### Triggering Tests
|
||||
|
||||
Trigger the PR tests on PRs from external contributors:
|
||||
Trigger pull request tests for external contributors by:
|
||||
|
||||
- Cloud Build tests: comment `/gcbrun`
|
||||
- Unit tests: add `tests:run` label
|
||||
* **Cloud Build tests:** Comment `/gcbrun`
|
||||
* **Unit tests:** Add the `tests:run` label
|
||||
|
||||
## Repo Setup & Automation
|
||||
|
||||
* .github/blunderbuss.yml - Auto-assign issues and PRs from GitHub teams
|
||||
* .github/renovate.json5 - Tooling for dependency updates. Dependabot is built
|
||||
into the GitHub repo for GitHub security warnings
|
||||
* go/github-issue-mirror - GitHub issues are automatically mirrored into buganizer
|
||||
* (Suspended) .github/sync-repo-settings.yaml - configure repo settings
|
||||
* .github/release-please.yml - Creates GitHub releases
|
||||
* .github/ISSUE_TEMPLATE - templates for GitHub issues
|
||||
|
||||
@@ -13,18 +13,19 @@
|
||||
# limitations under the License.
|
||||
|
||||
# Use the latest stable golang 1.x to compile to a binary
|
||||
FROM --platform=$BUILDPLATFORM golang:1 as build
|
||||
FROM --platform=$BUILDPLATFORM golang:1 AS build
|
||||
|
||||
WORKDIR /go/src/genai-toolbox
|
||||
COPY . .
|
||||
|
||||
ARG TARGETOS
|
||||
ARG TARGETARCH
|
||||
ARG METADATA_TAGS=dev
|
||||
ARG BUILD_TYPE="container.dev"
|
||||
ARG COMMIT_SHA=""
|
||||
|
||||
RUN go get ./...
|
||||
RUN CGO_ENABLED=0 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.metadataString=container.${METADATA_TAGS}"
|
||||
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=container.${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}"
|
||||
|
||||
# Final Stage
|
||||
FROM gcr.io/distroless/static:nonroot
|
||||
|
||||
517
README.md
517
README.md
@@ -1,24 +1,27 @@
|
||||
|
||||

|
||||
|
||||
# MCP Toolbox for Databases
|
||||
|
||||
> [!NOTE]
|
||||
> MCP Toolbox for Databases is currently in beta, and may see breaking
|
||||
[](https://googleapis.github.io/genai-toolbox/)
|
||||
[](https://discord.gg/Dmm69peqjh)
|
||||
[](https://medium.com/@mcp_toolbox)
|
||||
[](https://goreportcard.com/report/github.com/googleapis/genai-toolbox)
|
||||
|
||||
> [!NOTE]
|
||||
> MCP Toolbox for Databases is currently in beta, and may see breaking
|
||||
> changes until the first stable release (v1.0).
|
||||
|
||||
MCP Toolbox for Databases is an open source MCP server for databases It was
|
||||
designed with enterprise-grade and production-quality in mind. It enables you to
|
||||
develop tools easier, faster, and more securely by handling the complexities
|
||||
MCP Toolbox for Databases is an open source MCP server for databases. It enables
|
||||
you to develop tools easier, faster, and more securely by handling the complexities
|
||||
such as connection pooling, authentication, and more.
|
||||
|
||||
This README provides a brief overview. For comprehensive details, see the [full
|
||||
documentation](https://googleapis.github.io/genai-toolbox/).
|
||||
|
||||
|
||||
> [!NOTE]
|
||||
> This product was originally named “Gen AI Toolbox for Databases” as
|
||||
> [!NOTE]
|
||||
> This solution was originally named “Gen AI Toolbox for Databases” as
|
||||
> its initial development predated MCP, but was renamed to align with recently
|
||||
> added MCP compatibility.
|
||||
> added MCP compatibility.
|
||||
|
||||
<!-- TOC ignore:true -->
|
||||
## Table of Contents
|
||||
@@ -28,23 +31,24 @@ documentation](https://googleapis.github.io/genai-toolbox/).
|
||||
- [Why Toolbox?](#why-toolbox)
|
||||
- [General Architecture](#general-architecture)
|
||||
- [Getting Started](#getting-started)
|
||||
- [Installing the server](#installing-the-server)
|
||||
- [Running the server](#running-the-server)
|
||||
- [Integrating your application](#integrating-your-application)
|
||||
- [Installing the server](#installing-the-server)
|
||||
- [Running the server](#running-the-server)
|
||||
- [Integrating your application](#integrating-your-application)
|
||||
- [Configuration](#configuration)
|
||||
- [Sources](#sources)
|
||||
- [Tools](#tools)
|
||||
- [Toolsets](#toolsets)
|
||||
- [Sources](#sources)
|
||||
- [Tools](#tools)
|
||||
- [Toolsets](#toolsets)
|
||||
- [Versioning](#versioning)
|
||||
- [Contributing](#contributing)
|
||||
- [Community](#community)
|
||||
|
||||
<!-- /TOC -->
|
||||
|
||||
|
||||
## Why Toolbox?
|
||||
## Why Toolbox?
|
||||
|
||||
Toolbox helps you build Gen AI tools that let your agents access data in your
|
||||
database. Toolbox provides:
|
||||
|
||||
- **Simplified development**: Integrate tools to your agent in less than 10
|
||||
lines of code, reuse tools between multiple agents or frameworks, and deploy
|
||||
new versions of tools more easily.
|
||||
@@ -54,6 +58,33 @@ database. Toolbox provides:
|
||||
- **End-to-end observability**: Out of the box metrics and tracing with built-in
|
||||
support for OpenTelemetry.
|
||||
|
||||
**⚡ Supercharge Your Workflow with an AI Database Assistant ⚡**
|
||||
|
||||
Stop context-switching and let your AI assistant become a true co-developer. By
|
||||
[connecting your IDE to your databases with MCP Toolbox][connect-ide], you can
|
||||
delegate complex and time-consuming database tasks, allowing you to build faster
|
||||
and focus on what matters. This isn't just about code completion; it's about
|
||||
giving your AI the context it needs to handle the entire development lifecycle.
|
||||
|
||||
Here’s how it will save you time:
|
||||
|
||||
- **Query in Plain English**: Interact with your data using natural language
|
||||
right from your IDE. Ask complex questions like, *"How many orders were
|
||||
delivered in 2024, and what items were in them?"* without writing any SQL.
|
||||
- **Automate Database Management**: Simply describe your data needs, and let the
|
||||
AI assistant manage your database for you. It can handle generating queries,
|
||||
creating tables, adding indexes, and more.
|
||||
- **Generate Context-Aware Code**: Empower your AI assistant to generate
|
||||
application code and tests with a deep understanding of your real-time
|
||||
database schema. This accelerates the development cycle by ensuring the
|
||||
generated code is directly usable.
|
||||
- **Slash Development Overhead**: Radically reduce the time spent on manual
|
||||
setup and boilerplate. MCP Toolbox helps streamline lengthy database
|
||||
configurations, repetitive code, and error-prone schema migrations.
|
||||
|
||||
Learn [how to connect your AI tools (IDEs) to Toolbox using MCP][connect-ide].
|
||||
|
||||
[connect-ide]: https://googleapis.github.io/genai-toolbox/how-to/connect-ide/
|
||||
|
||||
## General Architecture
|
||||
|
||||
@@ -69,6 +100,7 @@ redeploying your application.
|
||||
## Getting Started
|
||||
|
||||
### Installing the server
|
||||
|
||||
For the latest version, check the [releases page][releases] and use the
|
||||
following instructions for your OS and CPU architecture.
|
||||
|
||||
@@ -82,7 +114,7 @@ To install Toolbox as a binary:
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.4.0
|
||||
export VERSION=0.9.0
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
|
||||
chmod +x toolbox
|
||||
```
|
||||
@@ -95,7 +127,7 @@ You can also install Toolbox as a container:
|
||||
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.4.0
|
||||
export VERSION=0.9.0
|
||||
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
|
||||
```
|
||||
|
||||
@@ -108,7 +140,7 @@ To install from source, ensure you have the latest version of
|
||||
[Go installed](https://go.dev/doc/install), and then run the following command:
|
||||
|
||||
```sh
|
||||
go install github.com/googleapis/genai-toolbox@v0.4.0
|
||||
go install github.com/googleapis/genai-toolbox@v0.9.0
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
@@ -120,8 +152,10 @@ go install github.com/googleapis/genai-toolbox@v0.4.0
|
||||
execute `toolbox` to start the server:
|
||||
|
||||
```sh
|
||||
./toolbox --tools_file "tools.yaml"
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
> [!NOTE]
|
||||
> Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
|
||||
You can use `toolbox help` for a full list of flags! To stop the server, send a
|
||||
terminate signal (`ctrl+c` on most platforms).
|
||||
@@ -136,21 +170,29 @@ Once your server is up and running, you can load the tools into your
|
||||
application. See below the list of Client SDKs for using various frameworks:
|
||||
|
||||
<details open>
|
||||
<summary>Core</summary>
|
||||
<summary>Python (<a href="https://github.com/googleapis/mcp-toolbox-sdk-python">Github</a>)</summary>
|
||||
<br>
|
||||
<blockquote>
|
||||
|
||||
<details open>
|
||||
<summary>Core</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core]:
|
||||
|
||||
```bash
|
||||
pip install toolbox-core
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```python
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = await client.load_toolset("toolset_name")
|
||||
# these tools can be passed to your application!
|
||||
tools = await client.load_toolset("toolset_name")
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
@@ -159,67 +201,439 @@ For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
[toolbox-core]: https://pypi.org/project/toolbox-core/
|
||||
[toolbox-core-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-core/README.md
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>LangChain / LangGraph</summary>
|
||||
</details>
|
||||
<details>
|
||||
<summary>LangChain / LangGraph</summary>
|
||||
|
||||
1. Install [Toolbox LangChain SDK][toolbox-langchain]:
|
||||
|
||||
```bash
|
||||
pip install toolbox-langchain
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```python
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox LangChain SDK, see the
|
||||
[project's README][toolbox-langchain-readme].
|
||||
For more detailed instructions on using the Toolbox LangChain SDK, see the
|
||||
[project's README][toolbox-langchain-readme].
|
||||
|
||||
[toolbox-langchain]: https://pypi.org/project/toolbox-langchain/
|
||||
[toolbox-langchain-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/README.md
|
||||
[toolbox-langchain]: https://pypi.org/project/toolbox-langchain/
|
||||
[toolbox-langchain-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/README.md
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>LlamaIndex</summary>
|
||||
</details>
|
||||
<details>
|
||||
<summary>LlamaIndex</summary>
|
||||
|
||||
1. Install [Toolbox Llamaindex SDK][toolbox-llamaindex]:
|
||||
|
||||
```bash
|
||||
pip install toolbox-llamaindex
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```python
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
[project's README][toolbox-llamaindex-readme].
|
||||
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
[project's README][toolbox-llamaindex-readme].
|
||||
|
||||
[toolbox-llamaindex]: https://pypi.org/project/toolbox-llamaindex/
|
||||
[toolbox-llamaindex-readme]: https://github.com/googleapis/genai-toolbox-llamaindex-python/blob/main/README.md
|
||||
[toolbox-llamaindex]: https://pypi.org/project/toolbox-llamaindex/
|
||||
[toolbox-llamaindex-readme]: https://github.com/googleapis/genai-toolbox-llamaindex-python/blob/main/README.md
|
||||
|
||||
</details>
|
||||
</details>
|
||||
</blockquote>
|
||||
<details>
|
||||
<summary>Javascript/Typescript (<a href="https://github.com/googleapis/mcp-toolbox-sdk-js">Github</a>)</summary>
|
||||
<br>
|
||||
<blockquote>
|
||||
|
||||
<details open>
|
||||
<summary>Core</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core-js]:
|
||||
|
||||
```bash
|
||||
npm install @toolbox-sdk/core
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```javascript
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const tools = await client.loadToolset('toolsetName');
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
[project's README][toolbox-core-js-readme].
|
||||
|
||||
[toolbox-core-js]: https://www.npmjs.com/package/@toolbox-sdk/core
|
||||
[toolbox-core-js-readme]: https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>LangChain / LangGraph</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core-js]:
|
||||
|
||||
```bash
|
||||
npm install @toolbox-sdk/core
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```javascript
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => tool(currTool, {
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
});
|
||||
|
||||
// Use these tools in your Langchain/Langraph applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
```
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>Genkit</summary>
|
||||
|
||||
1. Install [Toolbox Core SDK][toolbox-core-js]:
|
||||
|
||||
```bash
|
||||
npm install @toolbox-sdk/core
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```javascript
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
import { genkit } from 'genkit';
|
||||
|
||||
// Initialise genkit
|
||||
const ai = genkit({
|
||||
plugins: [
|
||||
googleAI({
|
||||
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
|
||||
})
|
||||
],
|
||||
model: googleAI.model('gemini-2.0-flash'),
|
||||
});
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => ai.defineTool({
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
}, toolboxTool)
|
||||
|
||||
// Use these tools in your Genkit applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
```
|
||||
|
||||
</details>
|
||||
</details>
|
||||
</blockquote>
|
||||
<details>
|
||||
<summary>Go (<a href="https://github.com/googleapis/mcp-toolbox-sdk-go">Github</a>)</summary>
|
||||
<br>
|
||||
<blockquote>
|
||||
|
||||
<details open>
|
||||
<summary>Core</summary>
|
||||
|
||||
1. Install [Toolbox Go SDK][toolbox-go]:
|
||||
|
||||
```bash
|
||||
go get github.com/googleapis/mcp-toolbox-sdk-go
|
||||
```
|
||||
|
||||
1. Load tools:
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"context"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000";
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
|
||||
// Framework agnostic tools
|
||||
tools, err := client.LoadToolset("toolsetName", ctx)
|
||||
}
|
||||
```
|
||||
|
||||
For more detailed instructions on using the Toolbox Go SDK, see the
|
||||
[project's README][toolbox-core-go-readme].
|
||||
|
||||
[toolbox-go]: https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go/core
|
||||
[toolbox-core-go-readme]: https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/README.md
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>LangChain Go</summary>
|
||||
|
||||
1. Install [Toolbox Go SDK][toolbox-go]:
|
||||
|
||||
```bash
|
||||
go get github.com/googleapis/mcp-toolbox-sdk-go
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"github.com/tmc/langchaingo/llms"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
|
||||
// Fetch the tool's input schema
|
||||
inputschema, err := tool.InputSchema()
|
||||
|
||||
var paramsSchema map[string]any
|
||||
_ = json.Unmarshal(inputschema, ¶msSchema)
|
||||
|
||||
// Use this tool with LangChainGo
|
||||
langChainTool := llms.Tool{
|
||||
Type: "function",
|
||||
Function: &llms.FunctionDefinition{
|
||||
Name: tool.Name(),
|
||||
Description: tool.Description(),
|
||||
Parameters: paramsSchema,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>Genkit</summary>
|
||||
|
||||
1. Install [Toolbox Go SDK][toolbox-go]:
|
||||
|
||||
```bash
|
||||
go get github.com/googleapis/mcp-toolbox-sdk-go
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```go
|
||||
package main
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
|
||||
"github.com/firebase/genkit/go/ai"
|
||||
"github.com/firebase/genkit/go/genkit"
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
|
||||
"github.com/invopop/jsonschema"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// Update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
g, err := genkit.Init(ctx)
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
|
||||
// Convert the tool using the tbgenkit package
|
||||
// Use this tool with Genkit Go
|
||||
genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to convert tool: %v\n", err)
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>Go GenAI</summary>
|
||||
|
||||
1. Install [Toolbox Go SDK][toolbox-go]:
|
||||
|
||||
```bash
|
||||
go get github.com/googleapis/mcp-toolbox-sdk-go
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"google.golang.org/genai"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// Update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
|
||||
// Fetch the tool's input schema
|
||||
inputschema, err := tool.InputSchema()
|
||||
|
||||
var schema *genai.Schema
|
||||
_ = json.Unmarshal(inputschema, &schema)
|
||||
|
||||
funcDeclaration := &genai.FunctionDeclaration{
|
||||
Name: tool.Name(),
|
||||
Description: tool.Description(),
|
||||
Parameters: schema,
|
||||
}
|
||||
|
||||
// Use this tool with Go GenAI
|
||||
genAITool := &genai.Tool{
|
||||
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>OpenAI Go</summary>
|
||||
|
||||
1. Install [Toolbox Go SDK][toolbox-go]:
|
||||
|
||||
```bash
|
||||
go get github.com/googleapis/mcp-toolbox-sdk-go
|
||||
```
|
||||
|
||||
2. Load tools:
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
openai "github.com/openai/openai-go"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// Update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
|
||||
// Fetch the tool's input schema
|
||||
inputschema, err := tool.InputSchema()
|
||||
|
||||
var paramsSchema openai.FunctionParameters
|
||||
_ = json.Unmarshal(inputschema, ¶msSchema)
|
||||
|
||||
// Use this tool with OpenAI Go
|
||||
openAITool := openai.ChatCompletionToolParam{
|
||||
Function: openai.FunctionDefinitionParam{
|
||||
Name: tool.Name(),
|
||||
Description: openai.String(tool.Description()),
|
||||
Parameters: paramsSchema,
|
||||
},
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
</details>
|
||||
</details>
|
||||
</blockquote>
|
||||
</details>
|
||||
|
||||
## Configuration
|
||||
|
||||
The primary way to configure Toolbox is through the `tools.yaml` file. If you
|
||||
have multiple files, you can tell toolbox which to load with the `--tools_file
|
||||
have multiple files, you can tell toolbox which to load with the `--tools-file
|
||||
tools.yaml` flag.
|
||||
|
||||
You can find more detailed reference documentation to all resource types in the
|
||||
[Resources](https://googleapis.github.io/genai-toolbox/resources/).
|
||||
|
||||
### Sources
|
||||
|
||||
The `sources` section of your `tools.yaml` defines what data sources your
|
||||
@@ -261,7 +675,6 @@ tools:
|
||||
For more details on configuring different types of tools, see the
|
||||
[Tools](https://googleapis.github.io/genai-toolbox/resources/tools).
|
||||
|
||||
|
||||
### Toolsets
|
||||
|
||||
The `toolsets` section of your `tools.yaml` allows you to define groups of tools
|
||||
@@ -298,7 +711,7 @@ This project uses [semantic versioning](https://semver.org/), including a
|
||||
- PATCH version when we make backward compatible bug fixes
|
||||
|
||||
The public API that this applies to is the CLI associated with Toolbox, the
|
||||
interactions with official SDKs, and the definitions in the `tools.yaml` file.
|
||||
interactions with official SDKs, and the definitions in the `tools.yaml` file.
|
||||
|
||||
## Contributing
|
||||
|
||||
@@ -308,3 +721,7 @@ to get started.
|
||||
Please note that this project is released with a Contributor Code of Conduct.
|
||||
By participating in this project you agree to abide by its terms. See
|
||||
[Contributor Code of Conduct](CODE_OF_CONDUCT.md) for more information.
|
||||
|
||||
## Community
|
||||
|
||||
Join our [discord community](https://discord.gg/GQrFB3Ec3W) to connect with our developers!
|
||||
|
||||
618
cmd/root.go
618
cmd/root.go
@@ -19,27 +19,114 @@ import (
|
||||
_ "embed"
|
||||
"fmt"
|
||||
"io"
|
||||
"maps"
|
||||
"os"
|
||||
"os/signal"
|
||||
"path/filepath"
|
||||
"regexp"
|
||||
"runtime"
|
||||
"slices"
|
||||
"strings"
|
||||
"syscall"
|
||||
"time"
|
||||
|
||||
"github.com/fsnotify/fsnotify"
|
||||
yaml "github.com/goccy/go-yaml"
|
||||
"github.com/googleapis/genai-toolbox/internal/auth"
|
||||
"github.com/googleapis/genai-toolbox/internal/log"
|
||||
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||
"github.com/googleapis/genai-toolbox/internal/telemetry"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/util"
|
||||
|
||||
// Import tool packages for side effect of registration
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/alloydbainl"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigqueryexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygetdatasetinfo"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygettableinfo"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylistdatasetids"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylisttableids"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/bigtable"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/couchbase"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/dgraph"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoredeletedocuments"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetdocuments"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetrules"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorelistcollections"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorequerycollection"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorevalidaterules"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/http"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdimensions"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetexplores"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetfilters"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetlooks"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmeasures"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmodels"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetparameters"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquery"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquerysql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrunlook"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbdeletemany"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbdeleteone"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbfind"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbfindone"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbinsertmany"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbinsertone"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbupdatemany"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbupdateone"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlsql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlsql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jcypher"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jexecutecypher"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/redis"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerexecutesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannersql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/sqlitesql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/alloydbwaitforoperation"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/wait"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/tools/valkey"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/bigtable"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlmssql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlmysql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/couchbase"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/dgraph"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/firestore"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/http"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/looker"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/mongodb"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/mssql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/mysql"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/neo4j"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/postgres"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/redis"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/spanner"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/sqlite"
|
||||
_ "github.com/googleapis/genai-toolbox/internal/sources/valkey"
|
||||
)
|
||||
|
||||
var (
|
||||
// versionString indicates the version of this library.
|
||||
//go:embed version.txt
|
||||
// versionString stores the full semantic version, including build metadata.
|
||||
versionString string
|
||||
// versionNum indicates the numerical part fo the version
|
||||
//go:embed version.txt
|
||||
versionNum string
|
||||
// metadataString indicates additional build or distribution metadata.
|
||||
metadataString string
|
||||
buildType string = "dev" // should be one of "dev", "binary", or "container"
|
||||
// commitSha is the git commit it was built from
|
||||
commitSha string
|
||||
)
|
||||
|
||||
func init() {
|
||||
@@ -48,10 +135,11 @@ func init() {
|
||||
|
||||
// semanticVersion returns the version of the CLI including a compile-time metadata.
|
||||
func semanticVersion() string {
|
||||
v := strings.TrimSpace(versionString)
|
||||
if metadataString != "" {
|
||||
v += "+" + metadataString
|
||||
metadataStrings := []string{buildType, runtime.GOOS, runtime.GOARCH}
|
||||
if commitSha != "" {
|
||||
metadataStrings = append(metadataStrings, commitSha)
|
||||
}
|
||||
v := strings.TrimSpace(versionNum) + "+" + strings.Join(metadataStrings, ".")
|
||||
return v
|
||||
}
|
||||
|
||||
@@ -68,15 +156,20 @@ func Execute() {
|
||||
type Command struct {
|
||||
*cobra.Command
|
||||
|
||||
cfg server.ServerConfig
|
||||
logger log.Logger
|
||||
tools_file string
|
||||
outStream io.Writer
|
||||
errStream io.Writer
|
||||
cfg server.ServerConfig
|
||||
logger log.Logger
|
||||
tools_file string
|
||||
tools_files []string
|
||||
tools_folder string
|
||||
prebuiltConfig string
|
||||
inStream io.Reader
|
||||
outStream io.Writer
|
||||
errStream io.Writer
|
||||
}
|
||||
|
||||
// NewCommand returns a Command object representing an invocation of the CLI.
|
||||
func NewCommand(opts ...Option) *Command {
|
||||
in := os.Stdin
|
||||
out := os.Stdout
|
||||
err := os.Stderr
|
||||
|
||||
@@ -87,6 +180,7 @@ func NewCommand(opts ...Option) *Command {
|
||||
}
|
||||
cmd := &Command{
|
||||
Command: baseCmd,
|
||||
inStream: in,
|
||||
outStream: out,
|
||||
errStream: err,
|
||||
}
|
||||
@@ -98,7 +192,8 @@ func NewCommand(opts ...Option) *Command {
|
||||
// Set server version
|
||||
cmd.cfg.Version = versionString
|
||||
|
||||
// set baseCmd out and err the same as cmd.
|
||||
// set baseCmd in, out and err the same as cmd.
|
||||
baseCmd.SetIn(cmd.inStream)
|
||||
baseCmd.SetOut(cmd.outStream)
|
||||
baseCmd.SetErr(cmd.errStream)
|
||||
|
||||
@@ -106,15 +201,20 @@ func NewCommand(opts ...Option) *Command {
|
||||
flags.StringVarP(&cmd.cfg.Address, "address", "a", "127.0.0.1", "Address of the interface the server will listen on.")
|
||||
flags.IntVarP(&cmd.cfg.Port, "port", "p", 5000, "Port the server will listen on.")
|
||||
|
||||
flags.StringVar(&cmd.tools_file, "tools_file", "tools.yaml", "File path specifying the tool configuration.")
|
||||
flags.StringVar(&cmd.tools_file, "tools_file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt.")
|
||||
// deprecate tools_file
|
||||
_ = flags.MarkDeprecated("tools_file", "please use --tools-file instead")
|
||||
flags.StringVar(&cmd.tools_file, "tools-file", "tools.yaml", "File path specifying the tool configuration.")
|
||||
flags.StringVar(&cmd.tools_file, "tools-file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt, --tools-files, or --tools-folder.")
|
||||
flags.StringSliceVar(&cmd.tools_files, "tools-files", []string{}, "Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --prebuilt, --tools-file, or --tools-folder.")
|
||||
flags.StringVar(&cmd.tools_folder, "tools-folder", "", "Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --prebuilt, --tools-file, or --tools-files.")
|
||||
flags.Var(&cmd.cfg.LogLevel, "log-level", "Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'.")
|
||||
flags.Var(&cmd.cfg.LoggingFormat, "logging-format", "Specify logging format to use. Allowed: 'standard' or 'JSON'.")
|
||||
flags.BoolVar(&cmd.cfg.TelemetryGCP, "telemetry-gcp", false, "Enable exporting directly to Google Cloud Monitoring.")
|
||||
flags.StringVar(&cmd.cfg.TelemetryOTLP, "telemetry-otlp", "", "Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318')")
|
||||
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
|
||||
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres-admin', alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'firestore', 'mssql', 'mysql', 'postgres', 'spanner', 'spanner-postgres'.")
|
||||
flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.")
|
||||
flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.")
|
||||
|
||||
// wrap RunE command so that we have access to original Command object
|
||||
cmd.RunE = func(*cobra.Command, []string) error { return run(cmd) }
|
||||
@@ -163,7 +263,354 @@ func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
|
||||
return toolsFile, nil
|
||||
}
|
||||
|
||||
// mergeToolsFiles merges multiple ToolsFile structs into one.
|
||||
// Detects and raises errors for resource conflicts in sources, authServices, tools, and toolsets.
|
||||
// All resource names (sources, authServices, tools, toolsets) must be unique across all files.
|
||||
func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
|
||||
merged := ToolsFile{
|
||||
Sources: make(server.SourceConfigs),
|
||||
AuthServices: make(server.AuthServiceConfigs),
|
||||
Tools: make(server.ToolConfigs),
|
||||
Toolsets: make(server.ToolsetConfigs),
|
||||
}
|
||||
|
||||
var conflicts []string
|
||||
|
||||
for fileIndex, file := range files {
|
||||
// Check for conflicts and merge sources
|
||||
for name, source := range file.Sources {
|
||||
if _, exists := merged.Sources[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("source '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.Sources[name] = source
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge authSources (deprecated, but still support)
|
||||
for name, authSource := range file.AuthSources {
|
||||
if _, exists := merged.AuthSources[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("authSource '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.AuthSources[name] = authSource
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge authServices
|
||||
for name, authService := range file.AuthServices {
|
||||
if _, exists := merged.AuthServices[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("authService '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.AuthServices[name] = authService
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge tools
|
||||
for name, tool := range file.Tools {
|
||||
if _, exists := merged.Tools[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("tool '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.Tools[name] = tool
|
||||
}
|
||||
}
|
||||
|
||||
// Check for conflicts and merge toolsets
|
||||
for name, toolset := range file.Toolsets {
|
||||
if _, exists := merged.Toolsets[name]; exists {
|
||||
conflicts = append(conflicts, fmt.Sprintf("toolset '%s' (file #%d)", name, fileIndex+1))
|
||||
} else {
|
||||
merged.Toolsets[name] = toolset
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If conflicts were detected, return an error
|
||||
if len(conflicts) > 0 {
|
||||
return ToolsFile{}, fmt.Errorf("resource conflicts detected:\n - %s\n\nPlease ensure each source, authService, tool, and toolset has a unique name across all files", strings.Join(conflicts, "\n - "))
|
||||
}
|
||||
|
||||
return merged, nil
|
||||
}
|
||||
|
||||
// loadAndMergeToolsFiles loads multiple YAML files and merges them
|
||||
func loadAndMergeToolsFiles(ctx context.Context, filePaths []string) (ToolsFile, error) {
|
||||
var toolsFiles []ToolsFile
|
||||
|
||||
for _, filePath := range filePaths {
|
||||
buf, err := os.ReadFile(filePath)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to read tool file at %q: %w", filePath, err)
|
||||
}
|
||||
|
||||
toolsFile, err := parseToolsFile(ctx, buf)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to parse tool file at %q: %w", filePath, err)
|
||||
}
|
||||
|
||||
toolsFiles = append(toolsFiles, toolsFile)
|
||||
}
|
||||
|
||||
mergedFile, err := mergeToolsFiles(toolsFiles...)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to merge tools files: %w", err)
|
||||
}
|
||||
|
||||
return mergedFile, nil
|
||||
}
|
||||
|
||||
// loadAndMergeToolsFolder loads all YAML files from a directory and merges them
|
||||
func loadAndMergeToolsFolder(ctx context.Context, folderPath string) (ToolsFile, error) {
|
||||
// Check if directory exists
|
||||
info, err := os.Stat(folderPath)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("unable to access tools folder at %q: %w", folderPath, err)
|
||||
}
|
||||
if !info.IsDir() {
|
||||
return ToolsFile{}, fmt.Errorf("path %q is not a directory", folderPath)
|
||||
}
|
||||
|
||||
// Find all YAML files in the directory
|
||||
pattern := filepath.Join(folderPath, "*.yaml")
|
||||
yamlFiles, err := filepath.Glob(pattern)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("error finding YAML files in %q: %w", folderPath, err)
|
||||
}
|
||||
|
||||
// Also find .yml files
|
||||
ymlPattern := filepath.Join(folderPath, "*.yml")
|
||||
ymlFiles, err := filepath.Glob(ymlPattern)
|
||||
if err != nil {
|
||||
return ToolsFile{}, fmt.Errorf("error finding YML files in %q: %w", folderPath, err)
|
||||
}
|
||||
|
||||
// Combine both file lists
|
||||
allFiles := append(yamlFiles, ymlFiles...)
|
||||
|
||||
if len(allFiles) == 0 {
|
||||
return ToolsFile{}, fmt.Errorf("no YAML files found in directory %q", folderPath)
|
||||
}
|
||||
|
||||
// Use existing loadAndMergeToolsFiles function
|
||||
return loadAndMergeToolsFiles(ctx, allFiles)
|
||||
}
|
||||
|
||||
func handleDynamicReload(ctx context.Context, toolsFile ToolsFile, s *server.Server) error {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
sourcesMap, authServicesMap, toolsMap, toolsetsMap, err := validateReloadEdits(ctx, toolsFile)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to validate reloaded edits: %w", err)
|
||||
logger.WarnContext(ctx, errMsg.Error())
|
||||
return err
|
||||
}
|
||||
|
||||
s.ResourceMgr.SetResources(sourcesMap, authServicesMap, toolsMap, toolsetsMap)
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// validateReloadEdits checks that the reloaded tools file configs can initialized without failing
|
||||
func validateReloadEdits(
|
||||
ctx context.Context, toolsFile ToolsFile,
|
||||
) (map[string]sources.Source, map[string]auth.AuthService, map[string]tools.Tool, map[string]tools.Toolset, error,
|
||||
) {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
instrumentation, err := util.InstrumentationFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
logger.DebugContext(ctx, "Attempting to parse and validate reloaded tools file.")
|
||||
|
||||
ctx, span := instrumentation.Tracer.Start(ctx, "toolbox/server/reload")
|
||||
defer span.End()
|
||||
|
||||
reloadedConfig := server.ServerConfig{
|
||||
Version: versionString,
|
||||
SourceConfigs: toolsFile.Sources,
|
||||
AuthServiceConfigs: toolsFile.AuthServices,
|
||||
ToolConfigs: toolsFile.Tools,
|
||||
ToolsetConfigs: toolsFile.Toolsets,
|
||||
}
|
||||
|
||||
sourcesMap, authServicesMap, toolsMap, toolsetsMap, err := server.InitializeConfigs(ctx, reloadedConfig)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to initialize reloaded configs: %w", err)
|
||||
logger.WarnContext(ctx, errMsg.Error())
|
||||
return nil, nil, nil, nil, err
|
||||
}
|
||||
|
||||
return sourcesMap, authServicesMap, toolsMap, toolsetsMap, nil
|
||||
}
|
||||
|
||||
// watchChanges checks for changes in the provided yaml tools file(s) or folder.
|
||||
func watchChanges(ctx context.Context, watchDirs map[string]bool, watchedFiles map[string]bool, s *server.Server) {
|
||||
logger, err := util.LoggerFromContext(ctx)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
w, err := fsnotify.NewWatcher()
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "error setting up new watcher %s", err)
|
||||
return
|
||||
}
|
||||
|
||||
defer w.Close()
|
||||
|
||||
watchingFolder := false
|
||||
var folderToWatch string
|
||||
|
||||
// if watchedFiles is empty, indicates that user passed entire folder instead
|
||||
if len(watchedFiles) == 0 {
|
||||
watchingFolder = true
|
||||
|
||||
// validate that watchDirs only has single element
|
||||
if len(watchDirs) > 1 {
|
||||
logger.WarnContext(ctx, "error setting watcher, expected single tools folder if no file(s) are defined.")
|
||||
return
|
||||
}
|
||||
|
||||
for onlyKey := range watchDirs {
|
||||
folderToWatch = onlyKey
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
for dir := range watchDirs {
|
||||
err := w.Add(dir)
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, fmt.Sprintf("Error adding path %s to watcher: %s", dir, err))
|
||||
break
|
||||
}
|
||||
logger.DebugContext(ctx, fmt.Sprintf("Added directory %s to watcher.", dir))
|
||||
}
|
||||
|
||||
// debounce timer is used to prevent multiple writes triggering multiple reloads
|
||||
debounceDelay := 100 * time.Millisecond
|
||||
debounce := time.NewTimer(1 * time.Minute)
|
||||
debounce.Stop()
|
||||
|
||||
for {
|
||||
select {
|
||||
case <-ctx.Done():
|
||||
logger.DebugContext(ctx, "file watcher context cancelled")
|
||||
return
|
||||
case err, ok := <-w.Errors:
|
||||
if !ok {
|
||||
logger.WarnContext(ctx, "file watcher was closed unexpectedly")
|
||||
return
|
||||
}
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "file watcher error %s", err)
|
||||
return
|
||||
}
|
||||
|
||||
case e, ok := <-w.Events:
|
||||
if !ok {
|
||||
logger.WarnContext(ctx, "file watcher already closed")
|
||||
return
|
||||
}
|
||||
|
||||
// only check for events which indicate user saved a new tools file
|
||||
// multiple operations checked due to various file update methods across editors
|
||||
if !e.Has(fsnotify.Write | fsnotify.Create | fsnotify.Rename) {
|
||||
continue
|
||||
}
|
||||
|
||||
cleanedFilename := filepath.Clean(e.Name)
|
||||
logger.DebugContext(ctx, fmt.Sprintf("%s event detected in %s", e.Op, cleanedFilename))
|
||||
|
||||
folderChanged := watchingFolder &&
|
||||
(strings.HasSuffix(cleanedFilename, ".yaml") || strings.HasSuffix(cleanedFilename, ".yml"))
|
||||
|
||||
if folderChanged || watchedFiles[cleanedFilename] {
|
||||
// indicates the write event is on a relevant file
|
||||
debounce.Reset(debounceDelay)
|
||||
}
|
||||
|
||||
case <-debounce.C:
|
||||
debounce.Stop()
|
||||
var reloadedToolsFile ToolsFile
|
||||
|
||||
if watchingFolder {
|
||||
logger.DebugContext(ctx, "Reloading tools folder.")
|
||||
reloadedToolsFile, err = loadAndMergeToolsFolder(ctx, folderToWatch)
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "error loading tools folder %s", err)
|
||||
continue
|
||||
}
|
||||
} else {
|
||||
logger.DebugContext(ctx, "Reloading tools file(s).")
|
||||
reloadedToolsFile, err = loadAndMergeToolsFiles(ctx, slices.Collect(maps.Keys(watchedFiles)))
|
||||
if err != nil {
|
||||
logger.WarnContext(ctx, "error loading tools files %s", err)
|
||||
continue
|
||||
}
|
||||
}
|
||||
|
||||
err = handleDynamicReload(ctx, reloadedToolsFile, s)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to parse reloaded tools file at %q: %w", reloadedToolsFile, err)
|
||||
logger.WarnContext(ctx, errMsg.Error())
|
||||
continue
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// updateLogLevel checks if Toolbox have to update the existing log level set by users.
|
||||
// stdio doesn't support "debug" and "info" logs.
|
||||
func updateLogLevel(stdio bool, logLevel string) bool {
|
||||
if stdio {
|
||||
switch strings.ToUpper(logLevel) {
|
||||
case log.Debug, log.Info:
|
||||
return true
|
||||
default:
|
||||
return false
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func resolveWatcherInputs(toolsFile string, toolsFiles []string, toolsFolder string) (map[string]bool, map[string]bool) {
|
||||
var relevantFiles []string
|
||||
|
||||
// map for efficiently checking if a file is relevant
|
||||
watchedFiles := make(map[string]bool)
|
||||
|
||||
// dirs that will be added to watcher (fsnotify prefers watching directory then filtering for file)
|
||||
watchDirs := make(map[string]bool)
|
||||
|
||||
if len(toolsFiles) > 0 {
|
||||
relevantFiles = toolsFiles
|
||||
} else if toolsFolder != "" {
|
||||
watchDirs[filepath.Clean(toolsFolder)] = true
|
||||
} else {
|
||||
relevantFiles = []string{toolsFile}
|
||||
}
|
||||
|
||||
// extract parent dir for relevant files and dedup
|
||||
for _, f := range relevantFiles {
|
||||
cleanFile := filepath.Clean(f)
|
||||
watchedFiles[cleanFile] = true
|
||||
watchDirs[filepath.Dir(cleanFile)] = true
|
||||
}
|
||||
|
||||
return watchDirs, watchedFiles
|
||||
}
|
||||
|
||||
func run(cmd *Command) error {
|
||||
if updateLogLevel(cmd.cfg.Stdio, cmd.cfg.LogLevel.String()) {
|
||||
cmd.cfg.LogLevel = server.StringLevel(log.Warn)
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithCancel(cmd.Context())
|
||||
defer cancel()
|
||||
|
||||
@@ -202,13 +649,13 @@ func run(cmd *Command) error {
|
||||
}
|
||||
cmd.logger = logger
|
||||
default:
|
||||
return fmt.Errorf("logging format invalid.")
|
||||
return fmt.Errorf("logging format invalid")
|
||||
}
|
||||
|
||||
ctx = util.WithLogger(ctx, cmd.logger)
|
||||
|
||||
// Set up OpenTelemetry
|
||||
otelShutdown, err := telemetry.SetupOTel(ctx, cmd.Command.Version, cmd.cfg.TelemetryOTLP, cmd.cfg.TelemetryGCP, cmd.cfg.TelemetryServiceName)
|
||||
otelShutdown, err := telemetry.SetupOTel(ctx, cmd.cfg.Version, cmd.cfg.TelemetryOTLP, cmd.cfg.TelemetryGCP, cmd.cfg.TelemetryServiceName)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("error setting up OpenTelemetry: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
@@ -222,51 +669,144 @@ func run(cmd *Command) error {
|
||||
}
|
||||
}()
|
||||
|
||||
// Read tool file contents
|
||||
buf, err := os.ReadFile(cmd.tools_file)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to read tool file at %q: %w", cmd.tools_file, err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
var toolsFile ToolsFile
|
||||
|
||||
if cmd.prebuiltConfig != "" {
|
||||
// Make sure --prebuilt and --tools-file/--tools-files/--tools-folder flags are mutually exclusive
|
||||
if cmd.tools_file != "" || len(cmd.tools_files) > 0 || cmd.tools_folder != "" {
|
||||
errMsg := fmt.Errorf("--prebuilt and --tools-file/--tools-files/--tools-folder flags cannot be used simultaneously")
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
// Use prebuilt tools
|
||||
buf, err := prebuiltconfigs.Get(cmd.prebuiltConfig)
|
||||
if err != nil {
|
||||
cmd.logger.ErrorContext(ctx, err.Error())
|
||||
return err
|
||||
}
|
||||
logMsg := fmt.Sprint("Using prebuilt tool configuration for ", cmd.prebuiltConfig)
|
||||
cmd.logger.InfoContext(ctx, logMsg)
|
||||
// Append prebuilt.source to Version string for the User Agent
|
||||
cmd.cfg.Version += "+prebuilt." + cmd.prebuiltConfig
|
||||
|
||||
toolsFile, err = parseToolsFile(ctx, buf)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to parse prebuilt tool configuration: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
} else if len(cmd.tools_files) > 0 {
|
||||
// Make sure --tools-file, --tools-files, and --tools-folder flags are mutually exclusive
|
||||
if cmd.tools_file != "" || cmd.tools_folder != "" {
|
||||
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
// Use multiple tools files
|
||||
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging %d tool configuration files", len(cmd.tools_files)))
|
||||
var err error
|
||||
toolsFile, err = loadAndMergeToolsFiles(ctx, cmd.tools_files)
|
||||
if err != nil {
|
||||
cmd.logger.ErrorContext(ctx, err.Error())
|
||||
return err
|
||||
}
|
||||
} else if cmd.tools_folder != "" {
|
||||
// Make sure --tools-folder and other flags are mutually exclusive
|
||||
if cmd.tools_file != "" || len(cmd.tools_files) > 0 {
|
||||
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
// Use tools folder
|
||||
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging all YAML files from directory: %s", cmd.tools_folder))
|
||||
var err error
|
||||
toolsFile, err = loadAndMergeToolsFolder(ctx, cmd.tools_folder)
|
||||
if err != nil {
|
||||
cmd.logger.ErrorContext(ctx, err.Error())
|
||||
return err
|
||||
}
|
||||
} else {
|
||||
// Set default value of tools-file flag to tools.yaml
|
||||
if cmd.tools_file == "" {
|
||||
cmd.tools_file = "tools.yaml"
|
||||
}
|
||||
|
||||
// Read single tool file contents
|
||||
buf, err := os.ReadFile(cmd.tools_file)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to read tool file at %q: %w", cmd.tools_file, err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
toolsFile, err = parseToolsFile(ctx, buf)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to parse tool file at %q: %w", cmd.tools_file, err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
}
|
||||
toolsFile, err := parseToolsFile(ctx, buf)
|
||||
|
||||
cmd.cfg.SourceConfigs, cmd.cfg.AuthServiceConfigs, cmd.cfg.ToolConfigs, cmd.cfg.ToolsetConfigs = toolsFile.Sources, toolsFile.AuthServices, toolsFile.Tools, toolsFile.Toolsets
|
||||
authSourceConfigs := toolsFile.AuthSources
|
||||
if authSourceConfigs != nil {
|
||||
cmd.logger.WarnContext(ctx, "`authSources` is deprecated, use `authServices` instead")
|
||||
cmd.cfg.AuthServiceConfigs = authSourceConfigs
|
||||
}
|
||||
|
||||
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("unable to parse tool file at %q: %w", cmd.tools_file, err)
|
||||
errMsg := fmt.Errorf("unable to create telemetry instrumentation: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
ctx = util.WithInstrumentation(ctx, instrumentation)
|
||||
|
||||
// start server
|
||||
s, err := server.NewServer(ctx, cmd.cfg, cmd.logger)
|
||||
s, err := server.NewServer(ctx, cmd.cfg)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("toolbox failed to initialize: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
|
||||
err = s.Listen(ctx)
|
||||
if err != nil {
|
||||
errMsg := fmt.Errorf("toolbox failed to start listener: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
cmd.logger.InfoContext(ctx, "Server ready to serve!")
|
||||
|
||||
// run server in background
|
||||
srvErr := make(chan error)
|
||||
go func() {
|
||||
defer close(srvErr)
|
||||
err = s.Serve(ctx)
|
||||
if cmd.cfg.Stdio {
|
||||
go func() {
|
||||
defer close(srvErr)
|
||||
err = s.ServeStdio(ctx, cmd.inStream, cmd.outStream)
|
||||
if err != nil {
|
||||
srvErr <- err
|
||||
}
|
||||
}()
|
||||
} else {
|
||||
err = s.Listen(ctx)
|
||||
if err != nil {
|
||||
srvErr <- err
|
||||
errMsg := fmt.Errorf("toolbox failed to start listener: %w", err)
|
||||
cmd.logger.ErrorContext(ctx, errMsg.Error())
|
||||
return errMsg
|
||||
}
|
||||
}()
|
||||
cmd.logger.InfoContext(ctx, "Server ready to serve!")
|
||||
|
||||
go func() {
|
||||
defer close(srvErr)
|
||||
err = s.Serve(ctx)
|
||||
if err != nil {
|
||||
srvErr <- err
|
||||
}
|
||||
}()
|
||||
}
|
||||
|
||||
watchDirs, watchedFiles := resolveWatcherInputs(cmd.tools_file, cmd.tools_files, cmd.tools_folder)
|
||||
|
||||
if !cmd.cfg.DisableReload {
|
||||
// start watching the file(s) or folder for changes to trigger dynamic reloading
|
||||
go watchChanges(ctx, watchDirs, watchedFiles, s)
|
||||
}
|
||||
|
||||
// wait for either the server to error out or the command's context to be canceled
|
||||
select {
|
||||
@@ -282,7 +822,7 @@ func run(cmd *Command) error {
|
||||
cmd.logger.WarnContext(shutdownContext, "Shutting down gracefully...")
|
||||
err := s.Shutdown(shutdownContext)
|
||||
if err == context.DeadlineExceeded {
|
||||
return fmt.Errorf("graceful shutdown timed out... forcing exit.")
|
||||
return fmt.Errorf("graceful shutdown timed out... forcing exit")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
533
cmd/root_test.go
533
cmd/root_test.go
@@ -16,27 +16,41 @@ package cmd
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"context"
|
||||
_ "embed"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"path"
|
||||
"path/filepath"
|
||||
"regexp"
|
||||
"runtime"
|
||||
"strings"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/google/go-cmp/cmp"
|
||||
|
||||
"github.com/googleapis/genai-toolbox/internal/auth/google"
|
||||
"github.com/googleapis/genai-toolbox/internal/log"
|
||||
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
|
||||
"github.com/googleapis/genai-toolbox/internal/server"
|
||||
cloudsqlpgsrc "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
|
||||
httpsrc "github.com/googleapis/genai-toolbox/internal/sources/http"
|
||||
"github.com/googleapis/genai-toolbox/internal/telemetry"
|
||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/http"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/postgressql"
|
||||
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
|
||||
"github.com/googleapis/genai-toolbox/internal/util"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
func withDefaults(c server.ServerConfig) server.ServerConfig {
|
||||
data, _ := os.ReadFile("version.txt")
|
||||
c.Version = strings.TrimSpace(string(data))
|
||||
version := strings.TrimSpace(string(data)) // Preserving 'data', new var for clarity
|
||||
c.Version = version + "+" + strings.Join([]string{"dev", runtime.GOOS, runtime.GOARCH}, ".")
|
||||
|
||||
if c.Address == "" {
|
||||
c.Address = "127.0.0.1"
|
||||
}
|
||||
@@ -163,6 +177,20 @@ func TestServerConfigFlags(t *testing.T) {
|
||||
TelemetryServiceName: "toolbox-custom",
|
||||
}),
|
||||
},
|
||||
{
|
||||
desc: "stdio",
|
||||
args: []string{"--stdio"},
|
||||
want: withDefaults(server.ServerConfig{
|
||||
Stdio: true,
|
||||
}),
|
||||
},
|
||||
{
|
||||
desc: "disable reload",
|
||||
args: []string{"--disable-reload"},
|
||||
want: withDefaults(server.ServerConfig{
|
||||
DisableReload: true,
|
||||
}),
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
@@ -187,7 +215,7 @@ func TestToolFileFlag(t *testing.T) {
|
||||
{
|
||||
desc: "default value",
|
||||
args: []string{},
|
||||
want: "tools.yaml",
|
||||
want: "",
|
||||
},
|
||||
{
|
||||
desc: "foo file",
|
||||
@@ -218,6 +246,101 @@ func TestToolFileFlag(t *testing.T) {
|
||||
}
|
||||
}
|
||||
|
||||
func TestToolsFilesFlag(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
args []string
|
||||
want []string
|
||||
}{
|
||||
{
|
||||
desc: "no value",
|
||||
args: []string{},
|
||||
want: []string{},
|
||||
},
|
||||
{
|
||||
desc: "single file",
|
||||
args: []string{"--tools-files", "foo.yaml"},
|
||||
want: []string{"foo.yaml"},
|
||||
},
|
||||
{
|
||||
desc: "multiple files",
|
||||
args: []string{"--tools-files", "foo.yaml,bar.yaml"},
|
||||
want: []string{"foo.yaml", "bar.yaml"},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
c, _, err := invokeCommand(tc.args)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error invoking command: %s", err)
|
||||
}
|
||||
if diff := cmp.Diff(c.tools_files, tc.want); diff != "" {
|
||||
t.Fatalf("got %v, want %v", c.tools_files, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestToolsFolderFlag(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
args []string
|
||||
want string
|
||||
}{
|
||||
{
|
||||
desc: "no value",
|
||||
args: []string{},
|
||||
want: "",
|
||||
},
|
||||
{
|
||||
desc: "folder set",
|
||||
args: []string{"--tools-folder", "test-folder"},
|
||||
want: "test-folder",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
c, _, err := invokeCommand(tc.args)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error invoking command: %s", err)
|
||||
}
|
||||
if c.tools_folder != tc.want {
|
||||
t.Fatalf("got %v, want %v", c.tools_folder, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestPrebuiltFlag(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
args []string
|
||||
want string
|
||||
}{
|
||||
{
|
||||
desc: "default value",
|
||||
args: []string{},
|
||||
want: "",
|
||||
},
|
||||
{
|
||||
desc: "custom pre built flag",
|
||||
args: []string{"--tools-file", "alloydb"},
|
||||
want: "alloydb",
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
c, _, err := invokeCommand(tc.args)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error invoking command: %s", err)
|
||||
}
|
||||
if c.tools_file != tc.want {
|
||||
t.Fatalf("got %v, want %v", c.cfg, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestFailServerConfigFlags(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
@@ -320,7 +443,7 @@ func TestParseToolFile(t *testing.T) {
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: postgressql.ToolKind,
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
@@ -451,7 +574,7 @@ func TestParseToolFileWithAuth(t *testing.T) {
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: postgressql.ToolKind,
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
@@ -550,7 +673,7 @@ func TestParseToolFileWithAuth(t *testing.T) {
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: postgressql.ToolKind,
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
@@ -651,7 +774,7 @@ func TestParseToolFileWithAuth(t *testing.T) {
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": postgressql.Config{
|
||||
Name: "example_tool",
|
||||
Kind: postgressql.ToolKind,
|
||||
Kind: "postgres-sql",
|
||||
Source: "my-pg-instance",
|
||||
Description: "some description",
|
||||
Statement: "SELECT * FROM SQL_STATEMENT;\n",
|
||||
@@ -804,7 +927,7 @@ func TestEnvVarReplacement(t *testing.T) {
|
||||
Tools: server.ToolConfigs{
|
||||
"example_tool": http.Config{
|
||||
Name: "example_tool",
|
||||
Kind: http.ToolKind,
|
||||
Kind: "http",
|
||||
Source: "my-instance",
|
||||
Method: "GET",
|
||||
Path: "search?name=alice&pet=cat",
|
||||
@@ -858,3 +981,397 @@ func TestEnvVarReplacement(t *testing.T) {
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
// normalizeFilepaths is a helper function to allow same filepath formats for Mac and Windows.
|
||||
// this prevents needing multiple "want" cases for TestResolveWatcherInputs
|
||||
func normalizeFilepaths(m map[string]bool) map[string]bool {
|
||||
newMap := make(map[string]bool)
|
||||
for k, v := range m {
|
||||
newMap[filepath.ToSlash(k)] = v
|
||||
}
|
||||
return newMap
|
||||
}
|
||||
|
||||
func TestResolveWatcherInputs(t *testing.T) {
|
||||
tcs := []struct {
|
||||
description string
|
||||
toolsFile string
|
||||
toolsFiles []string
|
||||
toolsFolder string
|
||||
wantWatchDirs map[string]bool
|
||||
wantWatchedFiles map[string]bool
|
||||
}{
|
||||
{
|
||||
description: "single tools file",
|
||||
toolsFile: "tools_folder/example_tools.yaml",
|
||||
toolsFiles: []string{},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true},
|
||||
wantWatchedFiles: map[string]bool{"tools_folder/example_tools.yaml": true},
|
||||
},
|
||||
{
|
||||
description: "default tools file (root dir)",
|
||||
toolsFile: "tools.yaml",
|
||||
toolsFiles: []string{},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{".": true},
|
||||
wantWatchedFiles: map[string]bool{"tools.yaml": true},
|
||||
},
|
||||
{
|
||||
description: "multiple files in different folders",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{"tools_folder/example_tools.yaml", "tools_folder2/example_tools.yaml"},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true, "tools_folder2": true},
|
||||
wantWatchedFiles: map[string]bool{
|
||||
"tools_folder/example_tools.yaml": true,
|
||||
"tools_folder2/example_tools.yaml": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
description: "multiple files in same folder",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{"tools_folder/example_tools.yaml", "tools_folder/example_tools2.yaml"},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true},
|
||||
wantWatchedFiles: map[string]bool{
|
||||
"tools_folder/example_tools.yaml": true,
|
||||
"tools_folder/example_tools2.yaml": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
description: "multiple files in different levels",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{
|
||||
"tools_folder/example_tools.yaml",
|
||||
"tools_folder/special_tools/example_tools2.yaml"},
|
||||
toolsFolder: "",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true, "tools_folder/special_tools": true},
|
||||
wantWatchedFiles: map[string]bool{
|
||||
"tools_folder/example_tools.yaml": true,
|
||||
"tools_folder/special_tools/example_tools2.yaml": true,
|
||||
},
|
||||
},
|
||||
{
|
||||
description: "tools folder",
|
||||
toolsFile: "",
|
||||
toolsFiles: []string{},
|
||||
toolsFolder: "tools_folder",
|
||||
wantWatchDirs: map[string]bool{"tools_folder": true},
|
||||
wantWatchedFiles: map[string]bool{},
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.description, func(t *testing.T) {
|
||||
gotWatchDirs, gotWatchedFiles := resolveWatcherInputs(tc.toolsFile, tc.toolsFiles, tc.toolsFolder)
|
||||
|
||||
normalizedGotWatchDirs := normalizeFilepaths(gotWatchDirs)
|
||||
normalizedGotWatchedFiles := normalizeFilepaths(gotWatchedFiles)
|
||||
|
||||
if diff := cmp.Diff(tc.wantWatchDirs, normalizedGotWatchDirs); diff != "" {
|
||||
t.Errorf("incorrect watchDirs: diff %v", diff)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantWatchedFiles, normalizedGotWatchedFiles); diff != "" {
|
||||
t.Errorf("incorrect watchedFiles: diff %v", diff)
|
||||
}
|
||||
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// helper function for testing file detection in dynamic reloading
|
||||
func tmpFileWithCleanup(content []byte) (string, func(), error) {
|
||||
f, err := os.CreateTemp("", "*")
|
||||
if err != nil {
|
||||
return "", nil, err
|
||||
}
|
||||
cleanup := func() { os.Remove(f.Name()) }
|
||||
|
||||
if _, err := f.Write(content); err != nil {
|
||||
cleanup()
|
||||
return "", nil, err
|
||||
}
|
||||
if err := f.Close(); err != nil {
|
||||
cleanup()
|
||||
return "", nil, err
|
||||
}
|
||||
return f.Name(), cleanup, err
|
||||
}
|
||||
|
||||
func TestSingleEdit(t *testing.T) {
|
||||
ctx, cancelCtx := context.WithTimeout(context.Background(), time.Minute)
|
||||
defer cancelCtx()
|
||||
|
||||
pr, pw := io.Pipe()
|
||||
defer pw.Close()
|
||||
defer pr.Close()
|
||||
|
||||
fileToWatch, cleanup, err := tmpFileWithCleanup([]byte("initial content"))
|
||||
if err != nil {
|
||||
t.Fatalf("error editing tools file %s", err)
|
||||
}
|
||||
defer cleanup()
|
||||
|
||||
logger, err := log.NewStdLogger(pw, pw, "DEBUG")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to setup logger %s", err)
|
||||
}
|
||||
ctx = util.WithLogger(ctx, logger)
|
||||
|
||||
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to setup instrumentation %s", err)
|
||||
}
|
||||
ctx = util.WithInstrumentation(ctx, instrumentation)
|
||||
|
||||
mockServer := &server.Server{}
|
||||
|
||||
cleanFileToWatch := filepath.Clean(fileToWatch)
|
||||
watchDir := filepath.Dir(cleanFileToWatch)
|
||||
|
||||
watchedFiles := map[string]bool{cleanFileToWatch: true}
|
||||
watchDirs := map[string]bool{watchDir: true}
|
||||
|
||||
go watchChanges(ctx, watchDirs, watchedFiles, mockServer)
|
||||
|
||||
// escape backslash so regex doesn't fail on windows filepaths
|
||||
regexEscapedPathFile := strings.ReplaceAll(cleanFileToWatch, `\`, `\\\\*\\`)
|
||||
regexEscapedPathFile = path.Clean(regexEscapedPathFile)
|
||||
|
||||
regexEscapedPathDir := strings.ReplaceAll(watchDir, `\`, `\\\\*\\`)
|
||||
regexEscapedPathDir = path.Clean(regexEscapedPathDir)
|
||||
|
||||
begunWatchingDir := regexp.MustCompile(fmt.Sprintf(`DEBUG "Added directory %s to watcher."`, regexEscapedPathDir))
|
||||
_, err = testutils.WaitForString(ctx, begunWatchingDir, pr)
|
||||
if err != nil {
|
||||
t.Fatalf("timeout or error waiting for watcher to start: %s", err)
|
||||
}
|
||||
|
||||
err = os.WriteFile(fileToWatch, []byte("modification"), 0777)
|
||||
if err != nil {
|
||||
t.Fatalf("error writing to file: %v", err)
|
||||
}
|
||||
|
||||
// only check substring of DEBUG message due to some OS/editors firing different operations
|
||||
detectedFileChange := regexp.MustCompile(fmt.Sprintf(`event detected in %s"`, regexEscapedPathFile))
|
||||
_, err = testutils.WaitForString(ctx, detectedFileChange, pr)
|
||||
if err != nil {
|
||||
t.Fatalf("timeout or error waiting for file to detect write: %s", err)
|
||||
}
|
||||
}
|
||||
|
||||
func TestPrebuiltTools(t *testing.T) {
|
||||
alloydb_admin_config, _ := prebuiltconfigs.Get("alloydb-postgres-admin")
|
||||
alloydb_config, _ := prebuiltconfigs.Get("alloydb-postgres")
|
||||
bigquery_config, _ := prebuiltconfigs.Get("bigquery")
|
||||
cloudsqlpg_config, _ := prebuiltconfigs.Get("cloud-sql-postgres")
|
||||
cloudsqlmysql_config, _ := prebuiltconfigs.Get("cloud-sql-mysql")
|
||||
cloudsqlmssql_config, _ := prebuiltconfigs.Get("cloud-sql-mssql")
|
||||
firestoreconfig, _ := prebuiltconfigs.Get("firestore")
|
||||
mysql_config, _ := prebuiltconfigs.Get("mysql")
|
||||
mssql_config, _ := prebuiltconfigs.Get("mssql")
|
||||
looker_config, _ := prebuiltconfigs.Get("looker")
|
||||
postgresconfig, _ := prebuiltconfigs.Get("postgres")
|
||||
spanner_config, _ := prebuiltconfigs.Get("spanner")
|
||||
spannerpg_config, _ := prebuiltconfigs.Get("spanner-postgres")
|
||||
ctx, err := testutils.ContextWithNewLogger()
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %s", err)
|
||||
}
|
||||
tcs := []struct {
|
||||
name string
|
||||
in []byte
|
||||
wantToolset server.ToolsetConfigs
|
||||
}{
|
||||
{
|
||||
name: "alloydb postgres admin prebuilt tools",
|
||||
in: alloydb_admin_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"alloydb-postgres-admin-tools": tools.ToolsetConfig{
|
||||
Name: "alloydb-postgres-admin-tools",
|
||||
ToolNames: []string{"alloydb-create-cluster", "alloydb-operations-get", "alloydb-create-instance"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "alloydb prebuilt tools",
|
||||
in: alloydb_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"alloydb-postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "alloydb-postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "bigquery prebuilt tools",
|
||||
in: bigquery_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"bigquery-database-tools": tools.ToolsetConfig{
|
||||
Name: "bigquery-database-tools",
|
||||
ToolNames: []string{"execute_sql", "get_dataset_info", "get_table_info", "list_dataset_ids", "list_table_ids"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "cloudsqlpg prebuilt tools",
|
||||
in: cloudsqlpg_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"cloud-sql-postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "cloud-sql-postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "cloudsqlmysql prebuilt tools",
|
||||
in: cloudsqlmysql_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"cloud-sql-mysql-database-tools": tools.ToolsetConfig{
|
||||
Name: "cloud-sql-mysql-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "cloudsqlmssql prebuilt tools",
|
||||
in: cloudsqlmssql_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"cloud-sql-mssql-database-tools": tools.ToolsetConfig{
|
||||
Name: "cloud-sql-mssql-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "firestore prebuilt tools",
|
||||
in: firestoreconfig,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"firestore-database-tools": tools.ToolsetConfig{
|
||||
Name: "firestore-database-tools",
|
||||
ToolNames: []string{"firestore-get-documents", "firestore-list-collections", "firestore-delete-documents", "firestore-query-collection", "firestore-get-rules", "firestore-validate-rules"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "mysql prebuilt tools",
|
||||
in: mysql_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"mysql-database-tools": tools.ToolsetConfig{
|
||||
Name: "mysql-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "mssql prebuilt tools",
|
||||
in: mssql_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"mssql-database-tools": tools.ToolsetConfig{
|
||||
Name: "mssql-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "looker prebuilt tools",
|
||||
in: looker_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"looker-tools": tools.ToolsetConfig{
|
||||
Name: "looker-tools",
|
||||
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "get_looks", "run_look"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "postgres prebuilt tools",
|
||||
in: postgresconfig,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "spanner prebuilt tools",
|
||||
in: spanner_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"spanner-database-tools": tools.ToolsetConfig{
|
||||
Name: "spanner-database-tools",
|
||||
ToolNames: []string{"execute_sql", "execute_sql_dql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "spanner pg prebuilt tools",
|
||||
in: spannerpg_config,
|
||||
wantToolset: server.ToolsetConfigs{
|
||||
"spanner-postgres-database-tools": tools.ToolsetConfig{
|
||||
Name: "spanner-postgres-database-tools",
|
||||
ToolNames: []string{"execute_sql", "execute_sql_dql", "list_tables"},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
toolsFile, err := parseToolsFile(ctx, tc.in)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to parse input: %v", err)
|
||||
}
|
||||
if diff := cmp.Diff(tc.wantToolset, toolsFile.Toolsets); diff != "" {
|
||||
t.Fatalf("incorrect tools parse: diff %v", diff)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestUpdateLogLevel(t *testing.T) {
|
||||
tcs := []struct {
|
||||
desc string
|
||||
stdio bool
|
||||
logLevel string
|
||||
want bool
|
||||
}{
|
||||
{
|
||||
desc: "no stdio",
|
||||
stdio: false,
|
||||
logLevel: "info",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
desc: "stdio with info log",
|
||||
stdio: true,
|
||||
logLevel: "info",
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
desc: "stdio with debug log",
|
||||
stdio: true,
|
||||
logLevel: "debug",
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
desc: "stdio with warn log",
|
||||
stdio: true,
|
||||
logLevel: "warn",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
desc: "stdio with error log",
|
||||
stdio: true,
|
||||
logLevel: "error",
|
||||
want: false,
|
||||
},
|
||||
}
|
||||
for _, tc := range tcs {
|
||||
t.Run(tc.desc, func(t *testing.T) {
|
||||
got := updateLogLevel(tc.stdio, tc.logLevel)
|
||||
if got != tc.want {
|
||||
t.Fatalf("incorrect indication to update log level: got %t, want %t", got, tc.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1 +1 @@
|
||||
0.4.0
|
||||
0.9.0
|
||||
|
||||
@@ -12,4 +12,4 @@ description: >
|
||||
<link rel="canonical" href="getting-started/introduction/"/>
|
||||
<meta http-equiv="refresh" content="0;url=getting-started/introduction"/>
|
||||
</head>
|
||||
</html>
|
||||
</html>
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
---
|
||||
title: "About"
|
||||
type: docs
|
||||
weight: 5
|
||||
description: A list of other information related to Toolbox.
|
||||
weight: 6
|
||||
description: >
|
||||
A list of other information related to Toolbox.
|
||||
---
|
||||
|
||||
@@ -9,22 +9,22 @@ description: Frequently asked questions about Toolbox.
|
||||
|
||||
MCP Toolbox for Databases is open-source and can be ran or deployed to a
|
||||
multitude of environments. For convenience, we release [compiled binaries and
|
||||
docker images][release-notes] (but you can always compile yourself as well!).
|
||||
docker images][release-notes] (but you can always compile yourself as well!).
|
||||
|
||||
For detailed instructions, check our these resources:
|
||||
|
||||
- [Quickstart: How to Run Locally](../getting-started/local_quickstart.md)
|
||||
- [Deploy to Cloud Run](../how-to/deploy_toolbox.md)
|
||||
|
||||
[release-notes]: https://github.com/googleapis/genai-toolbox/releases/
|
||||
|
||||
|
||||
## Do I need a Google Cloud account/project to get started with Toolbox?
|
||||
|
||||
Nope! While some of the sources Toolbox connects to may require GCP credentials,
|
||||
Toolbox doesn't require them and can connect to a bunch of different resources
|
||||
that don't.
|
||||
that don't.
|
||||
|
||||
## Does Toolbox take contributions from external users?
|
||||
## Does Toolbox take contributions from external users?
|
||||
|
||||
Absolutely! Please check out our [DEVELOPER.md][] for instructions on how to get
|
||||
started developing _on_ Toolbox instead of with it, and the [CONTRIBUTING.md][]
|
||||
@@ -33,17 +33,16 @@ for instructions on completing the CLA and getting a PR accepted.
|
||||
[DEVELOPER.md]: https://github.com/googleapis/genai-toolbox/blob/main/DEVELOPER.md
|
||||
[CONTRIBUTING.MD]: https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md
|
||||
|
||||
|
||||
## Can Toolbox support a feature to let me do _$FOO_?
|
||||
## Can Toolbox support a feature to let me do _$FOO_?
|
||||
|
||||
Maybe? The best place to start is by [opening an issue][github-issue] for
|
||||
discussion (or seeing if there is already one open), so we can better understand
|
||||
your use case and the best way to solve it. Generally we aim to prioritize the
|
||||
most popular issues, so make sure to +1 ones you are the most interested in.
|
||||
most popular issues, so make sure to +1 ones you are the most interested in.
|
||||
|
||||
[github-issue]: https://github.com/googleapis/genai-toolbox/issues
|
||||
|
||||
## Can Toolbox be used for non-database tools?
|
||||
## Can Toolbox be used for non-database tools?
|
||||
|
||||
Currently, Toolbox is primarily focused on making it easier to create and
|
||||
develop tools focused on interacting with Databases. We believe that there are a
|
||||
@@ -55,21 +54,21 @@ GRPC tools might be helpful in assisting with migrating to Toolbox or in
|
||||
accomplishing more complicated workflows. We're looking into what that might
|
||||
best look like in Toolbox.
|
||||
|
||||
## Can I use _$BAR_ orchestration framework to use tools from Toolbox?
|
||||
## Can I use _$BAR_ orchestration framework to use tools from Toolbox?
|
||||
|
||||
Currently, Toolbox only supports a limited number of client SDKs at our initial
|
||||
launch. We are investigating support for more frameworks as well as more general
|
||||
approaches for users without a framework -- look forward to seeing an update
|
||||
soon.
|
||||
|
||||
## Why does Toolbox use a server-client architecture pattern?
|
||||
## Why does Toolbox use a server-client architecture pattern?
|
||||
|
||||
Toolbox's server-client architecture allows us to more easily support a wide
|
||||
variety of languages and frameworks with a centralized implementation. It also
|
||||
allows us to tackle problems like connection pooling, auth, or caching more
|
||||
completely than entirely client-side solutions.
|
||||
|
||||
## Why was Toolbox written in Go?
|
||||
## Why was Toolbox written in Go?
|
||||
|
||||
While a large part of the Gen AI Ecosystem is predominately Python, we opted to
|
||||
use Go. We chose Go because it's still easy and simple to use, but also easier
|
||||
@@ -80,8 +79,9 @@ to be able to use Toolbox on the serving path of mission critical applications.
|
||||
It's easier to build the needed robustness, performance and scalability in Go
|
||||
than in Python.
|
||||
|
||||
## Is Toolbox compatible with Model Context Protocol (MCP)?
|
||||
|
||||
## Is Toolbox compatible with Model Context Protocol (MCP)?
|
||||
|
||||
Yes! Toolbox is compatible with [Anthropic's Model Context Protocol (MCP)](https://modelcontextprotocol.io/). Please checkout [Connect via MCP](../how-to/connect_via_mcp.md) on how to
|
||||
connect to Toolbox with an MCP client.
|
||||
Yes! Toolbox is compatible with [Anthropic's Model Context Protocol
|
||||
(MCP)](https://modelcontextprotocol.io/). Please checkout [Connect via
|
||||
MCP](../how-to/connect_via_mcp.md) on how to connect to Toolbox with an MCP
|
||||
client.
|
||||
|
||||
@@ -2,5 +2,6 @@
|
||||
title: "Concepts"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: Some core concepts in Toolbox
|
||||
description: >
|
||||
Some core concepts in Toolbox
|
||||
---
|
||||
|
||||
@@ -2,7 +2,8 @@
|
||||
title: "Telemetry"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: An overview of telemetry and observability in Toolbox.
|
||||
description: >
|
||||
An overview of telemetry and observability in Toolbox.
|
||||
---
|
||||
|
||||
## About
|
||||
@@ -16,7 +17,6 @@ through [OpenTelemetry](https://opentelemetry.io/). Additional flags can be
|
||||
passed to Toolbox to enable different logging behavior, or to export metrics
|
||||
through a specific [exporter](#exporter).
|
||||
|
||||
|
||||
## Logging
|
||||
|
||||
The following flags can be used to customize Toolbox logging:
|
||||
@@ -26,14 +26,16 @@ The following flags can be used to customize Toolbox logging:
|
||||
| `--log-level` | Preferred log level, allowed values: `debug`, `info`, `warn`, `error`. Default: `info`. |
|
||||
| `--logging-format` | Preferred logging format, allowed values: `standard`, `json`. Default: `standard`. |
|
||||
|
||||
__Example:__
|
||||
**Example:**
|
||||
|
||||
```bash
|
||||
./toolbox --tools_file "tools.yaml" --log-level warn --logging-format json
|
||||
./toolbox --tools-file "tools.yaml" --log-level warn --logging-format json
|
||||
```
|
||||
|
||||
### Level
|
||||
|
||||
Toolbox supports the following log levels, including:
|
||||
|
||||
| **Log level** | **Description** |
|
||||
|---------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| Debug | Debug logs typically contain information that is only useful during the debugging phase and may be of little value during production. |
|
||||
@@ -45,17 +47,18 @@ Toolbox will only output logs that are equal or more severe to the
|
||||
level that it is set. Below are the log levels that Toolbox supports in the
|
||||
order of severity.
|
||||
|
||||
|
||||
### Format
|
||||
|
||||
Toolbox supports both standard and structured logging format.
|
||||
|
||||
The standard logging outputs log as string:
|
||||
|
||||
```
|
||||
2024-11-12T15:08:11.451377-08:00 INFO "Initialized 0 sources.\n"
|
||||
```
|
||||
|
||||
The structured logging outputs log as JSON:
|
||||
|
||||
```
|
||||
{
|
||||
"timestamp":"2024-11-04T16:45:11.987299-08:00",
|
||||
@@ -65,9 +68,9 @@ The structured logging outputs log as JSON:
|
||||
}
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
{{< notice tip >}}
|
||||
`logging.googleapis.com/sourceLocation` shows the source code
|
||||
location information associated with the log entry, if any.
|
||||
location information associated with the log entry, if any.
|
||||
{{< /notice >}}
|
||||
|
||||
## Telemetry
|
||||
@@ -124,7 +127,6 @@ unified [resource][resource]. The list of resource attributes included are:
|
||||
| `service.name` | Open telemetry service name. Defaulted to `toolbox`. User can set the service name via flag mentioned above to distinguish between different toolbox service. |
|
||||
| `service.version` | The version of Toolbox used. |
|
||||
|
||||
|
||||
[resource]: https://opentelemetry.io/docs/languages/go/resources/
|
||||
|
||||
### Exporter
|
||||
@@ -150,9 +152,10 @@ Exporter][gcp-trace-exporter].
|
||||
[gcp-trace-exporter]:
|
||||
https://github.com/GoogleCloudPlatform/opentelemetry-operations-go/tree/main/exporter/trace
|
||||
|
||||
{{< notice note >}}
|
||||
If you're using Google Cloud Monitoring, the following APIs will need to be
|
||||
{{< notice note >}}
|
||||
If you're using Google Cloud Monitoring, the following APIs will need to be
|
||||
enabled:
|
||||
|
||||
- [Cloud Logging API](https://cloud.google.com/logging/docs/api/enable-api)
|
||||
- [Cloud Monitoring API](https://cloud.google.com/monitoring/api/enable-api)
|
||||
- [Cloud Trace API](https://cloud.google.com/apis/enableflow?apiid=cloudtrace.googleapis.com)
|
||||
@@ -183,7 +186,7 @@ The following flags are used to determine Toolbox's telemetry configuration:
|
||||
| **flag** | **type** | **description** |
|
||||
|----------------------------|----------|----------------------------------------------------------------------------------------------------------------|
|
||||
| `--telemetry-gcp` | bool | Enable exporting directly to Google Cloud Monitoring. Default is `false`. |
|
||||
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "http://127.0.0.1:4318"). |
|
||||
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "<http://127.0.0.1:4318>"). |
|
||||
| `--telemetry-service-name` | string | Sets the value of the `service.name` resource attribute. Default is `toolbox`. |
|
||||
|
||||
In addition to the flags noted above, you can also make additional configuration
|
||||
@@ -193,14 +196,16 @@ environmental variables.
|
||||
[sdk-configuration]:
|
||||
https://opentelemetry.io/docs/languages/sdk-configuration/general/
|
||||
|
||||
__Examples:__
|
||||
**Examples:**
|
||||
|
||||
To enable Google Cloud Exporter:
|
||||
|
||||
```bash
|
||||
./toolbox --telemetry-gcp
|
||||
```
|
||||
|
||||
To enable OTLP Exporter, provide Collector endpoint:
|
||||
|
||||
```bash
|
||||
./toolbox --telemetry-otlp="http://127.0.0.1:4553"
|
||||
```
|
||||
|
||||
@@ -3,5 +3,5 @@ title: "Getting Started"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
How to get started with Toolbox
|
||||
How to get started with Toolbox.
|
||||
---
|
||||
|
||||
@@ -29,7 +29,7 @@
|
||||
"id": "5sZ7_HCYJm4y"
|
||||
},
|
||||
"source": [
|
||||
"[](https://github.com/googleapis/genai-toolbox/tree/main/docs/en/getting-started/colab_quickstart.ipynb)"
|
||||
"[](https://colab.research.google.com/github/googleapis/genai-toolbox/blob/main/docs/en/getting-started/colab_quickstart.ipynb)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -190,6 +190,18 @@
|
||||
"!sudo lsof -i :5432"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Optional: Enable Vertex AI API for Google Cloud\n",
|
||||
"\n",
|
||||
"If you're using a model hosted on **Vertex AI**, run the following command to enable the API:\n",
|
||||
"\n",
|
||||
"```bash\n",
|
||||
"!gcloud services enable aiplatform.googleapis.com\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
@@ -222,7 +234,8 @@
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"! curl -O https://storage.googleapis.com/genai-toolbox/v0.2.0/linux/amd64/toolbox\n",
|
||||
"version = \"0.9.0\" # x-release-please-version\n",
|
||||
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
|
||||
"\n",
|
||||
"# Make the binary executable\n",
|
||||
"! chmod +x toolbox"
|
||||
@@ -394,7 +407,7 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Start a toolbox server\n",
|
||||
"! nohup {TOOLBOX_BINARY_PATH} --tools_file {TOOLS_FILE_PATH} -p {SERVER_PORT} > toolbox.log 2>&1 &"
|
||||
"! nohup {TOOLBOX_BINARY_PATH} --tools-file {TOOLS_FILE_PATH} -p {SERVER_PORT} > toolbox.log 2>&1 &"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -473,165 +486,12 @@
|
||||
"source": [
|
||||
"> You can either use LangGraph or LlamaIndex to develop a Toolbox based\n",
|
||||
"> application. Run one of the sections below\n",
|
||||
"> - [Connect using Google GenAI](#scrollTo=Rwgv1LDdNKSn)\n",
|
||||
"> - [Connect using Google GenAI](#scrollTo=Fv2-uT4mvYtp)\n",
|
||||
"> - [Connect using ADK](#scrollTo=QqRlWqvYNKSo)\n",
|
||||
"> - [Connect Using LangGraph](#scrollTo=pbapNMhhL33S)\n",
|
||||
"> - [Connect using LlamaIndex](#scrollTo=04iysrm_L_7v)\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "Rwgv1LDdNKSn"
|
||||
},
|
||||
"source": [
|
||||
"### Connect Using Google GenAI"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "HY23RMk4NKSn"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Install the Toolbox Core package\n",
|
||||
"!pip install toolbox-core --quiet\n",
|
||||
"\n",
|
||||
"# Install the Google GenAI package\n",
|
||||
"!pip install google-genai --quiet"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "9F1u566sNKSn"
|
||||
},
|
||||
"source": [
|
||||
"Create a Google GenAI Application which can Search, Book and Cancel hotels."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "LAuBIOXvNKSn"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import asyncio\n",
|
||||
"\n",
|
||||
"from google import genai\n",
|
||||
"from google.genai.types import (\n",
|
||||
" Content,\n",
|
||||
" FunctionDeclaration,\n",
|
||||
" GenerateContentConfig,\n",
|
||||
" Part,\n",
|
||||
" Tool,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"from toolbox_core import ToolboxClient\n",
|
||||
"\n",
|
||||
"prompt = \"\"\"\n",
|
||||
" You're a helpful hotel assistant. You handle hotel searching, booking and\n",
|
||||
" cancellations. When the user searches for a hotel, mention it's name, id,\n",
|
||||
" location and price tier. Always mention hotel id while performing any\n",
|
||||
" searches. This is very important for any operations. For any bookings or\n",
|
||||
" cancellations, please provide the appropriate confirmation. Be sure to\n",
|
||||
" update checkin or checkout dates if mentioned by the user.\n",
|
||||
" Don't ask for confirmations from the user.\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"queries = [\n",
|
||||
" \"Find hotels in Basel with Basel in it's name.\",\n",
|
||||
" \"Please book the hotel Hilton Basel for me.\",\n",
|
||||
" \"This is too expensive. Please cancel it.\",\n",
|
||||
" \"Please book Hyatt Regency for me\",\n",
|
||||
" \"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.\",\n",
|
||||
"]\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"async def run_application():\n",
|
||||
" toolbox_client = ToolboxClient(\"http://127.0.0.1:5000\")\n",
|
||||
"\n",
|
||||
" # The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use\n",
|
||||
" # integration. While this example uses Google's genai client, these callables can be adapted for\n",
|
||||
" # various function-calling or agent frameworks. For easier integration with supported frameworks\n",
|
||||
" # (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the\n",
|
||||
" # provided wrapper packages, which handle framework-specific boilerplate.\n",
|
||||
" toolbox_tools = await toolbox_client.load_toolset(\"my-toolset\")\n",
|
||||
" genai_client = genai.Client(\n",
|
||||
" vertexai=True, project=project_id, location=\"us-central1\"\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" genai_tools = [\n",
|
||||
" Tool(\n",
|
||||
" function_declarations=[\n",
|
||||
" FunctionDeclaration.from_callable_with_api_option(callable=tool)\n",
|
||||
" ]\n",
|
||||
" )\n",
|
||||
" for tool in toolbox_tools\n",
|
||||
" ]\n",
|
||||
" history = []\n",
|
||||
" for query in queries:\n",
|
||||
" user_prompt_content = Content(\n",
|
||||
" role=\"user\",\n",
|
||||
" parts=[Part.from_text(text=query)],\n",
|
||||
" )\n",
|
||||
" history.append(user_prompt_content)\n",
|
||||
"\n",
|
||||
" response = genai_client.models.generate_content(\n",
|
||||
" model=\"gemini-2.0-flash\",\n",
|
||||
" contents=history,\n",
|
||||
" config=GenerateContentConfig(\n",
|
||||
" system_instruction=prompt,\n",
|
||||
" tools=genai_tools,\n",
|
||||
" ),\n",
|
||||
" )\n",
|
||||
" history.append(response.candidates[0].content)\n",
|
||||
" function_response_parts = []\n",
|
||||
" for function_call in response.function_calls:\n",
|
||||
" fn_name = function_call.name\n",
|
||||
" # The tools are sorted alphabetically\n",
|
||||
" if fn_name == \"search-hotels-by-name\":\n",
|
||||
" function_result = await toolbox_tools[3](**function_call.args)\n",
|
||||
" elif fn_name == \"search-hotels-by-location\":\n",
|
||||
" function_result = await toolbox_tools[2](**function_call.args)\n",
|
||||
" elif fn_name == \"book-hotel\":\n",
|
||||
" function_result = await toolbox_tools[0](**function_call.args)\n",
|
||||
" elif fn_name == \"update-hotel\":\n",
|
||||
" function_result = await toolbox_tools[4](**function_call.args)\n",
|
||||
" elif fn_name == \"cancel-hotel\":\n",
|
||||
" function_result = await toolbox_tools[1](**function_call.args)\n",
|
||||
" else:\n",
|
||||
" raise ValueError(\"Function name not present.\")\n",
|
||||
" function_response = {\"result\": function_result}\n",
|
||||
" function_response_part = Part.from_function_response(\n",
|
||||
" name=function_call.name,\n",
|
||||
" response=function_response,\n",
|
||||
" )\n",
|
||||
" function_response_parts.append(function_response_part)\n",
|
||||
"\n",
|
||||
" if function_response_parts:\n",
|
||||
" tool_response_content = Content(role=\"tool\", parts=function_response_parts)\n",
|
||||
" history.append(tool_response_content)\n",
|
||||
"\n",
|
||||
" response2 = genai_client.models.generate_content(\n",
|
||||
" model=\"gemini-2.0-flash-001\",\n",
|
||||
" contents=history,\n",
|
||||
" config=GenerateContentConfig(\n",
|
||||
" tools=genai_tools,\n",
|
||||
" ),\n",
|
||||
" )\n",
|
||||
" final_model_response_content = response2.candidates[0].content\n",
|
||||
" history.append(final_model_response_content)\n",
|
||||
" print(response2.text)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"asyncio.run(run_application())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
@@ -649,8 +509,8 @@
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"! pip install toolbox-langchain --quiet\n",
|
||||
"! pip install google-adk langchain --quiet"
|
||||
"! pip install toolbox-core --quiet\n",
|
||||
"! pip install google-adk --quiet"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -662,17 +522,17 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from google.adk.agents import Agent\n",
|
||||
"from google.adk.tools.toolbox_tool import ToolboxTool\n",
|
||||
"from google.adk.runners import Runner\n",
|
||||
"from google.adk.sessions import InMemorySessionService\n",
|
||||
"from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService\n",
|
||||
"from google.genai import types\n",
|
||||
"from toolbox_core import ToolboxSyncClient\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"# TODO(developer): replace this with your Google API key\n",
|
||||
"os.environ['GOOGLE_API_KEY'] = \"<GOOGLE_API_KEY>\"\n",
|
||||
"\n",
|
||||
"toolbox_tools = ToolboxTool(\"http://127.0.0.1:5000\")\n",
|
||||
"toolbox_client = ToolboxSyncClient(\"http://127.0.0.1:5000\")\n",
|
||||
"\n",
|
||||
"prompt = \"\"\"\n",
|
||||
" You're a helpful hotel assistant. You handle hotel searching, booking and\n",
|
||||
@@ -685,16 +545,16 @@
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"root_agent = Agent(\n",
|
||||
" model='gemini-2.0-flash',\n",
|
||||
" model='gemini-2.0-flash-001',\n",
|
||||
" name='hotel_agent',\n",
|
||||
" description='A helpful AI assistant.',\n",
|
||||
" instruction=prompt,\n",
|
||||
" tools=toolbox_tools.get_toolset(\"my-toolset\"),\n",
|
||||
" tools=toolbox_client.load_toolset(\"my-toolset\"),\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"session_service = InMemorySessionService()\n",
|
||||
"artifacts_service = InMemoryArtifactService()\n",
|
||||
"session = session_service.create_session(\n",
|
||||
"session = await session_service.create_session(\n",
|
||||
" state={}, app_name='hotel_agent', user_id='123'\n",
|
||||
")\n",
|
||||
"runner = Runner(\n",
|
||||
@@ -801,8 +661,8 @@
|
||||
"async def run_application():\n",
|
||||
" # Create an LLM to bind with the agent.\n",
|
||||
" # TODO(developer): replace this with another model if needed\n",
|
||||
" model = ChatVertexAI(model_name=\"gemini-1.5-pro\", project=project_id)\n",
|
||||
" # model = ChatGoogleGenerativeAI(model=\"gemini-1.5-pro\")\n",
|
||||
" model = ChatVertexAI(model_name=\"gemini-2.0-flash-001\", project=project_id)\n",
|
||||
" # model = ChatGoogleGenerativeAI(model=\"gemini-2.0-flash-001\")\n",
|
||||
" # model = ChatAnthropic(model=\"claude-3-5-sonnet-20240620\")\n",
|
||||
"\n",
|
||||
" # Load the tools from the Toolbox server\n",
|
||||
@@ -897,12 +757,12 @@
|
||||
" # Create an LLM to bind with the agent.\n",
|
||||
" # TODO(developer): replace this with another model if needed\n",
|
||||
" llm = GoogleGenAI(\n",
|
||||
" model=\"gemini-1.5-pro\",\n",
|
||||
" model=\"gemini-2.0-flash-001\",\n",
|
||||
" vertexai_config={\"project\": project_id, \"location\": \"us-central1\"},\n",
|
||||
" )\n",
|
||||
" # llm = GoogleGenAI(\n",
|
||||
" # api_key=os.getenv(\"GOOGLE_API_KEY\"),\n",
|
||||
" # model=\"gemini-1.5-pro\",\n",
|
||||
" # model=\"gemini-2.0-flash-001\",\n",
|
||||
" # )\n",
|
||||
" # llm = Anthropic(\n",
|
||||
" # model=\"claude-3-7-sonnet-latest\",\n",
|
||||
@@ -930,6 +790,159 @@
|
||||
"await run_application()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "Fv2-uT4mvYtp"
|
||||
},
|
||||
"source": [
|
||||
"### Connect Using Google GenAI"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "mHSvk5_AvYtu"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Install the Toolbox Core package\n",
|
||||
"!pip install toolbox-core --quiet\n",
|
||||
"\n",
|
||||
"# Install the Google GenAI package\n",
|
||||
"!pip install google-genai --quiet"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "sO_7FGSYvYtu"
|
||||
},
|
||||
"source": [
|
||||
"Create a Google GenAI Application which can Search, Book and Cancel hotels."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "-NVVBiLnvYtu"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import asyncio\n",
|
||||
"\n",
|
||||
"from google import genai\n",
|
||||
"from google.genai.types import (\n",
|
||||
" Content,\n",
|
||||
" FunctionDeclaration,\n",
|
||||
" GenerateContentConfig,\n",
|
||||
" Part,\n",
|
||||
" Tool,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"from toolbox_core import ToolboxClient\n",
|
||||
"\n",
|
||||
"prompt = \"\"\"\n",
|
||||
" You're a helpful hotel assistant. You handle hotel searching, booking and\n",
|
||||
" cancellations. When the user searches for a hotel, mention it's name, id,\n",
|
||||
" location and price tier. Always mention hotel id while performing any\n",
|
||||
" searches. This is very important for any operations. For any bookings or\n",
|
||||
" cancellations, please provide the appropriate confirmation. Be sure to\n",
|
||||
" update checkin or checkout dates if mentioned by the user.\n",
|
||||
" Don't ask for confirmations from the user.\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"queries = [\n",
|
||||
" \"Find hotels in Basel with Basel in it's name.\",\n",
|
||||
" \"Please book the hotel Hilton Basel for me.\",\n",
|
||||
" \"This is too expensive. Please cancel it.\",\n",
|
||||
" \"Please book Hyatt Regency for me\",\n",
|
||||
" \"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.\",\n",
|
||||
"]\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"async def run_application():\n",
|
||||
" toolbox_client = ToolboxClient(\"http://127.0.0.1:5000\")\n",
|
||||
"\n",
|
||||
" # The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use\n",
|
||||
" # integration. While this example uses Google's genai client, these callables can be adapted for\n",
|
||||
" # various function-calling or agent frameworks. For easier integration with supported frameworks\n",
|
||||
" # (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the\n",
|
||||
" # provided wrapper packages, which handle framework-specific boilerplate.\n",
|
||||
" toolbox_tools = await toolbox_client.load_toolset(\"my-toolset\")\n",
|
||||
" genai_client = genai.Client(\n",
|
||||
" vertexai=True, project=project_id, location=\"us-central1\"\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
" genai_tools = [\n",
|
||||
" Tool(\n",
|
||||
" function_declarations=[\n",
|
||||
" FunctionDeclaration.from_callable_with_api_option(callable=tool)\n",
|
||||
" ]\n",
|
||||
" )\n",
|
||||
" for tool in toolbox_tools\n",
|
||||
" ]\n",
|
||||
" history = []\n",
|
||||
" for query in queries:\n",
|
||||
" user_prompt_content = Content(\n",
|
||||
" role=\"user\",\n",
|
||||
" parts=[Part.from_text(text=query)],\n",
|
||||
" )\n",
|
||||
" history.append(user_prompt_content)\n",
|
||||
"\n",
|
||||
" response = genai_client.models.generate_content(\n",
|
||||
" model=\"gemini-2.0-flash-001\",\n",
|
||||
" contents=history,\n",
|
||||
" config=GenerateContentConfig(\n",
|
||||
" system_instruction=prompt,\n",
|
||||
" tools=genai_tools,\n",
|
||||
" ),\n",
|
||||
" )\n",
|
||||
" history.append(response.candidates[0].content)\n",
|
||||
" function_response_parts = []\n",
|
||||
" for function_call in response.function_calls:\n",
|
||||
" fn_name = function_call.name\n",
|
||||
" # The tools are sorted alphabetically\n",
|
||||
" if fn_name == \"search-hotels-by-name\":\n",
|
||||
" function_result = await toolbox_tools[3](**function_call.args)\n",
|
||||
" elif fn_name == \"search-hotels-by-location\":\n",
|
||||
" function_result = await toolbox_tools[2](**function_call.args)\n",
|
||||
" elif fn_name == \"book-hotel\":\n",
|
||||
" function_result = await toolbox_tools[0](**function_call.args)\n",
|
||||
" elif fn_name == \"update-hotel\":\n",
|
||||
" function_result = await toolbox_tools[4](**function_call.args)\n",
|
||||
" elif fn_name == \"cancel-hotel\":\n",
|
||||
" function_result = await toolbox_tools[1](**function_call.args)\n",
|
||||
" else:\n",
|
||||
" raise ValueError(\"Function name not present.\")\n",
|
||||
" function_response = {\"result\": function_result}\n",
|
||||
" function_response_part = Part.from_function_response(\n",
|
||||
" name=function_call.name,\n",
|
||||
" response=function_response,\n",
|
||||
" )\n",
|
||||
" function_response_parts.append(function_response_part)\n",
|
||||
"\n",
|
||||
" if function_response_parts:\n",
|
||||
" tool_response_content = Content(role=\"tool\", parts=function_response_parts)\n",
|
||||
" history.append(tool_response_content)\n",
|
||||
"\n",
|
||||
" response2 = genai_client.models.generate_content(\n",
|
||||
" model=\"gemini-2.0-flash-001\",\n",
|
||||
" contents=history,\n",
|
||||
" config=GenerateContentConfig(\n",
|
||||
" tools=genai_tools,\n",
|
||||
" ),\n",
|
||||
" )\n",
|
||||
" final_model_response_content = response2.candidates[0].content\n",
|
||||
" history.append(final_model_response_content)\n",
|
||||
" print(response2.text)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"asyncio.run(run_application())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
---
|
||||
title: "Configuration"
|
||||
type: docs
|
||||
weight: 4
|
||||
description: How to configure Toolbox's tools.yaml file.
|
||||
weight: 6
|
||||
description: >
|
||||
How to configure Toolbox's tools.yaml file.
|
||||
---
|
||||
|
||||
The primary way to configure Toolbox is through the `tools.yaml` file. If you
|
||||
|
||||
@@ -2,25 +2,25 @@
|
||||
title: "Introduction"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: An introduction to MCP Toolbox for Databases.
|
||||
description: >
|
||||
An introduction to MCP Toolbox for Databases.
|
||||
---
|
||||
|
||||
MCP Toolbox for Databases is an open source MCP server for databases. It was
|
||||
designed with enterprise-grade and production-quality in mind. It enables you to
|
||||
develop tools easier, faster, and more securely by handling the complexities
|
||||
MCP Toolbox for Databases is an open source MCP server for databases. It enables
|
||||
you to develop tools easier, faster, and more securely by handling the complexities
|
||||
such as connection pooling, authentication, and more.
|
||||
|
||||
|
||||
{{< notice note >}}
|
||||
This product was originally named “Gen AI Toolbox for
|
||||
{{< notice note >}}
|
||||
This solution was originally named “Gen AI Toolbox for
|
||||
Databases” as its initial development predated MCP, but was renamed to align
|
||||
with recently added MCP compatibility.
|
||||
with recently added MCP compatibility.
|
||||
{{< /notice >}}
|
||||
|
||||
## Why Toolbox?
|
||||
## Why Toolbox?
|
||||
|
||||
Toolbox helps you build Gen AI tools that let your agents access data in your
|
||||
database. Toolbox provides:
|
||||
|
||||
- **Simplified development**: Integrate tools to your agent in less than 10
|
||||
lines of code, reuse tools between multiple agents or frameworks, and deploy
|
||||
new versions of tools more easily.
|
||||
@@ -30,6 +30,33 @@ database. Toolbox provides:
|
||||
- **End-to-end observability**: Out of the box metrics and tracing with built-in
|
||||
support for OpenTelemetry.
|
||||
|
||||
**⚡ Supercharge Your Workflow with an AI Database Assistant ⚡**
|
||||
|
||||
Stop context-switching and let your AI assistant become a true co-developer. By
|
||||
[connecting your IDE to your databases with MCP Toolbox][connect-ide], you can
|
||||
delegate complex and time-consuming database tasks, allowing you to build faster
|
||||
and focus on what matters. This isn't just about code completion; it's about
|
||||
giving your AI the context it needs to handle the entire development lifecycle.
|
||||
|
||||
Here’s how it will save you time:
|
||||
|
||||
- **Query in Plain English**: Interact with your data using natural language
|
||||
right from your IDE. Ask complex questions like, *"How many orders were
|
||||
delivered in 2024, and what items were in them?"* without writing any SQL.
|
||||
- **Automate Database Management**: Simply describe your data needs, and let the
|
||||
AI assistant manage your database for you. It can handle generating queries,
|
||||
creating tables, adding indexes, and more.
|
||||
- **Generate Context-Aware Code**: Empower your AI assistant to generate
|
||||
application code and tests with a deep understanding of your real-time
|
||||
database schema. This accelerates the development cycle by ensuring the
|
||||
generated code is directly usable.
|
||||
- **Slash Development Overhead**: Radically reduce the time spent on manual
|
||||
setup and boilerplate. MCP Toolbox helps streamline lengthy database
|
||||
configurations, repetitive code, and error-prone schema migrations.
|
||||
|
||||
Learn [how to connect your AI tools (IDEs) to Toolbox using MCP][connect-ide].
|
||||
|
||||
[connect-ide]: ../../how-to/connect-ide/
|
||||
|
||||
## General Architecture
|
||||
|
||||
@@ -45,6 +72,7 @@ redeploying your application.
|
||||
## Getting Started
|
||||
|
||||
### Installing the server
|
||||
|
||||
For the latest version, check the [releases page][releases] and use the
|
||||
following instructions for your OS and CPU architecture.
|
||||
|
||||
@@ -58,7 +86,7 @@ To install Toolbox as a binary:
|
||||
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.4.0
|
||||
export VERSION=0.9.0
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
|
||||
chmod +x toolbox
|
||||
```
|
||||
@@ -69,7 +97,7 @@ You can also install Toolbox as a container:
|
||||
|
||||
```sh
|
||||
# see releases page for other versions
|
||||
export VERSION=0.4.0
|
||||
export VERSION=0.9.0
|
||||
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
|
||||
```
|
||||
|
||||
@@ -80,7 +108,7 @@ To install from source, ensure you have the latest version of
|
||||
[Go installed](https://go.dev/doc/install), and then run the following command:
|
||||
|
||||
```sh
|
||||
go install github.com/googleapis/genai-toolbox@v0.4.0
|
||||
go install github.com/googleapis/genai-toolbox@v0.9.0
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
@@ -93,8 +121,11 @@ go install github.com/googleapis/genai-toolbox@v0.4.0
|
||||
execute `toolbox` to start the server:
|
||||
|
||||
```sh
|
||||
./toolbox --tools_file "tools.yaml"
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
You can use `toolbox help` for a full list of flags! To stop the server, send a
|
||||
terminate signal (`ctrl+c` on most platforms).
|
||||
@@ -107,6 +138,7 @@ out the resources in the [How-to section](../../how-to/_index.md)
|
||||
Once your server is up and running, you can load the tools into your
|
||||
application. See below the list of Client SDKs for using various frameworks:
|
||||
|
||||
#### Python
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
|
||||
@@ -118,10 +150,11 @@ tools:
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = await client.load_toolset("toolset_name")
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = await client.load_toolset("toolset_name")
|
||||
{{< /highlight >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
@@ -138,10 +171,11 @@ tools:
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as client:
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
{{< /highlight >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox LangChain SDK, see the
|
||||
@@ -158,10 +192,12 @@ tools:
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
# update the url to point to your server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
|
||||
# these tools can be passed to your application!
|
||||
tools = client.load_toolset()
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as client:
|
||||
|
||||
# these tools can be passed to your application
|
||||
|
||||
tools = client.load_toolset()
|
||||
{{< /highlight >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
@@ -169,3 +205,364 @@ For more detailed instructions on using the Toolbox Llamaindex SDK, see the
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
#### Javascript/Typescript
|
||||
|
||||
Once you've installed the [Toolbox Core
|
||||
SDK](https://www.npmjs.com/package/@toolbox-sdk/core), you can load
|
||||
tools:
|
||||
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="LangChain/Langraph" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => tool(currTool, {
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
});
|
||||
|
||||
// Use these tools in your Langchain/Langraph applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Genkit" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
import { genkit } from 'genkit';
|
||||
|
||||
// Initialise genkit
|
||||
const ai = genkit({
|
||||
plugins: [
|
||||
googleAI({
|
||||
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
|
||||
})
|
||||
],
|
||||
model: googleAI.model('gemini-2.0-flash'),
|
||||
});
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => ai.defineTool({
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
}, toolboxTool)
|
||||
|
||||
// Use these tools in your Genkit applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="LlamaIndex" lang="en" %}}
|
||||
|
||||
{{< highlight javascript >}}
|
||||
import { ToolboxClient } from '@toolbox-sdk/core';
|
||||
import { tool } from "llamaindex";
|
||||
|
||||
// update the url to point to your server
|
||||
const URL = 'http://127.0.0.1:5000';
|
||||
let client = new ToolboxClient(URL);
|
||||
|
||||
// these tools can be passed to your application!
|
||||
const toolboxTools = await client.loadToolset('toolsetName');
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => tool({
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
parameters: toolboxTool.getParams(),
|
||||
execute: toolboxTool
|
||||
});;
|
||||
|
||||
// Use these tools in your LlamaIndex applications
|
||||
const tools = toolboxTools.map(getTool);
|
||||
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox Core SDK, see the
|
||||
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
|
||||
|
||||
#### Go
|
||||
|
||||
Once you've installed the [Toolbox Go
|
||||
SDK](https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go/core), you can load
|
||||
tools:
|
||||
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
|
||||
{{< highlight go >}}
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"log"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Framework agnostic tools
|
||||
tools, err := client.LoadToolset("toolsetName", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v", err)
|
||||
}
|
||||
}
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="LangChain Go" lang="en" %}}
|
||||
|
||||
{{< highlight go >}}
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"log"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"github.com/tmc/langchaingo/llms"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v", err)
|
||||
}
|
||||
|
||||
// Fetch the tool's input schema
|
||||
inputschema, err := tool.InputSchema()
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to fetch inputSchema: %v", err)
|
||||
}
|
||||
|
||||
var paramsSchema map[string]any
|
||||
_ = json.Unmarshal(inputschema, ¶msSchema)
|
||||
|
||||
// Use this tool with LangChainGo
|
||||
langChainTool := llms.Tool{
|
||||
Type: "function",
|
||||
Function: &llms.FunctionDefinition{
|
||||
Name: tool.Name(),
|
||||
Description: tool.Description(),
|
||||
Parameters: paramsSchema,
|
||||
},
|
||||
}
|
||||
}
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Genkit Go" lang="en" %}}
|
||||
|
||||
{{< highlight go >}}
|
||||
package main
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"log"
|
||||
|
||||
"github.com/firebase/genkit/go/ai"
|
||||
"github.com/firebase/genkit/go/genkit"
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
|
||||
"github.com/invopop/jsonschema"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// Update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
g, err := genkit.Init(ctx)
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v", err)
|
||||
}
|
||||
|
||||
// Convert the tool using the tbgenkit package
|
||||
// Use this tool with Genkit Go
|
||||
genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to convert tool: %v\n", err)
|
||||
}
|
||||
}
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Go GenAI" lang="en" %}}
|
||||
|
||||
{{< highlight go >}}
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"log"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"google.golang.org/genai"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// Update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v", err)
|
||||
}
|
||||
|
||||
// Fetch the tool's input schema
|
||||
inputschema, err := tool.InputSchema()
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to fetch inputSchema: %v", err)
|
||||
}
|
||||
|
||||
var schema *genai.Schema
|
||||
_ = json.Unmarshal(inputschema, &schema)
|
||||
|
||||
funcDeclaration := &genai.FunctionDeclaration{
|
||||
Name: tool.Name(),
|
||||
Description: tool.Description(),
|
||||
Parameters: schema,
|
||||
}
|
||||
|
||||
// Use this tool with Go GenAI
|
||||
genAITool := &genai.Tool{
|
||||
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
|
||||
}
|
||||
}
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="OpenAI Go" lang="en" %}}
|
||||
|
||||
{{< highlight go >}}
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"log"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
openai "github.com/openai/openai-go"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Make sure to add the error checks
|
||||
// Update the url to point to your server
|
||||
URL := "http://127.0.0.1:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
client, err := core.NewToolboxClient(URL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Framework agnostic tool
|
||||
tool, err := client.LoadTool("toolName", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v", err)
|
||||
}
|
||||
|
||||
// Fetch the tool's input schema
|
||||
inputschema, err := tool.InputSchema()
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to fetch inputSchema: %v", err)
|
||||
}
|
||||
|
||||
var paramsSchema openai.FunctionParameters
|
||||
_ = json.Unmarshal(inputschema, ¶msSchema)
|
||||
|
||||
// Use this tool with OpenAI Go
|
||||
openAITool := openai.ChatCompletionToolParam{
|
||||
Function: openai.FunctionDefinitionParam{
|
||||
Name: tool.Name(),
|
||||
Description: openai.String(tool.Description()),
|
||||
Parameters: paramsSchema,
|
||||
},
|
||||
}
|
||||
}
|
||||
{{< /highlight >}}
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
For more detailed instructions on using the Toolbox Go SDK, see the
|
||||
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/README.md).
|
||||
|
||||
For end-to-end samples on using the Toolbox Go SDK with orchestration frameworks, see the [project's samples](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/core/samples)
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 143 KiB After Width: | Height: | Size: 154 KiB |
@@ -1,11 +1,10 @@
|
||||
---
|
||||
title: "Quickstart (Local)"
|
||||
title: "Python Quickstart (Local)"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
How to get started running Toolbox locally with Python, PostgreSQL, and
|
||||
[GoogleGenAI](https://pypi.org/project/google-genai/),
|
||||
[LangGraph](https://www.langchain.com/langgraph), [LlamaIndex](https://www.llamaindex.ai/) or [Agent Development Kit](https://google.github.io/adk-docs/).
|
||||
How to get started running Toolbox locally with [Python](https://github.com/googleapis/mcp-toolbox-sdk-python), PostgreSQL, and [Agent Development Kit](https://google.github.io/adk-docs/),
|
||||
[LangGraph](https://www.langchain.com/langgraph), [LlamaIndex](https://www.llamaindex.ai/) or [GoogleGenAI](https://pypi.org/project/google-genai/).
|
||||
---
|
||||
|
||||
[](https://colab.research.google.com/github/googleapis/genai-toolbox/blob/main/docs/en/getting-started/colab_quickstart.ipynb)
|
||||
@@ -15,8 +14,21 @@ description: >
|
||||
This guide assumes you have already done the following:
|
||||
|
||||
1. Installed [Python 3.9+][install-python] (including [pip][install-pip] and
|
||||
your preferred virtual environment tool for managing dependencies e.g. [venv][install-venv])
|
||||
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres]
|
||||
your preferred virtual environment tool for managing dependencies e.g. [venv][install-venv]).
|
||||
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
|
||||
|
||||
### Cloud Setup (Optional)
|
||||
|
||||
If you plan to use **Google Cloud’s Vertex AI** with your agent (e.g., using `vertexai=True` or a Google GenAI model), follow these one-time setup steps for local development:
|
||||
|
||||
1. [Install the Google Cloud CLI](https://cloud.google.com/sdk/docs/install)
|
||||
1. [Set up Application Default Credentials (ADC)](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
|
||||
1. Set your project and enable Vertex AI
|
||||
|
||||
```bash
|
||||
gcloud config set project YOUR_PROJECT_ID
|
||||
gcloud services enable aiplatform.googleapis.com
|
||||
```
|
||||
|
||||
[install-python]: https://wiki.python.org/moin/BeginnersGuide/Download
|
||||
[install-pip]: https://pip.pypa.io/en/stable/installation/
|
||||
@@ -26,7 +38,7 @@ This guide assumes you have already done the following:
|
||||
## Step 1: Set up your database
|
||||
|
||||
In this section, we will create a database, insert some data that needs to be
|
||||
access by our agent, and create a database user for Toolbox to connect with.
|
||||
accessed by our agent, and create a database user for Toolbox to connect with.
|
||||
|
||||
1. Connect to postgres using the `psql` command:
|
||||
|
||||
@@ -36,6 +48,43 @@ access by our agent, and create a database user for Toolbox to connect with.
|
||||
|
||||
Here, `postgres` denotes the default postgres superuser.
|
||||
|
||||
{{< notice info >}}
|
||||
|
||||
#### **Having trouble connecting?**
|
||||
|
||||
* **Password Prompt:** If you are prompted for a password for the `postgres`
|
||||
user and do not know it (or a blank password doesn't work), your PostgreSQL
|
||||
installation might require a password or a different authentication method.
|
||||
* **`FATAL: role "postgres" does not exist`:** This error means the default
|
||||
`postgres` superuser role isn't available under that name on your system.
|
||||
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
|
||||
You can typically check with `sudo systemctl status postgresql` and start it
|
||||
with `sudo systemctl start postgresql` on Linux systems.
|
||||
|
||||
<br/>
|
||||
|
||||
#### **Common Solution**
|
||||
|
||||
For password issues or if the `postgres` role seems inaccessible directly, try
|
||||
switching to the `postgres` operating system user first. This user often has
|
||||
permission to connect without a password for local connections (this is called
|
||||
peer authentication).
|
||||
|
||||
```bash
|
||||
sudo -i -u postgres
|
||||
psql -h 127.0.0.1
|
||||
```
|
||||
|
||||
Once you are in the `psql` shell using this method, you can proceed with the
|
||||
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
|
||||
`exit` to return to your normal user shell.
|
||||
|
||||
If desired, once connected to `psql` as the `postgres` OS user, you can set a
|
||||
password for the `postgres` *database* user using: `ALTER USER postgres WITH
|
||||
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
|
||||
postgres` and a password next time.
|
||||
{{< /notice >}}
|
||||
|
||||
1. Create a new database and a new user:
|
||||
|
||||
{{< notice tip >}}
|
||||
@@ -58,6 +107,10 @@ access by our agent, and create a database user for Toolbox to connect with.
|
||||
\q
|
||||
```
|
||||
|
||||
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
|
||||
need to type `exit` after `\q` to leave the `postgres` user's shell
|
||||
session.)
|
||||
|
||||
1. Connect to your database with your new user:
|
||||
|
||||
```bash
|
||||
@@ -101,6 +154,7 @@ access by our agent, and create a database user for Toolbox to connect with.
|
||||
\q
|
||||
```
|
||||
|
||||
|
||||
## Step 2: Install and configure Toolbox
|
||||
|
||||
In this section, we will download Toolbox, configure our tools in a
|
||||
@@ -116,7 +170,7 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```bash
|
||||
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.4.0/$OS/toolbox
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.9.0/$OS/toolbox
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
@@ -215,8 +269,11 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
|
||||
|
||||
```bash
|
||||
./toolbox --tools_file "tools.yaml"
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
## Step 3: Connect your agent to Toolbox
|
||||
|
||||
@@ -231,13 +288,9 @@ you can connect to a
|
||||
1. In a new terminal, install the SDK package.
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="Core" lang="bash" >}}
|
||||
|
||||
pip install toolbox-core
|
||||
{{< /tab >}}
|
||||
{{< tab header="ADK" lang="bash" >}}
|
||||
|
||||
pip install toolbox-langchain
|
||||
pip install toolbox-core
|
||||
{{< /tab >}}
|
||||
{{< tab header="Langchain" lang="bash" >}}
|
||||
|
||||
@@ -247,18 +300,18 @@ pip install toolbox-langchain
|
||||
|
||||
pip install toolbox-llamaindex
|
||||
{{< /tab >}}
|
||||
{{< tab header="Core" lang="bash" >}}
|
||||
|
||||
pip install toolbox-core
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
1. Install other required dependencies:
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="Core" lang="bash" >}}
|
||||
|
||||
pip install google-genai
|
||||
{{< /tab >}}
|
||||
{{< tab header="ADK" lang="bash" >}}
|
||||
|
||||
pip install google-adk langchain
|
||||
pip install google-adk
|
||||
{{< /tab >}}
|
||||
{{< tab header="Langchain" lang="bash" >}}
|
||||
|
||||
@@ -279,192 +332,90 @@ pip install llama-index-llms-google-genai
|
||||
|
||||
# pip install llama-index-llms-anthropic
|
||||
|
||||
{{< /tab >}}
|
||||
{{< tab header="Core" lang="bash" >}}
|
||||
|
||||
pip install google-genai
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
1. Create a new file named `hotel_agent.py` and copy the following
|
||||
code to create an agent:
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="Core" lang="python" >}}
|
||||
import asyncio
|
||||
|
||||
from google import genai
|
||||
from google.genai.types import (
|
||||
Content,
|
||||
FunctionDeclaration,
|
||||
GenerateContentConfig,
|
||||
Part,
|
||||
Tool,
|
||||
)
|
||||
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel id while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
|
||||
queries = [
|
||||
"Find hotels in Basel with Basel in it's name.",
|
||||
"Please book the hotel Hilton Basel for me.",
|
||||
"This is too expensive. Please cancel it.",
|
||||
"Please book Hyatt Regency for me",
|
||||
"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
|
||||
async def run_application():
|
||||
toolbox_client = ToolboxClient("<http://127.0.0.1:5000>")
|
||||
|
||||
# The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use
|
||||
# integration. While this example uses Google's genai client, these callables can be adapted for
|
||||
# various function-calling or agent frameworks. For easier integration with supported frameworks
|
||||
# (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the
|
||||
# provided wrapper packages, which handle framework-specific boilerplate.
|
||||
toolbox_tools = await toolbox_client.load_toolset("my-toolset")
|
||||
genai_client = genai.Client(
|
||||
vertexai=True, project="project-id", location="us-central1"
|
||||
)
|
||||
|
||||
genai_tools = [
|
||||
Tool(
|
||||
function_declarations=[
|
||||
FunctionDeclaration.from_callable_with_api_option(callable=tool)
|
||||
]
|
||||
)
|
||||
for tool in toolbox_tools
|
||||
]
|
||||
history = []
|
||||
for query in queries:
|
||||
user_prompt_content = Content(
|
||||
role="user",
|
||||
parts=[Part.from_text(text=query)],
|
||||
)
|
||||
history.append(user_prompt_content)
|
||||
|
||||
response = genai_client.models.generate_content(
|
||||
model="gemini-2.0-flash",
|
||||
contents=history,
|
||||
config=GenerateContentConfig(
|
||||
system_instruction=prompt,
|
||||
tools=genai_tools,
|
||||
),
|
||||
)
|
||||
history.append(response.candidates[0].content)
|
||||
function_response_parts = []
|
||||
for function_call in response.function_calls:
|
||||
fn_name = function_call.name
|
||||
# The tools are sorted alphabetically
|
||||
if fn_name == "search-hotels-by-name":
|
||||
function_result = await toolbox_tools[3](**function_call.args)
|
||||
elif fn_name == "search-hotels-by-location":
|
||||
function_result = await toolbox_tools[2](**function_call.args)
|
||||
elif fn_name == "book-hotel":
|
||||
function_result = await toolbox_tools[0](**function_call.args)
|
||||
elif fn_name == "update-hotel":
|
||||
function_result = await toolbox_tools[4](**function_call.args)
|
||||
elif fn_name == "cancel-hotel":
|
||||
function_result = await toolbox_tools[1](**function_call.args)
|
||||
else:
|
||||
raise ValueError("Function name not present.")
|
||||
function_response = {"result": function_result}
|
||||
function_response_part = Part.from_function_response(
|
||||
name=function_call.name,
|
||||
response=function_response,
|
||||
)
|
||||
function_response_parts.append(function_response_part)
|
||||
|
||||
if function_response_parts:
|
||||
tool_response_content = Content(role="tool", parts=function_response_parts)
|
||||
history.append(tool_response_content)
|
||||
|
||||
response2 = genai_client.models.generate_content(
|
||||
model="gemini-2.0-flash-001",
|
||||
contents=history,
|
||||
config=GenerateContentConfig(
|
||||
tools=genai_tools,
|
||||
),
|
||||
)
|
||||
final_model_response_content = response2.candidates[0].content
|
||||
history.append(final_model_response_content)
|
||||
print(response2.text)
|
||||
|
||||
asyncio.run(run_application())
|
||||
|
||||
{{< /tab >}}
|
||||
{{< tab header="ADK" lang="python" >}}
|
||||
from google.adk.agents import Agent
|
||||
from google.adk.tools.toolbox_tool import ToolboxTool
|
||||
from google.adk.runners import Runner
|
||||
from google.adk.sessions import InMemorySessionService
|
||||
from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService
|
||||
from google.genai import types
|
||||
from toolbox_core import ToolboxSyncClient
|
||||
|
||||
import asyncio
|
||||
import os
|
||||
|
||||
# TODO(developer): replace this with your Google API key
|
||||
|
||||
os.environ['GOOGLE_API_KEY'] = 'your-api-key'
|
||||
|
||||
toolbox_tools = ToolboxTool("<http://127.0.0.1:5000>")
|
||||
async def main():
|
||||
with ToolboxSyncClient("<http://127.0.0.1:5000>") as toolbox_client:
|
||||
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
|
||||
root_agent = Agent(
|
||||
model='gemini-2.0-flash',
|
||||
name='hotel_agent',
|
||||
description='A helpful AI assistant.',
|
||||
instruction=prompt,
|
||||
tools=toolbox_tools.get_toolset("my-toolset"),
|
||||
)
|
||||
root_agent = Agent(
|
||||
model='gemini-2.0-flash-001',
|
||||
name='hotel_agent',
|
||||
description='A helpful AI assistant.',
|
||||
instruction=prompt,
|
||||
tools=toolbox_client.load_toolset("my-toolset"),
|
||||
)
|
||||
|
||||
session_service = InMemorySessionService()
|
||||
artifacts_service = InMemoryArtifactService()
|
||||
session = session_service.create_session(
|
||||
state={}, app_name='hotel_agent', user_id='123'
|
||||
)
|
||||
runner = Runner(
|
||||
app_name='hotel_agent',
|
||||
agent=root_agent,
|
||||
artifact_service=artifacts_service,
|
||||
session_service=session_service,
|
||||
)
|
||||
session_service = InMemorySessionService()
|
||||
artifacts_service = InMemoryArtifactService()
|
||||
session = await session_service.create_session(
|
||||
state={}, app_name='hotel_agent', user_id='123'
|
||||
)
|
||||
runner = Runner(
|
||||
app_name='hotel_agent',
|
||||
agent=root_agent,
|
||||
artifact_service=artifacts_service,
|
||||
session_service=session_service,
|
||||
)
|
||||
|
||||
queries = [
|
||||
"Find hotels in Basel with Basel in it's name.",
|
||||
"Can you book the Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
queries = [
|
||||
"Find hotels in Basel with Basel in it's name.",
|
||||
"Can you book the Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
|
||||
for query in queries:
|
||||
content = types.Content(role='user', parts=[types.Part(text=query)])
|
||||
events = runner.run(session_id=session.id,
|
||||
user_id='123', new_message=content)
|
||||
for query in queries:
|
||||
content = types.Content(role='user', parts=[types.Part(text=query)])
|
||||
events = runner.run(session_id=session.id,
|
||||
user_id='123', new_message=content)
|
||||
|
||||
responses = (
|
||||
part.text
|
||||
for event in events
|
||||
for part in event.content.parts
|
||||
if part.text is not None
|
||||
)
|
||||
responses = (
|
||||
part.text
|
||||
for event in events
|
||||
for part in event.content.parts
|
||||
if part.text is not None
|
||||
)
|
||||
|
||||
for text in responses:
|
||||
print(text)
|
||||
for text in responses:
|
||||
print(text)
|
||||
|
||||
asyncio.run(main())
|
||||
{{< /tab >}}
|
||||
{{< tab header="LangChain" lang="python" >}}
|
||||
import asyncio
|
||||
|
||||
from langgraph.prebuilt import create_react_agent
|
||||
|
||||
@@ -499,21 +450,21 @@ queries = [
|
||||
|
||||
async def run_application():
|
||||
# TODO(developer): replace this with another model if needed
|
||||
model = ChatVertexAI(model_name="gemini-1.5-pro")
|
||||
# model = ChatGoogleGenerativeAI(model="gemini-1.5-pro")
|
||||
model = ChatVertexAI(model_name="gemini-2.0-flash-001")
|
||||
# model = ChatGoogleGenerativeAI(model="gemini-2.0-flash-001")
|
||||
# model = ChatAnthropic(model="claude-3-5-sonnet-20240620")
|
||||
|
||||
# Load the tools from the Toolbox server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
tools = await client.aload_toolset()
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
tools = await client.aload_toolset()
|
||||
|
||||
agent = create_react_agent(model, tools, checkpointer=MemorySaver())
|
||||
agent = create_react_agent(model, tools, checkpointer=MemorySaver())
|
||||
|
||||
config = {"configurable": {"thread_id": "thread-1"}}
|
||||
for query in queries:
|
||||
inputs = {"messages": [("user", prompt + query)]}
|
||||
response = agent.invoke(inputs, stream_mode="values", config=config)
|
||||
print(response["messages"][-1].content)
|
||||
config = {"configurable": {"thread_id": "thread-1"}}
|
||||
for query in queries:
|
||||
inputs = {"messages": [("user", prompt + query)]}
|
||||
response = agent.invoke(inputs, stream_mode="values", config=config)
|
||||
print(response["messages"][-1].content)
|
||||
|
||||
asyncio.run(run_application())
|
||||
{{< /tab >}}
|
||||
@@ -553,12 +504,12 @@ queries = [
|
||||
async def run_application():
|
||||
# TODO(developer): replace this with another model if needed
|
||||
llm = GoogleGenAI(
|
||||
model="gemini-1.5-pro",
|
||||
model="gemini-2.0-flash-001",
|
||||
vertexai_config={"project": "project-id", "location": "us-central1"},
|
||||
)
|
||||
# llm = GoogleGenAI(
|
||||
# api_key=os.getenv("GOOGLE_API_KEY"),
|
||||
# model="gemini-1.5-pro",
|
||||
# model="gemini-2.0-flash-001",
|
||||
# )
|
||||
# llm = Anthropic(
|
||||
# model="claude-3-7-sonnet-latest",
|
||||
@@ -566,38 +517,153 @@ async def run_application():
|
||||
# )
|
||||
|
||||
# Load the tools from the Toolbox server
|
||||
client = ToolboxClient("http://127.0.0.1:5000")
|
||||
tools = await client.aload_toolset()
|
||||
async with ToolboxClient("http://127.0.0.1:5000") as client:
|
||||
tools = await client.aload_toolset()
|
||||
|
||||
agent = AgentWorkflow.from_tools_or_functions(
|
||||
tools,
|
||||
llm=llm,
|
||||
system_prompt=prompt,
|
||||
)
|
||||
ctx = Context(agent)
|
||||
for query in queries:
|
||||
response = await agent.run(user_msg=query, ctx=ctx)
|
||||
print(f"---- {query} ----")
|
||||
print(str(response))
|
||||
agent = AgentWorkflow.from_tools_or_functions(
|
||||
tools,
|
||||
llm=llm,
|
||||
system_prompt=prompt,
|
||||
)
|
||||
ctx = Context(agent)
|
||||
for query in queries:
|
||||
response = await agent.run(user_msg=query, ctx=ctx)
|
||||
print(f"---- {query} ----")
|
||||
print(str(response))
|
||||
|
||||
asyncio.run(run_application())
|
||||
{{< /tab >}}
|
||||
{{< tab header="Core" lang="python" >}}
|
||||
import asyncio
|
||||
|
||||
from google import genai
|
||||
from google.genai.types import (
|
||||
Content,
|
||||
FunctionDeclaration,
|
||||
GenerateContentConfig,
|
||||
Part,
|
||||
Tool,
|
||||
)
|
||||
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
prompt = """
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and
|
||||
cancellations. When the user searches for a hotel, mention it's name, id,
|
||||
location and price tier. Always mention hotel id while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
"""
|
||||
|
||||
queries = [
|
||||
"Find hotels in Basel with Basel in it's name.",
|
||||
"Please book the hotel Hilton Basel for me.",
|
||||
"This is too expensive. Please cancel it.",
|
||||
"Please book Hyatt Regency for me",
|
||||
"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.",
|
||||
]
|
||||
|
||||
async def run_application():
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox_client:
|
||||
|
||||
# The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use
|
||||
# integration. While this example uses Google's genai client, these callables can be adapted for
|
||||
# various function-calling or agent frameworks. For easier integration with supported frameworks
|
||||
# (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the
|
||||
# provided wrapper packages, which handle framework-specific boilerplate.
|
||||
toolbox_tools = await toolbox_client.load_toolset("my-toolset")
|
||||
genai_client = genai.Client(
|
||||
vertexai=True, project="project-id", location="us-central1"
|
||||
)
|
||||
|
||||
genai_tools = [
|
||||
Tool(
|
||||
function_declarations=[
|
||||
FunctionDeclaration.from_callable_with_api_option(callable=tool)
|
||||
]
|
||||
)
|
||||
for tool in toolbox_tools
|
||||
]
|
||||
history = []
|
||||
for query in queries:
|
||||
user_prompt_content = Content(
|
||||
role="user",
|
||||
parts=[Part.from_text(text=query)],
|
||||
)
|
||||
history.append(user_prompt_content)
|
||||
|
||||
response = genai_client.models.generate_content(
|
||||
model="gemini-2.0-flash-001",
|
||||
contents=history,
|
||||
config=GenerateContentConfig(
|
||||
system_instruction=prompt,
|
||||
tools=genai_tools,
|
||||
),
|
||||
)
|
||||
history.append(response.candidates[0].content)
|
||||
function_response_parts = []
|
||||
for function_call in response.function_calls:
|
||||
fn_name = function_call.name
|
||||
# The tools are sorted alphabetically
|
||||
if fn_name == "search-hotels-by-name":
|
||||
function_result = await toolbox_tools[3](**function_call.args)
|
||||
elif fn_name == "search-hotels-by-location":
|
||||
function_result = await toolbox_tools[2](**function_call.args)
|
||||
elif fn_name == "book-hotel":
|
||||
function_result = await toolbox_tools[0](**function_call.args)
|
||||
elif fn_name == "update-hotel":
|
||||
function_result = await toolbox_tools[4](**function_call.args)
|
||||
elif fn_name == "cancel-hotel":
|
||||
function_result = await toolbox_tools[1](**function_call.args)
|
||||
else:
|
||||
raise ValueError("Function name not present.")
|
||||
function_response = {"result": function_result}
|
||||
function_response_part = Part.from_function_response(
|
||||
name=function_call.name,
|
||||
response=function_response,
|
||||
)
|
||||
function_response_parts.append(function_response_part)
|
||||
|
||||
if function_response_parts:
|
||||
tool_response_content = Content(role="tool", parts=function_response_parts)
|
||||
history.append(tool_response_content)
|
||||
|
||||
response2 = genai_client.models.generate_content(
|
||||
model="gemini-2.0-flash-001",
|
||||
contents=history,
|
||||
config=GenerateContentConfig(
|
||||
tools=genai_tools,
|
||||
),
|
||||
)
|
||||
final_model_response_content = response2.candidates[0].content
|
||||
history.append(final_model_response_content)
|
||||
print(response2.text)
|
||||
|
||||
asyncio.run(run_application())
|
||||
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
{{< tabpane text=true persist=header >}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
To learn more about tool calling with Google GenAI, check out the
|
||||
[Google GenAI Documentation](https://github.com/googleapis/python-genai?tab=readme-ov-file#manually-declare-and-invoke-a-function-for-function-calling).
|
||||
{{% /tab %}}
|
||||
{{% tab header="ADK" lang="en" %}}
|
||||
To learn more about Agent Development Kit, check out the [ADK documentation.](https://google.github.io/adk-docs/)
|
||||
To learn more about Agent Development Kit, check out the [ADK
|
||||
documentation.](https://google.github.io/adk-docs/)
|
||||
{{% /tab %}}
|
||||
{{% tab header="Langchain" lang="en" %}}
|
||||
To learn more about Agents in LangChain, check out the [LangGraph Agent documentation.](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)
|
||||
To learn more about Agents in LangChain, check out the [LangGraph Agent
|
||||
documentation.](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)
|
||||
{{% /tab %}}
|
||||
{{% tab header="LlamaIndex" lang="en" %}}
|
||||
To learn more about Agents in LlamaIndex, check out the
|
||||
[LlamaIndex AgentWorkflow documentation.](https://docs.llamaindex.ai/en/stable/examples/agent/agent_workflow_basic/)
|
||||
To learn more about Agents in LlamaIndex, check out the [LlamaIndex
|
||||
AgentWorkflow
|
||||
documentation.](https://docs.llamaindex.ai/en/stable/examples/agent/agent_workflow_basic/)
|
||||
{{% /tab %}}
|
||||
{{% tab header="Core" lang="en" %}}
|
||||
To learn more about tool calling with Google GenAI, check out the
|
||||
[Google GenAI
|
||||
Documentation](https://github.com/googleapis/python-genai?tab=readme-ov-file#manually-declare-and-invoke-a-function-for-function-calling).
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
@@ -606,3 +672,7 @@ To learn more about Agents in LlamaIndex, check out the
|
||||
```sh
|
||||
python hotel_agent.py
|
||||
```
|
||||
|
||||
{{< notice info >}}
|
||||
For more information, visit the [Python SDK repo](https://github.com/googleapis/mcp-toolbox-sdk-python).
|
||||
{{</ notice >}}
|
||||
|
||||
921
docs/en/getting-started/local_quickstart_go.md
Normal file
921
docs/en/getting-started/local_quickstart_go.md
Normal file
@@ -0,0 +1,921 @@
|
||||
---
|
||||
title: "Go Quickstart (Local)"
|
||||
type: docs
|
||||
weight: 4
|
||||
description: >
|
||||
How to get started running Toolbox locally with [Go](https://github.com/googleapis/mcp-toolbox-sdk-go), PostgreSQL, and orchestration frameworks such as [LangChain Go](https://tmc.github.io/langchaingo/docs/), [GenkitGo](https://genkit.dev/go/docs/get-started-go/), [Go GenAI](https://github.com/googleapis/go-genai) and [OpenAI Go](https://github.com/openai/openai-go).
|
||||
---
|
||||
|
||||
## Before you begin
|
||||
|
||||
This guide assumes you have already done the following:
|
||||
|
||||
1. Installed [Go (v1.24.2 or higher)].
|
||||
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
|
||||
|
||||
### Cloud Setup (Optional)
|
||||
|
||||
If you plan to use **Google Cloud’s Vertex AI** with your agent (e.g., using Gemini or PaLM models), follow these one-time setup steps:
|
||||
|
||||
1. [Install the Google Cloud CLI]
|
||||
1. [Set up Application Default Credentials (ADC)]
|
||||
1. Set your project and enable Vertex AI
|
||||
|
||||
```bash
|
||||
gcloud config set project YOUR_PROJECT_ID
|
||||
gcloud services enable aiplatform.googleapis.com
|
||||
```
|
||||
|
||||
[Go (v1.24.2 or higher)]: https://go.dev/doc/install
|
||||
[install-postgres]: https://www.postgresql.org/download/
|
||||
[Install the Google Cloud CLI]: https://cloud.google.com/sdk/docs/install
|
||||
[Set up Application Default Credentials (ADC)]: https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
|
||||
|
||||
|
||||
## Step 1: Set up your database
|
||||
|
||||
In this section, we will create a database, insert some data that needs to be
|
||||
accessed by our agent, and create a database user for Toolbox to connect with.
|
||||
|
||||
1. Connect to postgres using the `psql` command:
|
||||
|
||||
```bash
|
||||
psql -h 127.0.0.1 -U postgres
|
||||
```
|
||||
|
||||
Here, `postgres` denotes the default postgres superuser.
|
||||
|
||||
{{< notice info >}}
|
||||
|
||||
#### **Having trouble connecting?**
|
||||
|
||||
* **Password Prompt:** If you are prompted for a password for the `postgres`
|
||||
user and do not know it (or a blank password doesn't work), your PostgreSQL
|
||||
installation might require a password or a different authentication method.
|
||||
* **`FATAL: role "postgres" does not exist`:** This error means the default
|
||||
`postgres` superuser role isn't available under that name on your system.
|
||||
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
|
||||
You can typically check with `sudo systemctl status postgresql` and start it
|
||||
with `sudo systemctl start postgresql` on Linux systems.
|
||||
|
||||
<br/>
|
||||
|
||||
#### **Common Solution**
|
||||
|
||||
For password issues or if the `postgres` role seems inaccessible directly, try
|
||||
switching to the `postgres` operating system user first. This user often has
|
||||
permission to connect without a password for local connections (this is called
|
||||
peer authentication).
|
||||
|
||||
```bash
|
||||
sudo -i -u postgres
|
||||
psql -h 127.0.0.1
|
||||
```
|
||||
|
||||
Once you are in the `psql` shell using this method, you can proceed with the
|
||||
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
|
||||
`exit` to return to your normal user shell.
|
||||
|
||||
If desired, once connected to `psql` as the `postgres` OS user, you can set a
|
||||
password for the `postgres` *database* user using: `ALTER USER postgres WITH
|
||||
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
|
||||
postgres` and a password next time.
|
||||
{{< /notice >}}
|
||||
|
||||
1. Create a new database and a new user:
|
||||
|
||||
{{< notice tip >}}
|
||||
For a real application, it's best to follow the principle of least permission
|
||||
and only grant the privileges your application needs.
|
||||
{{< /notice >}}
|
||||
|
||||
```sql
|
||||
CREATE USER toolbox_user WITH PASSWORD 'my-password';
|
||||
|
||||
CREATE DATABASE toolbox_db;
|
||||
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
|
||||
|
||||
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
|
||||
```
|
||||
|
||||
1. End the database session:
|
||||
|
||||
```bash
|
||||
\q
|
||||
```
|
||||
|
||||
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
|
||||
need to type `exit` after `\q` to leave the `postgres` user's shell
|
||||
session.)
|
||||
|
||||
1. Connect to your database with your new user:
|
||||
|
||||
```bash
|
||||
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
|
||||
```
|
||||
|
||||
1. Create a table using the following command:
|
||||
|
||||
```sql
|
||||
CREATE TABLE hotels(
|
||||
id INTEGER NOT NULL PRIMARY KEY,
|
||||
name VARCHAR NOT NULL,
|
||||
location VARCHAR NOT NULL,
|
||||
price_tier VARCHAR NOT NULL,
|
||||
checkin_date DATE NOT NULL,
|
||||
checkout_date DATE NOT NULL,
|
||||
booked BIT NOT NULL
|
||||
);
|
||||
```
|
||||
|
||||
1. Insert data into the table.
|
||||
|
||||
```sql
|
||||
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
|
||||
VALUES
|
||||
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
|
||||
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
|
||||
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
|
||||
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
|
||||
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
|
||||
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
|
||||
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
|
||||
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
|
||||
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
|
||||
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
|
||||
```
|
||||
|
||||
1. End the database session:
|
||||
|
||||
```bash
|
||||
\q
|
||||
```
|
||||
|
||||
## Step 2: Install and configure Toolbox
|
||||
|
||||
In this section, we will download Toolbox, configure our tools in a
|
||||
`tools.yaml`, and then run the Toolbox server.
|
||||
|
||||
1. Download the latest version of Toolbox as a binary:
|
||||
|
||||
{{< notice tip >}}
|
||||
Select the
|
||||
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
|
||||
corresponding to your OS and CPU architecture.
|
||||
{{< /notice >}}
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```bash
|
||||
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.9.0/$OS/toolbox
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Write the following into a `tools.yaml` file. Be sure to update any fields
|
||||
such as `user`, `password`, or `database` that you may have customized in the
|
||||
previous step.
|
||||
|
||||
{{< notice tip >}}
|
||||
In practice, use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-pg-source:
|
||||
kind: postgres
|
||||
host: 127.0.0.1
|
||||
port: 5432
|
||||
database: toolbox_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
tools:
|
||||
search-hotels-by-name:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Search for hotels based on name.
|
||||
parameters:
|
||||
- name: name
|
||||
type: string
|
||||
description: The name of the hotel.
|
||||
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
|
||||
search-hotels-by-location:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Search for hotels based on location.
|
||||
parameters:
|
||||
- name: location
|
||||
type: string
|
||||
description: The location of the hotel.
|
||||
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
|
||||
book-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: >-
|
||||
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to book.
|
||||
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
|
||||
update-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: >-
|
||||
Update a hotel's check-in and check-out dates by its ID. Returns a message
|
||||
indicating whether the hotel was successfully updated or not.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to update.
|
||||
- name: checkin_date
|
||||
type: string
|
||||
description: The new check-in date of the hotel.
|
||||
- name: checkout_date
|
||||
type: string
|
||||
description: The new check-out date of the hotel.
|
||||
statement: >-
|
||||
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
|
||||
as date) WHERE id = $1;
|
||||
cancel-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Cancel a hotel by its ID.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to cancel.
|
||||
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
|
||||
toolsets:
|
||||
my-toolset:
|
||||
- search-hotels-by-name
|
||||
- search-hotels-by-location
|
||||
- book-hotel
|
||||
- update-hotel
|
||||
- cancel-hotel
|
||||
```
|
||||
|
||||
For more info on tools, check out the `Resources` section of the docs.
|
||||
|
||||
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
|
||||
|
||||
```bash
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
## Step 3: Connect your agent to Toolbox
|
||||
|
||||
In this section, we will write and run an agent that will load the Tools
|
||||
from Toolbox.
|
||||
|
||||
1. Initialize a go module:
|
||||
|
||||
```bash
|
||||
go mod init main
|
||||
```
|
||||
|
||||
1. In a new terminal, install the [SDK](https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go).
|
||||
|
||||
```bash
|
||||
go get github.com/googleapis/mcp-toolbox-sdk-go
|
||||
```
|
||||
|
||||
1. Create a new file named `hotelagent.go` and copy the following code to create an agent:
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain Go" lang="go" >}}
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"github.com/tmc/langchaingo/llms"
|
||||
"github.com/tmc/langchaingo/llms/googleai"
|
||||
)
|
||||
|
||||
// ConvertToLangchainTool converts a generic core.ToolboxTool into a LangChainGo llms.Tool.
|
||||
func ConvertToLangchainTool(toolboxTool *core.ToolboxTool) llms.Tool {
|
||||
|
||||
// Fetch the tool's input schema
|
||||
inputschema, err := toolboxTool.InputSchema()
|
||||
if err != nil {
|
||||
return llms.Tool{}
|
||||
}
|
||||
|
||||
var paramsSchema map[string]any
|
||||
_ = json.Unmarshal(inputschema, ¶msSchema)
|
||||
|
||||
// Convert into LangChain's llms.Tool
|
||||
return llms.Tool{
|
||||
Type: "function",
|
||||
Function: &llms.FunctionDefinition{
|
||||
Name: toolboxTool.Name(),
|
||||
Description: toolboxTool.Description(),
|
||||
Parameters: paramsSchema,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
const systemPrompt = `
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking, and
|
||||
cancellations. When the user searches for a hotel, mention its name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
`
|
||||
|
||||
var queries = []string{
|
||||
"Find hotels in Basel with Basel in its name.",
|
||||
"Can you book the hotel Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it.",
|
||||
"Please book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
}
|
||||
|
||||
func main() {
|
||||
genaiKey := os.Getenv("GOOGLE_API_KEY")
|
||||
toolboxURL := "http://localhost:5000"
|
||||
ctx := context.Background()
|
||||
|
||||
// Initialize the Google AI client (LLM).
|
||||
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-1.5-flash"))
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Google AI client: %v", err)
|
||||
}
|
||||
|
||||
// Initialize the MCP Toolbox client.
|
||||
toolboxClient, err := core.NewToolboxClient(toolboxURL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Load the tool using the MCP Toolbox SDK.
|
||||
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
|
||||
}
|
||||
|
||||
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
|
||||
|
||||
langchainTools := make([]llms.Tool, len(tools))
|
||||
// Convert the loaded ToolboxTools into the format LangChainGo requires.
|
||||
for i, tool := range tools {
|
||||
langchainTools[i] = ConvertToLangchainTool(tool)
|
||||
toolsMap[tool.Name()] = tool
|
||||
}
|
||||
|
||||
// Start the conversation history.
|
||||
messageHistory := []llms.MessageContent{
|
||||
llms.TextParts(llms.ChatMessageTypeSystem, systemPrompt),
|
||||
}
|
||||
|
||||
for _, query := range queries {
|
||||
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeHuman, query))
|
||||
|
||||
// Make the first call to the LLM, making it aware of the tool.
|
||||
resp, err := llm.GenerateContent(ctx, messageHistory, llms.WithTools(langchainTools))
|
||||
if err != nil {
|
||||
log.Fatalf("LLM call failed: %v", err)
|
||||
}
|
||||
respChoice := resp.Choices[0]
|
||||
|
||||
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI, respChoice.Content)
|
||||
for _, tc := range respChoice.ToolCalls {
|
||||
assistantResponse.Parts = append(assistantResponse.Parts, tc)
|
||||
}
|
||||
messageHistory = append(messageHistory, assistantResponse)
|
||||
|
||||
// Process each tool call requested by the model.
|
||||
for _, tc := range respChoice.ToolCalls {
|
||||
toolName := tc.FunctionCall.Name
|
||||
tool := toolsMap[toolName]
|
||||
var args map[string]any
|
||||
if err := json.Unmarshal([]byte(tc.FunctionCall.Arguments), &args); err != nil {
|
||||
log.Fatalf("Failed to unmarshal arguments for tool '%s': %v", toolName, err)
|
||||
}
|
||||
toolResult, err := tool.Invoke(ctx, args)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to execute tool '%s': %v", toolName, err)
|
||||
}
|
||||
if toolResult == "" || toolResult == nil {
|
||||
toolResult = "Operation completed successfully with no specific return value."
|
||||
}
|
||||
|
||||
// Create the tool call response message and add it to the history.
|
||||
toolResponse := llms.MessageContent{
|
||||
Role: llms.ChatMessageTypeTool,
|
||||
Parts: []llms.ContentPart{
|
||||
llms.ToolCallResponse{
|
||||
Name: toolName,
|
||||
Content: fmt.Sprintf("%v", toolResult),
|
||||
},
|
||||
},
|
||||
}
|
||||
messageHistory = append(messageHistory, toolResponse)
|
||||
}
|
||||
finalResp, err := llm.GenerateContent(ctx, messageHistory)
|
||||
if err != nil {
|
||||
log.Fatalf("Final LLM call failed after tool execution: %v", err)
|
||||
}
|
||||
|
||||
// Add the final textual response from the LLM to the history
|
||||
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeAI, finalResp.Choices[0].Content))
|
||||
|
||||
fmt.Println(finalResp.Choices[0].Content)
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="Genkit Go" lang="go" >}}
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"log"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
|
||||
|
||||
"github.com/firebase/genkit/go/ai"
|
||||
"github.com/firebase/genkit/go/genkit"
|
||||
"github.com/firebase/genkit/go/plugins/googlegenai"
|
||||
)
|
||||
|
||||
const systemPrompt = `
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking, and
|
||||
cancellations. When the user searches for a hotel, mention its name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
`
|
||||
|
||||
var queries = []string{
|
||||
"Find hotels in Basel with Basel in its name.",
|
||||
"Can you book the hotel Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
}
|
||||
|
||||
func main() {
|
||||
ctx := context.Background()
|
||||
|
||||
// Create Toolbox Client
|
||||
toolboxClient, err := core.NewToolboxClient("http://127.0.0.1:5000")
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Load the tools using the MCP Toolbox SDK.
|
||||
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
|
||||
}
|
||||
|
||||
// Initialize Genkit
|
||||
g, err := genkit.Init(ctx,
|
||||
genkit.WithPlugins(&googlegenai.GoogleAI{}),
|
||||
genkit.WithDefaultModel("googleai/gemini-1.5-flash"),
|
||||
)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to init genkit: %v\n", err)
|
||||
}
|
||||
|
||||
// Create a conversation history
|
||||
conversationHistory := []*ai.Message{
|
||||
ai.NewSystemTextMessage(systemPrompt),
|
||||
}
|
||||
|
||||
// Convert your tool to a Genkit tool.
|
||||
genkitTools := make([]ai.Tool, len(tools))
|
||||
for i, tool := range tools {
|
||||
newTool, err := tbgenkit.ToGenkitTool(tool, g)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to convert tool: %v\n", err)
|
||||
}
|
||||
genkitTools[i] = newTool
|
||||
}
|
||||
|
||||
toolRefs := make([]ai.ToolRef, len(genkitTools))
|
||||
|
||||
for i, tool := range genkitTools {
|
||||
toolRefs[i] = tool
|
||||
}
|
||||
|
||||
for _, query := range queries {
|
||||
conversationHistory = append(conversationHistory, ai.NewUserTextMessage(query))
|
||||
response, err := genkit.Generate(ctx, g,
|
||||
ai.WithMessages(conversationHistory...),
|
||||
ai.WithTools(toolRefs...),
|
||||
ai.WithReturnToolRequests(true),
|
||||
)
|
||||
|
||||
if err != nil {
|
||||
log.Fatalf("%v\n", err)
|
||||
}
|
||||
conversationHistory = append(conversationHistory, response.Message)
|
||||
|
||||
parts := []*ai.Part{}
|
||||
|
||||
for _, req := range response.ToolRequests() {
|
||||
tool := genkit.LookupTool(g, req.Name)
|
||||
if tool == nil {
|
||||
log.Fatalf("tool %q not found", req.Name)
|
||||
}
|
||||
|
||||
output, err := tool.RunRaw(ctx, req.Input)
|
||||
if err != nil {
|
||||
log.Fatalf("tool %q execution failed: %v", tool.Name(), err)
|
||||
}
|
||||
|
||||
parts = append(parts,
|
||||
ai.NewToolResponsePart(&ai.ToolResponse{
|
||||
Name: req.Name,
|
||||
Ref: req.Ref,
|
||||
Output: output,
|
||||
}))
|
||||
|
||||
}
|
||||
|
||||
if len(parts) > 0 {
|
||||
resp, err := genkit.Generate(ctx, g,
|
||||
ai.WithMessages(append(response.History(), ai.NewMessage(ai.RoleTool, nil, parts...))...),
|
||||
ai.WithTools(toolRefs...),
|
||||
)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
fmt.Println("\n", resp.Text())
|
||||
conversationHistory = append(conversationHistory, resp.Message)
|
||||
} else {
|
||||
fmt.Println("\n", response.Text())
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="Go GenAI" lang="go" >}}
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
"google.golang.org/genai"
|
||||
)
|
||||
|
||||
// ConvertToGenaiTool translates a ToolboxTool into the genai.FunctionDeclaration format.
|
||||
func ConvertToGenaiTool(toolboxTool *core.ToolboxTool) *genai.Tool {
|
||||
|
||||
inputschema, err := toolboxTool.InputSchema()
|
||||
if err != nil {
|
||||
return &genai.Tool{}
|
||||
}
|
||||
|
||||
var paramsSchema *genai.Schema
|
||||
_ = json.Unmarshal(inputschema, ¶msSchema)
|
||||
// First, create the function declaration.
|
||||
funcDeclaration := &genai.FunctionDeclaration{
|
||||
Name: toolboxTool.Name(),
|
||||
Description: toolboxTool.Description(),
|
||||
Parameters: paramsSchema,
|
||||
}
|
||||
|
||||
// Then, wrap the function declaration in a genai.Tool struct.
|
||||
return &genai.Tool{
|
||||
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
|
||||
}
|
||||
}
|
||||
|
||||
func printResponse(resp *genai.GenerateContentResponse) {
|
||||
for _, cand := range resp.Candidates {
|
||||
if cand.Content != nil {
|
||||
for _, part := range cand.Content.Parts {
|
||||
fmt.Println(part.Text)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const systemPrompt = `
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking, and
|
||||
cancellations. When the user searches for a hotel, mention its name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
`
|
||||
|
||||
var queries = []string{
|
||||
"Find hotels in Basel with Basel in its name.",
|
||||
"Can you book the hotel Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it.",
|
||||
"Please book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
}
|
||||
|
||||
func main() {
|
||||
// Setup
|
||||
ctx := context.Background()
|
||||
apiKey := os.Getenv("GOOGLE_API_KEY")
|
||||
toolboxURL := "http://localhost:5000"
|
||||
|
||||
// Initialize the Google GenAI client using the explicit ClientConfig.
|
||||
client, err := genai.NewClient(ctx, &genai.ClientConfig{
|
||||
APIKey: apiKey,
|
||||
})
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Google GenAI client: %v", err)
|
||||
}
|
||||
|
||||
// Initialize the MCP Toolbox client.
|
||||
toolboxClient, err := core.NewToolboxClient(toolboxURL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Load the tool using the MCP Toolbox SDK.
|
||||
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
|
||||
}
|
||||
|
||||
genAITools := make([]*genai.Tool, len(tools))
|
||||
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
|
||||
|
||||
for i, tool := range tools {
|
||||
genAITools[i] = ConvertToGenaiTool(tool)
|
||||
toolsMap[tool.Name()] = tool
|
||||
}
|
||||
|
||||
// Set up the generative model with the available tool.
|
||||
modelName := "gemini-2.0-flash"
|
||||
|
||||
// Create the initial content prompt for the model.
|
||||
messageHistory := []*genai.Content{
|
||||
genai.NewContentFromText(systemPrompt, genai.RoleUser),
|
||||
}
|
||||
config := &genai.GenerateContentConfig{
|
||||
Tools: genAITools,
|
||||
ToolConfig: &genai.ToolConfig{
|
||||
FunctionCallingConfig: &genai.FunctionCallingConfig{
|
||||
Mode: genai.FunctionCallingConfigModeAny,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, query := range queries {
|
||||
|
||||
messageHistory = append(messageHistory, genai.NewContentFromText(query, genai.RoleUser))
|
||||
|
||||
genContentResp, err := client.Models.GenerateContent(ctx, modelName, messageHistory, config)
|
||||
if err != nil {
|
||||
log.Fatalf("LLM call failed for query '%s': %v", query, err)
|
||||
}
|
||||
|
||||
if len(genContentResp.Candidates) > 0 && genContentResp.Candidates[0].Content != nil {
|
||||
messageHistory = append(messageHistory, genContentResp.Candidates[0].Content)
|
||||
}
|
||||
|
||||
functionCalls := genContentResp.FunctionCalls()
|
||||
|
||||
toolResponseParts := []*genai.Part{}
|
||||
|
||||
for _, fc := range functionCalls {
|
||||
|
||||
toolToInvoke, found := toolsMap[fc.Name]
|
||||
if !found {
|
||||
log.Fatalf("Tool '%s' not found in loaded tools map. Check toolset configuration.", fc.Name)
|
||||
}
|
||||
|
||||
toolResult, invokeErr := toolToInvoke.Invoke(ctx, fc.Args)
|
||||
if invokeErr != nil {
|
||||
log.Fatalf("Failed to execute tool '%s': %v", fc.Name, invokeErr)
|
||||
}
|
||||
|
||||
// Enhanced Tool Result Handling (retained to prevent nil issues)
|
||||
toolResultString := ""
|
||||
if toolResult != nil {
|
||||
jsonBytes, marshalErr := json.Marshal(toolResult)
|
||||
if marshalErr == nil {
|
||||
toolResultString = string(jsonBytes)
|
||||
} else {
|
||||
toolResultString = fmt.Sprintf("%v", toolResult)
|
||||
}
|
||||
}
|
||||
|
||||
responseMap := map[string]any{"result": toolResultString}
|
||||
|
||||
toolResponseParts = append(toolResponseParts, genai.NewPartFromFunctionResponse(fc.Name, responseMap))
|
||||
}
|
||||
// Add all accumulated tool responses for this turn to the message history.
|
||||
toolResponseContent := genai.NewContentFromParts(toolResponseParts, "function")
|
||||
messageHistory = append(messageHistory, toolResponseContent)
|
||||
|
||||
finalResponse, err := client.Models.GenerateContent(ctx, modelName, messageHistory, &genai.GenerateContentConfig{})
|
||||
if err != nil {
|
||||
log.Fatalf("Error calling GenerateContent (with function result): %v", err)
|
||||
}
|
||||
|
||||
printResponse(finalResponse)
|
||||
// Add the final textual response from the LLM to the history
|
||||
if len(finalResponse.Candidates) > 0 && finalResponse.Candidates[0].Content != nil {
|
||||
messageHistory = append(messageHistory, finalResponse.Candidates[0].Content)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="OpenAI Go" lang="go" >}}
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"log"
|
||||
|
||||
"github.com/googleapis/mcp-toolbox-sdk-go/core"
|
||||
openai "github.com/openai/openai-go"
|
||||
)
|
||||
|
||||
// ConvertToOpenAITool converts a ToolboxTool into the go-openai library's Tool format.
|
||||
func ConvertToOpenAITool(toolboxTool *core.ToolboxTool) openai.ChatCompletionToolParam {
|
||||
// Get the input schema
|
||||
jsonSchemaBytes, err := toolboxTool.InputSchema()
|
||||
if err != nil {
|
||||
return openai.ChatCompletionToolParam{}
|
||||
}
|
||||
|
||||
// Unmarshal the JSON bytes into FunctionParameters
|
||||
var paramsSchema openai.FunctionParameters
|
||||
if err := json.Unmarshal(jsonSchemaBytes, ¶msSchema); err != nil {
|
||||
return openai.ChatCompletionToolParam{}
|
||||
}
|
||||
|
||||
// Create and return the final tool parameter struct.
|
||||
return openai.ChatCompletionToolParam{
|
||||
Function: openai.FunctionDefinitionParam{
|
||||
Name: toolboxTool.Name(),
|
||||
Description: openai.String(toolboxTool.Description()),
|
||||
Parameters: paramsSchema,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
const systemPrompt = `
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking, and
|
||||
cancellations. When the user searches for a hotel, mention its name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
`
|
||||
|
||||
var queries = []string{
|
||||
"Find hotels in Basel with Basel in its name.",
|
||||
"Can you book the hotel Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
}
|
||||
|
||||
func main() {
|
||||
// Setup
|
||||
ctx := context.Background()
|
||||
toolboxURL := "http://localhost:5000"
|
||||
openAIClient := openai.NewClient()
|
||||
|
||||
// Initialize the MCP Toolbox client.
|
||||
toolboxClient, err := core.NewToolboxClient(toolboxURL)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to create Toolbox client: %v", err)
|
||||
}
|
||||
|
||||
// Load the tools using the MCP Toolbox SDK.
|
||||
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed to load tool : %v\nMake sure your Toolbox server is running and the tool is configured.", err)
|
||||
}
|
||||
|
||||
openAITools := make([]openai.ChatCompletionToolParam, len(tools))
|
||||
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
|
||||
|
||||
for i, tool := range tools {
|
||||
// Convert the Toolbox tool into the openAI FunctionDeclaration format.
|
||||
openAITools[i] = ConvertToOpenAITool(tool)
|
||||
// Add tool to a map for lookup later
|
||||
toolsMap[tool.Name()] = tool
|
||||
|
||||
}
|
||||
|
||||
params := openai.ChatCompletionNewParams{
|
||||
Messages: []openai.ChatCompletionMessageParamUnion{
|
||||
openai.SystemMessage(systemPrompt),
|
||||
},
|
||||
Tools: openAITools,
|
||||
Seed: openai.Int(0),
|
||||
Model: openai.ChatModelGPT4o,
|
||||
}
|
||||
|
||||
for _, query := range queries {
|
||||
|
||||
params.Messages = append(params.Messages, openai.UserMessage(query))
|
||||
|
||||
// Make initial chat completion request
|
||||
completion, err := openAIClient.Chat.Completions.New(ctx, params)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
toolCalls := completion.Choices[0].Message.ToolCalls
|
||||
|
||||
// Return early if there are no tool calls
|
||||
if len(toolCalls) == 0 {
|
||||
log.Println("No function call")
|
||||
}
|
||||
|
||||
// If there is a was a function call, continue the conversation
|
||||
params.Messages = append(params.Messages, completion.Choices[0].Message.ToParam())
|
||||
for _, toolCall := range toolCalls {
|
||||
|
||||
toolName := toolCall.Function.Name
|
||||
toolToInvoke := toolsMap[toolName]
|
||||
|
||||
var args map[string]any
|
||||
err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
result, err := toolToInvoke.Invoke(ctx, args)
|
||||
if err != nil {
|
||||
log.Fatal("Could not invoke tool", err)
|
||||
}
|
||||
|
||||
params.Messages = append(params.Messages, openai.ToolMessage(result.(string), toolCall.ID))
|
||||
}
|
||||
|
||||
completion, err = openAIClient.Chat.Completions.New(ctx, params)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
params.Messages = append(params.Messages, openai.AssistantMessage(query))
|
||||
|
||||
println("\n", completion.Choices[0].Message.Content)
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
1. Ensure all dependencies are installed:
|
||||
|
||||
```sh
|
||||
go mod tidy
|
||||
```
|
||||
|
||||
1. Run your agent, and observe the results:
|
||||
|
||||
```sh
|
||||
go run hotelagent.go
|
||||
```
|
||||
|
||||
{{< notice info >}}
|
||||
For more information, visit the [Go SDK repo](https://github.com/googleapis/mcp-toolbox-sdk-go).
|
||||
{{</ notice >}}
|
||||
578
docs/en/getting-started/local_quickstart_js.md
Normal file
578
docs/en/getting-started/local_quickstart_js.md
Normal file
@@ -0,0 +1,578 @@
|
||||
---
|
||||
title: "JS Quickstart (Local)"
|
||||
type: docs
|
||||
weight: 3
|
||||
description: >
|
||||
How to get started running Toolbox locally with [JavaScript](https://github.com/googleapis/mcp-toolbox-sdk-js), PostgreSQL, and orchestration frameworks such as [LangChain](https://js.langchain.com/docs/introduction/), [GenkitJS](https://genkit.dev/docs/get-started/), and [LlamaIndex](https://ts.llamaindex.ai/).
|
||||
---
|
||||
|
||||
## Before you begin
|
||||
|
||||
This guide assumes you have already done the following:
|
||||
|
||||
1. Installed [Node.js (v18 or higher)].
|
||||
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
|
||||
|
||||
### Cloud Setup (Optional)
|
||||
|
||||
If you plan to use **Google Cloud’s Vertex AI** with your agent (e.g., using Gemini or PaLM models), follow these one-time setup steps:
|
||||
|
||||
1. [Install the Google Cloud CLI]
|
||||
1. [Set up Application Default Credentials (ADC)]
|
||||
1. Set your project and enable Vertex AI
|
||||
|
||||
```bash
|
||||
gcloud config set project YOUR_PROJECT_ID
|
||||
gcloud services enable aiplatform.googleapis.com
|
||||
```
|
||||
|
||||
[Node.js (v18 or higher)]: https://nodejs.org/
|
||||
[install-postgres]: https://www.postgresql.org/download/
|
||||
[Install the Google Cloud CLI]: https://cloud.google.com/sdk/docs/install
|
||||
[Set up Application Default Credentials (ADC)]: https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
|
||||
|
||||
|
||||
## Step 1: Set up your database
|
||||
|
||||
In this section, we will create a database, insert some data that needs to be
|
||||
accessed by our agent, and create a database user for Toolbox to connect with.
|
||||
|
||||
1. Connect to postgres using the `psql` command:
|
||||
|
||||
```bash
|
||||
psql -h 127.0.0.1 -U postgres
|
||||
```
|
||||
|
||||
Here, `postgres` denotes the default postgres superuser.
|
||||
|
||||
{{< notice info >}}
|
||||
|
||||
#### **Having trouble connecting?**
|
||||
|
||||
* **Password Prompt:** If you are prompted for a password for the `postgres`
|
||||
user and do not know it (or a blank password doesn't work), your PostgreSQL
|
||||
installation might require a password or a different authentication method.
|
||||
* **`FATAL: role "postgres" does not exist`:** This error means the default
|
||||
`postgres` superuser role isn't available under that name on your system.
|
||||
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
|
||||
You can typically check with `sudo systemctl status postgresql` and start it
|
||||
with `sudo systemctl start postgresql` on Linux systems.
|
||||
|
||||
<br/>
|
||||
|
||||
#### **Common Solution**
|
||||
|
||||
For password issues or if the `postgres` role seems inaccessible directly, try
|
||||
switching to the `postgres` operating system user first. This user often has
|
||||
permission to connect without a password for local connections (this is called
|
||||
peer authentication).
|
||||
|
||||
```bash
|
||||
sudo -i -u postgres
|
||||
psql -h 127.0.0.1
|
||||
```
|
||||
|
||||
Once you are in the `psql` shell using this method, you can proceed with the
|
||||
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
|
||||
`exit` to return to your normal user shell.
|
||||
|
||||
If desired, once connected to `psql` as the `postgres` OS user, you can set a
|
||||
password for the `postgres` *database* user using: `ALTER USER postgres WITH
|
||||
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
|
||||
postgres` and a password next time.
|
||||
{{< /notice >}}
|
||||
|
||||
1. Create a new database and a new user:
|
||||
|
||||
{{< notice tip >}}
|
||||
For a real application, it's best to follow the principle of least permission
|
||||
and only grant the privileges your application needs.
|
||||
{{< /notice >}}
|
||||
|
||||
```sql
|
||||
CREATE USER toolbox_user WITH PASSWORD 'my-password';
|
||||
|
||||
CREATE DATABASE toolbox_db;
|
||||
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
|
||||
|
||||
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
|
||||
```
|
||||
|
||||
1. End the database session:
|
||||
|
||||
```bash
|
||||
\q
|
||||
```
|
||||
|
||||
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
|
||||
need to type `exit` after `\q` to leave the `postgres` user's shell
|
||||
session.)
|
||||
|
||||
1. Connect to your database with your new user:
|
||||
|
||||
```bash
|
||||
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
|
||||
```
|
||||
|
||||
1. Create a table using the following command:
|
||||
|
||||
```sql
|
||||
CREATE TABLE hotels(
|
||||
id INTEGER NOT NULL PRIMARY KEY,
|
||||
name VARCHAR NOT NULL,
|
||||
location VARCHAR NOT NULL,
|
||||
price_tier VARCHAR NOT NULL,
|
||||
checkin_date DATE NOT NULL,
|
||||
checkout_date DATE NOT NULL,
|
||||
booked BIT NOT NULL
|
||||
);
|
||||
```
|
||||
|
||||
1. Insert data into the table.
|
||||
|
||||
```sql
|
||||
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
|
||||
VALUES
|
||||
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
|
||||
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
|
||||
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
|
||||
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
|
||||
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
|
||||
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
|
||||
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
|
||||
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
|
||||
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
|
||||
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
|
||||
```
|
||||
|
||||
1. End the database session:
|
||||
|
||||
```bash
|
||||
\q
|
||||
```
|
||||
|
||||
## Step 2: Install and configure Toolbox
|
||||
|
||||
In this section, we will download Toolbox, configure our tools in a
|
||||
`tools.yaml`, and then run the Toolbox server.
|
||||
|
||||
1. Download the latest version of Toolbox as a binary:
|
||||
|
||||
{{< notice tip >}}
|
||||
Select the
|
||||
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
|
||||
corresponding to your OS and CPU architecture.
|
||||
{{< /notice >}}
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```bash
|
||||
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.9.0/$OS/toolbox
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Write the following into a `tools.yaml` file. Be sure to update any fields
|
||||
such as `user`, `password`, or `database` that you may have customized in the
|
||||
previous step.
|
||||
|
||||
{{< notice tip >}}
|
||||
In practice, use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-pg-source:
|
||||
kind: postgres
|
||||
host: 127.0.0.1
|
||||
port: 5432
|
||||
database: toolbox_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
tools:
|
||||
search-hotels-by-name:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Search for hotels based on name.
|
||||
parameters:
|
||||
- name: name
|
||||
type: string
|
||||
description: The name of the hotel.
|
||||
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
|
||||
search-hotels-by-location:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Search for hotels based on location.
|
||||
parameters:
|
||||
- name: location
|
||||
type: string
|
||||
description: The location of the hotel.
|
||||
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
|
||||
book-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: >-
|
||||
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to book.
|
||||
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
|
||||
update-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: >-
|
||||
Update a hotel's check-in and check-out dates by its ID. Returns a message
|
||||
indicating whether the hotel was successfully updated or not.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to update.
|
||||
- name: checkin_date
|
||||
type: string
|
||||
description: The new check-in date of the hotel.
|
||||
- name: checkout_date
|
||||
type: string
|
||||
description: The new check-out date of the hotel.
|
||||
statement: >-
|
||||
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
|
||||
as date) WHERE id = $1;
|
||||
cancel-hotel:
|
||||
kind: postgres-sql
|
||||
source: my-pg-source
|
||||
description: Cancel a hotel by its ID.
|
||||
parameters:
|
||||
- name: hotel_id
|
||||
type: string
|
||||
description: The ID of the hotel to cancel.
|
||||
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
|
||||
toolsets:
|
||||
my-toolset:
|
||||
- search-hotels-by-name
|
||||
- search-hotels-by-location
|
||||
- book-hotel
|
||||
- update-hotel
|
||||
- cancel-hotel
|
||||
```
|
||||
|
||||
For more info on tools, check out the `Resources` section of the docs.
|
||||
|
||||
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
|
||||
|
||||
```bash
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
## Step 3: Connect your agent to Toolbox
|
||||
|
||||
In this section, we will write and run an agent that will load the Tools
|
||||
from Toolbox.
|
||||
|
||||
1. (Optional) Initialize a Node.js project:
|
||||
|
||||
```bash
|
||||
npm init -y
|
||||
```
|
||||
|
||||
1. In a new terminal, install the [SDK](https://www.npmjs.com/package/@toolbox-sdk/core).
|
||||
|
||||
```bash
|
||||
npm install @toolbox-sdk/core
|
||||
```
|
||||
|
||||
1. Install other required dependencies
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain" lang="bash" >}}
|
||||
npm install langchain @langchain/google-vertexai
|
||||
{{< /tab >}}
|
||||
{{< tab header="GenkitJS" lang="bash" >}}
|
||||
npm install genkit @genkit-ai/vertexai
|
||||
{{< /tab >}}
|
||||
{{< tab header="LlamaIndex" lang="bash" >}}
|
||||
npm install llamaindex @llamaindex/google @llamaindex/workflow
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
1. Create a new file named `hotelAgent.js` and copy the following code to create an agent:
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain" lang="js" >}}
|
||||
|
||||
import { ChatVertexAI } from "@langchain/google-vertexai";
|
||||
import { ToolboxClient } from "@toolbox-sdk/core";
|
||||
import { tool } from "@langchain/core/tools";
|
||||
import { createReactAgent } from "@langchain/langgraph/prebuilt";
|
||||
import { MemorySaver } from "@langchain/langgraph";
|
||||
|
||||
// Replace it with your API key
|
||||
process.env.GOOGLE_API_KEY = 'your-api-key';
|
||||
|
||||
const prompt = `
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking, and
|
||||
cancellations. When the user searches for a hotel, mention its name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
`;
|
||||
|
||||
const queries = [
|
||||
"Find hotels in Basel with Basel in its name.",
|
||||
"Can you book the Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
];
|
||||
|
||||
async function runApplication() {
|
||||
const model = new ChatVertexAI({
|
||||
model: "gemini-2.0-flash",
|
||||
});
|
||||
|
||||
|
||||
const client = new ToolboxClient("http://127.0.0.1:5000");
|
||||
const toolboxTools = await client.loadToolset("my-toolset");
|
||||
|
||||
// Define the basics of the tool: name, description, schema and core logic
|
||||
const getTool = (toolboxTool) => tool(toolboxTool, {
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
schema: toolboxTool.getParamSchema()
|
||||
});
|
||||
const tools = toolboxTools.map(getTool);
|
||||
|
||||
const agent = createReactAgent({
|
||||
llm: model,
|
||||
tools: tools,
|
||||
checkpointer: new MemorySaver(),
|
||||
systemPrompt: prompt,
|
||||
});
|
||||
|
||||
const langGraphConfig = {
|
||||
configurable: {
|
||||
thread_id: "test-thread",
|
||||
},
|
||||
};
|
||||
|
||||
|
||||
for (const query of queries) {
|
||||
const agentOutput = await agent.invoke(
|
||||
{
|
||||
messages: [
|
||||
{
|
||||
role: "user",
|
||||
content: query,
|
||||
},
|
||||
],
|
||||
verbose: true,
|
||||
},
|
||||
langGraphConfig
|
||||
);
|
||||
const response = agentOutput.messages[agentOutput.messages.length - 1].content;
|
||||
console.log(response);
|
||||
}
|
||||
}
|
||||
|
||||
runApplication()
|
||||
.catch(console.error)
|
||||
.finally(() => console.log("\nApplication finished."));
|
||||
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="GenkitJS" lang="js" >}}
|
||||
|
||||
import { ToolboxClient } from "@toolbox-sdk/core";
|
||||
import { genkit } from "genkit";
|
||||
import { googleAI } from '@genkit-ai/googleai';
|
||||
|
||||
// Replace it with your API key
|
||||
process.env.GOOGLE_API_KEY = 'your-api-key';
|
||||
|
||||
const systemPrompt = `
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking, and
|
||||
cancellations. When the user searches for a hotel, mention its name, id,
|
||||
location and price tier. Always mention hotel ids while performing any
|
||||
searches. This is very important for any operations. For any bookings or
|
||||
cancellations, please provide the appropriate confirmation. Be sure to
|
||||
update checkin or checkout dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
`;
|
||||
|
||||
const queries = [
|
||||
"Find hotels in Basel with Basel in its name.",
|
||||
"Can you book the Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
];
|
||||
|
||||
async function run() {
|
||||
const toolboxClient = new ToolboxClient("http://127.0.0.1:5000");
|
||||
|
||||
const ai = genkit({
|
||||
plugins: [
|
||||
googleAI({
|
||||
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
|
||||
})
|
||||
],
|
||||
model: googleAI.model('gemini-2.0-flash'),
|
||||
});
|
||||
|
||||
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
|
||||
const toolMap = Object.fromEntries(
|
||||
toolboxTools.map((tool) => {
|
||||
const definedTool = ai.defineTool(
|
||||
{
|
||||
name: tool.getName(),
|
||||
description: tool.getDescription(),
|
||||
inputSchema: tool.getParamSchema(),
|
||||
},
|
||||
tool
|
||||
);
|
||||
return [tool.getName(), definedTool];
|
||||
})
|
||||
);
|
||||
const tools = Object.values(toolMap);
|
||||
|
||||
let conversationHistory = [{ role: "system", content: [{ text: systemPrompt }] }];
|
||||
|
||||
for (const query of queries) {
|
||||
conversationHistory.push({ role: "user", content: [{ text: query }] });
|
||||
const response = await ai.generate({
|
||||
messages: conversationHistory,
|
||||
tools: tools,
|
||||
});
|
||||
conversationHistory.push(response.message);
|
||||
|
||||
const toolRequests = response.toolRequests;
|
||||
if (toolRequests?.length > 0) {
|
||||
// Execute tools concurrently and collect their responses.
|
||||
const toolResponses = await Promise.all(
|
||||
toolRequests.map(async (call) => {
|
||||
try {
|
||||
const toolOutput = await toolMap[call.name].invoke(call.input);
|
||||
return { role: "tool", content: [{ toolResponse: { name: call.name, output: toolOutput } }] };
|
||||
} catch (e) {
|
||||
console.error(`Error executing tool ${call.name}:`, e);
|
||||
return { role: "tool", content: [{ toolResponse: { name: call.name, output: { error: e.message } } }] };
|
||||
}
|
||||
})
|
||||
);
|
||||
|
||||
conversationHistory.push(...toolResponses);
|
||||
|
||||
// Call the AI again with the tool results.
|
||||
response = await ai.generate({ messages: conversationHistory, tools });
|
||||
conversationHistory.push(response.message);
|
||||
}
|
||||
|
||||
console.log(response.text);
|
||||
}
|
||||
}
|
||||
|
||||
run();
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="LlamaIndex" lang="js" >}}
|
||||
|
||||
import { gemini, GEMINI_MODEL } from "@llamaindex/google";
|
||||
import { agent } from "@llamaindex/workflow";
|
||||
import { createMemory, staticBlock, tool } from "llamaindex";
|
||||
import { ToolboxClient } from "@toolbox-sdk/core";
|
||||
|
||||
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
|
||||
process.env.GOOGLE_API_KEY = 'your-api-key'; // Replace it with your API key
|
||||
|
||||
const prompt = `
|
||||
|
||||
You're a helpful hotel assistant. You handle hotel searching, booking and cancellations.
|
||||
When the user searches for a hotel, mention its name, id, location and price tier.
|
||||
Always mention hotel ids while performing any searches — this is very important for operations.
|
||||
For any bookings or cancellations, please provide the appropriate confirmation.
|
||||
Update check-in or check-out dates if mentioned by the user.
|
||||
Don't ask for confirmations from the user.
|
||||
|
||||
`;
|
||||
|
||||
const queries = [
|
||||
"Find hotels in Basel with Basel in its name.",
|
||||
"Can you book the Hilton Basel for me?",
|
||||
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
|
||||
"My check in dates would be from April 10, 2024 to April 19, 2024.",
|
||||
];
|
||||
|
||||
async function main() {
|
||||
// Connect to MCP Toolbox
|
||||
const client = new ToolboxClient(TOOLBOX_URL);
|
||||
const toolboxTools = await client.loadToolset("my-toolset");
|
||||
const tools = toolboxTools.map((toolboxTool) => {
|
||||
return tool({
|
||||
name: toolboxTool.getName(),
|
||||
description: toolboxTool.getDescription(),
|
||||
parameters: toolboxTool.getParamSchema(),
|
||||
execute: toolboxTool,
|
||||
});
|
||||
});
|
||||
|
||||
// Initialize LLM
|
||||
const llm = gemini({
|
||||
model: GEMINI_MODEL.GEMINI_2_0_FLASH,
|
||||
apiKey: process.env.GOOGLE_API_KEY,
|
||||
});
|
||||
|
||||
const memory = createMemory({
|
||||
memoryBlocks: [
|
||||
staticBlock({
|
||||
content: prompt,
|
||||
}),
|
||||
],
|
||||
});
|
||||
|
||||
// Create the Agent
|
||||
const myAgent = agent({
|
||||
tools: tools,
|
||||
llm,
|
||||
memory,
|
||||
systemPrompt: prompt,
|
||||
});
|
||||
|
||||
for (const query of queries) {
|
||||
const result = await myAgent.run(query);
|
||||
const output = result.data.result;
|
||||
|
||||
console.log(`\nUser: ${query}`);
|
||||
if (typeof output === "string") {
|
||||
console.log(output.trim());
|
||||
} else if (typeof output === "object" && "text" in output) {
|
||||
console.log(output.text.trim());
|
||||
} else {
|
||||
console.log(JSON.stringify(output));
|
||||
}
|
||||
}
|
||||
//You may observe some extra logs during execution due to the run method provided by Llama.
|
||||
console.log("Agent run finished.");
|
||||
}
|
||||
|
||||
main();
|
||||
|
||||
{{< /tab >}}
|
||||
|
||||
{{< /tabpane >}}
|
||||
|
||||
1. Run your agent, and observe the results:
|
||||
|
||||
```sh
|
||||
node hotelAgent.js
|
||||
```
|
||||
|
||||
{{< notice info >}}
|
||||
For more information, visit the [JS SDK repo](https://github.com/googleapis/mcp-toolbox-sdk-js).
|
||||
{{</ notice >}}
|
||||
@@ -1,9 +1,9 @@
|
||||
---
|
||||
title: "Quickstart (MCP)"
|
||||
type: docs
|
||||
weight: 3
|
||||
weight: 5
|
||||
description: >
|
||||
How to get started running Toolbox locally with MCP Inspector.
|
||||
How to get started running Toolbox locally with MCP Inspector.
|
||||
---
|
||||
|
||||
## Overview
|
||||
@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
<!-- {x-release-please-start-version} -->
|
||||
```bash
|
||||
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.4.0/$OS/toolbox
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.9.0/$OS/toolbox
|
||||
```
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
@@ -199,12 +199,13 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
- cancel-hotel
|
||||
```
|
||||
|
||||
For more info on tools, check out the [Tools](../../resources/tools/_index.md) section.
|
||||
For more info on tools, check out the
|
||||
[Tools](../../resources/tools/_index.md) section.
|
||||
|
||||
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
|
||||
|
||||
```bash
|
||||
./toolbox --tools_file "tools.yaml"
|
||||
./toolbox --tools-file "tools.yaml"
|
||||
```
|
||||
|
||||
## Step 3: Connect to MCP Inspector
|
||||
@@ -217,17 +218,25 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
|
||||
1. Type `y` when it asks to install the inspector package.
|
||||
|
||||
1. It should show the following when the MCP Inspector is up and running:
|
||||
1. It should show the following when the MCP Inspector is up and running (please take note of `<YOUR_SESSION_TOKEN>`):
|
||||
|
||||
```bash
|
||||
🔍 MCP Inspector is up and running at http://127.0.0.1:5173 🚀
|
||||
Starting MCP inspector...
|
||||
⚙️ Proxy server listening on localhost:6277
|
||||
🔑 Session token: <YOUR_SESSION_TOKEN>
|
||||
Use this token to authenticate requests or set DANGEROUSLY_OMIT_AUTH=true to disable auth
|
||||
|
||||
🚀 MCP Inspector is up and running at:
|
||||
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=<YOUR_SESSION_TOKEN>
|
||||
```
|
||||
|
||||
1. Open the above link in your browser.
|
||||
|
||||
1. For `Transport Type`, select `SSE`.
|
||||
1. For `Transport Type`, select `Streamable HTTP`.
|
||||
|
||||
1. For `URL`, type in `http://127.0.0.1:5000/mcp/sse`.
|
||||
1. For `URL`, type in `http://127.0.0.1:5000/mcp`.
|
||||
|
||||
1. For `Configuration` -> `Proxy Session Token`, make sure `<YOUR_SESSION_TOKEN>` is present.
|
||||
|
||||
1. Click Connect.
|
||||
|
||||
@@ -237,4 +246,4 @@ In this section, we will download Toolbox, configure our tools in a
|
||||
|
||||

|
||||
|
||||
1. Test out your tools here!
|
||||
1. Test out your tools here!
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 22 KiB After Width: | Height: | Size: 32 KiB |
9
docs/en/how-to/connect-ide/_index.md
Normal file
9
docs/en/how-to/connect-ide/_index.md
Normal file
@@ -0,0 +1,9 @@
|
||||
---
|
||||
title: "Connect from your IDE"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
List of guides detailing how to connect your AI tools (IDEs) to Toolbox using MCP.
|
||||
aliases:
|
||||
- /how-to/connect_tools_using_mcp
|
||||
---
|
||||
306
docs/en/how-to/connect-ide/alloydb_pg_admin_mcp.md
Normal file
306
docs/en/how-to/connect-ide/alloydb_pg_admin_mcp.md
Normal file
@@ -0,0 +1,306 @@
|
||||
---
|
||||
title: "AlloyDB Admin API using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Create your AlloyDB database with MCP Toolbox.
|
||||
---
|
||||
|
||||
This guide covers how to use [MCP Toolbox for Databases][toolbox] to create AlloyDB clusters and instances from IDE enabling their E2E journey.
|
||||
|
||||
- [Cursor][cursor]
|
||||
- [Windsurf][windsurf] (Codium)
|
||||
- [Visual Studio Code ][vscode] (Copilot)
|
||||
- [Cline][cline] (VS Code extension)
|
||||
- [Claude desktop][claudedesktop]
|
||||
- [Claude code][claudecode]
|
||||
- [Gemini CLI][geminicli]
|
||||
- [Gemini Code Assist][geminicodeassist]
|
||||
|
||||
[toolbox]: https://github.com/googleapis/genai-toolbox
|
||||
[cursor]: #configure-your-mcp-client
|
||||
[windsurf]: #configure-your-mcp-client
|
||||
[vscode]: #configure-your-mcp-client
|
||||
[cline]: #configure-your-mcp-client
|
||||
[claudedesktop]: #configure-your-mcp-client
|
||||
[claudecode]: #configure-your-mcp-client
|
||||
[geminicli]: #configure-your-mcp-client
|
||||
[geminicodeassist]: #configure-your-mcp-client
|
||||
|
||||
## Before you begin
|
||||
|
||||
1. In the Google Cloud console, on the [project selector page](https://console.cloud.google.com/projectselector2/home/dashboard), select or create a Google Cloud project.
|
||||
|
||||
1. [Make sure that billing is enabled for your Google Cloud project](https://cloud.google.com/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).
|
||||
|
||||
## Install MCP Toolbox
|
||||
|
||||
1. Download the latest version of Toolbox as a binary. Select the [correct binary](https://github.com/googleapis/genai-toolbox/releases) corresponding to your OS and CPU architecture. You are required to use Toolbox version V0.10.0+:
|
||||
|
||||
<!-- {x-release-please-start-version} -->
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="linux/amd64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/arm64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/amd64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="windows/amd64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Verify the installation:
|
||||
|
||||
```bash
|
||||
./toolbox --version
|
||||
```
|
||||
|
||||
## Configure your MCP Client
|
||||
|
||||
{{< tabpane text=true >}}
|
||||
{{% tab header="Claude code" lang="en" %}}
|
||||
|
||||
1. Install [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
|
||||
1. Create a `.mcp.json` file in your project root if it doesn't exist.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude code to apply the new configuration.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Claude desktop" lang="en" %}}
|
||||
|
||||
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
|
||||
1. Under the Developer tab, tap Edit Config to open the configuration file.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude desktop.
|
||||
1. From the new chat screen, you should see a hammer (MCP) icon appear with the new MCP server available.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cline" lang="en" %}}
|
||||
|
||||
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap the **MCP Servers** icon.
|
||||
1. Tap Configure MCP Servers to open the configuration file.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. You should see a green active status after the server is successfully connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cursor" lang="en" %}}
|
||||
|
||||
1. Create a `.cursor` directory in your project root if it doesn't exist.
|
||||
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor Settings > MCP**. You should see a green active status after the server is successfully connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
|
||||
|
||||
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and create a `.vscode` directory in your project root if it doesn't exist.
|
||||
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Windsurf" lang="en" %}}
|
||||
|
||||
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the Cascade assistant.
|
||||
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Gemini CLI" lang="en" %}}
|
||||
|
||||
1. Install the [Gemini CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
|
||||
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||
|
||||
1. Install the [Gemini Code Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist) extension in Visual Studio Code.
|
||||
1. Enable Agent Mode in Gemini Code Assist chat.
|
||||
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
|
||||
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
|
||||
|
||||
> **Note:** The lifetime of token is 1 hour.
|
||||
|
||||
1. Add the following configuration, replace the environment variables with your values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"alloydb-admin": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
|
||||
"env": {
|
||||
"API_KEY": "your-api-key"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
## Use Tools
|
||||
|
||||
Your AI tool is now connected to AlloyDB using MCP. Try asking your AI assistant to create a database, cluster or instance.
|
||||
|
||||
The following tools are available to the LLM:
|
||||
|
||||
1. **alloydb-create-cluster**: creates alloydb cluster
|
||||
1. **alloydb-create-instance**: creates alloydb instance (PRIMARY, READ_POOL or SECONDARY)
|
||||
1. **alloydb-get-operation**: polls on operations API until the operation is done.
|
||||
|
||||
{{< notice note >}}
|
||||
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs will adapt to the tools available, so this shouldn't affect most users.
|
||||
{{< /notice >}}
|
||||
|
||||
## Connect to your Data
|
||||
|
||||
After setting up an AlloyDB cluster and instance, you can [connect your IDE to the database](https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox).
|
||||
13
docs/en/how-to/connect-ide/alloydb_pg_mcp.md
Normal file
13
docs/en/how-to/connect-ide/alloydb_pg_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "AlloyDB using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to AlloyDB using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/bigquery_mcp.md
Normal file
13
docs/en/how-to/connect-ide/bigquery_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "BigQuery using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to BigQuery using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/bigquery/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/bigquery/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/cloud_sql_mssql_mcp.md
Normal file
13
docs/en/how-to/connect-ide/cloud_sql_mssql_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Cloud SQL for SQL Server using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Cloud SQL for SQL Server using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/sql/docs/sqlserver/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/sql/docs/sqlserver/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md
Normal file
13
docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Cloud SQL for MySQL using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Cloud SQL for MySQL using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/sql/docs/mysql/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/sql/docs/mysql/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
13
docs/en/how-to/connect-ide/cloud_sql_pg_mcp.md
Normal file
13
docs/en/how-to/connect-ide/cloud_sql_pg_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Cloud SQL for Postgres using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Cloud SQL for Postgres using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/sql/docs/postgres/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/sql/docs/postgres/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
313
docs/en/how-to/connect-ide/firestore_mcp.md
Normal file
313
docs/en/how-to/connect-ide/firestore_mcp.md
Normal file
@@ -0,0 +1,313 @@
|
||||
---
|
||||
title: "Firestore using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Firestore using Toolbox.
|
||||
---
|
||||
|
||||
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
|
||||
an open protocol for connecting Large Language Models (LLMs) to data sources
|
||||
like Firestore. This guide covers how to use [MCP Toolbox for Databases][toolbox]
|
||||
to expose your developer assistant tools to a Firestore instance:
|
||||
|
||||
* [Cursor][cursor]
|
||||
* [Windsurf][windsurf] (Codium)
|
||||
* [Visual Studio Code][vscode] (Copilot)
|
||||
* [Cline][cline] (VS Code extension)
|
||||
* [Claude desktop][claudedesktop]
|
||||
* [Claude code][claudecode]
|
||||
* [Gemini CLI][geminicli]
|
||||
* [Gemini Code Assist][geminicodeassist]
|
||||
|
||||
[toolbox]: https://github.com/googleapis/genai-toolbox
|
||||
[cursor]: #configure-your-mcp-client
|
||||
[windsurf]: #configure-your-mcp-client
|
||||
[vscode]: #configure-your-mcp-client
|
||||
[cline]: #configure-your-mcp-client
|
||||
[claudedesktop]: #configure-your-mcp-client
|
||||
[claudecode]: #configure-your-mcp-client
|
||||
[geminicli]: #configure-your-mcp-client
|
||||
[geminicodeassist]: #configure-your-mcp-client]
|
||||
|
||||
## Set up Firestore
|
||||
|
||||
1. Create or select a Google Cloud project.
|
||||
|
||||
* [Create a new project](https://cloud.google.com/resource-manager/docs/creating-managing-projects)
|
||||
* [Select an existing project](https://cloud.google.com/resource-manager/docs/creating-managing-projects#identifying_projects)
|
||||
|
||||
1. [Enable the Firestore API](https://console.cloud.google.com/apis/library/firestore.googleapis.com) for your project.
|
||||
|
||||
1. [Create a Firestore database](https://cloud.google.com/firestore/docs/create-database-web-mobile-client-library) if you haven't already.
|
||||
|
||||
1. Set up authentication for your local environment.
|
||||
|
||||
* [Install gcloud CLI](https://cloud.google.com/sdk/docs/install)
|
||||
* Run `gcloud auth application-default login` to authenticate
|
||||
|
||||
## Install MCP Toolbox
|
||||
|
||||
1. Download the latest version of Toolbox as a binary. Select the [correct
|
||||
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
|
||||
to your OS and CPU architecture. You are required to use Toolbox version
|
||||
V0.10.0+:
|
||||
|
||||
<!-- {x-release-please-start-version} -->
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="linux/amd64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/arm64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/amd64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="windows/amd64" lang="bash" >}}
|
||||
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Verify the installation:
|
||||
|
||||
```bash
|
||||
./toolbox --version
|
||||
```
|
||||
|
||||
## Configure your MCP Client
|
||||
|
||||
{{< tabpane text=true >}}
|
||||
{{% tab header="Claude code" lang="en" %}}
|
||||
|
||||
1. Install [Claude
|
||||
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
|
||||
1. Create a `.mcp.json` file in your project root if it doesn't exist.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude code to apply the new configuration.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Claude desktop" lang="en" %}}
|
||||
|
||||
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
|
||||
1. Under the Developer tab, tap Edit Config to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude desktop.
|
||||
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
|
||||
new MCP server available.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cline" lang="en" %}}
|
||||
|
||||
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
|
||||
the **MCP Servers** icon.
|
||||
1. Tap Configure MCP Servers to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. You should see a green active status after the server is successfully
|
||||
connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cursor" lang="en" %}}
|
||||
|
||||
1. Create a `.cursor` directory in your project root if it doesn't exist.
|
||||
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
|
||||
Settings > MCP**. You should see a green active status after the server is
|
||||
successfully connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
|
||||
|
||||
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
|
||||
create a `.vscode` directory in your project root if it doesn't exist.
|
||||
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Windsurf" lang="en" %}}
|
||||
|
||||
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
|
||||
Cascade assistant.
|
||||
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Gemini CLI" lang="en" %}}
|
||||
|
||||
1. Install the [Gemini CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
|
||||
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
|
||||
1. Add the following configuration, replace the environment variables with your values, and then save:
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{% tab header="Gemini Code Assist" lang="en" %}}
|
||||
|
||||
1. Install the [Gemini Code Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist) extension in Visual Studio Code.
|
||||
1. Enable Agent Mode in Gemini Code Assist chat.
|
||||
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
|
||||
1. Add the following configuration, replace the environment variables with your values, and then save:
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firestore": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","firestore","--stdio"],
|
||||
"env": {
|
||||
"FIRESTORE_PROJECT": "your-project-id",
|
||||
"FIRESTORE_DATABASE": "(default)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
## Use Tools
|
||||
|
||||
Your AI tool is now connected to Firestore using MCP. Try asking your AI
|
||||
assistant to list collections, get documents, query collections, or manage
|
||||
security rules.
|
||||
|
||||
The following tools are available to the LLM:
|
||||
|
||||
1. **firestore-get-documents**: Gets multiple documents from Firestore by their paths
|
||||
1. **firestore-list-collections**: List Firestore collections for a given parent path
|
||||
1. **firestore-delete-documents**: Delete multiple documents from Firestore
|
||||
1. **firestore-query-collection**: Query documents from a collection with filtering, ordering, and limit options
|
||||
1. **firestore-get-rules**: Retrieves the active Firestore security rules for the current project
|
||||
1. **firestore-validate-rules**: Validates Firestore security rules syntax and errors
|
||||
|
||||
{{< notice note >}}
|
||||
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
|
||||
will adapt to the tools available, so this shouldn't affect most users.
|
||||
{{< /notice >}}
|
||||
298
docs/en/how-to/connect-ide/looker_mcp.md
Normal file
298
docs/en/how-to/connect-ide/looker_mcp.md
Normal file
@@ -0,0 +1,298 @@
|
||||
---
|
||||
title: "Looker using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Looker using Toolbox.
|
||||
---
|
||||
|
||||
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
|
||||
an open protocol for connecting Large Language Models (LLMs) to data sources
|
||||
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
|
||||
to expose your developer assistant tools to a Looker instance:
|
||||
|
||||
* [Cursor][cursor]
|
||||
* [Windsurf][windsurf] (Codium)
|
||||
* [Visual Studio Code][vscode] (Copilot)
|
||||
* [Cline][cline] (VS Code extension)
|
||||
* [Claude desktop][claudedesktop]
|
||||
* [Claude code][claudecode]
|
||||
|
||||
[toolbox]: https://github.com/googleapis/genai-toolbox
|
||||
[cursor]: #configure-your-mcp-client
|
||||
[windsurf]: #configure-your-mcp-client
|
||||
[vscode]: #configure-your-mcp-client
|
||||
[cline]: #configure-your-mcp-client
|
||||
[claudedesktop]: #configure-your-mcp-client
|
||||
[claudecode]: #configure-your-mcp-client
|
||||
|
||||
## Set up Looker
|
||||
|
||||
1. Get a Looker Client ID and Client Secret. Follow the directions
|
||||
[here](https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk).
|
||||
|
||||
1. Have the base URL of your Looker instance available. It is likely
|
||||
something like `https://looker.example.com`. In some cases the API is
|
||||
listening at a different port, and you will need to use
|
||||
`https://looker.example.com:19999` instead.
|
||||
|
||||
## Install MCP Toolbox
|
||||
|
||||
1. Download the latest version of Toolbox as a binary. Select the [correct
|
||||
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
|
||||
to your OS and CPU architecture. You are required to use Toolbox version
|
||||
v0.10.0+:
|
||||
|
||||
<!-- {x-release-please-start-version} -->
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="linux/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/arm64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="windows/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox.exe>
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Verify the installation:
|
||||
|
||||
```bash
|
||||
./toolbox --version
|
||||
```
|
||||
|
||||
## Configure your MCP Client
|
||||
|
||||
{{< tabpane text=true >}}
|
||||
{{% tab header="Claude code" lang="en" %}}
|
||||
|
||||
1. Install [Claude
|
||||
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
|
||||
1. Create a `.mcp.json` file in your project root if it doesn't exist.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"looker-toolbox": {
|
||||
"command": "/PATH/TO/toolbox",
|
||||
"args": [
|
||||
"--stdio",
|
||||
"--prebuilt",
|
||||
"looker"
|
||||
],
|
||||
"env": {
|
||||
"LOOKER_BASE_URL": "https://looker.example.com",
|
||||
"LOOKER_CLIENT_ID": "",
|
||||
"LOOKER_CLIENT_SECRET": "",
|
||||
"LOOKER_VERIFY_SSL": "true"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude Code to apply the new configuration.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Claude desktop" lang="en" %}}
|
||||
|
||||
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
|
||||
1. Under the Developer tab, tap Edit Config to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"looker-toolbox": {
|
||||
"command": "/PATH/TO/toolbox",
|
||||
"args": [
|
||||
"--stdio",
|
||||
"--prebuilt",
|
||||
"looker"
|
||||
],
|
||||
"env": {
|
||||
"LOOKER_BASE_URL": "https://looker.example.com",
|
||||
"LOOKER_CLIENT_ID": "",
|
||||
"LOOKER_CLIENT_SECRET": "",
|
||||
"LOOKER_VERIFY_SSL": "true"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude desktop.
|
||||
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
|
||||
new MCP server available.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cline" lang="en" %}}
|
||||
|
||||
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
|
||||
the **MCP Servers** icon.
|
||||
1. Tap Configure MCP Servers to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"looker-toolbox": {
|
||||
"command": "/PATH/TO/toolbox",
|
||||
"args": [
|
||||
"--stdio",
|
||||
"--prebuilt",
|
||||
"looker"
|
||||
],
|
||||
"env": {
|
||||
"LOOKER_BASE_URL": "https://looker.example.com",
|
||||
"LOOKER_CLIENT_ID": "",
|
||||
"LOOKER_CLIENT_SECRET": "",
|
||||
"LOOKER_VERIFY_SSL": "true"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. You should see a green active status after the server is successfully
|
||||
connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cursor" lang="en" %}}
|
||||
|
||||
1. Create a `.cursor` directory in your project root if it doesn't exist.
|
||||
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"looker-toolbox": {
|
||||
"command": "/PATH/TO/toolbox",
|
||||
"args": [
|
||||
"--stdio",
|
||||
"--prebuilt",
|
||||
"looker"
|
||||
],
|
||||
"env": {
|
||||
"LOOKER_BASE_URL": "https://looker.example.com",
|
||||
"LOOKER_CLIENT_ID": "",
|
||||
"LOOKER_CLIENT_SECRET": "",
|
||||
"LOOKER_VERIFY_SSL": "true"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Open [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
|
||||
Settings > MCP**. You should see a green active status after the server is
|
||||
successfully connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
|
||||
|
||||
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
|
||||
create a `.vscode` directory in your project root if it doesn't exist.
|
||||
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"looker-toolbox": {
|
||||
"command": "/PATH/TO/toolbox",
|
||||
"args": [
|
||||
"--stdio",
|
||||
"--prebuilt",
|
||||
"looker"
|
||||
],
|
||||
"env": {
|
||||
"LOOKER_BASE_URL": "https://looker.example.com",
|
||||
"LOOKER_CLIENT_ID": "",
|
||||
"LOOKER_CLIENT_SECRET": "",
|
||||
"LOOKER_VERIFY_SSL": "true"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Windsurf" lang="en" %}}
|
||||
|
||||
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
|
||||
Cascade assistant.
|
||||
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"looker-toolbox": {
|
||||
"command": "/PATH/TO/toolbox",
|
||||
"args": [
|
||||
"--stdio",
|
||||
"--prebuilt",
|
||||
"looker"
|
||||
],
|
||||
"env": {
|
||||
"LOOKER_BASE_URL": "https://looker.example.com",
|
||||
"LOOKER_CLIENT_ID": "",
|
||||
"LOOKER_CLIENT_SECRET": "",
|
||||
"LOOKER_VERIFY_SSL": "true"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
## Use Tools
|
||||
|
||||
Your AI tool is now connected to Looker using MCP. Try asking your AI
|
||||
assistant to list models, explores, dimensions, and measures. Run a
|
||||
query, retrieve the SQL for a query, and run a saved Look.
|
||||
|
||||
The following tools are available to the LLM:
|
||||
|
||||
1. **get_models**: list the LookML models in Looker
|
||||
1. **get_explores**: list the explores in a given model
|
||||
1. **get_dimensions**: list the dimensions in a given explore
|
||||
1. **get_measures**: list the measures in a given explore
|
||||
1. **get_filters**: list the filters in a given explore
|
||||
1. **get_parameters**: list the parameters in a given explore
|
||||
1. **query**: Run a query
|
||||
1. **query_sql**: Return the SQL generated by Looker for a query
|
||||
1. **get_looks**: Return the saved Looks that match a title or description
|
||||
1. **run_look**: Run a saved Look and return the data
|
||||
|
||||
{{< notice note >}}
|
||||
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
|
||||
will adapt to the tools available, so this shouldn't affect most users.
|
||||
{{< /notice >}}
|
||||
278
docs/en/how-to/connect-ide/postgres_mcp.md
Normal file
278
docs/en/how-to/connect-ide/postgres_mcp.md
Normal file
@@ -0,0 +1,278 @@
|
||||
---
|
||||
title: "PostgreSQL using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to PostgreSQL using Toolbox.
|
||||
---
|
||||
|
||||
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
|
||||
an open protocol for connecting Large Language Models (LLMs) to data sources
|
||||
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
|
||||
to expose your developer assistant tools to a Postgres instance:
|
||||
|
||||
* [Cursor][cursor]
|
||||
* [Windsurf][windsurf] (Codium)
|
||||
* [Visual Studio Code][vscode] (Copilot)
|
||||
* [Cline][cline] (VS Code extension)
|
||||
* [Claude desktop][claudedesktop]
|
||||
* [Claude code][claudecode]
|
||||
|
||||
[toolbox]: https://github.com/googleapis/genai-toolbox
|
||||
[cursor]: #configure-your-mcp-client
|
||||
[windsurf]: #configure-your-mcp-client
|
||||
[vscode]: #configure-your-mcp-client
|
||||
[cline]: #configure-your-mcp-client
|
||||
[claudedesktop]: #configure-your-mcp-client
|
||||
[claudecode]: #configure-your-mcp-client
|
||||
|
||||
{{< notice tip >}}
|
||||
This guide can be used with [AlloyDB
|
||||
Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
|
||||
{{< /notice >}}
|
||||
|
||||
## Set up the database
|
||||
|
||||
1. Create or select a PostgreSQL instance.
|
||||
|
||||
* [Install PostgreSQL locally](https://www.postgresql.org/download/)
|
||||
* [Install AlloyDB Omni](https://cloud.google.com/alloydb/omni/current/docs/quickstart)
|
||||
|
||||
1. Create or reuse [a database
|
||||
user](https://cloud.google.com/alloydb/omni/current/docs/database-users/manage-users)
|
||||
and have the username and password ready.
|
||||
|
||||
## Install MCP Toolbox
|
||||
|
||||
1. Download the latest version of Toolbox as a binary. Select the [correct
|
||||
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
|
||||
to your OS and CPU architecture. You are required to use Toolbox version
|
||||
V0.6.0+:
|
||||
|
||||
<!-- {x-release-please-start-version} -->
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="linux/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.9.0/linux/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/arm64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.9.0/darwin/arm64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="darwin/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.9.0/darwin/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
|
||||
{{< tab header="windows/amd64" lang="bash" >}}
|
||||
curl -O <https://storage.googleapis.com/genai-toolbox/v0.9.0/windows/amd64/toolbox>
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
<!-- {x-release-please-end} -->
|
||||
|
||||
1. Make the binary executable:
|
||||
|
||||
```bash
|
||||
chmod +x toolbox
|
||||
```
|
||||
|
||||
1. Verify the installation:
|
||||
|
||||
```bash
|
||||
./toolbox --version
|
||||
```
|
||||
|
||||
## Configure your MCP Client
|
||||
|
||||
{{< tabpane text=true >}}
|
||||
{{% tab header="Claude code" lang="en" %}}
|
||||
|
||||
1. Install [Claude
|
||||
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
|
||||
1. Create a `.mcp.json` file in your project root if it doesn't exist.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude code to apply the new configuration.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Claude desktop" lang="en" %}}
|
||||
|
||||
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
|
||||
1. Under the Developer tab, tap Edit Config to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. Restart Claude desktop.
|
||||
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
|
||||
new MCP server available.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cline" lang="en" %}}
|
||||
|
||||
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
|
||||
the **MCP Servers** icon.
|
||||
1. Tap Configure MCP Servers to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. You should see a green active status after the server is successfully
|
||||
connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Cursor" lang="en" %}}
|
||||
|
||||
1. Create a `.cursor` directory in your project root if it doesn't exist.
|
||||
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
|
||||
Settings > MCP**. You should see a green active status after the server is
|
||||
successfully connected.
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
|
||||
|
||||
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
|
||||
create a `.vscode` directory in your project root if it doesn't exist.
|
||||
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
|
||||
{{% tab header="Windsurf" lang="en" %}}
|
||||
|
||||
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
|
||||
Cascade assistant.
|
||||
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
|
||||
1. Add the following configuration, replace the environment variables with your
|
||||
values, and save:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"postgres": {
|
||||
"command": "./PATH/TO/toolbox",
|
||||
"args": ["--prebuilt","postgres","--stdio"],
|
||||
"env": {
|
||||
"POSTGRES_HOST": "",
|
||||
"POSTGRES_PORT": "",
|
||||
"POSTGRES_DATABASE": "",
|
||||
"POSTGRES_USER": "",
|
||||
"POSTGRES_PASSWORD": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
{{% /tab %}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
## Use Tools
|
||||
|
||||
Your AI tool is now connected to Postgres using MCP. Try asking your AI
|
||||
assistant to list tables, create a table, or define and execute other SQL
|
||||
statements.
|
||||
|
||||
The following tools are available to the LLM:
|
||||
|
||||
1. **list_tables**: lists tables and descriptions
|
||||
1. **execute_sql**: execute any SQL statement
|
||||
|
||||
{{< notice note >}}
|
||||
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
|
||||
will adapt to the tools available, so this shouldn't affect most users.
|
||||
{{< /notice >}}
|
||||
13
docs/en/how-to/connect-ide/spanner_mcp.md
Normal file
13
docs/en/how-to/connect-ide/spanner_mcp.md
Normal file
@@ -0,0 +1,13 @@
|
||||
---
|
||||
title: "Spanner using MCP"
|
||||
type: docs
|
||||
weight: 2
|
||||
description: >
|
||||
Connect your IDE to Spanner using Toolbox.
|
||||
---
|
||||
<html>
|
||||
<head>
|
||||
<link rel="canonical" href="https://cloud.google.com/spanner/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/spanner/docs/pre-built-tools-with-mcp-toolbox"/>
|
||||
</head>
|
||||
</html>
|
||||
@@ -7,40 +7,71 @@ description: >
|
||||
---
|
||||
|
||||
## Toolbox SDKs vs Model Context Protocol (MCP)
|
||||
Toolbox now supports connections via both the native Toolbox SDKs and via [Model Context Protocol (MCP)](https://modelcontextprotocol.io/). However, Toolbox has several features which are not supported in the MCP specification (such as Authenticated Parameters and Authorized invocation).
|
||||
|
||||
We recommend using the native SDKs over MCP clients to leverage these features. The native SDKs can be combined with MCP clients in many cases.
|
||||
Toolbox now supports connections via both the native Toolbox SDKs and via [Model
|
||||
Context Protocol (MCP)](https://modelcontextprotocol.io/). However, Toolbox has
|
||||
several features which are not supported in the MCP specification (such as
|
||||
Authenticated Parameters and Authorized invocation).
|
||||
|
||||
We recommend using the native SDKs over MCP clients to leverage these features.
|
||||
The native SDKs can be combined with MCP clients in many cases.
|
||||
|
||||
### Protocol Versions
|
||||
Toolbox currently supports the following versions of MCP specification:
|
||||
* [2024-11-05](https://spec.modelcontextprotocol.io/specification/2024-11-05/)
|
||||
|
||||
### Features Not Supported by MCP
|
||||
Toolbox has several features that are not yet supported in the MCP specification:
|
||||
* **AuthZ/AuthN:** There are no auth implementation in the `2024-11-05` specification. This includes:
|
||||
Toolbox currently supports the following versions of MCP specification:
|
||||
|
||||
* [2025-06-18](https://modelcontextprotocol.io/specification/2025-06-18)
|
||||
* [2025-03-26](https://modelcontextprotocol.io/specification/2025-03-26)
|
||||
* [2024-11-05](https://modelcontextprotocol.io/specification/2024-11-05)
|
||||
|
||||
### Toolbox AuthZ/AuthN Not Supported by MCP
|
||||
|
||||
The auth implementation in Toolbox is not supported in MCP's auth specification.
|
||||
This includes:
|
||||
* [Authenticated Parameters](../resources/tools/_index.md#authenticated-parameters)
|
||||
* [Authorized Invocations](../resources/tools/_index.md#authorized-invocations)
|
||||
* **Notifications:** Currently, editing Toolbox Tools requires a server restart. Clients should reload tools on disconnect to get the latest version.
|
||||
|
||||
|
||||
## Connecting to Toolbox with an MCP client
|
||||
|
||||
### Before you begin
|
||||
|
||||
{{< notice note >}}
|
||||
{{< notice note >}}
|
||||
MCP is only compatible with Toolbox version 0.3.0 and above.
|
||||
{{< /notice >}}
|
||||
|
||||
1. [Install](../getting-started/introduction/_index.md#installing-the-server) Toolbox version 0.3.0+.
|
||||
1. [Install](../getting-started/introduction/_index.md#installing-the-server)
|
||||
Toolbox version 0.3.0+.
|
||||
|
||||
1. Make sure you've set up and initialized your database.
|
||||
|
||||
1. [Set up](../getting-started/configure.md) your `tools.yaml` file.
|
||||
|
||||
### Connecting via Standard Input/Output (stdio)
|
||||
|
||||
Toolbox supports the
|
||||
[stdio](https://modelcontextprotocol.io/docs/concepts/transports#standard-input%2Foutput-stdio)
|
||||
transport protocol. Users that wish to use stdio will have to include the
|
||||
`--stdio` flag when running Toolbox.
|
||||
|
||||
```bash
|
||||
./toolbox --stdio
|
||||
```
|
||||
|
||||
When running with stdio, Toolbox will listen via stdio instead of acting as a
|
||||
remote HTTP server. Logs will be set to the `warn` level by default. `debug` and
|
||||
`info` logs are not supported with stdio.
|
||||
|
||||
{{< notice note >}}
|
||||
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
|
||||
{{< /notice >}}
|
||||
|
||||
### Connecting via HTTP
|
||||
|
||||
Toolbox supports the HTTP transport protocol with and without SSE.
|
||||
|
||||
{{< tabpane text=true >}} {{% tab header="HTTP with SSE" lang="en" %}}
|
||||
{{< tabpane text=true >}} {{% tab header="HTTP with SSE (deprecated)" lang="en" %}}
|
||||
Add the following configuration to your MCP client configuration:
|
||||
|
||||
```bash
|
||||
{
|
||||
"mcpServers": {
|
||||
@@ -52,17 +83,54 @@ Add the following configuration to your MCP client configuration:
|
||||
}
|
||||
```
|
||||
|
||||
If you would like to connect to a specific toolset, replace `url` with `"http://127.0.0.1:5000/mcp/{toolset_name}/sse"`.
|
||||
{{% /tab %}} {{% tab header="HTTP POST" lang="en" %}}
|
||||
Connect to Toolbox HTTP POST via `http://127.0.0.1:5000/mcp`.
|
||||
If you would like to connect to a specific toolset, replace `url` with
|
||||
`"http://127.0.0.1:5000/mcp/{toolset_name}/sse"`.
|
||||
|
||||
If you would like to connect to a specific toolset, connect via `http://127.0.0.1:5000/mcp/{toolset_name}`.
|
||||
HTTP with SSE is only supported in version `2024-11-05` and is currently
|
||||
deprecated.
|
||||
{{% /tab %}} {{% tab header="Streamable HTTP" lang="en" %}}
|
||||
Add the following configuration to your MCP client configuration:
|
||||
|
||||
```bash
|
||||
{
|
||||
"mcpServers": {
|
||||
"toolbox": {
|
||||
"type": "http",
|
||||
"url": "http://127.0.0.1:5000/mcp",
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
If you would like to connect to a specific toolset, replace `url` with
|
||||
`"http://127.0.0.1:5000/mcp/{toolset_name}"`.
|
||||
{{% /tab %}} {{< /tabpane >}}
|
||||
|
||||
### Using the MCP Inspector with Toolbox
|
||||
|
||||
Use MCP [Inspector](https://github.com/modelcontextprotocol/inspector) for testing and debugging Toolbox server.
|
||||
Use MCP [Inspector](https://github.com/modelcontextprotocol/inspector) for
|
||||
testing and debugging Toolbox server.
|
||||
|
||||
{{< tabpane text=true >}}
|
||||
{{% tab header="STDIO" lang="en" %}}
|
||||
|
||||
1. Run Inspector with Toolbox as a subprocess:
|
||||
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector ./toolbox --stdio
|
||||
```
|
||||
|
||||
1. For `Transport Type` dropdown menu, select `STDIO`.
|
||||
|
||||
1. In `Command`, make sure that it is set to :`./toolbox` (or the correct path
|
||||
to where the Toolbox binary is installed).
|
||||
|
||||
1. In `Arguments`, make sure that it's filled with `--stdio`.
|
||||
|
||||
1. Click the `Connect` button. It might take awhile to spin up Toolbox. Voila!
|
||||
You should be able to inspect your toolbox tools!
|
||||
{{% /tab %}}
|
||||
{{% tab header="HTTP with SSE (deprecated)" lang="en" %}}
|
||||
1. [Run Toolbox](../getting-started/introduction/_index.md#running-the-server).
|
||||
|
||||
1. In a separate terminal, run Inspector directly through `npx`:
|
||||
@@ -78,13 +146,31 @@ Use MCP [Inspector](https://github.com/modelcontextprotocol/inspector) for testi
|
||||
|
||||
1. Click the `Connect` button. Voila! You should be able to inspect your toolbox
|
||||
tools!
|
||||
{{% /tab %}}
|
||||
{{% tab header="Streamable HTTP" lang="en" %}}
|
||||
1. [Run Toolbox](../getting-started/introduction/_index.md#running-the-server).
|
||||
|
||||
1. In a separate terminal, run Inspector directly through `npx`:
|
||||
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector
|
||||
```
|
||||
|
||||
1. For `Transport Type` dropdown menu, select `Streamable HTTP`.
|
||||
|
||||
1. For `URL`, type in `http://127.0.0.1:5000/mcp` to use all tool or
|
||||
`http//127.0.0.1:5000/mcp/{toolset_name}` to use a specific toolset.
|
||||
|
||||
1. Click the `Connect` button. Voila! You should be able to inspect your toolbox
|
||||
tools!
|
||||
{{% /tab %}} {{< /tabpane >}}
|
||||
|
||||
### Tested Clients
|
||||
|
||||
| Client | SSE Works | MCP Config Docs |
|
||||
|--------|--------|--------|
|
||||
| Claude Desktop | ❗ | Claude Desktop only supports STDIO -- use [`mcp-remote`](https://www.npmjs.com/package/mcp-remote) to proxy. |
|
||||
| MCP Inspector | ✅ | https://github.com/modelcontextprotocol/inspector |
|
||||
| Cursor | ✅ | https://docs.cursor.com/context/model-context-protocol |
|
||||
| Windsurf | ✅ | https://docs.windsurf.com/windsurf/mcp |
|
||||
| VS Code (Insiders) | ✅ | https://code.visualstudio.com/docs/copilot/chat/mcp-servers |
|
||||
| Claude Desktop | ✅ | <https://modelcontextprotocol.io/quickstart/user#1-download-claude-for-desktop> |
|
||||
| MCP Inspector | ✅ | <https://github.com/modelcontextprotocol/inspector> |
|
||||
| Cursor | ✅ | <https://docs.cursor.com/context/model-context-protocol> |
|
||||
| Windsurf | ✅ | <https://docs.windsurf.com/windsurf/mcp> |
|
||||
| VS Code (Insiders) | ✅ | <https://code.visualstudio.com/docs/copilot/chat/mcp-servers> |
|
||||
|
||||
@@ -1,14 +1,13 @@
|
||||
---
|
||||
title: "Deploy using Docker Compose"
|
||||
type: docs
|
||||
weight: 3
|
||||
weight: 4
|
||||
description: >
|
||||
How to deploy Toolbox using Docker Compose.
|
||||
How to deploy Toolbox using Docker Compose.
|
||||
---
|
||||
|
||||
<!-- Contributor: Sujith R Pillai <sujithrpillai@gmail.com> -->
|
||||
|
||||
|
||||
## Before you begin
|
||||
|
||||
1. [Install Docker Compose.](https://docs.docker.com/compose/install/)
|
||||
@@ -35,7 +34,7 @@ services:
|
||||
- "5000:5000"
|
||||
volumes:
|
||||
- ./config:/config
|
||||
command: [ "toolbox", "--tools_file", "/config/tools.yaml", "--address", "0.0.0.0"]
|
||||
command: [ "toolbox", "--tools-file", "/config/tools.yaml", "--address", "0.0.0.0"]
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
@@ -74,19 +73,16 @@ networks:
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
|
||||
{{< notice tip >}}
|
||||
{{< notice tip >}}
|
||||
|
||||
You can use this setup quickly set up Toolbox + Postgres to follow along in our
|
||||
[Quickstart](../getting-started/local_quickstart.md)
|
||||
|
||||
{{< /notice >}}
|
||||
|
||||
|
||||
{{< /notice >}}
|
||||
|
||||
## Connecting with Toolbox Client SDK
|
||||
|
||||
Next, we will use Toolbox with the Client SDKs:
|
||||
Next, we will use Toolbox with the Client SDKs:
|
||||
|
||||
1. The url for the Toolbox server running using docker-compose will be:
|
||||
|
||||
@@ -101,14 +97,14 @@ Next, we will use Toolbox with the Client SDKs:
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
|
||||
async with ToolboxClient("http://$YOUR_URL") as toolbox:
|
||||
{{< /tab >}}
|
||||
{{< tab header="Llamaindex" lang="Python" >}}
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
|
||||
async with ToolboxClient("http://$YOUR_URL") as toolbox:
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: "Deploy to Kubernetes"
|
||||
type: docs
|
||||
weight: 2
|
||||
weight: 4
|
||||
description: >
|
||||
How to set up and configure Toolbox to deploy on Kubernetes with Google Kubernetes Engine (GKE).
|
||||
---
|
||||
@@ -9,7 +9,6 @@ description: >
|
||||
|
||||
## Before you begin
|
||||
|
||||
|
||||
1. Set the PROJECT_ID environment variable:
|
||||
|
||||
```bash
|
||||
@@ -40,7 +39,6 @@ description: >
|
||||
```bash
|
||||
kubectl version --client
|
||||
```
|
||||
|
||||
|
||||
1. If needed, install `kubectl` component using the Google Cloud CLI:
|
||||
|
||||
@@ -62,7 +60,7 @@ description: >
|
||||
gcloud iam service-accounts create $SA_NAME
|
||||
```
|
||||
|
||||
1. Grant any IAM roles necessary to the IAM service account. Each source have a
|
||||
1. Grant any IAM roles necessary to the IAM service account. Each source have a
|
||||
list of necessary IAM permissions listed on it's page. The example below is
|
||||
for cloud sql postgres source:
|
||||
|
||||
@@ -76,7 +74,7 @@ description: >
|
||||
- [CloudSQL IAM Identity](../resources/sources/cloud-sql-pg.md#iam-permissions)
|
||||
- [Spanner IAM Identity](../resources/sources/spanner.md#iam-permissions)
|
||||
|
||||
## Deploy to Kubernetes
|
||||
## Deploy to Kubernetes
|
||||
|
||||
1. Set environment variables:
|
||||
|
||||
@@ -94,7 +92,7 @@ description: >
|
||||
|
||||
```bash
|
||||
gcloud container clusters create-auto $CLUSTER_NAME \
|
||||
--location=us-central1
|
||||
--location=us-central1
|
||||
```
|
||||
|
||||
1. Get authentication credentials to interact with the cluster. This also
|
||||
@@ -254,6 +252,7 @@ description: >
|
||||
```
|
||||
|
||||
## Clean up resources
|
||||
|
||||
1. Delete secret.
|
||||
|
||||
```bash
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
title: "Deploy to Cloud Run"
|
||||
type: docs
|
||||
weight: 1
|
||||
weight: 3
|
||||
description: >
|
||||
How to set up and configure Toolbox to run on Cloud Run.
|
||||
---
|
||||
@@ -33,7 +33,7 @@ description: >
|
||||
cloudbuild.googleapis.com \
|
||||
artifactregistry.googleapis.com \
|
||||
iam.googleapis.com \
|
||||
secretmanager.googleapis.com
|
||||
secretmanager.googleapis.com
|
||||
|
||||
```
|
||||
|
||||
@@ -48,21 +48,12 @@ description: >
|
||||
- Cloud Run Developer (roles/run.developer)
|
||||
- Service Account User role (roles/iam.serviceAccountUser)
|
||||
|
||||
{{< notice note >}}
|
||||
If you are under a domain restriction organization policy
|
||||
[restricting](https://cloud.google.com/run/docs/authenticating/public#domain-restricted-sharing)
|
||||
unauthenticated invocations for your project, you will need to access your
|
||||
deployed service as described under [Testing private
|
||||
services](https://cloud.google.com/run/docs/triggering/https-request#testing-private).
|
||||
{{< /notice >}}
|
||||
|
||||
{{< notice note >}}
|
||||
{{< notice note >}}
|
||||
If you are using sources that require VPC-access (such as
|
||||
AlloyDB or Cloud SQL over private IP), make sure your Cloud Run service and the
|
||||
database are in the same VPC network.
|
||||
database are in the same VPC network.
|
||||
{{< /notice >}}
|
||||
|
||||
|
||||
## Create a service account
|
||||
|
||||
1. Create a backend service account if you don't already have one:
|
||||
@@ -71,7 +62,7 @@ database are in the same VPC network.
|
||||
gcloud iam service-accounts create toolbox-identity
|
||||
```
|
||||
|
||||
1. Grant permissions to use secret manager:
|
||||
1. Grant permissions to use secret manager:
|
||||
|
||||
```bash
|
||||
gcloud projects add-iam-policy-binding $PROJECT_ID \
|
||||
@@ -79,7 +70,8 @@ database are in the same VPC network.
|
||||
--role roles/secretmanager.secretAccessor
|
||||
```
|
||||
|
||||
1. Grant additional permissions to the service account that are specific to the source, e.g.:
|
||||
1. Grant additional permissions to the service account that are specific to the
|
||||
source, e.g.:
|
||||
- [AlloyDB for PostgreSQL](../resources/sources/alloydb-pg.md#iam-permissions)
|
||||
- [Cloud SQL for PostgreSQL](../resources/sources/cloud-sql-pg.md#iam-permissions)
|
||||
|
||||
@@ -87,7 +79,7 @@ database are in the same VPC network.
|
||||
|
||||
Create a `tools.yaml` file that contains your configuration for Toolbox. For
|
||||
details, see the
|
||||
[configuration](https://github.com/googleapis/genai-toolbox/blob/main/README.md#configuration)
|
||||
[configuration](https://googleapis.github.io/genai-toolbox/resources/sources/)
|
||||
section.
|
||||
|
||||
## Deploy to Cloud Run
|
||||
@@ -105,7 +97,8 @@ section.
|
||||
gcloud secrets versions add tools --data-file=tools.yaml
|
||||
```
|
||||
|
||||
1. Set an environment variable to the container image that you want to use for cloud run:
|
||||
1. Set an environment variable to the container image that you want to use for
|
||||
cloud run:
|
||||
|
||||
```bash
|
||||
export IMAGE=us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:latest
|
||||
@@ -119,7 +112,7 @@ section.
|
||||
--service-account toolbox-identity \
|
||||
--region us-central1 \
|
||||
--set-secrets "/app/tools.yaml=tools:latest" \
|
||||
--args="--tools_file=/app/tools.yaml","--address=0.0.0.0","--port=8080"
|
||||
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080"
|
||||
# --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud
|
||||
```
|
||||
|
||||
@@ -131,34 +124,24 @@ section.
|
||||
--service-account toolbox-identity \
|
||||
--region us-central1 \
|
||||
--set-secrets "/app/tools.yaml=tools:latest" \
|
||||
--args="--tools_file=/app/tools.yaml","--address=0.0.0.0","--port=8080" \
|
||||
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080" \
|
||||
# TODO(dev): update the following to match your VPC if necessary
|
||||
--network default \
|
||||
--subnet default
|
||||
# --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud
|
||||
```
|
||||
|
||||
## Connecting to Cloud Run
|
||||
|
||||
Next, we will use `gcloud` to authenticate requests to our Cloud Run instance:
|
||||
|
||||
1. Run the `run services proxy` to proxy connections to Cloud Run:
|
||||
|
||||
```bash
|
||||
gcloud run services proxy toolbox --port=8080 --region=us-central1
|
||||
```
|
||||
|
||||
If you are prompted to install the proxy, reply *Y* to install.
|
||||
|
||||
1. Finally, use `curl` to verify the endpoint works:
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:8080
|
||||
```
|
||||
|
||||
## Connecting with Toolbox Client SDK
|
||||
|
||||
Next, we will use Toolbox with client SDK:
|
||||
You can connect to Toolbox Cloud Run instances directly through the SDK.
|
||||
|
||||
1. [Set up `Cloud Run Invoker` role
|
||||
access](https://cloud.google.com/run/docs/securing/managing-access#service-add-principals)
|
||||
to your Cloud Run service.
|
||||
|
||||
1. (Only for local runs) Set up [Application Default
|
||||
Credentials](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
|
||||
for the principle you set up the `Cloud Run Invoker` role access to.
|
||||
|
||||
1. Run the following to retrieve a non-deterministic URL for the cloud run service:
|
||||
|
||||
@@ -168,18 +151,18 @@ Next, we will use Toolbox with client SDK:
|
||||
|
||||
1. Import and initialize the toolbox client with the URL retrieved above:
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
from toolbox_langchain import ToolboxClient
|
||||
```python
|
||||
from toolbox_core import ToolboxClient, auth_methods
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
{{< /tab >}}
|
||||
{{< tab header="Llamaindex" lang="Python" >}}
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
# Replace with the Cloud Run service URL generated in the previous step.
|
||||
URL = "https://cloud-run-url.app"
|
||||
|
||||
# Replace with the cloud run service URL generated above
|
||||
toolbox = ToolboxClient("http://$YOUR_URL")
|
||||
{{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
auth_token_provider = auth_methods.aget_google_id_token(URL) # can also use sync method
|
||||
|
||||
async with ToolboxClient(
|
||||
URL,
|
||||
client_headers={"Authorization": auth_token_provider},
|
||||
) as toolbox:
|
||||
```
|
||||
|
||||
Now, you can use this client to connect to the deployed Cloud Run instance!
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
---
|
||||
title: "Export Telemetry"
|
||||
type: docs
|
||||
weight: 4
|
||||
weight: 5
|
||||
description: >
|
||||
How to set up and configure Toolbox to use the Otel Collector.
|
||||
How to set up and configure Toolbox to use the Otel Collector.
|
||||
---
|
||||
|
||||
|
||||
## About
|
||||
## About
|
||||
|
||||
The [OpenTelemetry Collector][about-collector] offers a vendor-agnostic
|
||||
implementation of how to receive, process and export telemetry data. It removes
|
||||
@@ -20,6 +20,7 @@ the need to run, operate, and maintain multiple agents/collectors.
|
||||
To configure the collector, you will have to provide a configuration file. The
|
||||
configuration file consists of four classes of pipeline component that access
|
||||
telemetry data.
|
||||
|
||||
- `Receivers`
|
||||
- `Processors`
|
||||
- `Exporters`
|
||||
@@ -56,7 +57,7 @@ service:
|
||||
exporters: ["googlecloud"]
|
||||
```
|
||||
|
||||
## Running the Connector
|
||||
## Running the Collector
|
||||
|
||||
There are a couple of steps to run and use a Collector.
|
||||
|
||||
|
||||
@@ -2,5 +2,6 @@
|
||||
title: "Resources"
|
||||
type: docs
|
||||
weight: 4
|
||||
description: List of reference documentation for resources in Toolbox.
|
||||
description: >
|
||||
List of reference documentation for resources in Toolbox.
|
||||
---
|
||||
|
||||
@@ -10,8 +10,8 @@ AuthServices represent services that handle authentication and authorization. It
|
||||
can primarily be used by [Tools](../tools) in two different ways:
|
||||
|
||||
- [**Authorized Invocation**][auth-invoke] is when a tool
|
||||
to be validate by the auth service before the call can be invoked. Toolbox
|
||||
will rejected an calls that fail to validate or have an invalid token.
|
||||
is validated by the auth service before the call can be invoked. Toolbox
|
||||
will reject any calls that fail to validate or have an invalid token.
|
||||
- [**Authenticated Parameters**][auth-params] replace the value of a parameter
|
||||
with a field from an [OIDC][openid-claims] claim. Toolbox will automatically
|
||||
resolve the ID token provided by the client and replace the parameter in the
|
||||
@@ -50,8 +50,8 @@ After you've configured an `authService` you'll, need to reference it in the
|
||||
configuration for each tool that should use it:
|
||||
|
||||
- **Authorized Invocations** for authorizing a tool call, [use the
|
||||
`requiredAuth` field in a tool config][auth-invoke]
|
||||
- **Authenticated Parameters** for using the value from a ODIC claim, [use the
|
||||
`authRequired` field in a tool config][auth-invoke]
|
||||
- **Authenticated Parameters** for using the value from a OIDC claim, [use the
|
||||
`authServices` field in a parameter config][auth-params]
|
||||
|
||||
## Specifying ID Tokens from Clients
|
||||
@@ -62,52 +62,106 @@ token you will provide a function (that returns an id). This function is called
|
||||
when the tool is invoked. This allows you to cache and refresh the ID token as
|
||||
needed.
|
||||
|
||||
The primary method for providing these getters is via the `auth_token_getters`
|
||||
parameter when loading tools, or the `add_auth_token_getter`() /
|
||||
`add_auth_token_getters()` methods on a loaded tool object.
|
||||
|
||||
### Specifying tokens during load
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
{{< tab header="Core" lang="Python" >}}
|
||||
import asyncio
|
||||
from toolbox_core import ToolboxClient
|
||||
|
||||
async def get_auth_token():
|
||||
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
|
||||
# This example just returns a placeholder. Replace with your actual token retrieval.
|
||||
return "YOUR_ID_TOKEN" # Placeholder
|
||||
|
||||
# for a single tool use
|
||||
async def main():
|
||||
async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox:
|
||||
auth_tool = await toolbox.load_tool(
|
||||
"get_sensitive_data",
|
||||
auth_token_getters={"my_auth_app_1": get_auth_token}
|
||||
)
|
||||
result = await auth_tool(param="value")
|
||||
print(result)
|
||||
|
||||
authorized_tool = toolbox.load_tool("my-tool-name", auth_tokens={"my_auth": get_auth_token})
|
||||
if **name** == "**main**":
|
||||
asyncio.run(main())
|
||||
{{< /tab >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
import asyncio
|
||||
from toolbox_langchain import ToolboxClient
|
||||
|
||||
# for a toolset use
|
||||
async def get_auth_token():
|
||||
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
|
||||
# This example just returns a placeholder. Replace with your actual token retrieval.
|
||||
return "YOUR_ID_TOKEN" # Placeholder
|
||||
|
||||
authorized_tools = toolbox.load_toolset("my-toolset-name", auth_tokens={"my_auth": get_auth_token})
|
||||
async def main():
|
||||
toolbox = ToolboxClient("<http://127.0.0.1:5000>")
|
||||
|
||||
auth_tool = await toolbox.aload_tool(
|
||||
"get_sensitive_data",
|
||||
auth_token_getters={"my_auth_app_1": get_auth_token}
|
||||
)
|
||||
result = await auth_tool.ainvoke({"param": "value"})
|
||||
print(result)
|
||||
|
||||
if **name** == "**main**":
|
||||
asyncio.run(main())
|
||||
{{< /tab >}}
|
||||
{{< tab header="Llamaindex" lang="Python" >}}
|
||||
import asyncio
|
||||
from toolbox_llamaindex import ToolboxClient
|
||||
|
||||
async def get_auth_token():
|
||||
# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
|
||||
# This example just returns a placeholder. Replace with your actual token retrieval.
|
||||
return "YOUR_ID_TOKEN" # Placeholder
|
||||
|
||||
# for a single tool use
|
||||
async def main():
|
||||
toolbox = ToolboxClient("<http://127.0.0.1:5000>")
|
||||
|
||||
authorized_tool = toolbox.load_tool("my-tool-name", auth_tokens={"my_auth": get_auth_token})
|
||||
auth_tool = await toolbox.aload_tool(
|
||||
"get_sensitive_data",
|
||||
auth_token_getters={"my_auth_app_1": get_auth_token}
|
||||
)
|
||||
# result = await auth_tool.acall(param="value")
|
||||
# print(result.content)
|
||||
|
||||
# for a toolset use
|
||||
|
||||
authorized_tools = toolbox.load_toolset("my-toolset-name", auth_tokens={"my_auth": get_auth_token})
|
||||
{{< /tab >}}
|
||||
if **name** == "**main**":
|
||||
asyncio.run(main()){{< /tab >}}
|
||||
{{< /tabpane >}}
|
||||
|
||||
### Specifying tokens for existing tools
|
||||
|
||||
{{< tabpane persist=header >}}
|
||||
{{< tab header="Core" lang="Python" >}}
|
||||
tools = await toolbox.load_toolset()
|
||||
|
||||
# for a single token
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)
|
||||
|
||||
# OR, if multiple tokens are needed
|
||||
|
||||
authorized_tool = tools[0].add_auth_token_getters({
|
||||
"my_auth1": get_auth1_token,
|
||||
"my_auth2": get_auth2_token,
|
||||
})
|
||||
{{< /tab >}}
|
||||
{{< tab header="LangChain" lang="Python" >}}
|
||||
tools = toolbox.load_toolset()
|
||||
|
||||
# for a single token
|
||||
|
||||
auth_tools = [tool.add_auth_token("my_auth", get_auth_token) for tool in tools]
|
||||
authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)
|
||||
|
||||
# OR, if multiple tokens are needed
|
||||
|
||||
authorized_tool = tools[0].add_auth_tokens({
|
||||
authorized_tool = tools[0].add_auth_token_getters({
|
||||
"my_auth1": get_auth1_token,
|
||||
"my_auth2": get_auth2_token,
|
||||
})
|
||||
@@ -117,11 +171,11 @@ tools = toolbox.load_toolset()
|
||||
|
||||
# for a single token
|
||||
|
||||
auth_tools = [tool.add_auth_token("my_auth", get_auth_token) for tool in tools]
|
||||
authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)
|
||||
|
||||
# OR, if multiple tokens are needed
|
||||
|
||||
authorized_tool = tools[0].add_auth_tokens({
|
||||
authorized_tool = tools[0].add_auth_token_getters({
|
||||
"my_auth1": get_auth1_token,
|
||||
"my_auth2": get_auth2_token,
|
||||
})
|
||||
|
||||
@@ -22,6 +22,17 @@ cluster][alloydb-free-trial].
|
||||
[alloydb-docs]: https://cloud.google.com/alloydb/docs
|
||||
[alloydb-free-trial]: https://cloud.google.com/alloydb/docs/create-free-trial-cluster
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`alloydb-ai-nl`](../tools/alloydbainl/alloydb-ai-nl.md)
|
||||
Use natural language queries on AlloyDB, powered by AlloyDB AI.
|
||||
|
||||
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
|
||||
Execute SQL queries as prepared statements in AlloyDB Postgres.
|
||||
|
||||
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
|
||||
Run parameterized SQL statements in AlloyDB Postgres.
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
@@ -38,11 +49,6 @@ permissions):
|
||||
- `roles/alloydb.client`
|
||||
- `roles/serviceusage.serviceUsageConsumer`
|
||||
|
||||
To connect to your AlloyDB Source using IAM authentication:
|
||||
|
||||
1. Specify your IAM email as the `user` or leave it blank for Toolbox to fetch from ADC.
|
||||
2. Leave the `password` field blank.
|
||||
|
||||
[alloydb-go-conn]: https://github.com/GoogleCloudPlatform/alloydb-go-connector
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
@@ -63,11 +69,34 @@ mTLS.
|
||||
[public-ip]: https://cloud.google.com/alloydb/docs/connect-public-ip
|
||||
[conn-overview]: https://cloud.google.com/alloydb/docs/connection-overview
|
||||
|
||||
### Database User
|
||||
### Authentication
|
||||
|
||||
Currently, this source only uses standard authentication. You will need to [create
|
||||
a PostgreSQL user][alloydb-users] to login to the database with.
|
||||
This source supports both password-based authentication and IAM
|
||||
authentication (using your [Application Default Credentials][adc]).
|
||||
|
||||
#### Standard Authentication
|
||||
|
||||
To connect using user/password, [create
|
||||
a PostgreSQL user][alloydb-users] and input your credentials in the `user` and
|
||||
`password` fields.
|
||||
|
||||
```yaml
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
#### IAM Authentication
|
||||
|
||||
To connect using IAM authentication:
|
||||
|
||||
1. Prepare your database instance and user following this [guide][iam-guide].
|
||||
2. You could choose one of the two ways to log in:
|
||||
- Specify your IAM email as the `user`.
|
||||
- Leave your `user` field blank. Toolbox will fetch the [ADC][adc]
|
||||
automatically and log in using the email associated with it.
|
||||
3. Leave the `password` field blank.
|
||||
|
||||
[iam-guide]: https://cloud.google.com/alloydb/docs/database-users/manage-iam-auth
|
||||
[alloydb-users]: https://cloud.google.com/alloydb/docs/database-users/about
|
||||
|
||||
## Example
|
||||
|
||||
@@ -34,6 +34,26 @@ avoiding full table scans or complex filters.
|
||||
[bigquery-quickstart-cli]: https://cloud.google.com/bigquery/docs/quickstarts/quickstart-command-line
|
||||
[bigquery-googlesql]: https://cloud.google.com/bigquery/docs/reference/standard-sql/
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`bigquery-sql`](../tools/bigquery/bigquery-sql.md)
|
||||
Run SQL queries directly against BigQuery datasets.
|
||||
|
||||
- [`bigquery-execute-sql`](../tools/bigquery/bigquery-execute-sql.md)
|
||||
Execute structured queries using parameters.
|
||||
|
||||
- [`bigquery-get-dataset-info`](../tools/bigquery/bigquery-get-dataset-info.md)
|
||||
Retrieve metadata for a specific dataset.
|
||||
|
||||
- [`bigquery-get-table-info`](../tools/bigquery/bigquery-get-table-info.md)
|
||||
Retrieve metadata for a specific table.
|
||||
|
||||
- [`bigquery-list-dataset-ids`](../tools/bigquery/bigquery-list-dataset-ids.md)
|
||||
List available dataset IDs.
|
||||
|
||||
- [`bigquery-list-table-ids`](../tools/bigquery/bigquery-list-table-ids.md)
|
||||
List tables in a given dataset.
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
@@ -32,6 +32,11 @@ such as avoiding full table scans or complex filters.
|
||||
[bigtable-googlesql]:
|
||||
https://cloud.google.com/bigtable/docs/googlesql-overview
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`bigtable-sql`](../tools/bigtable/bigtable-sql.md)
|
||||
Run SQL-like queries over Bigtable rows.
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
@@ -19,6 +19,14 @@ to a database by following these instructions][csql-mssql-connect].
|
||||
[csql-mssql-docs]: https://cloud.google.com/sql/docs/sqlserver
|
||||
[csql-mssql-connect]: https://cloud.google.com/sql/docs/sqlserver/connect-overview
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`mssql-sql`](../tools/mssql/mssql-sql.md)
|
||||
Execute pre-defined SQL Server queries with placeholder parameters.
|
||||
|
||||
- [`mssql-execute-sql`](../tools/mssql/mssql-execute-sql.md)
|
||||
Run parameterized SQL Server queries in Cloud SQL for SQL Server.
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
@@ -63,8 +71,8 @@ mTLS.
|
||||
|
||||
### Database User
|
||||
|
||||
Currently, this source only uses standard authentication. You will need to [create a
|
||||
SQL Server user][cloud-sql-users] to login to the database with.
|
||||
Currently, this source only uses standard authentication. You will need to
|
||||
[create a SQL Server user][cloud-sql-users] to login to the database with.
|
||||
|
||||
[cloud-sql-users]: https://cloud.google.com/sql/docs/sqlserver/create-manage-users
|
||||
|
||||
@@ -96,7 +104,7 @@ instead of hardcoding your secrets into the configuration file.
|
||||
| kind | string | true | Must be "cloud-sql-mssql". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Cloud SQL database to connect to (e.g. "my_db"). |
|
||||
| ipAddress | string | true | IP address of the Cloud SQL instance to connect to. |
|
||||
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-pg-user"). |
|
||||
|
||||
@@ -20,6 +20,14 @@ to a database by following these instructions][csql-mysql-quickstart].
|
||||
[csql-mysql-docs]: https://cloud.google.com/sql/docs/mysql
|
||||
[csql-mysql-quickstart]: https://cloud.google.com/sql/docs/mysql/connect-instance-local-computer
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
|
||||
Execute pre-defined prepared SQL queries in MySQL.
|
||||
|
||||
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
|
||||
Run parameterized SQL queries in Cloud SQL for MySQL.
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
@@ -20,6 +20,14 @@ to a database by following these instructions][csql-pg-quickstart].
|
||||
[csql-pg-docs]: https://cloud.google.com/sql/docs/postgres
|
||||
[csql-pg-quickstart]: https://cloud.google.com/sql/docs/postgres/connect-instance-local-computer
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
|
||||
Execute SQL queries as prepared statements in PostgreSQL.
|
||||
|
||||
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
|
||||
Run parameterized SQL statements in PostgreSQL.
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
@@ -42,14 +50,9 @@ scope](https://cloud.google.com/compute/docs/access/service-accounts#accesscopes
|
||||
to connect using the Cloud SQL Admin API.
|
||||
{{< /notice >}}
|
||||
|
||||
To connect to your Cloud SQL Source using IAM authentication:
|
||||
|
||||
1. Specify your IAM email as the `user` or leave it blank for Toolbox to fetch from ADC.
|
||||
2. Leave the `password` field blank.
|
||||
|
||||
[csql-go-conn]: https://github.com/GoogleCloudPlatform/cloud-sql-go-connector
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
[csql-go-conn]: <https://github.com/GoogleCloudPlatform/cloud-sql-go-connector>
|
||||
[adc]: <https://cloud.google.com/docs/authentication#adc>
|
||||
[set-adc]: <https://cloud.google.com/docs/authentication/provide-credentials-adc>
|
||||
|
||||
### Networking
|
||||
|
||||
@@ -67,12 +70,36 @@ mTLS.
|
||||
[public-ip]: https://cloud.google.com/sql/docs/postgres/configure-ip
|
||||
[conn-overview]: https://cloud.google.com/sql/docs/postgres/connect-overview
|
||||
|
||||
### Database User
|
||||
### Authentication
|
||||
|
||||
Currently, this source only uses standard authentication. You will need to [create
|
||||
a PostgreSQL user][cloud-sql-users] to login to the database with.
|
||||
This source supports both password-based authentication and IAM
|
||||
authentication (using your [Application Default Credentials][adc]).
|
||||
|
||||
[cloud-sql-users]: https://cloud.google.com/sql/docs/postgres/create-manage-users
|
||||
#### Standard Authentication
|
||||
|
||||
To connect using user/password, [create
|
||||
a PostgreSQL user][cloudsql-users] and input your credentials in the `user` and
|
||||
`password` fields.
|
||||
|
||||
```yaml
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
```
|
||||
|
||||
#### IAM Authentication
|
||||
|
||||
To connect using IAM authentication:
|
||||
|
||||
1. Prepare your database instance and user following this [guide][iam-guide].
|
||||
2. You could choose one of the two ways to log in:
|
||||
- Specify your IAM email as the `user`.
|
||||
- Leave your `user` field blank. Toolbox will fetch the [ADC][adc]
|
||||
automatically and log in using the email associated with it.
|
||||
|
||||
3. Leave the `password` field blank.
|
||||
|
||||
[iam-guide]: https://cloud.google.com/sql/docs/postgres/iam-logins
|
||||
[cloudsql-users]: https://cloud.google.com/sql/docs/postgres/create-manage-users
|
||||
|
||||
## Example
|
||||
|
||||
@@ -96,13 +123,13 @@ instead of hardcoding your secrets into the configuration file.
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "cloud-sql-postgres". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
|
||||
| user | string | false | Name of the Postgres user to connect as (e.g. "my-pg-user"). Defaults to IAM auth using [ADC][adc] email if unspecified. |
|
||||
| password | string | false | Password of the Postgres user (e.g. "my-password"). Defaults to attempting IAM authentication if unspecified. |
|
||||
| ipType | string | false | IP Type of the Cloud SQL instance; must be one of `public` or `private`. Default: `public`. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "cloud-sql-postgres". |
|
||||
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
|
||||
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
|
||||
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
|
||||
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
|
||||
| user | string | false | Name of the Postgres user to connect as (e.g. "my-pg-user"). Defaults to IAM auth using [ADC][adc] email if unspecified. |
|
||||
| password | string | false | Password of the Postgres user (e.g. "my-password"). Defaults to attempting IAM authentication if unspecified. |
|
||||
| ipType | string | false | IP Type of the Cloud SQL instance; must be one of `public` or `private`. Default: `public`. |
|
||||
|
||||
49
docs/en/resources/sources/couchbase.md
Normal file
49
docs/en/resources/sources/couchbase.md
Normal file
@@ -0,0 +1,49 @@
|
||||
---
|
||||
title: "couchbase"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "couchbase" source connects to a Couchbase database.
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `couchbase` source establishes a connection to a Couchbase database cluster,
|
||||
allowing tools to execute SQL queries against it.
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`couchbase-sql`](../tools/couchbase/couchbase-sql.md)
|
||||
Run SQL++ statements on Couchbase with parameterized input.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-couchbase-instance:
|
||||
kind: couchbase
|
||||
connectionString: couchbase://localhost:8091
|
||||
bucket: travel-sample
|
||||
scope: inventory
|
||||
username: Administrator
|
||||
password: password
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|----------------------|:--------:|:------------:|---------------------------------------------------------|
|
||||
| kind | string | true | Must be "couchbase". |
|
||||
| connectionString | string | true | Connection string for the Couchbase cluster. |
|
||||
| bucket | string | true | Name of the bucket to connect to. |
|
||||
| scope | string | true | Name of the scope within the bucket. |
|
||||
| username | string | false | Username for authentication. |
|
||||
| password | string | false | Password for authentication. |
|
||||
| clientCert | string | false | Path to client certificate file for TLS authentication. |
|
||||
| clientCertPassword | string | false | Password for the client certificate. |
|
||||
| clientKey | string | false | Path to client key file for TLS authentication. |
|
||||
| clientKeyPassword | string | false | Password for the client key. |
|
||||
| caCert | string | false | Path to CA certificate file. |
|
||||
| noSslVerify | boolean | false | If true, skip server certificate verification. **Warning:** This option should only be used in development or testing environments. Disabling SSL verification poses significant security risks in production as it makes your connection vulnerable to man-in-the-middle attacks. |
|
||||
| profile | string | false | Name of the connection profile to apply. |
|
||||
| queryScanConsistency | integer | false | Query scan consistency. Controls the consistency guarantee for index scanning. Values: 1 for "not_bounded" (fastest option, but results may not include the most recent operations), 2 for "request_plus" (highest consistency level, includes all operations up until the query started, but incurs a performance penalty). If not specified, defaults to the Couchbase Go SDK default. |
|
||||
@@ -9,7 +9,10 @@ description: >
|
||||
|
||||
## About
|
||||
|
||||
[Dgraph][dgraph-docs] is an open-source graph database. It is designed for real-time workloads, horizontal scalability, and data flexibility. Implemented as a distributed system, Dgraph processes queries in parallel to deliver the fastest result.
|
||||
[Dgraph][dgraph-docs] is an open-source graph database. It is designed for
|
||||
real-time workloads, horizontal scalability, and data flexibility. Implemented
|
||||
as a distributed system, Dgraph processes queries in parallel to deliver the
|
||||
fastest result.
|
||||
|
||||
This source can connect to either a self-managed Dgraph cluster or one hosted on
|
||||
Dgraph Cloud. If you're new to Dgraph, the fastest way to get started is to
|
||||
@@ -18,6 +21,11 @@ Dgraph Cloud. If you're new to Dgraph, the fastest way to get started is to
|
||||
[dgraph-docs]: https://dgraph.io/docs
|
||||
[dgraph-login]: https://cloud.dgraph.io/login
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`dgraph-dql`](../tools/dgraph/dgraph-dql.md)
|
||||
Run DQL (Dgraph Query Language) queries.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
@@ -52,7 +60,7 @@ instead of hardcoding your secrets into the configuration file.
|
||||
| **Field** | **Type** | **Required** | **Description** |
|
||||
|-------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "dgraph". |
|
||||
| dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). |
|
||||
| dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). |
|
||||
| user | string | false | Name of the Dgraph user to connect as (e.g., "groot"). |
|
||||
| password | string | false | Password of the Dgraph user (e.g., "password"). |
|
||||
| apiKey | string | false | API key to connect to a Dgraph Cloud instance. |
|
||||
|
||||
72
docs/en/resources/sources/firestore.md
Normal file
72
docs/en/resources/sources/firestore.md
Normal file
@@ -0,0 +1,72 @@
|
||||
---
|
||||
title: "Firestore"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Firestore is a NoSQL document database built for automatic scaling, high performance, and ease of application development. It's a fully managed, serverless database that supports mobile, web, and server development.
|
||||
|
||||
---
|
||||
|
||||
# Firestore Source
|
||||
|
||||
[Firestore][firestore-docs] is a NoSQL document database built for automatic
|
||||
scaling, high performance, and ease of application development. While the
|
||||
Firestore interface has many of the same features as traditional databases,
|
||||
as a NoSQL database it differs from them in the way it describes relationships
|
||||
between data objects.
|
||||
|
||||
If you are new to Firestore, you can [create a database and learn the
|
||||
basics][firestore-quickstart].
|
||||
|
||||
[firestore-docs]: https://cloud.google.com/firestore/docs
|
||||
[firestore-quickstart]: https://cloud.google.com/firestore/docs/quickstart-servers
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
Firestore uses [Identity and Access Management (IAM)][iam-overview] to control
|
||||
user and group access to Firestore resources. Toolbox will use your [Application
|
||||
Default Credentials (ADC)][adc] to authorize and authenticate when interacting
|
||||
with [Firestore][firestore-docs].
|
||||
|
||||
In addition to [setting the ADC for your server][set-adc], you need to ensure
|
||||
the IAM identity has been given the correct IAM permissions for accessing
|
||||
Firestore. Common roles include:
|
||||
- `roles/datastore.user` - Read and write access to Firestore
|
||||
- `roles/datastore.viewer` - Read-only access to Firestore
|
||||
- `roles/firebaserules.admin` - Full management of Firebase Security Rules for Firestore. This role is required for operations that involve creating, updating, or managing Firestore security rules (see [Firebase Security Rules roles][firebaserules-roles])
|
||||
|
||||
See [Firestore access control][firestore-iam] for more information on
|
||||
applying IAM permissions and roles to an identity.
|
||||
|
||||
[iam-overview]: https://cloud.google.com/firestore/docs/security/iam
|
||||
[adc]: https://cloud.google.com/docs/authentication#adc
|
||||
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
|
||||
[firestore-iam]: https://cloud.google.com/firestore/docs/security/iam
|
||||
[firebaserules-roles]: https://cloud.google.com/iam/docs/roles-permissions/firebaserules
|
||||
|
||||
### Database Selection
|
||||
|
||||
Firestore allows you to create multiple databases within a single project. Each
|
||||
database is isolated from the others and has its own set of documents and
|
||||
collections. If you don't specify a database in your configuration, the default
|
||||
database named `(default)` will be used.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-firestore-source:
|
||||
kind: "firestore"
|
||||
project: "my-project-id"
|
||||
# database: "my-database" # Optional, defaults to "(default)"
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "firestore". |
|
||||
| project | string | true | Id of the GCP project that contains the Firestore database (e.g. "my-project-id"). |
|
||||
| database | string | false | Name of the Firestore database to connect to. Defaults to "(default)" if not specified. |
|
||||
@@ -13,6 +13,11 @@ The HTTP Source allows Toolbox to retrieve data from arbitrary HTTP
|
||||
endpoints. This enables Generative AI applications to access data from web APIs
|
||||
and other HTTP-accessible resources.
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`http`](../tools/http/http.md)
|
||||
Make HTTP requests to REST APIs or other web services.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
@@ -27,6 +32,7 @@ sources:
|
||||
queryParams:
|
||||
param1: value1
|
||||
param2: value2
|
||||
# disableSslVerification: false
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
@@ -36,12 +42,13 @@ instead of hardcoding your secrets into the configuration file.
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:-----------------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "http". |
|
||||
| baseUrl | string | true | The base URL for the HTTP requests (e.g., `https://api.example.com`). |
|
||||
| timeout | string | false | The timeout for HTTP requests (e.g., "5s", "1m", refer to [ParseDuration][parse-duration-doc] for more examples). Defaults to 30s. |
|
||||
| headers | map[string]string | false | Default headers to include in the HTTP requests. |
|
||||
| queryParams | map[string]string | false | Default query parameters to include in the HTTP requests. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|------------------------|:-----------------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "http". |
|
||||
| baseUrl | string | true | The base URL for the HTTP requests (e.g., `https://api.example.com`). |
|
||||
| timeout | string | false | The timeout for HTTP requests (e.g., "5s", "1m", refer to [ParseDuration][parse-duration-doc] for more examples). Defaults to 30s. |
|
||||
| headers | map[string]string | false | Default headers to include in the HTTP requests. |
|
||||
| queryParams | map[string]string | false | Default query parameters to include in the HTTP requests. |
|
||||
| disableSslVerification | bool | false | Disable SSL certificate verification. This should only be used for local development. Defaults to `false`. |
|
||||
|
||||
[parse-duration-doc]: https://pkg.go.dev/time#ParseDuration
|
||||
|
||||
64
docs/en/resources/sources/looker.md
Normal file
64
docs/en/resources/sources/looker.md
Normal file
@@ -0,0 +1,64 @@
|
||||
---
|
||||
title: "Looker"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Looker is a business intelligence tool that also provides a semantic layer.
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
[Looker][looker-docs] is a web based business intelligence and data management
|
||||
tool that provides a semantic layer to facilitate querying. It can be deployed
|
||||
in the cloud, on GCP, or on premises.
|
||||
|
||||
[looker-docs]: https://cloud.google.com/looker/docs
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
|
||||
This source only uses API authentication. You will need to
|
||||
[create an API user][looker-user] to login to Looker.
|
||||
|
||||
[looker-user]: https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-looker-source:
|
||||
kind: looker
|
||||
base_url: http://looker.example.com
|
||||
client_id: ${LOOKER_CLIENT_ID}
|
||||
client_secret: ${LOOKER_CLIENT_SECRET}
|
||||
verify_ssl: true
|
||||
timeout: 600s
|
||||
```
|
||||
|
||||
The Looker base url will look like "https://looker.example.com", don't include a
|
||||
trailing "/". In some cases, especially if your Looker is deployed on-premises,
|
||||
you may need to add the API port numner like "https://looker.example.com:19999".
|
||||
|
||||
Verify ssl should almost always be "true" (all lower case) unless you are using
|
||||
a self-signed ssl certificate for the Looker server. Anything other than "true"
|
||||
will be interpretted as false.
|
||||
|
||||
The client id and client secret are seemingly random character sequences
|
||||
assigned by the looker server.
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
| ------------- | :------: | :----------: | ----------------------------------------------------------------------------------------- |
|
||||
| kind | string | true | Must be "looker". |
|
||||
| base_url | string | true | The URL of your Looker server with no trailing /). |
|
||||
| client_id | string | true | The client id assigned by Looker. |
|
||||
| client_secret | string | true | The client secret assigned by Looker. |
|
||||
| verify_ssl | string | true | Whether to check the ssl certificate of the server. |
|
||||
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |
|
||||
33
docs/en/resources/sources/mongodb.md
Normal file
33
docs/en/resources/sources/mongodb.md
Normal file
@@ -0,0 +1,33 @@
|
||||
---
|
||||
title: "MongoDB"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
MongoDB is a no-sql data platform that can not only serve general purpose data requirements also perform VectorSearch where both operational data and embeddings used of search can reside in same document.
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
[MongoDB][mongodb-docs] is a popular NoSQL database that stores data in flexible, JSON-like documents, making it easy to develop and scale applications.
|
||||
|
||||
[mongodb-docs]: https://www.mongodb.com/docs/atlas/getting-started/
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-mongodb:
|
||||
kind: mongodb
|
||||
uri: "mongodb+srv://username:password@host.mongodb.net"
|
||||
database: sample_mflix
|
||||
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|-------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "mongodb". |
|
||||
| uri | string | true | connection string to connect to MongoDB |
|
||||
| database | string | true | Name of the mongodb database to connect to (e.g. "sample_mflix"). |
|
||||
@@ -15,6 +15,14 @@ amount of data through a structured format.
|
||||
|
||||
[mssql-docs]: https://www.microsoft.com/en-us/sql-server
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`mssql-sql`](../tools/mssql/mssql-sql.md)
|
||||
Execute pre-defined SQL Server queries with placeholder parameters.
|
||||
|
||||
- [`mssql-execute-sql`](../tools/mssql/mssql-execute-sql.md)
|
||||
Run parameterized SQL Server queries in SQL Server.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
|
||||
@@ -4,7 +4,6 @@ type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
MySQL is a relational database management system that stores and manages data.
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
@@ -15,6 +14,14 @@ reliability, performance, and ease of use.
|
||||
|
||||
[mysql-docs]: https://www.mysql.com/
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
|
||||
Execute pre-defined prepared SQL queries in MySQL.
|
||||
|
||||
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
|
||||
Run parameterized SQL queries in MySQL.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
@@ -35,6 +42,7 @@ sources:
|
||||
database: my_db
|
||||
user: ${USER_NAME}
|
||||
password: ${PASSWORD}
|
||||
queryTimeout: 30s # Optional: query timeout duration
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
@@ -44,11 +52,12 @@ instead of hardcoding your secrets into the configuration file.
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "mysql". |
|
||||
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
||||
| port | string | true | Port to connect to (e.g. "3306"). |
|
||||
| database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). |
|
||||
| user | string | true | Name of the MySQL user to connect as (e.g. "my-mysql-user"). |
|
||||
| password | string | true | Password of the MySQL user (e.g. "my-password"). |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
|
||||
| kind | string | true | Must be "mysql". |
|
||||
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
|
||||
| port | string | true | Port to connect to (e.g. "3306"). |
|
||||
| database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). |
|
||||
| user | string | true | Name of the MySQL user to connect as (e.g. "my-mysql-user"). |
|
||||
| password | string | true | Password of the MySQL user (e.g. "my-password"). |
|
||||
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
|
||||
|
||||
@@ -7,12 +7,19 @@ description: >
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
[Neo4j][neo4j-docs] is a powerful, open source graph database system with over
|
||||
15 years of active development that has earned it a strong reputation for
|
||||
reliability, feature robustness, and performance.
|
||||
|
||||
[neo4j-docs]: https://neo4j.com/docs
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`neo4j-cypher`](../tools/neo4j/neo4j-cypher.md)
|
||||
Run Cypher queries against your Neo4j graph database.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
|
||||
@@ -15,6 +15,14 @@ reputation for reliability, feature robustness, and performance.
|
||||
|
||||
[pg-docs]: https://www.postgresql.org/
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
|
||||
Execute SQL queries as prepared statements in PostgreSQL.
|
||||
|
||||
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
|
||||
Run parameterized SQL statements in PostgreSQL.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database User
|
||||
|
||||
102
docs/en/resources/sources/redis.md
Normal file
102
docs/en/resources/sources/redis.md
Normal file
@@ -0,0 +1,102 @@
|
||||
---
|
||||
title: "Redis"
|
||||
linkTitle: "Redis"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Redis is an open-source, in-memory data structure store.
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
Redis is an open-source, in-memory data structure store, used as a database,
|
||||
cache, and message broker. It supports data structures such as strings, hashes,
|
||||
lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, and
|
||||
geospatial indexes with radius queries.
|
||||
|
||||
If you are new to Redis, you can find installation and getting started guides on
|
||||
the [official Redis website](https://redis.io/docs/getting-started/).
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`redis`](../tools/redis/redis.md)
|
||||
Run Redis commands and interact with key-value pairs.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Redis
|
||||
|
||||
[AUTH string][auth] is a password for connection to Redis. If you have the
|
||||
`requirepass` directive set in your Redis configuration, incoming client
|
||||
connections must authenticate in order to connect.
|
||||
|
||||
Specify your AUTH string in the password field:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-redis-instance:
|
||||
kind: redis
|
||||
address:
|
||||
- 127.0.0.1:6379
|
||||
username: ${MY_USER_NAME}
|
||||
password: ${MY_AUTH_STRING} # Omit this field if you don't have a password.
|
||||
# database: 0
|
||||
# clusterEnabled: false
|
||||
# useGCPIAM: false
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
### Memorystore For Redis
|
||||
|
||||
Memorystore standalone instances support authentication using an [AUTH][auth]
|
||||
string.
|
||||
|
||||
Here is an example tools.yaml config with [AUTH][auth] enabled:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-redis-cluster-instance:
|
||||
kind: memorystore-redis
|
||||
address:
|
||||
- 127.0.0.1:6379
|
||||
password: ${MY_AUTH_STRING}
|
||||
# useGCPIAM: false
|
||||
# clusterEnabled: false
|
||||
```
|
||||
|
||||
Memorystore Redis Cluster supports IAM authentication instead. Grant your
|
||||
account the required [IAM role][iam] and make sure to set `useGCPIAM` to `true`.
|
||||
|
||||
Here is an example tools.yaml config for Memorystore Redis Cluster instances
|
||||
using IAM authentication:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-redis-cluster-instance:
|
||||
kind: memorystore-redis
|
||||
address:
|
||||
- 127.0.0.1:6379
|
||||
useGCPIAM: true
|
||||
clusterEnabled: true
|
||||
```
|
||||
|
||||
[iam]: https://cloud.google.com/memorystore/docs/cluster/about-iam-auth
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|----------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "memorystore-redis". |
|
||||
| address | string | true | Primary endpoint for the Memorystore Redis instance to connect to. |
|
||||
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Redis, leave this field blank |
|
||||
| password | string | false | If you have [Redis AUTH][auth] enabled, specify the AUTH string here |
|
||||
| database | int | false | The Redis database to connect to. Not applicable for cluster enabled instances. The default database is `0`. |
|
||||
| clusterEnabled | bool | false | Set it to `true` if using a Redis Cluster instance. Defaults to `false`. |
|
||||
| useGCPIAM | string | false | Set it to `true` if you are using GCP's IAM authentication. Defaults to `false`. |
|
||||
|
||||
[auth]: https://cloud.google.com/memorystore/docs/redis/about-redis-auth
|
||||
@@ -23,6 +23,14 @@ the Google Cloud console][spanner-quickstart].
|
||||
[spanner-quickstart]:
|
||||
https://cloud.google.com/spanner/docs/create-query-database-console
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`spanner-sql`](../tools/spanner/spanner-sql.md)
|
||||
Execute SQL on Google Cloud Spanner.
|
||||
|
||||
- [`spanner-execute-sql`](../tools/spanner/spanner-execute-sql.md)
|
||||
Run structured and parameterized queries on Spanner.
|
||||
|
||||
## Requirements
|
||||
|
||||
### IAM Permissions
|
||||
|
||||
@@ -15,17 +15,24 @@ database management system. The lite in SQLite means lightweight in terms of
|
||||
setup, database administration, and required resources.
|
||||
|
||||
SQLite has the following notable characteristics:
|
||||
|
||||
- Self-contained with no external dependencies
|
||||
- Serverless - the SQLite library accesses its storage files directly
|
||||
- Single database file that can be easily copied or moved
|
||||
- Zero-configuration - no setup or administration needed
|
||||
- Transactional with ACID properties
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`sqlite-sql`](../tools/sqlite/sqlite-sql.md)
|
||||
Run SQL queries against a local SQLite database.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Database File
|
||||
|
||||
You need a SQLite database file. This can be:
|
||||
|
||||
- An existing database file
|
||||
- A path where a new database file should be created
|
||||
- `:memory:` for an in-memory database
|
||||
@@ -40,6 +47,7 @@ sources:
|
||||
```
|
||||
|
||||
For an in-memory database:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-sqlite-memory-db:
|
||||
@@ -51,13 +59,14 @@ sources:
|
||||
|
||||
### Configuration Fields
|
||||
|
||||
| Field | Type | Required | Description |
|
||||
|-------|------|----------|-------------|
|
||||
| kind | string | Yes | Must be "sqlite" |
|
||||
| database | string | Yes | Path to SQLite database file, or ":memory:" for an in-memory database |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "spanner". |
|
||||
| database | string | true | Path to SQLite database file, or ":memory:" for an in-memory database. |
|
||||
|
||||
### Connection Properties
|
||||
|
||||
SQLite connections are configured with these defaults for optimal performance:
|
||||
|
||||
- `MaxOpenConns`: 1 (SQLite only supports one writer at a time)
|
||||
- `MaxIdleConns`: 1
|
||||
- `MaxIdleConns`: 1
|
||||
|
||||
74
docs/en/resources/sources/valkey.md
Normal file
74
docs/en/resources/sources/valkey.md
Normal file
@@ -0,0 +1,74 @@
|
||||
---
|
||||
title: "Valkey"
|
||||
linkTitle: "Valkey"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Valkey is an open-source, in-memory data structure store, forked from Redis.
|
||||
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
Valkey is an open-source, in-memory data structure store that originated as a
|
||||
fork of Redis. It's designed to be used as a database, cache, and message
|
||||
broker, supporting a wide range of data structures like strings, hashes, lists,
|
||||
sets, sorted sets with range queries, bitmaps, hyperloglogs, and geospatial
|
||||
indexes with radius queries.
|
||||
|
||||
If you're new to Valkey, you can find installation and getting started guides on
|
||||
the [official Valkey website](https://valkey.io/topics/quickstart/).
|
||||
|
||||
## Available Tools
|
||||
|
||||
- [`valkey`](../tools/valkey/valkey.md)
|
||||
Issue Valkey (Redis-compatible) commands.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-valkey-instance:
|
||||
kind: valkey
|
||||
address:
|
||||
- 127.0.0.1:6379
|
||||
username: ${YOUR_USERNAME}
|
||||
password: ${YOUR_PASSWORD}
|
||||
# database: 0
|
||||
# useGCPIAM: false
|
||||
# disableCache: false
|
||||
```
|
||||
|
||||
{{< notice tip >}}
|
||||
Use environment variable replacement with the format ${ENV_NAME}
|
||||
instead of hardcoding your secrets into the configuration file.
|
||||
{{< /notice >}}
|
||||
|
||||
### IAM Authentication
|
||||
|
||||
If you are using GCP's Memorystore for Valkey, you can connect using IAM
|
||||
authentication. Grant your account the required [IAM role][iam] and set
|
||||
`useGCPIAM` to `true`:
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
my-valkey-instance:
|
||||
kind: valkey
|
||||
address:
|
||||
- 127.0.0.1:6379
|
||||
useGCPIAM: true
|
||||
```
|
||||
|
||||
[iam]: https://cloud.google.com/memorystore/docs/valkey/about-iam-auth
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|--------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "valkey". |
|
||||
| address | []string | true | Endpoints for the Valkey instance to connect to. |
|
||||
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Valkey, leave this field blank |
|
||||
| password | string | false | Password for the Valkey instance |
|
||||
| database | int | false | The Valkey database to connect to. Not applicable for cluster enabled instances. The default database is `0`. |
|
||||
| useGCPIAM | bool | false | Set it to `true` if you are using GCP's IAM authentication. Defaults to `false`. |
|
||||
| disableCache | bool | false | Set it to `true` if you want to enable client-side caching. Defaults to `false`. |
|
||||
@@ -11,7 +11,6 @@ A tool represents an action your agent can take, such as running a SQL
|
||||
statement. You can define Tools as a map in the `tools` section of your
|
||||
`tools.yaml` file. Typically, a tool will require a source to act on:
|
||||
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
search_flights_by_number:
|
||||
@@ -50,7 +49,6 @@ tools:
|
||||
description: 1 to 4 digit number
|
||||
```
|
||||
|
||||
|
||||
## Specifying Parameters
|
||||
|
||||
Parameters for each Tool will define what inputs the agent will need to provide
|
||||
@@ -79,44 +77,87 @@ the parameter.
|
||||
description: Airline unique 2 letter identifier
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:--------:|:------------:|----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:---------------:|:------------:|-----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
|
||||
### Array Parameters
|
||||
|
||||
The `array` type is a list of items passed in as a single parameter.
|
||||
To use the `array` type, you must also specify what kind of items are
|
||||
To use the `array` type, you must also specify what kind of items are
|
||||
in the list using the items field:
|
||||
|
||||
```yaml
|
||||
parameters:
|
||||
- name: preffered_airlines
|
||||
- name: preferred_airlines
|
||||
type: array
|
||||
description: A list of airline, ordered by preference.
|
||||
description: A list of airline, ordered by preference.
|
||||
items:
|
||||
name: name
|
||||
name: name
|
||||
type: string
|
||||
description: Name of the airline.
|
||||
description: Name of the airline.
|
||||
statement: |
|
||||
SELECT * FROM airlines WHERE preferred_airlines = ANY($1);
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:----------------:|:------------:|----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| items | parameter object | true | Specify a Parameter object for the type of the values in the array. |
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:----------------:|:------------:|-----------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the parameter. |
|
||||
| type | string | true | Must be "array" |
|
||||
| description | string | true | Natural language description of the parameter to describe it to the agent. |
|
||||
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
|
||||
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
|
||||
| items | parameter object | true | Specify a Parameter object for the type of the values in the array. |
|
||||
|
||||
{{< notice note >}}
|
||||
Items in array should not have a `default` or `required` value. If provided, it will be ignored.
|
||||
{{< /notice >}}
|
||||
|
||||
### Map Parameters
|
||||
|
||||
The map type is a collection of key-value pairs. It can be configured in two ways:
|
||||
|
||||
- Generic Map: By default, it accepts values of any primitive type (string, integer, float, boolean), allowing for mixed data.
|
||||
- Typed Map: By setting the valueType field, you can enforce that all values
|
||||
within the map must be of the same specified type.
|
||||
|
||||
#### Generic Map (Mixed Value Types)
|
||||
|
||||
This is the default behavior when valueType is omitted. It's useful for passing a flexible group of settings.
|
||||
|
||||
```yaml
|
||||
parameters:
|
||||
- name: execution_context
|
||||
type: map
|
||||
description: A flexible set of key-value pairs for the execution environment.
|
||||
```
|
||||
|
||||
#### Typed Map
|
||||
|
||||
Specify valueType to ensure all values in the map are of the same type. An error
|
||||
will be thrown in case of value type mismatch.
|
||||
|
||||
```yaml
|
||||
parameters:
|
||||
- name: user_scores
|
||||
type: map
|
||||
description: A map of user IDs to their scores. All scores must be integers.
|
||||
valueType: integer # This enforces the value type for all entries.
|
||||
```
|
||||
|
||||
### Authenticated Parameters
|
||||
|
||||
Authenticated parameters are automatically populated with user
|
||||
information decoded from [ID tokens](../authsources/#specifying-id-tokens-from-clients) that
|
||||
are passed in request headers. They do not take input values in request bodies
|
||||
like other parameters. To use authenticated parameters, you must configure
|
||||
the tool to map the required [authServices](../authservices) to
|
||||
specific claims within the user's ID token.
|
||||
information decoded from [ID
|
||||
tokens](../authsources/#specifying-id-tokens-from-clients) that are passed in
|
||||
request headers. They do not take input values in request bodies like other
|
||||
parameters. To use authenticated parameters, you must configure the tool to map
|
||||
the required [authServices](../authservices) to specific claims within the
|
||||
user's ID token.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
@@ -141,6 +182,60 @@ specific claims within the user's ID token.
|
||||
| name | string | true | Name of the [authServices](../authservices) used to verify the OIDC auth token. |
|
||||
| field | string | true | Claim field decoded from the OIDC token used to auto-populate this parameter. |
|
||||
|
||||
### Template Parameters
|
||||
|
||||
Template parameters types include `string`, `integer`, `float`, `boolean` types.
|
||||
In most cases, the description will be provided to the LLM as context on
|
||||
specifying the parameter. Template parameters will be inserted into the SQL
|
||||
statement before executing the prepared statement. They will be inserted without
|
||||
quotes, so to insert a string using template parameters, quotes must be
|
||||
explicitly added within the string.
|
||||
|
||||
Template parameter arrays can also be used similarly to basic parameters, and array
|
||||
items must be strings. Once inserted into the SQL statement, the outer layer of
|
||||
quotes will be removed. Therefore to insert strings into the SQL statement, a
|
||||
set of quotes must be explicitly added within the string.
|
||||
|
||||
{{< notice warning >}}
|
||||
Because template parameters can directly replace identifiers, column names, and
|
||||
table names, they are prone to SQL injections. Basic parameters are preferred
|
||||
for performance and safety reasons.
|
||||
{{< /notice >}}
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
select_columns_from_table:
|
||||
kind: postgres-sql
|
||||
source: my-pg-instance
|
||||
statement: |
|
||||
SELECT {{array .columnNames}} FROM {{.tableName}}
|
||||
description: |
|
||||
Use this tool to list all information from a specific table.
|
||||
Example:
|
||||
{{
|
||||
"tableName": "flights",
|
||||
"columnNames": ["id", "name"]
|
||||
}}
|
||||
templateParameters:
|
||||
- name: tableName
|
||||
type: string
|
||||
description: Table to select from
|
||||
- name: columnNames
|
||||
type: array
|
||||
description: The columns to select
|
||||
items:
|
||||
name: column
|
||||
type: string
|
||||
description: Name of a column to select
|
||||
```
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:----------------:|:-------------:|-------------------------------------------------------------------------------------|
|
||||
| name | string | true | Name of the template parameter. |
|
||||
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
|
||||
| description | string | true | Natural language description of the template parameter to describe it to the agent. |
|
||||
| items | parameter object |true (if array)| Specify a Parameter object for the type of the values in the array (string only). |
|
||||
|
||||
## Authorized Invocations
|
||||
|
||||
You can require an authorization check for any Tool invocation request by
|
||||
|
||||
7
docs/en/resources/tools/alloydbainl/_index.md
Normal file
7
docs/en/resources/tools/alloydbainl/_index.md
Normal file
@@ -0,0 +1,7 @@
|
||||
---
|
||||
title: "AlloyDB AI NL"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
AlloyDB AI NL Tool.
|
||||
---
|
||||
@@ -7,6 +7,8 @@ description: >
|
||||
[AlloyDB AI](https://cloud.google.com/alloydb/ai) next-generation Natural
|
||||
Language support to provide the ability to query the database directly using
|
||||
natural language.
|
||||
aliases:
|
||||
- /resources/tools/alloydb-ai-nl
|
||||
---
|
||||
|
||||
## About
|
||||
@@ -16,22 +18,23 @@ Language][alloydb-ai-nl-overview] support to allow an Agent the ability to query
|
||||
the database directly using natural language. Natural language streamlines the
|
||||
development of generative AI applications by transferring the complexity of
|
||||
converting natural language to SQL from the application layer to the database
|
||||
layer.
|
||||
layer.
|
||||
|
||||
This tool is compatible with the following sources:
|
||||
|
||||
- [alloydb-postgres](../sources/alloydb-pg.md)
|
||||
|
||||
AlloyDB AI Natural Language delivers secure and accurate responses for
|
||||
application end user natural language questions. Natural language streamlines
|
||||
the development of generative AI applications by transferring the complexity
|
||||
of converting natural language to SQL from the application layer to the
|
||||
AlloyDB AI Natural Language delivers secure and accurate responses for
|
||||
application end user natural language questions. Natural language streamlines
|
||||
the development of generative AI applications by transferring the complexity
|
||||
of converting natural language to SQL from the application layer to the
|
||||
database layer.
|
||||
|
||||
## Requirements
|
||||
|
||||
{{< notice tip >}} AlloyDB AI natural language is currently in gated public
|
||||
preview. For more information on availability and limitations, please see
|
||||
[AlloyDB AI natural language
|
||||
overview](https://cloud.google.com/alloydb/docs/natural-language-questions-overview)
|
||||
[AlloyDB AI natural language overview](https://cloud.google.com/alloydb/docs/ai/natural-language-overview)
|
||||
{{< /notice >}}
|
||||
|
||||
To enable AlloyDB AI natural language for your AlloyDB cluster, please follow
|
||||
@@ -39,19 +42,19 @@ the steps listed in the [Generate SQL queries that answer natural language
|
||||
questions][alloydb-ai-gen-nl], including enabling the extension and configuring
|
||||
context for your application.
|
||||
|
||||
[alloydb-ai-nl-overview]: https://cloud.google.com/alloydb/docs/natural-language-questions-overview
|
||||
[alloydb-ai-gen-nl]: https://cloud.google.com/alloydb/docs/alloydb/docs/ai/generate-queries-natural-language
|
||||
|
||||
[alloydb-ai-nl-overview]: https://cloud.google.com/alloydb/docs/ai/natural-language-overview
|
||||
[alloydb-ai-gen-nl]: https://cloud.google.com/alloydb/docs/ai/generate-sql-queries-natural-language
|
||||
|
||||
## Configuration
|
||||
|
||||
|
||||
### Specifying an `nl_config`
|
||||
|
||||
A `nl_config` is a configuration that associates an application to schema
|
||||
objects, examples and other contexts that can be used. A large application can
|
||||
also use different configurations for different parts of the app, as long as the
|
||||
correct configuration can be specified when a question is sent from that part of
|
||||
the application.
|
||||
|
||||
|
||||
Once you've followed the steps for configuring context, you can use the
|
||||
`context` field when configuring a `alloydb-ai-nl` tool. When this tool is
|
||||
invoked, the SQL will be generated and executed using this context.
|
||||
@@ -59,9 +62,9 @@ invoked, the SQL will be generated and executed using this context.
|
||||
### Specifying Parameters to PSV's
|
||||
|
||||
[Parameterized Secure Views (PSVs)][alloydb-psv] are a feature unique to AlloyDB
|
||||
that allows you allow you to require one or more named parameter values passed
|
||||
that allows you to require one or more named parameter values passed
|
||||
to the view when querying it, somewhat like bind variables with ordinary
|
||||
database queries.
|
||||
database queries.
|
||||
|
||||
You can use the `nlConfigParameters` to list the parameters required for your
|
||||
`nl_config`. You **must** supply all parameters required for all PSVs in the
|
||||
@@ -70,7 +73,7 @@ Parameters](../tools/#array-parameters) or Bound Parameters to provide secure
|
||||
access to queries generated using natural language, as these parameters are not
|
||||
visible to the LLM.
|
||||
|
||||
[alloydb-psv]: https://cloud.google.com/alloydb/docs/ai/use-psvs#parameterized_secure_views
|
||||
[alloydb-psv]: https://cloud.google.com/alloydb/docs/parameterized-secure-views-overview
|
||||
|
||||
## Example
|
||||
|
||||
@@ -1,68 +0,0 @@
|
||||
---
|
||||
title: "BigQuery-sql"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigquery-sql" tool executes a pre-defined SQL statement.
|
||||
---
|
||||
|
||||
## About
|
||||
A `bigquery-sql` tool executes a pre-defined SQL statement. It's compatible with
|
||||
the following sources:
|
||||
|
||||
- [bigquery](../sources/bigquery.md)
|
||||
|
||||
### GoogleSQL
|
||||
|
||||
BigQuery uses [GoogleSQL][bigquery-googlesql] for querying data. The integration
|
||||
with Toolbox supports this dialect. The specified SQL statement is executed, and
|
||||
parameters can be inserted into the query. BigQuery supports both named parameters
|
||||
(e.g., `@name`) and positional parameters (`?`), but they cannot be mixed in the
|
||||
same query.
|
||||
|
||||
[bigquery-googlesql]: https://cloud.google.com/bigquery/docs/reference/standard-sql/
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
# Example: Querying a user table in BigQuery
|
||||
search_users_bq:
|
||||
kind: bigquery-sql
|
||||
source: my-bigquery-source
|
||||
statement: |
|
||||
SELECT
|
||||
id,
|
||||
name,
|
||||
email
|
||||
FROM
|
||||
`my-project.my-dataset.users`
|
||||
WHERE
|
||||
id = @id OR email = @email;
|
||||
description: |
|
||||
Use this tool to get information for a specific user.
|
||||
Takes an id number or a name and returns info on the user.
|
||||
|
||||
Example:
|
||||
{{
|
||||
"id": 123,
|
||||
"name": "Alice",
|
||||
}}
|
||||
parameters:
|
||||
- name: id
|
||||
type: integer
|
||||
description: User ID
|
||||
- name: email
|
||||
type: string
|
||||
description: Email address of the user
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery-sql". |
|
||||
| source | string | true | Name of the source the GoogleSQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| statement | string | true | The GoogleSQL statement to execute. |
|
||||
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
|
||||
7
docs/en/resources/tools/bigquery/_index.md
Normal file
7
docs/en/resources/tools/bigquery/_index.md
Normal file
@@ -0,0 +1,7 @@
|
||||
---
|
||||
title: "BigQuery"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Tools that work with BigQuery Sources.
|
||||
---
|
||||
37
docs/en/resources/tools/bigquery/bigquery-execute-sql.md
Normal file
37
docs/en/resources/tools/bigquery/bigquery-execute-sql.md
Normal file
@@ -0,0 +1,37 @@
|
||||
---
|
||||
title: "bigquery-execute-sql"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigquery-execute-sql" tool executes a SQL statement against BigQuery.
|
||||
aliases:
|
||||
- /resources/tools/bigquery-execute-sql
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigquery-execute-sql` tool executes a SQL statement against BigQuery.
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [bigquery](../sources/bigquery.md)
|
||||
|
||||
`bigquery-execute-sql` takes one input parameter `sql` and runs the sql
|
||||
statement against the `source`.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
execute_sql_tool:
|
||||
kind: bigquery-execute-sql
|
||||
source: my-bigquery-source
|
||||
description: Use this tool to execute sql statement.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery-execute-sql". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -0,0 +1,39 @@
|
||||
---
|
||||
title: "bigquery-get-dataset-info"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigquery-get-dataset-info" tool retrieves metadata for a BigQuery dataset.
|
||||
aliases:
|
||||
- /resources/tools/bigquery-get-dataset-info
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigquery-get-dataset-info` tool retrieves metadata for a BigQuery dataset.
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [bigquery](../sources/bigquery.md)
|
||||
|
||||
`bigquery-get-dataset-info` takes a `dataset` parameter to specify the dataset
|
||||
on the given source. It also optionally accepts a `project` parameter to
|
||||
define the Google Cloud project ID. If the `project` parameter is not provided,
|
||||
the tool defaults to using the project defined in the source configuration.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
bigquery_get_dataset_info:
|
||||
kind: bigquery-get-dataset-info
|
||||
source: my-bigquery-source
|
||||
description: Use this tool to get dataset metadata.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery-get-dataset-info". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
39
docs/en/resources/tools/bigquery/bigquery-get-table-info.md
Normal file
39
docs/en/resources/tools/bigquery/bigquery-get-table-info.md
Normal file
@@ -0,0 +1,39 @@
|
||||
---
|
||||
title: "bigquery-get-table-info"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigquery-get-table-info" tool retrieves metadata for a BigQuery table.
|
||||
aliases:
|
||||
- /resources/tools/bigquery-get-table-info
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigquery-get-table-info` tool retrieves metadata for a BigQuery table.
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [bigquery](../sources/bigquery.md)
|
||||
|
||||
`bigquery-get-table-info` takes `dataset` and `table` parameters to specify
|
||||
the target table. It also optionally accepts a `project` parameter to define
|
||||
the Google Cloud project ID. If the `project` parameter is not provided, the
|
||||
tool defaults to using the project defined in the source configuration.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
bigquery_get_table_info:
|
||||
kind: bigquery-get-table-info
|
||||
source: my-bigquery-source
|
||||
description: Use this tool to get table metadata.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery-get-table-info". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
@@ -0,0 +1,38 @@
|
||||
---
|
||||
title: "bigquery-list-dataset-ids"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigquery-list-dataset-ids" tool returns all dataset IDs from the source.
|
||||
aliases:
|
||||
- /resources/tools/bigquery-list-dataset-ids
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigquery-list-dataset-ids` tool returns all dataset IDs from the source.
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [bigquery](../sources/bigquery.md)
|
||||
|
||||
`bigquery-list-dataset-ids` optionally accepts a `project` parameter to define
|
||||
the Google Cloud project ID. If the `project` parameter is not provided, the
|
||||
tool defaults to using the project defined in the source configuration.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
bigquery_list_dataset_ids:
|
||||
kind: bigquery-list-dataset-ids
|
||||
source: my-bigquery-source
|
||||
description: Use this tool to get dataset metadata.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery-list-dataset-ids". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
39
docs/en/resources/tools/bigquery/bigquery-list-table-ids.md
Normal file
39
docs/en/resources/tools/bigquery/bigquery-list-table-ids.md
Normal file
@@ -0,0 +1,39 @@
|
||||
---
|
||||
title: "bigquery-list-table-ids"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigquery-list-table-ids" tool returns table IDs in a given BigQuery dataset.
|
||||
aliases:
|
||||
- /resources/tools/bigquery-list-table-ids
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigquery-list-table-ids` tool returns table IDs in a given BigQuery dataset.
|
||||
It's compatible with the following sources:
|
||||
|
||||
- [bigquery](../sources/bigquery.md)
|
||||
|
||||
`bigquery-get-dataset-info` takes a required `dataset` parameter to specify the dataset
|
||||
from which to list table IDs. It also optionally accepts a `project` parameter to
|
||||
define the Google Cloud project ID. If the `project` parameter is not provided, the
|
||||
tool defaults to using the project defined in the source configuration.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
bigquery_list_table_ids:
|
||||
kind: bigquery-list-table-ids
|
||||
source: my-bigquery-source
|
||||
description: Use this tool to get table metadata.
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery-list-table-ids". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
105
docs/en/resources/tools/bigquery/bigquery-sql.md
Normal file
105
docs/en/resources/tools/bigquery/bigquery-sql.md
Normal file
@@ -0,0 +1,105 @@
|
||||
---
|
||||
title: "bigquery-sql"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigquery-sql" tool executes a pre-defined SQL statement.
|
||||
aliases:
|
||||
- /resources/tools/bigquery-sql
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigquery-sql` tool executes a pre-defined SQL statement. It's compatible with
|
||||
the following sources:
|
||||
|
||||
- [bigquery](../sources/bigquery.md)
|
||||
|
||||
### GoogleSQL
|
||||
|
||||
BigQuery uses [GoogleSQL][bigquery-googlesql] for querying data. The integration
|
||||
with Toolbox supports this dialect. The specified SQL statement is executed, and
|
||||
parameters can be inserted into the query. BigQuery supports both named parameters
|
||||
(e.g., `@name`) and positional parameters (`?`), but they cannot be mixed in the
|
||||
same query.
|
||||
|
||||
[bigquery-googlesql]: https://cloud.google.com/bigquery/docs/reference/standard-sql/
|
||||
|
||||
## Example
|
||||
|
||||
> **Note:** This tool uses [parameterized
|
||||
> queries](https://cloud.google.com/bigquery/docs/parameterized-queries) to
|
||||
> prevent SQL injections. Query parameters can be used as substitutes for
|
||||
> arbitrary expressions. Parameters cannot be used as substitutes for
|
||||
> identifiers, column names, table names, or other parts of the query.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
# Example: Querying a user table in BigQuery
|
||||
search_users_bq:
|
||||
kind: bigquery-sql
|
||||
source: my-bigquery-source
|
||||
statement: |
|
||||
SELECT
|
||||
id,
|
||||
name,
|
||||
email
|
||||
FROM
|
||||
`my-project.my-dataset.users`
|
||||
WHERE
|
||||
id = @id OR email = @email;
|
||||
description: |
|
||||
Use this tool to get information for a specific user.
|
||||
Takes an id number or a name and returns info on the user.
|
||||
|
||||
Example:
|
||||
{{
|
||||
"id": 123,
|
||||
"name": "Alice",
|
||||
}}
|
||||
parameters:
|
||||
- name: id
|
||||
type: integer
|
||||
description: User ID
|
||||
- name: email
|
||||
type: string
|
||||
description: Email address of the user
|
||||
```
|
||||
|
||||
### Example with Template Parameters
|
||||
|
||||
> **Note:** This tool allows direct modifications to the SQL statement,
|
||||
> including identifiers, column names, and table names. **This makes it more
|
||||
> vulnerable to SQL injections**. Using basic parameters only (see above) is
|
||||
> recommended for performance and safety reasons. For more details, please check
|
||||
> [templateParameters](_index#template-parameters).
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
list_table:
|
||||
kind: bigquery-sql
|
||||
source: my-bigquery-source
|
||||
statement: |
|
||||
SELECT * FROM {{.tableName}};
|
||||
description: |
|
||||
Use this tool to list all information from a specific table.
|
||||
Example:
|
||||
{{
|
||||
"tableName": "flights",
|
||||
}}
|
||||
templateParameters:
|
||||
- name: tableName
|
||||
type: string
|
||||
description: Table to select from
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigquery-sql". |
|
||||
| source | string | true | Name of the source the GoogleSQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| statement | string | true | The GoogleSQL statement to execute. |
|
||||
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
|
||||
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
||||
@@ -1,82 +0,0 @@
|
||||
---
|
||||
title: "bigtable-sql"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigtable-sql" tool executes a pre-defined SQL statement against a Google
|
||||
Cloud Bigtable instance.
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigtable-sql` tool executes a pre-defined SQL statement against a Bigtable
|
||||
instance. It's compatible with any of the following sources:
|
||||
|
||||
- [bigtable](../sources/bigtable.md)
|
||||
|
||||
### GoogleSQL
|
||||
|
||||
Bigtable supports SQL queries. The integration with Toolbox supports `googlesql`
|
||||
dialect, the specified SQL statement is executed as a [data manipulation
|
||||
language (DML)][bigtable-googlesql] statements, and specified parameters will
|
||||
inserted according to their name: e.g. `@name`.
|
||||
|
||||
[bigtable-googlesql]: https://cloud.google.com/bigtable/docs/googlesql-overview
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
search_user_by_id_or_name:
|
||||
kind: bigtable-sql
|
||||
source: my-bigtable-instance
|
||||
statement: |
|
||||
SELECT
|
||||
TO_INT64(cf[ 'id' ]) as id,
|
||||
CAST(cf[ 'name' ] AS string) as name,
|
||||
FROM
|
||||
% s
|
||||
WHERE
|
||||
TO_INT64(cf[ 'id' ]) = @id
|
||||
OR CAST(cf[ 'name' ] AS string) = @name;
|
||||
description: |
|
||||
Use this tool to get information for a specific user.
|
||||
Takes an id number or a name and returns info on the user.
|
||||
|
||||
Example:
|
||||
{{
|
||||
"id": 123,
|
||||
"name": "Alice",
|
||||
}}
|
||||
parameters:
|
||||
- name: id
|
||||
type: integer
|
||||
description: User ID
|
||||
- name: name
|
||||
type: string
|
||||
description: Name of the user
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigtable-sql". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| statement | string | true | SQL statement to execute on. |
|
||||
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
|
||||
|
||||
## Tips
|
||||
|
||||
- [Bigtable Studio][bigtable-studio] is a useful to explore and manage your
|
||||
Bigtable data. If you're unfamiliar with the query syntax, [Query
|
||||
Builder][bigtable-querybuilder] lets you build a query, run it against a
|
||||
table, and then view the results in the console.
|
||||
- Some Python libraries limit the use of underscore columns such as `_key`. A
|
||||
workaround would be to leverage Bigtable [Logical
|
||||
Views][bigtable-logical-view] to rename the columns.
|
||||
|
||||
[bigtable-studio]: https://cloud.google.com/bigtable/docs/manage-data-using-console
|
||||
[bigtable-logical-view]: https://cloud.google.com/bigtable/docs/create-manage-logical-views
|
||||
[bigtable-querybuilder]: https://cloud.google.com/bigtable/docs/query-builder
|
||||
7
docs/en/resources/tools/bigtable/_index.md
Normal file
7
docs/en/resources/tools/bigtable/_index.md
Normal file
@@ -0,0 +1,7 @@
|
||||
---
|
||||
title: "Bigtable"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
Tools that work with Bigtable Sources.
|
||||
---
|
||||
120
docs/en/resources/tools/bigtable/bigtable-sql.md
Normal file
120
docs/en/resources/tools/bigtable/bigtable-sql.md
Normal file
@@ -0,0 +1,120 @@
|
||||
---
|
||||
title: "bigtable-sql"
|
||||
type: docs
|
||||
weight: 1
|
||||
description: >
|
||||
A "bigtable-sql" tool executes a pre-defined SQL statement against a Google
|
||||
Cloud Bigtable instance.
|
||||
aliases:
|
||||
- /resources/tools/bigtable-sql
|
||||
---
|
||||
|
||||
## About
|
||||
|
||||
A `bigtable-sql` tool executes a pre-defined SQL statement against a Bigtable
|
||||
instance. It's compatible with any of the following sources:
|
||||
|
||||
- [bigtable](../sources/bigtable.md)
|
||||
|
||||
### GoogleSQL
|
||||
|
||||
Bigtable supports SQL queries. The integration with Toolbox supports `googlesql`
|
||||
dialect, the specified SQL statement is executed as a [data manipulation
|
||||
language (DML)][bigtable-googlesql] statements, and specified parameters will inserted according to their name: e.g. `@name`.
|
||||
|
||||
{{<notice note>}}
|
||||
Bigtable's GoogleSQL support for DML statements might be limited to certain query types. For detailed information on supported DML statements and use cases, refer to the [Bigtable GoogleSQL use cases](https://cloud.google.com/bigtable/docs/googlesql-overview#use-cases).
|
||||
{{</notice>}}
|
||||
|
||||
[bigtable-googlesql]: https://cloud.google.com/bigtable/docs/googlesql-overview
|
||||
|
||||
## Example
|
||||
|
||||
> **Note:** This tool uses parameterized queries to prevent SQL injections.
|
||||
> Query parameters can be used as substitutes for arbitrary expressions.
|
||||
> Parameters cannot be used as substitutes for identifiers, column names, table
|
||||
> names, or other parts of the query.
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
search_user_by_id_or_name:
|
||||
kind: bigtable-sql
|
||||
source: my-bigtable-instance
|
||||
statement: |
|
||||
SELECT
|
||||
TO_INT64(cf[ 'id' ]) as id,
|
||||
CAST(cf[ 'name' ] AS string) as name,
|
||||
FROM
|
||||
mytable
|
||||
WHERE
|
||||
TO_INT64(cf[ 'id' ]) = @id
|
||||
OR CAST(cf[ 'name' ] AS string) = @name;
|
||||
description: |
|
||||
Use this tool to get information for a specific user.
|
||||
Takes an id number or a name and returns info on the user.
|
||||
|
||||
Example:
|
||||
{{
|
||||
"id": 123,
|
||||
"name": "Alice",
|
||||
}}
|
||||
parameters:
|
||||
- name: id
|
||||
type: integer
|
||||
description: User ID
|
||||
- name: name
|
||||
type: string
|
||||
description: Name of the user
|
||||
```
|
||||
|
||||
### Example with Template Parameters
|
||||
|
||||
> **Note:** This tool allows direct modifications to the SQL statement,
|
||||
> including identifiers, column names, and table names. **This makes it more
|
||||
> vulnerable to SQL injections**. Using basic parameters only (see above) is
|
||||
> recommended for performance and safety reasons. For more details, please check
|
||||
> [templateParameters](_index#template-parameters).
|
||||
|
||||
```yaml
|
||||
tools:
|
||||
list_table:
|
||||
kind: bigtable-sql
|
||||
source: my-bigtable-instance
|
||||
statement: |
|
||||
SELECT * FROM {{.tableName}};
|
||||
description: |
|
||||
Use this tool to list all information from a specific table.
|
||||
Example:
|
||||
{{
|
||||
"tableName": "flights",
|
||||
}}
|
||||
templateParameters:
|
||||
- name: tableName
|
||||
type: string
|
||||
description: Table to select from
|
||||
```
|
||||
|
||||
## Reference
|
||||
|
||||
| **field** | **type** | **required** | **description** |
|
||||
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| kind | string | true | Must be "bigtable-sql". |
|
||||
| source | string | true | Name of the source the SQL should execute on. |
|
||||
| description | string | true | Description of the tool that is passed to the LLM. |
|
||||
| statement | string | true | SQL statement to execute on. |
|
||||
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
|
||||
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
|
||||
|
||||
## Tips
|
||||
|
||||
- [Bigtable Studio][bigtable-studio] is a useful to explore and manage your
|
||||
Bigtable data. If you're unfamiliar with the query syntax, [Query
|
||||
Builder][bigtable-querybuilder] lets you build a query, run it against a
|
||||
table, and then view the results in the console.
|
||||
- Some Python libraries limit the use of underscore columns such as `_key`. A
|
||||
workaround would be to leverage Bigtable [Logical
|
||||
Views][bigtable-logical-view] to rename the columns.
|
||||
|
||||
[bigtable-studio]: https://cloud.google.com/bigtable/docs/manage-data-using-console
|
||||
[bigtable-logical-view]: https://cloud.google.com/bigtable/docs/create-manage-logical-views
|
||||
[bigtable-querybuilder]: https://cloud.google.com/bigtable/docs/query-builder
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user