mirror of
https://github.com/googleapis/genai-toolbox.git
synced 2026-01-13 17:38:10 -05:00
Compare commits
13 Commits
spanner-cr
...
guide
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
92a22de07c | ||
|
|
9a515a8792 | ||
|
|
e255808714 | ||
|
|
d69792d843 | ||
|
|
647b04d3a7 | ||
|
|
030df9766f | ||
|
|
5dbf207162 | ||
|
|
9c3720e31d | ||
|
|
3cd3c39d66 | ||
|
|
0691a6f715 | ||
|
|
467b96a23b | ||
|
|
4abf0c39e7 | ||
|
|
dd7b9de623 |
@@ -59,6 +59,13 @@ You can manually trigger the bot by commenting on your Pull Request:
|
|||||||
* `/gemini summary`: Posts a summary of the changes in the pull request.
|
* `/gemini summary`: Posts a summary of the changes in the pull request.
|
||||||
* `/gemini help`: Overview of the available commands
|
* `/gemini help`: Overview of the available commands
|
||||||
|
|
||||||
|
## Guidelines for Pull Requests
|
||||||
|
|
||||||
|
1. Please keep your PR small for more thorough review and easier updates. In case of regression, it also allows us to roll back a single feature instead of multiple ones.
|
||||||
|
1. For non-trivial changes, consider opening an issue and discussing it with the code owners first.
|
||||||
|
1. Provide a good PR description as a record of what change is being made and why it was made. Link to a GitHub issue if it exists.
|
||||||
|
1. Make sure your code is thoroughly tested with unit tests and integration tests. Remember to clean up the test instances properly in your code to avoid memory leaks.
|
||||||
|
|
||||||
## Adding a New Database Source or Tool
|
## Adding a New Database Source or Tool
|
||||||
|
|
||||||
Please create an
|
Please create an
|
||||||
@@ -110,6 +117,8 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
|
|||||||
We recommend looking at an [example tool
|
We recommend looking at an [example tool
|
||||||
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
|
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
|
||||||
|
|
||||||
|
Remember to keep your PRs small. For example, if you are contributing a new Source, only include one or two core Tools within the same PR, the rest of the Tools can come in subsequent PRs.
|
||||||
|
|
||||||
* **Create a new directory** under `internal/tools` for your tool type (e.g., `internal/tools/newdb/newdbtool`).
|
* **Create a new directory** under `internal/tools` for your tool type (e.g., `internal/tools/newdb/newdbtool`).
|
||||||
* **Define a configuration struct** for your tool in a file named `newdbtool.go`.
|
* **Define a configuration struct** for your tool in a file named `newdbtool.go`.
|
||||||
Create a `Config` struct and a `Tool` struct to store necessary parameters for
|
Create a `Config` struct and a `Tool` struct to store necessary parameters for
|
||||||
@@ -163,6 +172,8 @@ tools.
|
|||||||
parameters][temp-param-doc]. Only run this test if template
|
parameters][temp-param-doc]. Only run this test if template
|
||||||
parameters apply to your tool.
|
parameters apply to your tool.
|
||||||
|
|
||||||
|
* **Add additional tests** for the tools that are not covered by the predefined tests. Every tool must be tested!
|
||||||
|
|
||||||
* **Add the new database to the integration test workflow** in
|
* **Add the new database to the integration test workflow** in
|
||||||
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
|
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
|
||||||
|
|
||||||
@@ -179,6 +190,7 @@ tools.
|
|||||||
[temp-param-doc]:
|
[temp-param-doc]:
|
||||||
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
|
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
|
||||||
|
|
||||||
|
|
||||||
### Adding Documentation
|
### Adding Documentation
|
||||||
|
|
||||||
* **Update the documentation** to include information about your new data source
|
* **Update the documentation** to include information about your new data source
|
||||||
|
|||||||
17
DEVELOPER.md
17
DEVELOPER.md
@@ -379,6 +379,23 @@ to approve PRs for main. TeamSync is used to create this team from the MDB
|
|||||||
Group `toolbox-contributors`. Googlers who are developing for MCP-Toolbox
|
Group `toolbox-contributors`. Googlers who are developing for MCP-Toolbox
|
||||||
but aren't part of the core team should join this group.
|
but aren't part of the core team should join this group.
|
||||||
|
|
||||||
|
### Issue/PR Triage and SLO
|
||||||
|
After an issue is created, maintainers will assign the following labels:
|
||||||
|
* `Priority` (defaulted to P0)
|
||||||
|
* `Type` (if applicable)
|
||||||
|
* `Product` (if applicable)
|
||||||
|
|
||||||
|
All incoming issues and PRs will follow the following SLO:
|
||||||
|
| Type | Priority | Objective |
|
||||||
|
|-----------------|----------|------------------------------------------------------------------------|
|
||||||
|
| Feature Request | P0 | Must respond within **5 days** |
|
||||||
|
| Process | P0 | Must respond within **5 days** |
|
||||||
|
| Bugs | P0 | Must respond within **5 days**, and resolve/closure within **14 days** |
|
||||||
|
| Bugs | P1 | Must respond within **7 days**, and resolve/closure within **90 days** |
|
||||||
|
| Bugs | P2 | Must respond within **30 days**
|
||||||
|
|
||||||
|
_Types that are not listed in the table do not adhere to any SLO._
|
||||||
|
|
||||||
### Releasing
|
### Releasing
|
||||||
|
|
||||||
Toolbox has two types of releases: versioned and continuous. It uses Google
|
Toolbox has two types of releases: versioned and continuous. It uses Google
|
||||||
|
|||||||
@@ -272,7 +272,7 @@ To run Toolbox from binary:
|
|||||||
To run the server after pulling the [container image](#installing-the-server):
|
To run the server after pulling the [container image](#installing-the-server):
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
export VERSION=0.11.0 # Use the version you pulled
|
export VERSION=0.24.0 # Use the version you pulled
|
||||||
docker run -p 5000:5000 \
|
docker run -p 5000:5000 \
|
||||||
-v $(pwd)/tools.yaml:/app/tools.yaml \
|
-v $(pwd)/tools.yaml:/app/tools.yaml \
|
||||||
us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \
|
us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \
|
||||||
|
|||||||
@@ -134,6 +134,7 @@ sources:
|
|||||||
# scopes: # Optional: List of OAuth scopes to request.
|
# scopes: # Optional: List of OAuth scopes to request.
|
||||||
# - "https://www.googleapis.com/auth/bigquery"
|
# - "https://www.googleapis.com/auth/bigquery"
|
||||||
# - "https://www.googleapis.com/auth/drive.readonly"
|
# - "https://www.googleapis.com/auth/drive.readonly"
|
||||||
|
# maxQueryResultRows: 50 # Optional: Limits the number of rows returned by queries. Defaults to 50.
|
||||||
```
|
```
|
||||||
|
|
||||||
Initialize a BigQuery source that uses the client's access token:
|
Initialize a BigQuery source that uses the client's access token:
|
||||||
@@ -153,6 +154,7 @@ sources:
|
|||||||
# scopes: # Optional: List of OAuth scopes to request.
|
# scopes: # Optional: List of OAuth scopes to request.
|
||||||
# - "https://www.googleapis.com/auth/bigquery"
|
# - "https://www.googleapis.com/auth/bigquery"
|
||||||
# - "https://www.googleapis.com/auth/drive.readonly"
|
# - "https://www.googleapis.com/auth/drive.readonly"
|
||||||
|
# maxQueryResultRows: 50 # Optional: Limits the number of rows returned by queries. Defaults to 50.
|
||||||
```
|
```
|
||||||
|
|
||||||
## Reference
|
## Reference
|
||||||
@@ -167,3 +169,4 @@ sources:
|
|||||||
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. |
|
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. |
|
||||||
| scopes | []string | false | A list of OAuth 2.0 scopes to use for the credentials. If not provided, default scopes are used. |
|
| scopes | []string | false | A list of OAuth 2.0 scopes to use for the credentials. If not provided, default scopes are used. |
|
||||||
| impersonateServiceAccount | string | false | Service account email to impersonate when making BigQuery and Dataplex API calls. The authenticated principal must have the `roles/iam.serviceAccountTokenCreator` role on the target service account. [Learn More](https://cloud.google.com/iam/docs/service-account-impersonation) |
|
| impersonateServiceAccount | string | false | Service account email to impersonate when making BigQuery and Dataplex API calls. The authenticated principal must have the `roles/iam.serviceAccountTokenCreator` role on the target service account. [Learn More](https://cloud.google.com/iam/docs/service-account-impersonation) |
|
||||||
|
| maxQueryResultRows | int | false | The maximum number of rows to return from a query. Defaults to 50. |
|
||||||
|
|||||||
@@ -91,8 +91,8 @@ visible to the LLM.
|
|||||||
https://cloud.google.com/alloydb/docs/parameterized-secure-views-overview
|
https://cloud.google.com/alloydb/docs/parameterized-secure-views-overview
|
||||||
|
|
||||||
{{< notice tip >}} Make sure to enable the `parameterized_views` extension
|
{{< notice tip >}} Make sure to enable the `parameterized_views` extension
|
||||||
before running this tool. You can do so by running this command in the AlloyDB
|
to utilize PSV feature (`nlConfigParameters`) with this tool. You can do so by
|
||||||
studio:
|
running this command in the AlloyDB studio:
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
CREATE EXTENSION IF NOT EXISTS parameterized_views;
|
CREATE EXTENSION IF NOT EXISTS parameterized_views;
|
||||||
|
|||||||
@@ -19,6 +19,7 @@ sources:
|
|||||||
location: ${BIGQUERY_LOCATION:}
|
location: ${BIGQUERY_LOCATION:}
|
||||||
useClientOAuth: ${BIGQUERY_USE_CLIENT_OAUTH:false}
|
useClientOAuth: ${BIGQUERY_USE_CLIENT_OAUTH:false}
|
||||||
scopes: ${BIGQUERY_SCOPES:}
|
scopes: ${BIGQUERY_SCOPES:}
|
||||||
|
maxQueryResultRows: ${BIGQUERY_MAX_QUERY_RESULT_ROWS:50}
|
||||||
|
|
||||||
tools:
|
tools:
|
||||||
analyze_contribution:
|
analyze_contribution:
|
||||||
|
|||||||
@@ -89,6 +89,7 @@ type Config struct {
|
|||||||
UseClientOAuth bool `yaml:"useClientOAuth"`
|
UseClientOAuth bool `yaml:"useClientOAuth"`
|
||||||
ImpersonateServiceAccount string `yaml:"impersonateServiceAccount"`
|
ImpersonateServiceAccount string `yaml:"impersonateServiceAccount"`
|
||||||
Scopes StringOrStringSlice `yaml:"scopes"`
|
Scopes StringOrStringSlice `yaml:"scopes"`
|
||||||
|
MaxQueryResultRows int `yaml:"maxQueryResultRows"`
|
||||||
}
|
}
|
||||||
|
|
||||||
// StringOrStringSlice is a custom type that can unmarshal both a single string
|
// StringOrStringSlice is a custom type that can unmarshal both a single string
|
||||||
@@ -127,6 +128,10 @@ func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.So
|
|||||||
r.WriteMode = WriteModeAllowed
|
r.WriteMode = WriteModeAllowed
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if r.MaxQueryResultRows == 0 {
|
||||||
|
r.MaxQueryResultRows = 50
|
||||||
|
}
|
||||||
|
|
||||||
if r.WriteMode == WriteModeProtected && r.UseClientOAuth {
|
if r.WriteMode == WriteModeProtected && r.UseClientOAuth {
|
||||||
// The protected mode only allows write operations to the session's temporary datasets.
|
// The protected mode only allows write operations to the session's temporary datasets.
|
||||||
// when using client OAuth, a new session is created every
|
// when using client OAuth, a new session is created every
|
||||||
@@ -150,7 +155,7 @@ func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.So
|
|||||||
Client: client,
|
Client: client,
|
||||||
RestService: restService,
|
RestService: restService,
|
||||||
TokenSource: tokenSource,
|
TokenSource: tokenSource,
|
||||||
MaxQueryResultRows: 50,
|
MaxQueryResultRows: r.MaxQueryResultRows,
|
||||||
ClientCreator: clientCreator,
|
ClientCreator: clientCreator,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -567,7 +572,7 @@ func (s *Source) RunSQL(ctx context.Context, bqClient *bigqueryapi.Client, state
|
|||||||
}
|
}
|
||||||
|
|
||||||
var out []any
|
var out []any
|
||||||
for {
|
for s.MaxQueryResultRows <= 0 || len(out) < s.MaxQueryResultRows {
|
||||||
var val []bigqueryapi.Value
|
var val []bigqueryapi.Value
|
||||||
err = it.Next(&val)
|
err = it.Next(&val)
|
||||||
if err == iterator.Done {
|
if err == iterator.Done {
|
||||||
|
|||||||
@@ -21,9 +21,12 @@ import (
|
|||||||
|
|
||||||
yaml "github.com/goccy/go-yaml"
|
yaml "github.com/goccy/go-yaml"
|
||||||
"github.com/google/go-cmp/cmp"
|
"github.com/google/go-cmp/cmp"
|
||||||
|
"go.opentelemetry.io/otel/trace/noop"
|
||||||
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/server"
|
"github.com/googleapis/genai-toolbox/internal/server"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources/bigquery"
|
"github.com/googleapis/genai-toolbox/internal/sources/bigquery"
|
||||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||||
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestParseFromYamlBigQuery(t *testing.T) {
|
func TestParseFromYamlBigQuery(t *testing.T) {
|
||||||
@@ -154,6 +157,26 @@ func TestParseFromYamlBigQuery(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
desc: "with max query result rows example",
|
||||||
|
in: `
|
||||||
|
sources:
|
||||||
|
my-instance:
|
||||||
|
kind: bigquery
|
||||||
|
project: my-project
|
||||||
|
location: us
|
||||||
|
maxQueryResultRows: 10
|
||||||
|
`,
|
||||||
|
want: server.SourceConfigs{
|
||||||
|
"my-instance": bigquery.Config{
|
||||||
|
Name: "my-instance",
|
||||||
|
Kind: bigquery.SourceKind,
|
||||||
|
Project: "my-project",
|
||||||
|
Location: "us",
|
||||||
|
MaxQueryResultRows: 10,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
}
|
}
|
||||||
for _, tc := range tcs {
|
for _, tc := range tcs {
|
||||||
t.Run(tc.desc, func(t *testing.T) {
|
t.Run(tc.desc, func(t *testing.T) {
|
||||||
@@ -220,6 +243,59 @@ func TestFailParseFromYaml(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestInitialize_MaxQueryResultRows(t *testing.T) {
|
||||||
|
ctx, err := testutils.ContextWithNewLogger()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("unexpected error: %s", err)
|
||||||
|
}
|
||||||
|
ctx = util.WithUserAgent(ctx, "test-agent")
|
||||||
|
tracer := noop.NewTracerProvider().Tracer("")
|
||||||
|
|
||||||
|
tcs := []struct {
|
||||||
|
desc string
|
||||||
|
cfg bigquery.Config
|
||||||
|
want int
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
desc: "default value",
|
||||||
|
cfg: bigquery.Config{
|
||||||
|
Name: "test-default",
|
||||||
|
Kind: bigquery.SourceKind,
|
||||||
|
Project: "test-project",
|
||||||
|
UseClientOAuth: true,
|
||||||
|
},
|
||||||
|
want: 50,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
desc: "configured value",
|
||||||
|
cfg: bigquery.Config{
|
||||||
|
Name: "test-configured",
|
||||||
|
Kind: bigquery.SourceKind,
|
||||||
|
Project: "test-project",
|
||||||
|
UseClientOAuth: true,
|
||||||
|
MaxQueryResultRows: 100,
|
||||||
|
},
|
||||||
|
want: 100,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tc := range tcs {
|
||||||
|
t.Run(tc.desc, func(t *testing.T) {
|
||||||
|
src, err := tc.cfg.Initialize(ctx, tracer)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Initialize failed: %v", err)
|
||||||
|
}
|
||||||
|
bqSrc, ok := src.(*bigquery.Source)
|
||||||
|
if !ok {
|
||||||
|
t.Fatalf("Expected *bigquery.Source, got %T", src)
|
||||||
|
}
|
||||||
|
if bqSrc.MaxQueryResultRows != tc.want {
|
||||||
|
t.Errorf("MaxQueryResultRows = %d, want %d", bqSrc.MaxQueryResultRows, tc.want)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestNormalizeValue(t *testing.T) {
|
func TestNormalizeValue(t *testing.T) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
|
|||||||
@@ -16,8 +16,12 @@ package cloudhealthcare
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
|
"encoding/base64"
|
||||||
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"io"
|
||||||
"net/http"
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
@@ -255,3 +259,299 @@ func (s *Source) IsDICOMStoreAllowed(storeID string) bool {
|
|||||||
func (s *Source) UseClientAuthorization() bool {
|
func (s *Source) UseClientAuthorization() bool {
|
||||||
return s.UseClientOAuth
|
return s.UseClientOAuth
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func parseResults(resp *http.Response) (any, error) {
|
||||||
|
respBytes, err := io.ReadAll(resp.Body)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("could not read response: %w", err)
|
||||||
|
}
|
||||||
|
if resp.StatusCode > 299 {
|
||||||
|
return nil, fmt.Errorf("status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
||||||
|
}
|
||||||
|
var jsonMap map[string]interface{}
|
||||||
|
if err := json.Unmarshal(respBytes, &jsonMap); err != nil {
|
||||||
|
return nil, fmt.Errorf("could not unmarshal response as json: %w", err)
|
||||||
|
}
|
||||||
|
return jsonMap, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) getService(tokenStr string) (*healthcare.Service, error) {
|
||||||
|
svc := s.Service()
|
||||||
|
var err error
|
||||||
|
// Initialize new service if using user OAuth token
|
||||||
|
if s.UseClientAuthorization() {
|
||||||
|
svc, err = s.ServiceCreator()(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return svc, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) FHIRFetchPage(ctx context.Context, url, tokenStr string) (any, error) {
|
||||||
|
var httpClient *http.Client
|
||||||
|
if s.UseClientAuthorization() {
|
||||||
|
ts := oauth2.StaticTokenSource(&oauth2.Token{AccessToken: tokenStr})
|
||||||
|
httpClient = oauth2.NewClient(ctx, ts)
|
||||||
|
} else {
|
||||||
|
// The source.Service() object holds a client with the default credentials.
|
||||||
|
// However, the client is not exported, so we have to create a new one.
|
||||||
|
var err error
|
||||||
|
httpClient, err = google.DefaultClient(ctx, healthcare.CloudHealthcareScope)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create default http client: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create http request: %w", err)
|
||||||
|
}
|
||||||
|
req.Header.Set("Accept", "application/fhir+json;charset=utf-8")
|
||||||
|
|
||||||
|
resp, err := httpClient.Do(req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get fhir page from %q: %w", url, err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
return parseResults(resp)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) FHIRPatientEverything(storeID, patientID, tokenStr string, opts []googleapi.CallOption) (any, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s/fhir/Patient/%s", s.Project(), s.Region(), s.DatasetID(), storeID, patientID)
|
||||||
|
resp, err := svc.Projects.Locations.Datasets.FhirStores.Fhir.PatientEverything(name).Do(opts...)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to call patient everything for %q: %w", name, err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
return parseResults(resp)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) FHIRPatientSearch(storeID, tokenStr string, opts []googleapi.CallOption) (any, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s", s.Project(), s.Region(), s.DatasetID(), storeID)
|
||||||
|
resp, err := svc.Projects.Locations.Datasets.FhirStores.Fhir.SearchType(name, "Patient", &healthcare.SearchResourcesRequest{ResourceType: "Patient"}).Do(opts...)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to search patient resources: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
return parseResults(resp)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetDataset(tokenStr string) (*healthcare.Dataset, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
datasetName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s", s.Project(), s.Region(), s.DatasetID())
|
||||||
|
dataset, err := svc.Projects.Locations.Datasets.Get(datasetName).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get dataset %q: %w", datasetName, err)
|
||||||
|
}
|
||||||
|
return dataset, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetFHIRResource(storeID, resType, resID, tokenStr string) (any, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s/fhir/%s/%s", s.Project(), s.Region(), s.DatasetID(), storeID, resType, resID)
|
||||||
|
call := svc.Projects.Locations.Datasets.FhirStores.Fhir.Read(name)
|
||||||
|
call.Header().Set("Content-Type", "application/fhir+json;charset=utf-8")
|
||||||
|
resp, err := call.Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get fhir resource %q: %w", name, err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
return parseResults(resp)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetDICOMStore(storeID, tokenStr string) (*healthcare.DicomStore, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", s.Project(), s.Region(), s.DatasetID(), storeID)
|
||||||
|
store, err := svc.Projects.Locations.Datasets.DicomStores.Get(storeName).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get DICOM store %q: %w", storeName, err)
|
||||||
|
}
|
||||||
|
return store, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetFHIRStore(storeID, tokenStr string) (*healthcare.FhirStore, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s", s.Project(), s.Region(), s.DatasetID(), storeID)
|
||||||
|
store, err := svc.Projects.Locations.Datasets.FhirStores.Get(storeName).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get FHIR store %q: %w", storeName, err)
|
||||||
|
}
|
||||||
|
return store, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetDICOMStoreMetrics(storeID, tokenStr string) (*healthcare.DicomStoreMetrics, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", s.Project(), s.Region(), s.DatasetID(), storeID)
|
||||||
|
store, err := svc.Projects.Locations.Datasets.DicomStores.GetDICOMStoreMetrics(storeName).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get metrics for DICOM store %q: %w", storeName, err)
|
||||||
|
}
|
||||||
|
return store, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetFHIRStoreMetrics(storeID, tokenStr string) (*healthcare.FhirStoreMetrics, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s", s.Project(), s.Region(), s.DatasetID(), storeID)
|
||||||
|
store, err := svc.Projects.Locations.Datasets.FhirStores.GetFHIRStoreMetrics(storeName).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get metrics for FHIR store %q: %w", storeName, err)
|
||||||
|
}
|
||||||
|
return store, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) ListDICOMStores(tokenStr string) ([]*healthcare.DicomStore, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
datasetName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s", s.Project(), s.Region(), s.DatasetID())
|
||||||
|
stores, err := svc.Projects.Locations.Datasets.DicomStores.List(datasetName).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get dataset %q: %w", datasetName, err)
|
||||||
|
}
|
||||||
|
var filtered []*healthcare.DicomStore
|
||||||
|
for _, store := range stores.DicomStores {
|
||||||
|
if len(s.AllowedDICOMStores()) == 0 {
|
||||||
|
filtered = append(filtered, store)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if len(store.Name) == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
parts := strings.Split(store.Name, "/")
|
||||||
|
if _, ok := s.AllowedDICOMStores()[parts[len(parts)-1]]; ok {
|
||||||
|
filtered = append(filtered, store)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return filtered, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) ListFHIRStores(tokenStr string) ([]*healthcare.FhirStore, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
datasetName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s", s.Project(), s.Region(), s.DatasetID())
|
||||||
|
stores, err := svc.Projects.Locations.Datasets.FhirStores.List(datasetName).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get dataset %q: %w", datasetName, err)
|
||||||
|
}
|
||||||
|
var filtered []*healthcare.FhirStore
|
||||||
|
for _, store := range stores.FhirStores {
|
||||||
|
if len(s.AllowedFHIRStores()) == 0 {
|
||||||
|
filtered = append(filtered, store)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if len(store.Name) == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
parts := strings.Split(store.Name, "/")
|
||||||
|
if _, ok := s.AllowedFHIRStores()[parts[len(parts)-1]]; ok {
|
||||||
|
filtered = append(filtered, store)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return filtered, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) RetrieveRenderedDICOMInstance(storeID, study, series, sop string, frame int, tokenStr string) (any, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", s.Project(), s.Region(), s.DatasetID(), storeID)
|
||||||
|
dicomWebPath := fmt.Sprintf("studies/%s/series/%s/instances/%s/frames/%d/rendered", study, series, sop, frame)
|
||||||
|
call := svc.Projects.Locations.Datasets.DicomStores.Studies.Series.Instances.Frames.RetrieveRendered(name, dicomWebPath)
|
||||||
|
call.Header().Set("Accept", "image/jpeg")
|
||||||
|
resp, err := call.Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("unable to retrieve dicom instance rendered image: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
respBytes, err := io.ReadAll(resp.Body)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("could not read response: %w", err)
|
||||||
|
}
|
||||||
|
if resp.StatusCode > 299 {
|
||||||
|
return nil, fmt.Errorf("RetrieveRendered: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
||||||
|
}
|
||||||
|
base64String := base64.StdEncoding.EncodeToString(respBytes)
|
||||||
|
return base64String, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) SearchDICOM(toolKind, storeID, dicomWebPath, tokenStr string, opts []googleapi.CallOption) (any, error) {
|
||||||
|
svc, err := s.getService(tokenStr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", s.Project(), s.Region(), s.DatasetID(), storeID)
|
||||||
|
var resp *http.Response
|
||||||
|
switch toolKind {
|
||||||
|
case "cloud-healthcare-search-dicom-instances":
|
||||||
|
resp, err = svc.Projects.Locations.Datasets.DicomStores.SearchForInstances(name, dicomWebPath).Do(opts...)
|
||||||
|
case "cloud-healthcare-search-dicom-series":
|
||||||
|
resp, err = svc.Projects.Locations.Datasets.DicomStores.SearchForSeries(name, dicomWebPath).Do(opts...)
|
||||||
|
case "cloud-healthcare-search-dicom-studies":
|
||||||
|
resp, err = svc.Projects.Locations.Datasets.DicomStores.SearchForStudies(name, dicomWebPath).Do(opts...)
|
||||||
|
default:
|
||||||
|
return nil, fmt.Errorf("incompatible tool kind: %s", toolKind)
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to search dicom series: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
respBytes, err := io.ReadAll(resp.Body)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("could not read response: %w", err)
|
||||||
|
}
|
||||||
|
if resp.StatusCode > 299 {
|
||||||
|
return nil, fmt.Errorf("search: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
||||||
|
}
|
||||||
|
if len(respBytes) == 0 {
|
||||||
|
return []interface{}{}, nil
|
||||||
|
}
|
||||||
|
var result []interface{}
|
||||||
|
if err := json.Unmarshal(respBytes, &result); err != nil {
|
||||||
|
return nil, fmt.Errorf("could not unmarshal response as list: %w", err)
|
||||||
|
}
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|||||||
@@ -19,6 +19,8 @@ import (
|
|||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
dataplexapi "cloud.google.com/go/dataplex/apiv1"
|
dataplexapi "cloud.google.com/go/dataplex/apiv1"
|
||||||
|
"cloud.google.com/go/dataplex/apiv1/dataplexpb"
|
||||||
|
"github.com/cenkalti/backoff/v5"
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
@@ -121,3 +123,101 @@ func initDataplexConnection(
|
|||||||
}
|
}
|
||||||
return client, nil
|
return client, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (s *Source) LookupEntry(ctx context.Context, name string, view int, aspectTypes []string, entry string) (*dataplexpb.Entry, error) {
|
||||||
|
viewMap := map[int]dataplexpb.EntryView{
|
||||||
|
1: dataplexpb.EntryView_BASIC,
|
||||||
|
2: dataplexpb.EntryView_FULL,
|
||||||
|
3: dataplexpb.EntryView_CUSTOM,
|
||||||
|
4: dataplexpb.EntryView_ALL,
|
||||||
|
}
|
||||||
|
req := &dataplexpb.LookupEntryRequest{
|
||||||
|
Name: name,
|
||||||
|
View: viewMap[view],
|
||||||
|
AspectTypes: aspectTypes,
|
||||||
|
Entry: entry,
|
||||||
|
}
|
||||||
|
result, err := s.CatalogClient().LookupEntry(ctx, req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) searchRequest(ctx context.Context, query string, pageSize int, orderBy string) (*dataplexapi.SearchEntriesResultIterator, error) {
|
||||||
|
// Create SearchEntriesRequest with the provided parameters
|
||||||
|
req := &dataplexpb.SearchEntriesRequest{
|
||||||
|
Query: query,
|
||||||
|
Name: fmt.Sprintf("projects/%s/locations/global", s.ProjectID()),
|
||||||
|
PageSize: int32(pageSize),
|
||||||
|
OrderBy: orderBy,
|
||||||
|
SemanticSearch: true,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Perform the search using the CatalogClient - this will return an iterator
|
||||||
|
it := s.CatalogClient().SearchEntries(ctx, req)
|
||||||
|
if it == nil {
|
||||||
|
return nil, fmt.Errorf("failed to create search entries iterator for project %q", s.ProjectID())
|
||||||
|
}
|
||||||
|
return it, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) SearchAspectTypes(ctx context.Context, query string, pageSize int, orderBy string) ([]*dataplexpb.AspectType, error) {
|
||||||
|
q := query + " type=projects/dataplex-types/locations/global/entryTypes/aspecttype"
|
||||||
|
it, err := s.searchRequest(ctx, q, pageSize, orderBy)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Iterate through the search results and call GetAspectType for each result using the resource name
|
||||||
|
var results []*dataplexpb.AspectType
|
||||||
|
for {
|
||||||
|
entry, err := it.Next()
|
||||||
|
if err != nil {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create an instance of exponential backoff with default values for retrying GetAspectType calls
|
||||||
|
// InitialInterval, RandomizationFactor, Multiplier, MaxInterval = 500 ms, 0.5, 1.5, 60 s
|
||||||
|
getAspectBackOff := backoff.NewExponentialBackOff()
|
||||||
|
|
||||||
|
resourceName := entry.DataplexEntry.GetEntrySource().Resource
|
||||||
|
getAspectTypeReq := &dataplexpb.GetAspectTypeRequest{
|
||||||
|
Name: resourceName,
|
||||||
|
}
|
||||||
|
|
||||||
|
operation := func() (*dataplexpb.AspectType, error) {
|
||||||
|
aspectType, err := s.CatalogClient().GetAspectType(ctx, getAspectTypeReq)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get aspect type for entry %q: %w", resourceName, err)
|
||||||
|
}
|
||||||
|
return aspectType, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Retry the GetAspectType operation with exponential backoff
|
||||||
|
aspectType, err := backoff.Retry(ctx, operation, backoff.WithBackOff(getAspectBackOff))
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get aspect type after retries for entry %q: %w", resourceName, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
results = append(results, aspectType)
|
||||||
|
}
|
||||||
|
return results, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) SearchEntries(ctx context.Context, query string, pageSize int, orderBy string) ([]*dataplexpb.SearchEntriesResult, error) {
|
||||||
|
it, err := s.searchRequest(ctx, query, pageSize, orderBy)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
var results []*dataplexpb.SearchEntriesResult
|
||||||
|
for {
|
||||||
|
entry, err := it.Next()
|
||||||
|
if err != nil {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
results = append(results, entry)
|
||||||
|
}
|
||||||
|
return results, nil
|
||||||
|
}
|
||||||
|
|||||||
@@ -16,7 +16,10 @@ package firestore
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
|
"encoding/base64"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
"cloud.google.com/go/firestore"
|
"cloud.google.com/go/firestore"
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
@@ -25,6 +28,7 @@ import (
|
|||||||
"go.opentelemetry.io/otel/trace"
|
"go.opentelemetry.io/otel/trace"
|
||||||
"google.golang.org/api/firebaserules/v1"
|
"google.golang.org/api/firebaserules/v1"
|
||||||
"google.golang.org/api/option"
|
"google.golang.org/api/option"
|
||||||
|
"google.golang.org/genproto/googleapis/type/latlng"
|
||||||
)
|
)
|
||||||
|
|
||||||
const SourceKind string = "firestore"
|
const SourceKind string = "firestore"
|
||||||
@@ -113,6 +117,476 @@ func (s *Source) GetDatabaseId() string {
|
|||||||
return s.Database
|
return s.Database
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// FirestoreValueToJSON converts a Firestore value to a simplified JSON representation
|
||||||
|
// This removes type information and returns plain values
|
||||||
|
func FirestoreValueToJSON(value any) any {
|
||||||
|
if value == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
switch v := value.(type) {
|
||||||
|
case time.Time:
|
||||||
|
return v.Format(time.RFC3339Nano)
|
||||||
|
case *latlng.LatLng:
|
||||||
|
return map[string]any{
|
||||||
|
"latitude": v.Latitude,
|
||||||
|
"longitude": v.Longitude,
|
||||||
|
}
|
||||||
|
case []byte:
|
||||||
|
return base64.StdEncoding.EncodeToString(v)
|
||||||
|
case []any:
|
||||||
|
result := make([]any, len(v))
|
||||||
|
for i, item := range v {
|
||||||
|
result[i] = FirestoreValueToJSON(item)
|
||||||
|
}
|
||||||
|
return result
|
||||||
|
case map[string]any:
|
||||||
|
result := make(map[string]any)
|
||||||
|
for k, val := range v {
|
||||||
|
result[k] = FirestoreValueToJSON(val)
|
||||||
|
}
|
||||||
|
return result
|
||||||
|
case *firestore.DocumentRef:
|
||||||
|
return v.Path
|
||||||
|
default:
|
||||||
|
return value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// BuildQuery constructs the Firestore query from parameters
|
||||||
|
func (s *Source) BuildQuery(collectionPath string, filter firestore.EntityFilter, selectFields []string, field string, direction firestore.Direction, limit int, analyzeQuery bool) (*firestore.Query, error) {
|
||||||
|
collection := s.FirestoreClient().Collection(collectionPath)
|
||||||
|
query := collection.Query
|
||||||
|
|
||||||
|
// Process and apply filters if template is provided
|
||||||
|
if filter != nil {
|
||||||
|
query = query.WhereEntity(filter)
|
||||||
|
}
|
||||||
|
if len(selectFields) > 0 {
|
||||||
|
query = query.Select(selectFields...)
|
||||||
|
}
|
||||||
|
if field != "" {
|
||||||
|
query = query.OrderBy(field, direction)
|
||||||
|
}
|
||||||
|
query = query.Limit(limit)
|
||||||
|
|
||||||
|
// Apply analyze options if enabled
|
||||||
|
if analyzeQuery {
|
||||||
|
query = query.WithRunOptions(firestore.ExplainOptions{
|
||||||
|
Analyze: true,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return &query, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// QueryResult represents a document result from the query
|
||||||
|
type QueryResult struct {
|
||||||
|
ID string `json:"id"`
|
||||||
|
Path string `json:"path"`
|
||||||
|
Data map[string]any `json:"data"`
|
||||||
|
CreateTime any `json:"createTime,omitempty"`
|
||||||
|
UpdateTime any `json:"updateTime,omitempty"`
|
||||||
|
ReadTime any `json:"readTime,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// QueryResponse represents the full response including optional metrics
|
||||||
|
type QueryResponse struct {
|
||||||
|
Documents []QueryResult `json:"documents"`
|
||||||
|
ExplainMetrics map[string]any `json:"explainMetrics,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ExecuteQuery runs the query and formats the results
|
||||||
|
func (s *Source) ExecuteQuery(ctx context.Context, query *firestore.Query, analyzeQuery bool) (any, error) {
|
||||||
|
docIterator := query.Documents(ctx)
|
||||||
|
docs, err := docIterator.GetAll()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to execute query: %w", err)
|
||||||
|
}
|
||||||
|
// Convert results to structured format
|
||||||
|
results := make([]QueryResult, len(docs))
|
||||||
|
for i, doc := range docs {
|
||||||
|
results[i] = QueryResult{
|
||||||
|
ID: doc.Ref.ID,
|
||||||
|
Path: doc.Ref.Path,
|
||||||
|
Data: doc.Data(),
|
||||||
|
CreateTime: doc.CreateTime,
|
||||||
|
UpdateTime: doc.UpdateTime,
|
||||||
|
ReadTime: doc.ReadTime,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return with explain metrics if requested
|
||||||
|
if analyzeQuery {
|
||||||
|
explainMetrics, err := getExplainMetrics(docIterator)
|
||||||
|
if err == nil && explainMetrics != nil {
|
||||||
|
response := QueryResponse{
|
||||||
|
Documents: results,
|
||||||
|
ExplainMetrics: explainMetrics,
|
||||||
|
}
|
||||||
|
return response, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return results, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// getExplainMetrics extracts explain metrics from the query iterator
|
||||||
|
func getExplainMetrics(docIterator *firestore.DocumentIterator) (map[string]any, error) {
|
||||||
|
explainMetrics, err := docIterator.ExplainMetrics()
|
||||||
|
if err != nil || explainMetrics == nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
metricsData := make(map[string]any)
|
||||||
|
|
||||||
|
// Add plan summary if available
|
||||||
|
if explainMetrics.PlanSummary != nil {
|
||||||
|
planSummary := make(map[string]any)
|
||||||
|
planSummary["indexesUsed"] = explainMetrics.PlanSummary.IndexesUsed
|
||||||
|
metricsData["planSummary"] = planSummary
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add execution stats if available
|
||||||
|
if explainMetrics.ExecutionStats != nil {
|
||||||
|
executionStats := make(map[string]any)
|
||||||
|
executionStats["resultsReturned"] = explainMetrics.ExecutionStats.ResultsReturned
|
||||||
|
executionStats["readOperations"] = explainMetrics.ExecutionStats.ReadOperations
|
||||||
|
|
||||||
|
if explainMetrics.ExecutionStats.ExecutionDuration != nil {
|
||||||
|
executionStats["executionDuration"] = explainMetrics.ExecutionStats.ExecutionDuration.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
if explainMetrics.ExecutionStats.DebugStats != nil {
|
||||||
|
executionStats["debugStats"] = *explainMetrics.ExecutionStats.DebugStats
|
||||||
|
}
|
||||||
|
|
||||||
|
metricsData["executionStats"] = executionStats
|
||||||
|
}
|
||||||
|
|
||||||
|
return metricsData, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetDocuments(ctx context.Context, documentPaths []string) ([]any, error) {
|
||||||
|
// Create document references from paths
|
||||||
|
docRefs := make([]*firestore.DocumentRef, len(documentPaths))
|
||||||
|
for i, path := range documentPaths {
|
||||||
|
docRefs[i] = s.FirestoreClient().Doc(path)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all documents
|
||||||
|
snapshots, err := s.FirestoreClient().GetAll(ctx, docRefs)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get documents: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert snapshots to response data
|
||||||
|
results := make([]any, len(snapshots))
|
||||||
|
for i, snapshot := range snapshots {
|
||||||
|
docData := make(map[string]any)
|
||||||
|
docData["path"] = documentPaths[i]
|
||||||
|
docData["exists"] = snapshot.Exists()
|
||||||
|
|
||||||
|
if snapshot.Exists() {
|
||||||
|
docData["data"] = snapshot.Data()
|
||||||
|
docData["createTime"] = snapshot.CreateTime
|
||||||
|
docData["updateTime"] = snapshot.UpdateTime
|
||||||
|
docData["readTime"] = snapshot.ReadTime
|
||||||
|
}
|
||||||
|
|
||||||
|
results[i] = docData
|
||||||
|
}
|
||||||
|
|
||||||
|
return results, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) AddDocuments(ctx context.Context, collectionPath string, documentData any, returnData bool) (map[string]any, error) {
|
||||||
|
// Get the collection reference
|
||||||
|
collection := s.FirestoreClient().Collection(collectionPath)
|
||||||
|
|
||||||
|
// Add the document to the collection
|
||||||
|
docRef, writeResult, err := collection.Add(ctx, documentData)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to add document: %w", err)
|
||||||
|
}
|
||||||
|
// Build the response
|
||||||
|
response := map[string]any{
|
||||||
|
"documentPath": docRef.Path,
|
||||||
|
"createTime": writeResult.UpdateTime.Format("2006-01-02T15:04:05.999999999Z"),
|
||||||
|
}
|
||||||
|
// Add document data if requested
|
||||||
|
if returnData {
|
||||||
|
// Fetch the updated document to return the current state
|
||||||
|
snapshot, err := docRef.Get(ctx)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to retrieve updated document: %w", err)
|
||||||
|
}
|
||||||
|
// Convert the document data back to simple JSON format
|
||||||
|
simplifiedData := FirestoreValueToJSON(snapshot.Data())
|
||||||
|
response["documentData"] = simplifiedData
|
||||||
|
}
|
||||||
|
return response, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) UpdateDocument(ctx context.Context, documentPath string, updates []firestore.Update, documentData any, returnData bool) (map[string]any, error) {
|
||||||
|
// Get the document reference
|
||||||
|
docRef := s.FirestoreClient().Doc(documentPath)
|
||||||
|
|
||||||
|
// Prepare update data
|
||||||
|
var writeResult *firestore.WriteResult
|
||||||
|
var writeErr error
|
||||||
|
|
||||||
|
if len(updates) > 0 {
|
||||||
|
writeResult, writeErr = docRef.Update(ctx, updates)
|
||||||
|
} else {
|
||||||
|
writeResult, writeErr = docRef.Set(ctx, documentData, firestore.MergeAll)
|
||||||
|
}
|
||||||
|
|
||||||
|
if writeErr != nil {
|
||||||
|
return nil, fmt.Errorf("failed to update document: %w", writeErr)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build the response
|
||||||
|
response := map[string]any{
|
||||||
|
"documentPath": docRef.Path,
|
||||||
|
"updateTime": writeResult.UpdateTime.Format("2006-01-02T15:04:05.999999999Z"),
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add document data if requested
|
||||||
|
if returnData {
|
||||||
|
// Fetch the updated document to return the current state
|
||||||
|
snapshot, err := docRef.Get(ctx)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to retrieve updated document: %w", err)
|
||||||
|
}
|
||||||
|
// Convert the document data to simple JSON format
|
||||||
|
simplifiedData := FirestoreValueToJSON(snapshot.Data())
|
||||||
|
response["documentData"] = simplifiedData
|
||||||
|
}
|
||||||
|
|
||||||
|
return response, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) DeleteDocuments(ctx context.Context, documentPaths []string) ([]any, error) {
|
||||||
|
// Create a BulkWriter to handle multiple deletions efficiently
|
||||||
|
bulkWriter := s.FirestoreClient().BulkWriter(ctx)
|
||||||
|
|
||||||
|
// Keep track of jobs for each document
|
||||||
|
jobs := make([]*firestore.BulkWriterJob, len(documentPaths))
|
||||||
|
|
||||||
|
// Add all delete operations to the BulkWriter
|
||||||
|
for i, path := range documentPaths {
|
||||||
|
docRef := s.FirestoreClient().Doc(path)
|
||||||
|
job, err := bulkWriter.Delete(docRef)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to add delete operation for document %q: %w", path, err)
|
||||||
|
}
|
||||||
|
jobs[i] = job
|
||||||
|
}
|
||||||
|
|
||||||
|
// End the BulkWriter to execute all operations
|
||||||
|
bulkWriter.End()
|
||||||
|
|
||||||
|
// Collect results
|
||||||
|
results := make([]any, len(documentPaths))
|
||||||
|
for i, job := range jobs {
|
||||||
|
docData := make(map[string]any)
|
||||||
|
docData["path"] = documentPaths[i]
|
||||||
|
|
||||||
|
// Wait for the job to complete and get the result
|
||||||
|
_, err := job.Results()
|
||||||
|
if err != nil {
|
||||||
|
docData["success"] = false
|
||||||
|
docData["error"] = err.Error()
|
||||||
|
} else {
|
||||||
|
docData["success"] = true
|
||||||
|
}
|
||||||
|
|
||||||
|
results[i] = docData
|
||||||
|
}
|
||||||
|
return results, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) ListCollections(ctx context.Context, parentPath string) ([]any, error) {
|
||||||
|
var collectionRefs []*firestore.CollectionRef
|
||||||
|
var err error
|
||||||
|
if parentPath != "" {
|
||||||
|
// List subcollections of the specified document
|
||||||
|
docRef := s.FirestoreClient().Doc(parentPath)
|
||||||
|
collectionRefs, err = docRef.Collections(ctx).GetAll()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to list subcollections of document %q: %w", parentPath, err)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// List root collections
|
||||||
|
collectionRefs, err = s.FirestoreClient().Collections(ctx).GetAll()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to list root collections: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert collection references to response data
|
||||||
|
results := make([]any, len(collectionRefs))
|
||||||
|
for i, collRef := range collectionRefs {
|
||||||
|
collData := make(map[string]any)
|
||||||
|
collData["id"] = collRef.ID
|
||||||
|
collData["path"] = collRef.Path
|
||||||
|
|
||||||
|
// If this is a subcollection, include parent information
|
||||||
|
if collRef.Parent != nil {
|
||||||
|
collData["parent"] = collRef.Parent.Path
|
||||||
|
}
|
||||||
|
results[i] = collData
|
||||||
|
}
|
||||||
|
return results, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetRules(ctx context.Context) (any, error) {
|
||||||
|
// Get the latest release for Firestore
|
||||||
|
releaseName := fmt.Sprintf("projects/%s/releases/cloud.firestore/%s", s.GetProjectId(), s.GetDatabaseId())
|
||||||
|
release, err := s.FirebaseRulesClient().Projects.Releases.Get(releaseName).Context(ctx).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get latest Firestore release: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if release.RulesetName == "" {
|
||||||
|
return nil, fmt.Errorf("no active Firestore rules were found in project '%s' and database '%s'", s.GetProjectId(), s.GetDatabaseId())
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the ruleset content
|
||||||
|
ruleset, err := s.FirebaseRulesClient().Projects.Rulesets.Get(release.RulesetName).Context(ctx).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get ruleset content: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if ruleset.Source == nil || len(ruleset.Source.Files) == 0 {
|
||||||
|
return nil, fmt.Errorf("no rules files found in ruleset")
|
||||||
|
}
|
||||||
|
|
||||||
|
return ruleset, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// SourcePosition represents the location of an issue in the source
|
||||||
|
type SourcePosition struct {
|
||||||
|
FileName string `json:"fileName,omitempty"`
|
||||||
|
Line int64 `json:"line"` // 1-based
|
||||||
|
Column int64 `json:"column"` // 1-based
|
||||||
|
CurrentOffset int64 `json:"currentOffset"` // 0-based, inclusive start
|
||||||
|
EndOffset int64 `json:"endOffset"` // 0-based, exclusive end
|
||||||
|
}
|
||||||
|
|
||||||
|
// Issue represents a validation issue in the rules
|
||||||
|
type Issue struct {
|
||||||
|
SourcePosition SourcePosition `json:"sourcePosition"`
|
||||||
|
Description string `json:"description"`
|
||||||
|
Severity string `json:"severity"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ValidationResult represents the result of rules validation
|
||||||
|
type ValidationResult struct {
|
||||||
|
Valid bool `json:"valid"`
|
||||||
|
IssueCount int `json:"issueCount"`
|
||||||
|
FormattedIssues string `json:"formattedIssues,omitempty"`
|
||||||
|
RawIssues []Issue `json:"rawIssues,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) ValidateRules(ctx context.Context, sourceParam string) (any, error) {
|
||||||
|
// Create test request
|
||||||
|
testRequest := &firebaserules.TestRulesetRequest{
|
||||||
|
Source: &firebaserules.Source{
|
||||||
|
Files: []*firebaserules.File{
|
||||||
|
{
|
||||||
|
Name: "firestore.rules",
|
||||||
|
Content: sourceParam,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
// We don't need test cases for validation only
|
||||||
|
TestSuite: &firebaserules.TestSuite{
|
||||||
|
TestCases: []*firebaserules.TestCase{},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
// Call the test API
|
||||||
|
projectName := fmt.Sprintf("projects/%s", s.GetProjectId())
|
||||||
|
response, err := s.FirebaseRulesClient().Projects.Test(projectName, testRequest).Context(ctx).Do()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to validate rules: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process the response
|
||||||
|
if len(response.Issues) == 0 {
|
||||||
|
return ValidationResult{
|
||||||
|
Valid: true,
|
||||||
|
IssueCount: 0,
|
||||||
|
FormattedIssues: "✓ No errors detected. Rules are valid.",
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert issues to our format
|
||||||
|
issues := make([]Issue, len(response.Issues))
|
||||||
|
for i, issue := range response.Issues {
|
||||||
|
issues[i] = Issue{
|
||||||
|
Description: issue.Description,
|
||||||
|
Severity: issue.Severity,
|
||||||
|
SourcePosition: SourcePosition{
|
||||||
|
FileName: issue.SourcePosition.FileName,
|
||||||
|
Line: issue.SourcePosition.Line,
|
||||||
|
Column: issue.SourcePosition.Column,
|
||||||
|
CurrentOffset: issue.SourcePosition.CurrentOffset,
|
||||||
|
EndOffset: issue.SourcePosition.EndOffset,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format issues
|
||||||
|
sourceLines := strings.Split(sourceParam, "\n")
|
||||||
|
var formattedOutput []string
|
||||||
|
|
||||||
|
formattedOutput = append(formattedOutput, fmt.Sprintf("Found %d issue(s) in rules source:\n", len(issues)))
|
||||||
|
|
||||||
|
for _, issue := range issues {
|
||||||
|
issueString := fmt.Sprintf("%s: %s [Ln %d, Col %d]",
|
||||||
|
issue.Severity,
|
||||||
|
issue.Description,
|
||||||
|
issue.SourcePosition.Line,
|
||||||
|
issue.SourcePosition.Column)
|
||||||
|
|
||||||
|
if issue.SourcePosition.Line > 0 {
|
||||||
|
lineIndex := int(issue.SourcePosition.Line - 1) // 0-based index
|
||||||
|
if lineIndex >= 0 && lineIndex < len(sourceLines) {
|
||||||
|
errorLine := sourceLines[lineIndex]
|
||||||
|
issueString += fmt.Sprintf("\n```\n%s", errorLine)
|
||||||
|
|
||||||
|
// Add carets if we have column and offset information
|
||||||
|
if issue.SourcePosition.Column > 0 &&
|
||||||
|
issue.SourcePosition.CurrentOffset >= 0 &&
|
||||||
|
issue.SourcePosition.EndOffset > issue.SourcePosition.CurrentOffset {
|
||||||
|
|
||||||
|
startColumn := int(issue.SourcePosition.Column - 1) // 0-based
|
||||||
|
errorTokenLength := int(issue.SourcePosition.EndOffset - issue.SourcePosition.CurrentOffset)
|
||||||
|
|
||||||
|
if startColumn >= 0 && errorTokenLength > 0 && startColumn <= len(errorLine) {
|
||||||
|
padding := strings.Repeat(" ", startColumn)
|
||||||
|
carets := strings.Repeat("^", errorTokenLength)
|
||||||
|
issueString += fmt.Sprintf("\n%s%s", padding, carets)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
issueString += "\n```"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
formattedOutput = append(formattedOutput, issueString)
|
||||||
|
}
|
||||||
|
|
||||||
|
formattedIssues := strings.Join(formattedOutput, "\n\n")
|
||||||
|
|
||||||
|
return ValidationResult{
|
||||||
|
Valid: false,
|
||||||
|
IssueCount: len(issues),
|
||||||
|
FormattedIssues: formattedIssues,
|
||||||
|
RawIssues: issues,
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
func initFirestoreConnection(
|
func initFirestoreConnection(
|
||||||
ctx context.Context,
|
ctx context.Context,
|
||||||
tracer trace.Tracer,
|
tracer trace.Tracer,
|
||||||
|
|||||||
@@ -16,6 +16,7 @@ package firestore_test
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
"time"
|
||||||
|
|
||||||
yaml "github.com/goccy/go-yaml"
|
yaml "github.com/goccy/go-yaml"
|
||||||
"github.com/google/go-cmp/cmp"
|
"github.com/google/go-cmp/cmp"
|
||||||
@@ -128,3 +129,37 @@ func TestFailParseFromYamlFirestore(t *testing.T) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestFirestoreValueToJSON_RoundTrip(t *testing.T) {
|
||||||
|
// Test round-trip conversion
|
||||||
|
original := map[string]any{
|
||||||
|
"name": "Test",
|
||||||
|
"count": int64(42),
|
||||||
|
"price": 19.99,
|
||||||
|
"active": true,
|
||||||
|
"tags": []any{"tag1", "tag2"},
|
||||||
|
"metadata": map[string]any{
|
||||||
|
"created": time.Now(),
|
||||||
|
},
|
||||||
|
"nullField": nil,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert to JSON representation
|
||||||
|
jsonRepresentation := firestore.FirestoreValueToJSON(original)
|
||||||
|
|
||||||
|
// Verify types are simplified
|
||||||
|
jsonMap, ok := jsonRepresentation.(map[string]any)
|
||||||
|
if !ok {
|
||||||
|
t.Fatalf("Expected map, got %T", jsonRepresentation)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Time should be converted to string
|
||||||
|
metadata, ok := jsonMap["metadata"].(map[string]any)
|
||||||
|
if !ok {
|
||||||
|
t.Fatalf("metadata should be a map, got %T", jsonMap["metadata"])
|
||||||
|
}
|
||||||
|
_, ok = metadata["created"].(string)
|
||||||
|
if !ok {
|
||||||
|
t.Errorf("created should be a string, got %T", metadata["created"])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -16,7 +16,9 @@ package http
|
|||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"crypto/tls"
|
"crypto/tls"
|
||||||
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"io"
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/url"
|
"net/url"
|
||||||
"time"
|
"time"
|
||||||
@@ -143,3 +145,28 @@ func (s *Source) HttpQueryParams() map[string]string {
|
|||||||
func (s *Source) Client() *http.Client {
|
func (s *Source) Client() *http.Client {
|
||||||
return s.client
|
return s.client
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (s *Source) RunRequest(req *http.Request) (any, error) {
|
||||||
|
// Make request and fetch response
|
||||||
|
resp, err := s.Client().Do(req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error making HTTP request: %s", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
var body []byte
|
||||||
|
body, err = io.ReadAll(resp.Body)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if resp.StatusCode < 200 || resp.StatusCode > 299 {
|
||||||
|
return nil, fmt.Errorf("unexpected status code: %d, response body: %s", resp.StatusCode, string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
var data any
|
||||||
|
if err = json.Unmarshal(body, &data); err != nil {
|
||||||
|
// if unable to unmarshal data, return result as string.
|
||||||
|
return string(body), nil
|
||||||
|
}
|
||||||
|
return data, nil
|
||||||
|
}
|
||||||
|
|||||||
@@ -15,7 +15,9 @@ package looker
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
|
"crypto/tls"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"net/http"
|
||||||
"strings"
|
"strings"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
@@ -208,6 +210,49 @@ func (s *Source) LookerSessionLength() int64 {
|
|||||||
return s.SessionLength
|
return s.SessionLength
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Make types for RoundTripper
|
||||||
|
type transportWithAuthHeader struct {
|
||||||
|
Base http.RoundTripper
|
||||||
|
AuthToken string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t *transportWithAuthHeader) RoundTrip(req *http.Request) (*http.Response, error) {
|
||||||
|
req.Header.Set("x-looker-appid", "go-sdk")
|
||||||
|
req.Header.Set("Authorization", t.AuthToken)
|
||||||
|
return t.Base.RoundTrip(req)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetLookerSDK(accessToken string) (*v4.LookerSDK, error) {
|
||||||
|
if s.UseClientAuthorization() {
|
||||||
|
if accessToken == "" {
|
||||||
|
return nil, fmt.Errorf("no access token supplied with request")
|
||||||
|
}
|
||||||
|
|
||||||
|
session := rtl.NewAuthSession(*s.LookerApiSettings())
|
||||||
|
// Configure base transport with TLS
|
||||||
|
transport := &http.Transport{
|
||||||
|
TLSClientConfig: &tls.Config{
|
||||||
|
InsecureSkipVerify: !s.LookerApiSettings().VerifySsl,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build transport for end user token
|
||||||
|
session.Client = http.Client{
|
||||||
|
Transport: &transportWithAuthHeader{
|
||||||
|
Base: transport,
|
||||||
|
AuthToken: accessToken,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
// return SDK with new Transport
|
||||||
|
return v4.NewLookerSDK(session), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if s.LookerClient() == nil {
|
||||||
|
return nil, fmt.Errorf("client id or client secret not valid")
|
||||||
|
}
|
||||||
|
return s.LookerClient(), nil
|
||||||
|
}
|
||||||
|
|
||||||
func initGoogleCloudConnection(ctx context.Context) (oauth2.TokenSource, error) {
|
func initGoogleCloudConnection(ctx context.Context) (oauth2.TokenSource, error) {
|
||||||
cred, err := google.FindDefaultCredentials(ctx, geminidataanalytics.DefaultAuthScopes()...)
|
cred, err := google.FindDefaultCredentials(ctx, geminidataanalytics.DefaultAuthScopes()...)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|||||||
@@ -16,11 +16,14 @@ package mongodb
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
|
"encoding/json"
|
||||||
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
|
"go.mongodb.org/mongo-driver/bson"
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
"go.mongodb.org/mongo-driver/mongo/options"
|
"go.mongodb.org/mongo-driver/mongo/options"
|
||||||
"go.opentelemetry.io/otel/trace"
|
"go.opentelemetry.io/otel/trace"
|
||||||
@@ -93,6 +96,201 @@ func (s *Source) MongoClient() *mongo.Client {
|
|||||||
return s.Client
|
return s.Client
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func parseData(ctx context.Context, cur *mongo.Cursor) ([]any, error) {
|
||||||
|
var data = []any{}
|
||||||
|
err := cur.All(ctx, &data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
var final []any
|
||||||
|
for _, item := range data {
|
||||||
|
tmp, _ := bson.MarshalExtJSON(item, false, false)
|
||||||
|
var tmp2 any
|
||||||
|
err = json.Unmarshal(tmp, &tmp2)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
final = append(final, tmp2)
|
||||||
|
}
|
||||||
|
return final, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) Aggregate(ctx context.Context, pipelineString string, canonical, readOnly bool, database, collection string) ([]any, error) {
|
||||||
|
var pipeline = []bson.M{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(pipelineString), canonical, &pipeline)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if readOnly {
|
||||||
|
//fail if we do a merge or an out
|
||||||
|
for _, stage := range pipeline {
|
||||||
|
for key := range stage {
|
||||||
|
if key == "$merge" || key == "$out" {
|
||||||
|
return nil, fmt.Errorf("this is not a read-only pipeline: %+v", stage)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
cur, err := s.MongoClient().Database(database).Collection(collection).Aggregate(ctx, pipeline)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer cur.Close(ctx)
|
||||||
|
res, err := parseData(ctx, cur)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if res == nil {
|
||||||
|
return []any{}, nil
|
||||||
|
}
|
||||||
|
return res, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) Find(ctx context.Context, filterString, database, collection string, opts *options.FindOptions) ([]any, error) {
|
||||||
|
var filter = bson.D{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
cur, err := s.MongoClient().Database(database).Collection(collection).Find(ctx, filter, opts)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer cur.Close(ctx)
|
||||||
|
return parseData(ctx, cur)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) FindOne(ctx context.Context, filterString, database, collection string, opts *options.FindOneOptions) ([]any, error) {
|
||||||
|
var filter = bson.D{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
res := s.MongoClient().Database(database).Collection(collection).FindOne(ctx, filter, opts)
|
||||||
|
if res.Err() != nil {
|
||||||
|
return nil, res.Err()
|
||||||
|
}
|
||||||
|
|
||||||
|
var data any
|
||||||
|
err = res.Decode(&data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
var final []any
|
||||||
|
tmp, _ := bson.MarshalExtJSON(data, false, false)
|
||||||
|
var tmp2 any
|
||||||
|
err = json.Unmarshal(tmp, &tmp2)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
final = append(final, tmp2)
|
||||||
|
|
||||||
|
return final, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) InsertMany(ctx context.Context, jsonData string, canonical bool, database, collection string) ([]any, error) {
|
||||||
|
var data = []any{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(jsonData), canonical, &data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
res, err := s.MongoClient().Database(database).Collection(collection).InsertMany(ctx, data, options.InsertMany())
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return res.InsertedIDs, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) InsertOne(ctx context.Context, jsonData string, canonical bool, database, collection string) (any, error) {
|
||||||
|
var data any
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(jsonData), canonical, &data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
res, err := s.MongoClient().Database(database).Collection(collection).InsertOne(ctx, data, options.InsertOne())
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return res.InsertedID, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) UpdateMany(ctx context.Context, filterString string, canonical bool, updateString, database, collection string, upsert bool) ([]any, error) {
|
||||||
|
var filter = bson.D{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(filterString), canonical, &filter)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("unable to unmarshal filter string: %w", err)
|
||||||
|
}
|
||||||
|
var update = bson.D{}
|
||||||
|
err = bson.UnmarshalExtJSON([]byte(updateString), false, &update)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("unable to unmarshal update string: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
res, err := s.MongoClient().Database(database).Collection(collection).UpdateMany(ctx, filter, update, options.Update().SetUpsert(upsert))
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error updating collection: %w", err)
|
||||||
|
}
|
||||||
|
return []any{res.ModifiedCount, res.UpsertedCount, res.MatchedCount}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) UpdateOne(ctx context.Context, filterString string, canonical bool, updateString, database, collection string, upsert bool) (any, error) {
|
||||||
|
var filter = bson.D{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("unable to unmarshal filter string: %w", err)
|
||||||
|
}
|
||||||
|
var update = bson.D{}
|
||||||
|
err = bson.UnmarshalExtJSON([]byte(updateString), canonical, &update)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("unable to unmarshal update string: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
res, err := s.MongoClient().Database(database).Collection(collection).UpdateOne(ctx, filter, update, options.Update().SetUpsert(upsert))
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error updating collection: %w", err)
|
||||||
|
}
|
||||||
|
return res.ModifiedCount, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) DeleteMany(ctx context.Context, filterString, database, collection string) (any, error) {
|
||||||
|
var filter = bson.D{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
res, err := s.MongoClient().Database(database).Collection(collection).DeleteMany(ctx, filter, options.Delete())
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if res.DeletedCount == 0 {
|
||||||
|
return nil, errors.New("no document found")
|
||||||
|
}
|
||||||
|
return res.DeletedCount, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) DeleteOne(ctx context.Context, filterString, database, collection string) (any, error) {
|
||||||
|
var filter = bson.D{}
|
||||||
|
err := bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
res, err := s.MongoClient().Database(database).Collection(collection).DeleteOne(ctx, filter, options.Delete())
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return res.DeletedCount, nil
|
||||||
|
}
|
||||||
|
|
||||||
func initMongoDBClient(ctx context.Context, tracer trace.Tracer, name, uri string) (*mongo.Client, error) {
|
func initMongoDBClient(ctx context.Context, tracer trace.Tracer, name, uri string) (*mongo.Client, error) {
|
||||||
// Start a tracing span
|
// Start a tracing span
|
||||||
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
|
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
|
||||||
|
|||||||
@@ -16,15 +16,21 @@ package serverlessspark
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"time"
|
||||||
|
|
||||||
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
||||||
|
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
||||||
longrunning "cloud.google.com/go/longrunning/autogen"
|
longrunning "cloud.google.com/go/longrunning/autogen"
|
||||||
|
"cloud.google.com/go/longrunning/autogen/longrunningpb"
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"go.opentelemetry.io/otel/trace"
|
"go.opentelemetry.io/otel/trace"
|
||||||
|
"google.golang.org/api/iterator"
|
||||||
"google.golang.org/api/option"
|
"google.golang.org/api/option"
|
||||||
|
"google.golang.org/protobuf/encoding/protojson"
|
||||||
)
|
)
|
||||||
|
|
||||||
const SourceKind string = "serverless-spark"
|
const SourceKind string = "serverless-spark"
|
||||||
@@ -121,3 +127,168 @@ func (s *Source) Close() error {
|
|||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (s *Source) CancelOperation(ctx context.Context, operation string) (any, error) {
|
||||||
|
req := &longrunningpb.CancelOperationRequest{
|
||||||
|
Name: fmt.Sprintf("projects/%s/locations/%s/operations/%s", s.GetProject(), s.GetLocation(), operation),
|
||||||
|
}
|
||||||
|
client, err := s.GetOperationsClient(ctx)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get operations client: %w", err)
|
||||||
|
}
|
||||||
|
err = client.CancelOperation(ctx, req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to cancel operation: %w", err)
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("Cancelled [%s].", operation), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) CreateBatch(ctx context.Context, batch *dataprocpb.Batch) (map[string]any, error) {
|
||||||
|
req := &dataprocpb.CreateBatchRequest{
|
||||||
|
Parent: fmt.Sprintf("projects/%s/locations/%s", s.GetProject(), s.GetLocation()),
|
||||||
|
Batch: batch,
|
||||||
|
}
|
||||||
|
|
||||||
|
client := s.GetBatchControllerClient()
|
||||||
|
op, err := client.CreateBatch(ctx, req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create batch: %w", err)
|
||||||
|
}
|
||||||
|
meta, err := op.Metadata()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get create batch op metadata: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
projectID, location, batchID, err := ExtractBatchDetails(meta.GetBatch())
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error extracting batch details from name %q: %v", meta.GetBatch(), err)
|
||||||
|
}
|
||||||
|
consoleUrl := BatchConsoleURL(projectID, location, batchID)
|
||||||
|
logsUrl := BatchLogsURL(projectID, location, batchID, meta.GetCreateTime().AsTime(), time.Time{})
|
||||||
|
|
||||||
|
wrappedResult := map[string]any{
|
||||||
|
"opMetadata": meta,
|
||||||
|
"consoleUrl": consoleUrl,
|
||||||
|
"logsUrl": logsUrl,
|
||||||
|
}
|
||||||
|
return wrappedResult, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ListBatchesResponse is the response from the list batches API.
|
||||||
|
type ListBatchesResponse struct {
|
||||||
|
Batches []Batch `json:"batches"`
|
||||||
|
NextPageToken string `json:"nextPageToken"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Batch represents a single batch job.
|
||||||
|
type Batch struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
UUID string `json:"uuid"`
|
||||||
|
State string `json:"state"`
|
||||||
|
Creator string `json:"creator"`
|
||||||
|
CreateTime string `json:"createTime"`
|
||||||
|
Operation string `json:"operation"`
|
||||||
|
ConsoleURL string `json:"consoleUrl"`
|
||||||
|
LogsURL string `json:"logsUrl"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) ListBatches(ctx context.Context, ps *int, pt, filter string) (any, error) {
|
||||||
|
client := s.GetBatchControllerClient()
|
||||||
|
parent := fmt.Sprintf("projects/%s/locations/%s", s.GetProject(), s.GetLocation())
|
||||||
|
req := &dataprocpb.ListBatchesRequest{
|
||||||
|
Parent: parent,
|
||||||
|
OrderBy: "create_time desc",
|
||||||
|
}
|
||||||
|
|
||||||
|
if ps != nil {
|
||||||
|
req.PageSize = int32(*ps)
|
||||||
|
}
|
||||||
|
if pt != "" {
|
||||||
|
req.PageToken = pt
|
||||||
|
}
|
||||||
|
if filter != "" {
|
||||||
|
req.Filter = filter
|
||||||
|
}
|
||||||
|
|
||||||
|
it := client.ListBatches(ctx, req)
|
||||||
|
pager := iterator.NewPager(it, int(req.PageSize), req.PageToken)
|
||||||
|
|
||||||
|
var batchPbs []*dataprocpb.Batch
|
||||||
|
nextPageToken, err := pager.NextPage(&batchPbs)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to list batches: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
batches, err := ToBatches(batchPbs)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return ListBatchesResponse{Batches: batches, NextPageToken: nextPageToken}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBatches converts a slice of protobuf Batch messages to a slice of Batch structs.
|
||||||
|
func ToBatches(batchPbs []*dataprocpb.Batch) ([]Batch, error) {
|
||||||
|
batches := make([]Batch, 0, len(batchPbs))
|
||||||
|
for _, batchPb := range batchPbs {
|
||||||
|
consoleUrl, err := BatchConsoleURLFromProto(batchPb)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error generating console url: %v", err)
|
||||||
|
}
|
||||||
|
logsUrl, err := BatchLogsURLFromProto(batchPb)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error generating logs url: %v", err)
|
||||||
|
}
|
||||||
|
batch := Batch{
|
||||||
|
Name: batchPb.Name,
|
||||||
|
UUID: batchPb.Uuid,
|
||||||
|
State: batchPb.State.Enum().String(),
|
||||||
|
Creator: batchPb.Creator,
|
||||||
|
CreateTime: batchPb.CreateTime.AsTime().Format(time.RFC3339),
|
||||||
|
Operation: batchPb.Operation,
|
||||||
|
ConsoleURL: consoleUrl,
|
||||||
|
LogsURL: logsUrl,
|
||||||
|
}
|
||||||
|
batches = append(batches, batch)
|
||||||
|
}
|
||||||
|
return batches, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Source) GetBatch(ctx context.Context, name string) (map[string]any, error) {
|
||||||
|
client := s.GetBatchControllerClient()
|
||||||
|
req := &dataprocpb.GetBatchRequest{
|
||||||
|
Name: fmt.Sprintf("projects/%s/locations/%s/batches/%s", s.GetProject(), s.GetLocation(), name),
|
||||||
|
}
|
||||||
|
|
||||||
|
batchPb, err := client.GetBatch(ctx, req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get batch: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
jsonBytes, err := protojson.Marshal(batchPb)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to marshal batch to JSON: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
var result map[string]any
|
||||||
|
if err := json.Unmarshal(jsonBytes, &result); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to unmarshal batch JSON: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
consoleUrl, err := BatchConsoleURLFromProto(batchPb)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error generating console url: %v", err)
|
||||||
|
}
|
||||||
|
logsUrl, err := BatchLogsURLFromProto(batchPb)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error generating logs url: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
wrappedResult := map[string]any{
|
||||||
|
"consoleUrl": consoleUrl,
|
||||||
|
"logsUrl": logsUrl,
|
||||||
|
"batch": result,
|
||||||
|
}
|
||||||
|
|
||||||
|
return wrappedResult, nil
|
||||||
|
}
|
||||||
|
|||||||
@@ -1,10 +1,10 @@
|
|||||||
// Copyright 2025 Google LLC
|
// Copyright 2026 Google LLC
|
||||||
//
|
//
|
||||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
// you may not use this file except in compliance with the License.
|
// you may not use this file except in compliance with the License.
|
||||||
// You may obtain a copy of the License at
|
// You may obtain a copy of the License at
|
||||||
//
|
//
|
||||||
// http://www.apache.org/licenses/LICENSE-2.0
|
// http://www.apache.org/licenses/LICENSE-2.0
|
||||||
//
|
//
|
||||||
// Unless required by applicable law or agreed to in writing, software
|
// Unless required by applicable law or agreed to in writing, software
|
||||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
@@ -12,7 +12,7 @@
|
|||||||
// See the License for the specific language governing permissions and
|
// See the License for the specific language governing permissions and
|
||||||
// limitations under the License.
|
// limitations under the License.
|
||||||
|
|
||||||
package common
|
package serverlessspark
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
@@ -23,13 +23,13 @@ import (
|
|||||||
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
var batchFullNameRegex = regexp.MustCompile(`projects/(?P<project>[^/]+)/locations/(?P<location>[^/]+)/batches/(?P<batch_id>[^/]+)`)
|
||||||
|
|
||||||
const (
|
const (
|
||||||
logTimeBufferBefore = 1 * time.Minute
|
logTimeBufferBefore = 1 * time.Minute
|
||||||
logTimeBufferAfter = 10 * time.Minute
|
logTimeBufferAfter = 10 * time.Minute
|
||||||
)
|
)
|
||||||
|
|
||||||
var batchFullNameRegex = regexp.MustCompile(`projects/(?P<project>[^/]+)/locations/(?P<location>[^/]+)/batches/(?P<batch_id>[^/]+)`)
|
|
||||||
|
|
||||||
// Extract BatchDetails extracts the project ID, location, and batch ID from a fully qualified batch name.
|
// Extract BatchDetails extracts the project ID, location, and batch ID from a fully qualified batch name.
|
||||||
func ExtractBatchDetails(batchName string) (projectID, location, batchID string, err error) {
|
func ExtractBatchDetails(batchName string) (projectID, location, batchID string, err error) {
|
||||||
matches := batchFullNameRegex.FindStringSubmatch(batchName)
|
matches := batchFullNameRegex.FindStringSubmatch(batchName)
|
||||||
@@ -39,26 +39,6 @@ func ExtractBatchDetails(batchName string) (projectID, location, batchID string,
|
|||||||
return matches[1], matches[2], matches[3], nil
|
return matches[1], matches[2], matches[3], nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// BatchConsoleURLFromProto builds a URL to the Google Cloud Console linking to the batch summary page.
|
|
||||||
func BatchConsoleURLFromProto(batchPb *dataprocpb.Batch) (string, error) {
|
|
||||||
projectID, location, batchID, err := ExtractBatchDetails(batchPb.GetName())
|
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
return BatchConsoleURL(projectID, location, batchID), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// BatchLogsURLFromProto builds a URL to the Google Cloud Console showing Cloud Logging for the given batch and time range.
|
|
||||||
func BatchLogsURLFromProto(batchPb *dataprocpb.Batch) (string, error) {
|
|
||||||
projectID, location, batchID, err := ExtractBatchDetails(batchPb.GetName())
|
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
createTime := batchPb.GetCreateTime().AsTime()
|
|
||||||
stateTime := batchPb.GetStateTime().AsTime()
|
|
||||||
return BatchLogsURL(projectID, location, batchID, createTime, stateTime), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// BatchConsoleURL builds a URL to the Google Cloud Console linking to the batch summary page.
|
// BatchConsoleURL builds a URL to the Google Cloud Console linking to the batch summary page.
|
||||||
func BatchConsoleURL(projectID, location, batchID string) string {
|
func BatchConsoleURL(projectID, location, batchID string) string {
|
||||||
return fmt.Sprintf("https://console.cloud.google.com/dataproc/batches/%s/%s/summary?project=%s", location, batchID, projectID)
|
return fmt.Sprintf("https://console.cloud.google.com/dataproc/batches/%s/%s/summary?project=%s", location, batchID, projectID)
|
||||||
@@ -89,3 +69,23 @@ resource.labels.batch_id="%s"`
|
|||||||
|
|
||||||
return "https://console.cloud.google.com/logs/viewer?" + v.Encode()
|
return "https://console.cloud.google.com/logs/viewer?" + v.Encode()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// BatchConsoleURLFromProto builds a URL to the Google Cloud Console linking to the batch summary page.
|
||||||
|
func BatchConsoleURLFromProto(batchPb *dataprocpb.Batch) (string, error) {
|
||||||
|
projectID, location, batchID, err := ExtractBatchDetails(batchPb.GetName())
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
return BatchConsoleURL(projectID, location, batchID), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// BatchLogsURLFromProto builds a URL to the Google Cloud Console showing Cloud Logging for the given batch and time range.
|
||||||
|
func BatchLogsURLFromProto(batchPb *dataprocpb.Batch) (string, error) {
|
||||||
|
projectID, location, batchID, err := ExtractBatchDetails(batchPb.GetName())
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
createTime := batchPb.GetCreateTime().AsTime()
|
||||||
|
stateTime := batchPb.GetStateTime().AsTime()
|
||||||
|
return BatchLogsURL(projectID, location, batchID, createTime, stateTime), nil
|
||||||
|
}
|
||||||
@@ -1,10 +1,10 @@
|
|||||||
// Copyright 2025 Google LLC
|
// Copyright 2026 Google LLC
|
||||||
//
|
//
|
||||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
// you may not use this file except in compliance with the License.
|
// you may not use this file except in compliance with the License.
|
||||||
// You may obtain a copy of the License at
|
// You may obtain a copy of the License at
|
||||||
//
|
//
|
||||||
// http://www.apache.org/licenses/LICENSE-2.0
|
// http://www.apache.org/licenses/LICENSE-2.0
|
||||||
//
|
//
|
||||||
// Unless required by applicable law or agreed to in writing, software
|
// Unless required by applicable law or agreed to in writing, software
|
||||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
@@ -12,19 +12,20 @@
|
|||||||
// See the License for the specific language governing permissions and
|
// See the License for the specific language governing permissions and
|
||||||
// limitations under the License.
|
// limitations under the License.
|
||||||
|
|
||||||
package common
|
package serverlessspark_test
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
||||||
|
"github.com/googleapis/genai-toolbox/internal/sources/serverlessspark"
|
||||||
"google.golang.org/protobuf/types/known/timestamppb"
|
"google.golang.org/protobuf/types/known/timestamppb"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestExtractBatchDetails_Success(t *testing.T) {
|
func TestExtractBatchDetails_Success(t *testing.T) {
|
||||||
batchName := "projects/my-project/locations/us-central1/batches/my-batch"
|
batchName := "projects/my-project/locations/us-central1/batches/my-batch"
|
||||||
projectID, location, batchID, err := ExtractBatchDetails(batchName)
|
projectID, location, batchID, err := serverlessspark.ExtractBatchDetails(batchName)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Errorf("ExtractBatchDetails() error = %v, want no error", err)
|
t.Errorf("ExtractBatchDetails() error = %v, want no error", err)
|
||||||
return
|
return
|
||||||
@@ -45,7 +46,7 @@ func TestExtractBatchDetails_Success(t *testing.T) {
|
|||||||
|
|
||||||
func TestExtractBatchDetails_Failure(t *testing.T) {
|
func TestExtractBatchDetails_Failure(t *testing.T) {
|
||||||
batchName := "invalid-name"
|
batchName := "invalid-name"
|
||||||
_, _, _, err := ExtractBatchDetails(batchName)
|
_, _, _, err := serverlessspark.ExtractBatchDetails(batchName)
|
||||||
wantErr := "failed to parse batch name: invalid-name"
|
wantErr := "failed to parse batch name: invalid-name"
|
||||||
if err == nil || err.Error() != wantErr {
|
if err == nil || err.Error() != wantErr {
|
||||||
t.Errorf("ExtractBatchDetails() error = %v, want %v", err, wantErr)
|
t.Errorf("ExtractBatchDetails() error = %v, want %v", err, wantErr)
|
||||||
@@ -53,7 +54,7 @@ func TestExtractBatchDetails_Failure(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestBatchConsoleURL(t *testing.T) {
|
func TestBatchConsoleURL(t *testing.T) {
|
||||||
got := BatchConsoleURL("my-project", "us-central1", "my-batch")
|
got := serverlessspark.BatchConsoleURL("my-project", "us-central1", "my-batch")
|
||||||
want := "https://console.cloud.google.com/dataproc/batches/us-central1/my-batch/summary?project=my-project"
|
want := "https://console.cloud.google.com/dataproc/batches/us-central1/my-batch/summary?project=my-project"
|
||||||
if got != want {
|
if got != want {
|
||||||
t.Errorf("BatchConsoleURL() = %v, want %v", got, want)
|
t.Errorf("BatchConsoleURL() = %v, want %v", got, want)
|
||||||
@@ -63,7 +64,7 @@ func TestBatchConsoleURL(t *testing.T) {
|
|||||||
func TestBatchLogsURL(t *testing.T) {
|
func TestBatchLogsURL(t *testing.T) {
|
||||||
startTime := time.Date(2025, 10, 1, 5, 0, 0, 0, time.UTC)
|
startTime := time.Date(2025, 10, 1, 5, 0, 0, 0, time.UTC)
|
||||||
endTime := time.Date(2025, 10, 1, 6, 0, 0, 0, time.UTC)
|
endTime := time.Date(2025, 10, 1, 6, 0, 0, 0, time.UTC)
|
||||||
got := BatchLogsURL("my-project", "us-central1", "my-batch", startTime, endTime)
|
got := serverlessspark.BatchLogsURL("my-project", "us-central1", "my-batch", startTime, endTime)
|
||||||
want := "https://console.cloud.google.com/logs/viewer?advancedFilter=" +
|
want := "https://console.cloud.google.com/logs/viewer?advancedFilter=" +
|
||||||
"resource.type%3D%22cloud_dataproc_batch%22" +
|
"resource.type%3D%22cloud_dataproc_batch%22" +
|
||||||
"%0Aresource.labels.project_id%3D%22my-project%22" +
|
"%0Aresource.labels.project_id%3D%22my-project%22" +
|
||||||
@@ -82,7 +83,7 @@ func TestBatchConsoleURLFromProto(t *testing.T) {
|
|||||||
batchPb := &dataprocpb.Batch{
|
batchPb := &dataprocpb.Batch{
|
||||||
Name: "projects/my-project/locations/us-central1/batches/my-batch",
|
Name: "projects/my-project/locations/us-central1/batches/my-batch",
|
||||||
}
|
}
|
||||||
got, err := BatchConsoleURLFromProto(batchPb)
|
got, err := serverlessspark.BatchConsoleURLFromProto(batchPb)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("BatchConsoleURLFromProto() error = %v", err)
|
t.Fatalf("BatchConsoleURLFromProto() error = %v", err)
|
||||||
}
|
}
|
||||||
@@ -100,7 +101,7 @@ func TestBatchLogsURLFromProto(t *testing.T) {
|
|||||||
CreateTime: timestamppb.New(createTime),
|
CreateTime: timestamppb.New(createTime),
|
||||||
StateTime: timestamppb.New(stateTime),
|
StateTime: timestamppb.New(stateTime),
|
||||||
}
|
}
|
||||||
got, err := BatchLogsURLFromProto(batchPb)
|
got, err := serverlessspark.BatchLogsURLFromProto(batchPb)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("BatchLogsURLFromProto() error = %v", err)
|
t.Fatalf("BatchLogsURLFromProto() error = %v", err)
|
||||||
}
|
}
|
||||||
@@ -16,22 +16,13 @@ package fhirfetchpage
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
|
||||||
|
|
||||||
"net/http"
|
|
||||||
|
|
||||||
"golang.org/x/oauth2"
|
|
||||||
"golang.org/x/oauth2/google"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-fhir-fetch-page"
|
const kind string = "cloud-healthcare-fhir-fetch-page"
|
||||||
@@ -54,13 +45,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedFHIRStores() map[string]struct{}
|
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
FHIRFetchPage(context.Context, string, string) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -118,48 +104,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", pageURLKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", pageURLKey)
|
||||||
}
|
}
|
||||||
|
|
||||||
var httpClient *http.Client
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
ts := oauth2.StaticTokenSource(&oauth2.Token{AccessToken: tokenStr})
|
|
||||||
httpClient = oauth2.NewClient(ctx, ts)
|
|
||||||
} else {
|
|
||||||
// The source.Service() object holds a client with the default credentials.
|
|
||||||
// However, the client is not exported, so we have to create a new one.
|
|
||||||
var err error
|
|
||||||
httpClient, err = google.DefaultClient(ctx, healthcare.CloudHealthcareScope)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to create default http client: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to create http request: %w", err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
req.Header.Set("Accept", "application/fhir+json;charset=utf-8")
|
return source.FHIRFetchPage(ctx, url, tokenStr)
|
||||||
|
|
||||||
resp, err := httpClient.Do(req)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get fhir page from %q: %w", url, err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("read: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
var jsonMap map[string]interface{}
|
|
||||||
if err := json.Unmarshal([]byte(string(respBytes)), &jsonMap); err != nil {
|
|
||||||
return nil, fmt.Errorf("could not unmarshal response as json: %w", err)
|
|
||||||
}
|
|
||||||
return jsonMap, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,20 +16,16 @@ package fhirpatienteverything
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/googleapi"
|
"google.golang.org/api/googleapi"
|
||||||
"google.golang.org/api/healthcare/v1"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-fhir-patient-everything"
|
const kind string = "cloud-healthcare-fhir-patient-everything"
|
||||||
@@ -54,13 +50,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedFHIRStores() map[string]struct{}
|
AllowedFHIRStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
FHIRPatientEverything(string, string, string, []googleapi.CallOption) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -139,20 +131,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", patientIDKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", patientIDKey)
|
||||||
}
|
}
|
||||||
|
|
||||||
svc := source.Service()
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
// Initialize new service if using user OAuth token
|
if err != nil {
|
||||||
if source.UseClientAuthorization() {
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s/fhir/Patient/%s", source.Project(), source.Region(), source.DatasetID(), storeID, patientID)
|
|
||||||
var opts []googleapi.CallOption
|
var opts []googleapi.CallOption
|
||||||
if val, ok := params.AsMap()[typeFilterKey]; ok {
|
if val, ok := params.AsMap()[typeFilterKey]; ok {
|
||||||
types, ok := val.([]any)
|
types, ok := val.([]any)
|
||||||
@@ -176,25 +159,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
opts = append(opts, googleapi.QueryParameter("_since", sinceStr))
|
opts = append(opts, googleapi.QueryParameter("_since", sinceStr))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
return source.FHIRPatientEverything(storeID, patientID, tokenStr, opts)
|
||||||
resp, err := svc.Projects.Locations.Datasets.FhirStores.Fhir.PatientEverything(name).Do(opts...)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to call patient everything for %q: %w", name, err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("patient-everything: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
var jsonMap map[string]interface{}
|
|
||||||
if err := json.Unmarshal([]byte(string(respBytes)), &jsonMap); err != nil {
|
|
||||||
return nil, fmt.Errorf("could not unmarshal response as json: %w", err)
|
|
||||||
}
|
|
||||||
return jsonMap, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,20 +16,16 @@ package fhirpatientsearch
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/googleapi"
|
"google.golang.org/api/googleapi"
|
||||||
"google.golang.org/api/healthcare/v1"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-fhir-patient-search"
|
const kind string = "cloud-healthcare-fhir-patient-search"
|
||||||
@@ -70,13 +66,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedFHIRStores() map[string]struct{}
|
AllowedFHIRStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
FHIRPatientSearch(string, string, []googleapi.CallOption) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -169,17 +161,9 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
svc := source.Service()
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
// Initialize new service if using user OAuth token
|
if err != nil {
|
||||||
if source.UseClientAuthorization() {
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
var summary bool
|
var summary bool
|
||||||
@@ -248,26 +232,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if summary {
|
if summary {
|
||||||
opts = append(opts, googleapi.QueryParameter("_summary", "text"))
|
opts = append(opts, googleapi.QueryParameter("_summary", "text"))
|
||||||
}
|
}
|
||||||
|
return source.FHIRPatientSearch(storeID, tokenStr, opts)
|
||||||
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
|
||||||
resp, err := svc.Projects.Locations.Datasets.FhirStores.Fhir.SearchType(name, "Patient", &healthcare.SearchResourcesRequest{ResourceType: "Patient"}).Do(opts...)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to search patient resources: %w", err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("search: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
var jsonMap map[string]interface{}
|
|
||||||
if err := json.Unmarshal([]byte(string(respBytes)), &jsonMap); err != nil {
|
|
||||||
return nil, fmt.Errorf("could not unmarshal response as json: %w", err)
|
|
||||||
}
|
|
||||||
return jsonMap, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
"google.golang.org/api/healthcare/v1"
|
||||||
@@ -44,12 +43,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
GetDataset(string) (*healthcare.Dataset, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -100,27 +95,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
datasetName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s", source.Project(), source.Region(), source.DatasetID())
|
|
||||||
dataset, err := svc.Projects.Locations.Datasets.Get(datasetName).Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get dataset %q: %w", datasetName, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
return dataset, nil
|
return source.GetDataset(tokenStr)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
@@ -45,13 +44,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedDICOMStores() map[string]struct{}
|
AllowedDICOMStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
GetDICOMStore(string, string) (*healthcare.DicomStore, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,31 +112,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
|
||||||
store, err := svc.Projects.Locations.Datasets.DicomStores.Get(storeName).Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get DICOM store %q: %w", storeName, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
return store, nil
|
return source.GetDICOMStore(storeID, tokenStr)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
@@ -45,13 +44,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedDICOMStores() map[string]struct{}
|
AllowedDICOMStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
GetDICOMStoreMetrics(string, string) (*healthcare.DicomStoreMetrics, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,31 +112,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
|
||||||
store, err := svc.Projects.Locations.Datasets.DicomStores.GetDICOMStoreMetrics(storeName).Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get metrics for DICOM store %q: %w", storeName, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
return store, nil
|
return source.GetDICOMStoreMetrics(storeID, tokenStr)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,18 +16,14 @@ package getfhirresource
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-get-fhir-resource"
|
const kind string = "cloud-healthcare-get-fhir-resource"
|
||||||
@@ -51,13 +47,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedFHIRStores() map[string]struct{}
|
AllowedFHIRStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
GetFHIRResource(string, string, string, string) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -134,46 +126,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", typeKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", typeKey)
|
||||||
}
|
}
|
||||||
|
|
||||||
resID, ok := params.AsMap()[idKey].(string)
|
resID, ok := params.AsMap()[idKey].(string)
|
||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", idKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected a string", idKey)
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s/fhir/%s/%s", source.Project(), source.Region(), source.DatasetID(), storeID, resType, resID)
|
|
||||||
call := svc.Projects.Locations.Datasets.FhirStores.Fhir.Read(name)
|
|
||||||
call.Header().Set("Content-Type", "application/fhir+json;charset=utf-8")
|
|
||||||
resp, err := call.Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get fhir resource %q: %w", name, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
defer resp.Body.Close()
|
return source.GetFHIRResource(storeID, resType, resID, tokenStr)
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("read: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
var jsonMap map[string]interface{}
|
|
||||||
if err := json.Unmarshal([]byte(string(respBytes)), &jsonMap); err != nil {
|
|
||||||
return nil, fmt.Errorf("could not unmarshal response as json: %w", err)
|
|
||||||
}
|
|
||||||
return jsonMap, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
@@ -45,13 +44,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedFHIRStores() map[string]struct{}
|
AllowedFHIRStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
GetFHIRStore(string, string) (*healthcare.FhirStore, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,31 +112,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedFHIRStores())
|
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedFHIRStores())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
|
||||||
store, err := svc.Projects.Locations.Datasets.FhirStores.Get(storeName).Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get FHIR store %q: %w", storeName, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
return store, nil
|
return source.GetFHIRStore(storeID, tokenStr)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
@@ -45,13 +44,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedFHIRStores() map[string]struct{}
|
AllowedFHIRStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
GetFHIRStoreMetrics(string, string) (*healthcare.FhirStoreMetrics, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,31 +112,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedFHIRStores())
|
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedFHIRStores())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
storeName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/fhirStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
|
||||||
store, err := svc.Projects.Locations.Datasets.FhirStores.GetFHIRStoreMetrics(storeName).Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get metrics for FHIR store %q: %w", storeName, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
return store, nil
|
return source.GetFHIRStoreMetrics(storeID, tokenStr)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -17,12 +17,10 @@ package listdicomstores
|
|||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"fmt"
|
"fmt"
|
||||||
"strings"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
"google.golang.org/api/healthcare/v1"
|
||||||
@@ -45,13 +43,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedDICOMStores() map[string]struct{}
|
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
ListDICOMStores(tokenStr string) ([]*healthcare.DicomStore, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -102,41 +95,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
datasetName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s", source.Project(), source.Region(), source.DatasetID())
|
|
||||||
stores, err := svc.Projects.Locations.Datasets.DicomStores.List(datasetName).Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get dataset %q: %w", datasetName, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
var filtered []*healthcare.DicomStore
|
return source.ListDICOMStores(tokenStr)
|
||||||
for _, store := range stores.DicomStores {
|
|
||||||
if len(source.AllowedDICOMStores()) == 0 {
|
|
||||||
filtered = append(filtered, store)
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
if len(store.Name) == 0 {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
parts := strings.Split(store.Name, "/")
|
|
||||||
if _, ok := source.AllowedDICOMStores()[parts[len(parts)-1]]; ok {
|
|
||||||
filtered = append(filtered, store)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return filtered, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -17,12 +17,10 @@ package listfhirstores
|
|||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"fmt"
|
"fmt"
|
||||||
"strings"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
"google.golang.org/api/healthcare/v1"
|
||||||
@@ -45,13 +43,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedFHIRStores() map[string]struct{}
|
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
ListFHIRStores(string) ([]*healthcare.FhirStore, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -102,41 +95,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
|
||||||
|
|
||||||
// Initialize new service if using user OAuth token
|
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
datasetName := fmt.Sprintf("projects/%s/locations/%s/datasets/%s", source.Project(), source.Region(), source.DatasetID())
|
|
||||||
stores, err := svc.Projects.Locations.Datasets.FhirStores.List(datasetName).Do()
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to get dataset %q: %w", datasetName, err)
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
}
|
}
|
||||||
var filtered []*healthcare.FhirStore
|
return source.ListFHIRStores(tokenStr)
|
||||||
for _, store := range stores.FhirStores {
|
|
||||||
if len(source.AllowedFHIRStores()) == 0 {
|
|
||||||
filtered = append(filtered, store)
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
if len(store.Name) == 0 {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
parts := strings.Split(store.Name, "/")
|
|
||||||
if _, ok := source.AllowedFHIRStores()[parts[len(parts)-1]]; ok {
|
|
||||||
filtered = append(filtered, store)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return filtered, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,18 +16,14 @@ package retrieverendereddicominstance
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/base64"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-retrieve-rendered-dicom-instance"
|
const kind string = "cloud-healthcare-retrieve-rendered-dicom-instance"
|
||||||
@@ -53,13 +49,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedDICOMStores() map[string]struct{}
|
AllowedDICOMStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
RetrieveRenderedDICOMInstance(string, string, string, string, int, string) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -135,20 +127,10 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
if err != nil {
|
||||||
// Initialize new service if using user OAuth token
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
study, ok := params.AsMap()[studyInstanceUIDKey].(string)
|
study, ok := params.AsMap()[studyInstanceUIDKey].(string)
|
||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("invalid '%s' parameter; expected a string", studyInstanceUIDKey)
|
return nil, fmt.Errorf("invalid '%s' parameter; expected a string", studyInstanceUIDKey)
|
||||||
@@ -165,25 +147,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("invalid '%s' parameter; expected an integer", frameNumberKey)
|
return nil, fmt.Errorf("invalid '%s' parameter; expected an integer", frameNumberKey)
|
||||||
}
|
}
|
||||||
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
return source.RetrieveRenderedDICOMInstance(storeID, study, series, sop, frame, tokenStr)
|
||||||
dicomWebPath := fmt.Sprintf("studies/%s/series/%s/instances/%s/frames/%d/rendered", study, series, sop, frame)
|
|
||||||
call := svc.Projects.Locations.Datasets.DicomStores.Studies.Series.Instances.Frames.RetrieveRendered(name, dicomWebPath)
|
|
||||||
call.Header().Set("Accept", "image/jpeg")
|
|
||||||
resp, err := call.Do()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("unable to retrieve dicom instance rendered image: %w", err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("RetrieveRendered: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
base64String := base64.StdEncoding.EncodeToString(respBytes)
|
|
||||||
return base64String, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,20 +16,16 @@ package searchdicominstances
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/googleapi"
|
"google.golang.org/api/googleapi"
|
||||||
"google.golang.org/api/healthcare/v1"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-search-dicom-instances"
|
const kind string = "cloud-healthcare-search-dicom-instances"
|
||||||
@@ -60,13 +56,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedDICOMStores() map[string]struct{}
|
AllowedDICOMStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
SearchDICOM(string, string, string, string, []googleapi.CallOption) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -144,23 +136,13 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
if err != nil {
|
||||||
// Initialize new service if using user OAuth token
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
opts, err := common.ParseDICOMSearchParameters(params, []string{sopInstanceUIDKey, patientNameKey, patientIDKey, accessionNumberKey, referringPhysicianNameKey, studyDateKey, modalityKey})
|
opts, err := common.ParseDICOMSearchParameters(params, []string{sopInstanceUIDKey, patientNameKey, patientIDKey, accessionNumberKey, referringPhysicianNameKey, studyDateKey, modalityKey})
|
||||||
@@ -191,29 +173,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
return source.SearchDICOM(t.Kind, storeID, dicomWebPath, tokenStr, opts)
|
||||||
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
|
||||||
resp, err := svc.Projects.Locations.Datasets.DicomStores.SearchForInstances(name, dicomWebPath).Do(opts...)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to search dicom instances: %w", err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("search: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
if len(respBytes) == 0 {
|
|
||||||
return []interface{}{}, nil
|
|
||||||
}
|
|
||||||
var result []interface{}
|
|
||||||
if err := json.Unmarshal([]byte(string(respBytes)), &result); err != nil {
|
|
||||||
return nil, fmt.Errorf("could not unmarshal response as list: %w", err)
|
|
||||||
}
|
|
||||||
return result, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,18 +16,15 @@ package searchdicomseries
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
"google.golang.org/api/googleapi"
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-search-dicom-series"
|
const kind string = "cloud-healthcare-search-dicom-series"
|
||||||
@@ -57,13 +54,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedDICOMStores() map[string]struct{}
|
AllowedDICOMStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
SearchDICOM(string, string, string, string, []googleapi.CallOption) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -145,18 +138,9 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
if err != nil {
|
||||||
// Initialize new service if using user OAuth token
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
opts, err := common.ParseDICOMSearchParameters(params, []string{seriesInstanceUIDKey, patientNameKey, patientIDKey, accessionNumberKey, referringPhysicianNameKey, studyDateKey, modalityKey})
|
opts, err := common.ParseDICOMSearchParameters(params, []string{seriesInstanceUIDKey, patientNameKey, patientIDKey, accessionNumberKey, referringPhysicianNameKey, studyDateKey, modalityKey})
|
||||||
@@ -174,29 +158,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
dicomWebPath = fmt.Sprintf("studies/%s/series", id)
|
dicomWebPath = fmt.Sprintf("studies/%s/series", id)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
return source.SearchDICOM(t.Kind, storeID, dicomWebPath, tokenStr, opts)
|
||||||
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
|
||||||
resp, err := svc.Projects.Locations.Datasets.DicomStores.SearchForSeries(name, dicomWebPath).Do(opts...)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to search dicom series: %w", err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("search: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
if len(respBytes) == 0 {
|
|
||||||
return []interface{}{}, nil
|
|
||||||
}
|
|
||||||
var result []interface{}
|
|
||||||
if err := json.Unmarshal([]byte(string(respBytes)), &result); err != nil {
|
|
||||||
return nil, fmt.Errorf("could not unmarshal response as list: %w", err)
|
|
||||||
}
|
|
||||||
return result, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,18 +16,15 @@ package searchdicomstudies
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
healthcareds "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
"github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/common"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/healthcare/v1"
|
"google.golang.org/api/googleapi"
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "cloud-healthcare-search-dicom-studies"
|
const kind string = "cloud-healthcare-search-dicom-studies"
|
||||||
@@ -55,13 +52,9 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
Project() string
|
|
||||||
Region() string
|
|
||||||
DatasetID() string
|
|
||||||
AllowedDICOMStores() map[string]struct{}
|
AllowedDICOMStores() map[string]struct{}
|
||||||
Service() *healthcare.Service
|
|
||||||
ServiceCreator() healthcareds.HealthcareServiceCreator
|
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
|
SearchDICOM(string, string, string, string, []googleapi.CallOption) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -136,51 +129,20 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
storeID, err := common.ValidateAndFetchStoreID(params, source.AllowedDICOMStores())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
tokenStr, err := accessToken.ParseBearerToken()
|
||||||
svc := source.Service()
|
if err != nil {
|
||||||
// Initialize new service if using user OAuth token
|
return nil, fmt.Errorf("error parsing access token: %w", err)
|
||||||
if source.UseClientAuthorization() {
|
|
||||||
tokenStr, err := accessToken.ParseBearerToken()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error parsing access token: %w", err)
|
|
||||||
}
|
|
||||||
svc, err = source.ServiceCreator()(tokenStr)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error creating service from OAuth access token: %w", err)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
opts, err := common.ParseDICOMSearchParameters(params, []string{studyInstanceUIDKey, patientNameKey, patientIDKey, accessionNumberKey, referringPhysicianNameKey, studyDateKey})
|
opts, err := common.ParseDICOMSearchParameters(params, []string{studyInstanceUIDKey, patientNameKey, patientIDKey, accessionNumberKey, referringPhysicianNameKey, studyDateKey})
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", source.Project(), source.Region(), source.DatasetID(), storeID)
|
dicomWebPath := "studies"
|
||||||
resp, err := svc.Projects.Locations.Datasets.DicomStores.SearchForStudies(name, "studies").Do(opts...)
|
return source.SearchDICOM(t.Kind, storeID, dicomWebPath, tokenStr, opts)
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to search dicom studies: %w", err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
respBytes, err := io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("could not read response: %w", err)
|
|
||||||
}
|
|
||||||
if resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("search: status %d %s: %s", resp.StatusCode, resp.Status, respBytes)
|
|
||||||
}
|
|
||||||
if len(respBytes) == 0 {
|
|
||||||
return []interface{}{}, nil
|
|
||||||
}
|
|
||||||
var result []interface{}
|
|
||||||
if err := json.Unmarshal([]byte(string(respBytes)), &result); err != nil {
|
|
||||||
return nil, fmt.Errorf("could not unmarshal response as list: %w", err)
|
|
||||||
}
|
|
||||||
return result, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -18,7 +18,6 @@ import (
|
|||||||
"context"
|
"context"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
dataplexapi "cloud.google.com/go/dataplex/apiv1"
|
|
||||||
dataplexpb "cloud.google.com/go/dataplex/apiv1/dataplexpb"
|
dataplexpb "cloud.google.com/go/dataplex/apiv1/dataplexpb"
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
@@ -44,7 +43,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
CatalogClient() *dataplexapi.CatalogClient
|
LookupEntry(context.Context, string, int, []string, string) (*dataplexpb.Entry, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -118,12 +117,6 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
viewMap := map[int]dataplexpb.EntryView{
|
|
||||||
1: dataplexpb.EntryView_BASIC,
|
|
||||||
2: dataplexpb.EntryView_FULL,
|
|
||||||
3: dataplexpb.EntryView_CUSTOM,
|
|
||||||
4: dataplexpb.EntryView_ALL,
|
|
||||||
}
|
|
||||||
name, _ := paramsMap["name"].(string)
|
name, _ := paramsMap["name"].(string)
|
||||||
entry, _ := paramsMap["entry"].(string)
|
entry, _ := paramsMap["entry"].(string)
|
||||||
view, _ := paramsMap["view"].(int)
|
view, _ := paramsMap["view"].(int)
|
||||||
@@ -132,19 +125,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("can't convert aspectTypes to array of strings: %s", err)
|
return nil, fmt.Errorf("can't convert aspectTypes to array of strings: %s", err)
|
||||||
}
|
}
|
||||||
aspectTypes := aspectTypeSlice.([]string)
|
aspectTypes := aspectTypeSlice.([]string)
|
||||||
|
return source.LookupEntry(ctx, name, view, aspectTypes, entry)
|
||||||
req := &dataplexpb.LookupEntryRequest{
|
|
||||||
Name: name,
|
|
||||||
View: viewMap[view],
|
|
||||||
AspectTypes: aspectTypes,
|
|
||||||
Entry: entry,
|
|
||||||
}
|
|
||||||
|
|
||||||
result, err := source.CatalogClient().LookupEntry(ctx, req)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
return result, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -18,9 +18,7 @@ import (
|
|||||||
"context"
|
"context"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
dataplexapi "cloud.google.com/go/dataplex/apiv1"
|
"cloud.google.com/go/dataplex/apiv1/dataplexpb"
|
||||||
dataplexpb "cloud.google.com/go/dataplex/apiv1/dataplexpb"
|
|
||||||
"github.com/cenkalti/backoff/v5"
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
@@ -45,8 +43,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
CatalogClient() *dataplexapi.CatalogClient
|
SearchAspectTypes(context.Context, string, int, string) ([]*dataplexpb.AspectType, error)
|
||||||
ProjectID() string
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -101,61 +98,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
// Invoke the tool with the provided parameters
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
query, _ := paramsMap["query"].(string)
|
query, _ := paramsMap["query"].(string)
|
||||||
pageSize := int32(paramsMap["pageSize"].(int))
|
pageSize, _ := paramsMap["pageSize"].(int)
|
||||||
orderBy, _ := paramsMap["orderBy"].(string)
|
orderBy, _ := paramsMap["orderBy"].(string)
|
||||||
|
return source.SearchAspectTypes(ctx, query, pageSize, orderBy)
|
||||||
// Create SearchEntriesRequest with the provided parameters
|
|
||||||
req := &dataplexpb.SearchEntriesRequest{
|
|
||||||
Query: query + " type=projects/dataplex-types/locations/global/entryTypes/aspecttype",
|
|
||||||
Name: fmt.Sprintf("projects/%s/locations/global", source.ProjectID()),
|
|
||||||
PageSize: pageSize,
|
|
||||||
OrderBy: orderBy,
|
|
||||||
SemanticSearch: true,
|
|
||||||
}
|
|
||||||
|
|
||||||
// Perform the search using the CatalogClient - this will return an iterator
|
|
||||||
it := source.CatalogClient().SearchEntries(ctx, req)
|
|
||||||
if it == nil {
|
|
||||||
return nil, fmt.Errorf("failed to create search entries iterator for project %q", source.ProjectID())
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create an instance of exponential backoff with default values for retrying GetAspectType calls
|
|
||||||
// InitialInterval, RandomizationFactor, Multiplier, MaxInterval = 500 ms, 0.5, 1.5, 60 s
|
|
||||||
getAspectBackOff := backoff.NewExponentialBackOff()
|
|
||||||
|
|
||||||
// Iterate through the search results and call GetAspectType for each result using the resource name
|
|
||||||
var results []*dataplexpb.AspectType
|
|
||||||
for {
|
|
||||||
entry, err := it.Next()
|
|
||||||
if err != nil {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
resourceName := entry.DataplexEntry.GetEntrySource().Resource
|
|
||||||
getAspectTypeReq := &dataplexpb.GetAspectTypeRequest{
|
|
||||||
Name: resourceName,
|
|
||||||
}
|
|
||||||
|
|
||||||
operation := func() (*dataplexpb.AspectType, error) {
|
|
||||||
aspectType, err := source.CatalogClient().GetAspectType(ctx, getAspectTypeReq)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get aspect type for entry %q: %w", resourceName, err)
|
|
||||||
}
|
|
||||||
return aspectType, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// Retry the GetAspectType operation with exponential backoff
|
|
||||||
aspectType, err := backoff.Retry(ctx, operation, backoff.WithBackOff(getAspectBackOff))
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get aspect type after retries for entry %q: %w", resourceName, err)
|
|
||||||
}
|
|
||||||
|
|
||||||
results = append(results, aspectType)
|
|
||||||
}
|
|
||||||
return results, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -18,8 +18,7 @@ import (
|
|||||||
"context"
|
"context"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
dataplexapi "cloud.google.com/go/dataplex/apiv1"
|
"cloud.google.com/go/dataplex/apiv1/dataplexpb"
|
||||||
dataplexpb "cloud.google.com/go/dataplex/apiv1/dataplexpb"
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
@@ -44,8 +43,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
CatalogClient() *dataplexapi.CatalogClient
|
SearchEntries(context.Context, string, int, string) ([]*dataplexpb.SearchEntriesResult, error)
|
||||||
ProjectID() string
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -100,34 +98,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
query, _ := paramsMap["query"].(string)
|
query, _ := paramsMap["query"].(string)
|
||||||
pageSize := int32(paramsMap["pageSize"].(int))
|
pageSize, _ := paramsMap["pageSize"].(int)
|
||||||
orderBy, _ := paramsMap["orderBy"].(string)
|
orderBy, _ := paramsMap["orderBy"].(string)
|
||||||
|
return source.SearchEntries(ctx, query, pageSize, orderBy)
|
||||||
req := &dataplexpb.SearchEntriesRequest{
|
|
||||||
Query: query,
|
|
||||||
Name: fmt.Sprintf("projects/%s/locations/global", source.ProjectID()),
|
|
||||||
PageSize: pageSize,
|
|
||||||
OrderBy: orderBy,
|
|
||||||
SemanticSearch: true,
|
|
||||||
}
|
|
||||||
|
|
||||||
it := source.CatalogClient().SearchEntries(ctx, req)
|
|
||||||
if it == nil {
|
|
||||||
return nil, fmt.Errorf("failed to create search entries iterator for project %q", source.ProjectID())
|
|
||||||
}
|
|
||||||
|
|
||||||
var results []*dataplexpb.SearchEntriesResult
|
|
||||||
for {
|
|
||||||
entry, err := it.Next()
|
|
||||||
if err != nil {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
results = append(results, entry)
|
|
||||||
}
|
|
||||||
return results, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -48,6 +48,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirestoreClient() *firestoreapi.Client
|
FirestoreClient() *firestoreapi.Client
|
||||||
|
AddDocuments(context.Context, string, any, bool) (map[string]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -134,24 +135,20 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
mapParams := params.AsMap()
|
mapParams := params.AsMap()
|
||||||
|
|
||||||
// Get collection path
|
// Get collection path
|
||||||
collectionPath, ok := mapParams[collectionPathKey].(string)
|
collectionPath, ok := mapParams[collectionPathKey].(string)
|
||||||
if !ok || collectionPath == "" {
|
if !ok || collectionPath == "" {
|
||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter", collectionPathKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter", collectionPathKey)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Validate collection path
|
// Validate collection path
|
||||||
if err := util.ValidateCollectionPath(collectionPath); err != nil {
|
if err := util.ValidateCollectionPath(collectionPath); err != nil {
|
||||||
return nil, fmt.Errorf("invalid collection path: %w", err)
|
return nil, fmt.Errorf("invalid collection path: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Get document data
|
// Get document data
|
||||||
documentDataRaw, ok := mapParams[documentDataKey]
|
documentDataRaw, ok := mapParams[documentDataKey]
|
||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter", documentDataKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter", documentDataKey)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Convert the document data from JSON format to Firestore format
|
// Convert the document data from JSON format to Firestore format
|
||||||
// The client is passed to handle referenceValue types
|
// The client is passed to handle referenceValue types
|
||||||
documentData, err := util.JSONToFirestoreValue(documentDataRaw, source.FirestoreClient())
|
documentData, err := util.JSONToFirestoreValue(documentDataRaw, source.FirestoreClient())
|
||||||
@@ -164,30 +161,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if val, ok := mapParams[returnDocumentDataKey].(bool); ok {
|
if val, ok := mapParams[returnDocumentDataKey].(bool); ok {
|
||||||
returnData = val
|
returnData = val
|
||||||
}
|
}
|
||||||
|
return source.AddDocuments(ctx, collectionPath, documentData, returnData)
|
||||||
// Get the collection reference
|
|
||||||
collection := source.FirestoreClient().Collection(collectionPath)
|
|
||||||
|
|
||||||
// Add the document to the collection
|
|
||||||
docRef, writeResult, err := collection.Add(ctx, documentData)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to add document: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Build the response
|
|
||||||
response := map[string]any{
|
|
||||||
"documentPath": docRef.Path,
|
|
||||||
"createTime": writeResult.UpdateTime.Format("2006-01-02T15:04:05.999999999Z"),
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add document data if requested
|
|
||||||
if returnData {
|
|
||||||
// Convert the document data back to simple JSON format
|
|
||||||
simplifiedData := util.FirestoreValueToJSON(documentData)
|
|
||||||
response["documentData"] = simplifiedData
|
|
||||||
}
|
|
||||||
|
|
||||||
return response, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -46,6 +46,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirestoreClient() *firestoreapi.Client
|
FirestoreClient() *firestoreapi.Client
|
||||||
|
DeleteDocuments(context.Context, []string) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -104,7 +105,6 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected an array", documentPathsKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter; expected an array", documentPathsKey)
|
||||||
}
|
}
|
||||||
|
|
||||||
if len(documentPathsRaw) == 0 {
|
if len(documentPathsRaw) == 0 {
|
||||||
return nil, fmt.Errorf("'%s' parameter cannot be empty", documentPathsKey)
|
return nil, fmt.Errorf("'%s' parameter cannot be empty", documentPathsKey)
|
||||||
}
|
}
|
||||||
@@ -126,45 +126,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("invalid document path at index %d: %w", i, err)
|
return nil, fmt.Errorf("invalid document path at index %d: %w", i, err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
return source.DeleteDocuments(ctx, documentPaths)
|
||||||
// Create a BulkWriter to handle multiple deletions efficiently
|
|
||||||
bulkWriter := source.FirestoreClient().BulkWriter(ctx)
|
|
||||||
|
|
||||||
// Keep track of jobs for each document
|
|
||||||
jobs := make([]*firestoreapi.BulkWriterJob, len(documentPaths))
|
|
||||||
|
|
||||||
// Add all delete operations to the BulkWriter
|
|
||||||
for i, path := range documentPaths {
|
|
||||||
docRef := source.FirestoreClient().Doc(path)
|
|
||||||
job, err := bulkWriter.Delete(docRef)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to add delete operation for document %q: %w", path, err)
|
|
||||||
}
|
|
||||||
jobs[i] = job
|
|
||||||
}
|
|
||||||
|
|
||||||
// End the BulkWriter to execute all operations
|
|
||||||
bulkWriter.End()
|
|
||||||
|
|
||||||
// Collect results
|
|
||||||
results := make([]any, len(documentPaths))
|
|
||||||
for i, job := range jobs {
|
|
||||||
docData := make(map[string]any)
|
|
||||||
docData["path"] = documentPaths[i]
|
|
||||||
|
|
||||||
// Wait for the job to complete and get the result
|
|
||||||
_, err := job.Results()
|
|
||||||
if err != nil {
|
|
||||||
docData["success"] = false
|
|
||||||
docData["error"] = err.Error()
|
|
||||||
} else {
|
|
||||||
docData["success"] = true
|
|
||||||
}
|
|
||||||
|
|
||||||
results[i] = docData
|
|
||||||
}
|
|
||||||
|
|
||||||
return results, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -46,6 +46,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirestoreClient() *firestoreapi.Client
|
FirestoreClient() *firestoreapi.Client
|
||||||
|
GetDocuments(context.Context, []string) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -126,37 +127,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("invalid document path at index %d: %w", i, err)
|
return nil, fmt.Errorf("invalid document path at index %d: %w", i, err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
return source.GetDocuments(ctx, documentPaths)
|
||||||
// Create document references from paths
|
|
||||||
docRefs := make([]*firestoreapi.DocumentRef, len(documentPaths))
|
|
||||||
for i, path := range documentPaths {
|
|
||||||
docRefs[i] = source.FirestoreClient().Doc(path)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get all documents
|
|
||||||
snapshots, err := source.FirestoreClient().GetAll(ctx, docRefs)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get documents: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert snapshots to response data
|
|
||||||
results := make([]any, len(snapshots))
|
|
||||||
for i, snapshot := range snapshots {
|
|
||||||
docData := make(map[string]any)
|
|
||||||
docData["path"] = documentPaths[i]
|
|
||||||
docData["exists"] = snapshot.Exists()
|
|
||||||
|
|
||||||
if snapshot.Exists() {
|
|
||||||
docData["data"] = snapshot.Data()
|
|
||||||
docData["createTime"] = snapshot.CreateTime
|
|
||||||
docData["updateTime"] = snapshot.UpdateTime
|
|
||||||
docData["readTime"] = snapshot.ReadTime
|
|
||||||
}
|
|
||||||
|
|
||||||
results[i] = docData
|
|
||||||
}
|
|
||||||
|
|
||||||
return results, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -44,8 +44,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirebaseRulesClient() *firebaserules.Service
|
FirebaseRulesClient() *firebaserules.Service
|
||||||
GetProjectId() string
|
GetRules(context.Context) (any, error)
|
||||||
GetDatabaseId() string
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -98,29 +97,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
return source.GetRules(ctx)
|
||||||
// Get the latest release for Firestore
|
|
||||||
releaseName := fmt.Sprintf("projects/%s/releases/cloud.firestore/%s", source.GetProjectId(), source.GetDatabaseId())
|
|
||||||
release, err := source.FirebaseRulesClient().Projects.Releases.Get(releaseName).Context(ctx).Do()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get latest Firestore release: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
if release.RulesetName == "" {
|
|
||||||
return nil, fmt.Errorf("no active Firestore rules were found in project '%s' and database '%s'", source.GetProjectId(), source.GetDatabaseId())
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get the ruleset content
|
|
||||||
ruleset, err := source.FirebaseRulesClient().Projects.Rulesets.Get(release.RulesetName).Context(ctx).Do()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get ruleset content: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
if ruleset.Source == nil || len(ruleset.Source.Files) == 0 {
|
|
||||||
return nil, fmt.Errorf("no rules files found in ruleset")
|
|
||||||
}
|
|
||||||
|
|
||||||
return ruleset, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -46,6 +46,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirestoreClient() *firestoreapi.Client
|
FirestoreClient() *firestoreapi.Client
|
||||||
|
ListCollections(context.Context, string) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -102,47 +103,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
|
|
||||||
mapParams := params.AsMap()
|
mapParams := params.AsMap()
|
||||||
|
|
||||||
var collectionRefs []*firestoreapi.CollectionRef
|
|
||||||
|
|
||||||
// Check if parentPath is provided
|
// Check if parentPath is provided
|
||||||
parentPath, hasParent := mapParams[parentPathKey].(string)
|
parentPath, _ := mapParams[parentPathKey].(string)
|
||||||
|
if parentPath != "" {
|
||||||
if hasParent && parentPath != "" {
|
|
||||||
// Validate parent document path
|
// Validate parent document path
|
||||||
if err := util.ValidateDocumentPath(parentPath); err != nil {
|
if err := util.ValidateDocumentPath(parentPath); err != nil {
|
||||||
return nil, fmt.Errorf("invalid parent document path: %w", err)
|
return nil, fmt.Errorf("invalid parent document path: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// List subcollections of the specified document
|
|
||||||
docRef := source.FirestoreClient().Doc(parentPath)
|
|
||||||
collectionRefs, err = docRef.Collections(ctx).GetAll()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to list subcollections of document %q: %w", parentPath, err)
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
// List root collections
|
|
||||||
collectionRefs, err = source.FirestoreClient().Collections(ctx).GetAll()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to list root collections: %w", err)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
return source.ListCollections(ctx, parentPath)
|
||||||
// Convert collection references to response data
|
|
||||||
results := make([]any, len(collectionRefs))
|
|
||||||
for i, collRef := range collectionRefs {
|
|
||||||
collData := make(map[string]any)
|
|
||||||
collData["id"] = collRef.ID
|
|
||||||
collData["path"] = collRef.Path
|
|
||||||
|
|
||||||
// If this is a subcollection, include parent information
|
|
||||||
if collRef.Parent != nil {
|
|
||||||
collData["parent"] = collRef.Parent.Path
|
|
||||||
}
|
|
||||||
|
|
||||||
results[i] = collData
|
|
||||||
}
|
|
||||||
|
|
||||||
return results, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -36,27 +36,6 @@ const (
|
|||||||
defaultLimit = 100
|
defaultLimit = 100
|
||||||
)
|
)
|
||||||
|
|
||||||
// Firestore operators
|
|
||||||
var validOperators = map[string]bool{
|
|
||||||
"<": true,
|
|
||||||
"<=": true,
|
|
||||||
">": true,
|
|
||||||
">=": true,
|
|
||||||
"==": true,
|
|
||||||
"!=": true,
|
|
||||||
"array-contains": true,
|
|
||||||
"array-contains-any": true,
|
|
||||||
"in": true,
|
|
||||||
"not-in": true,
|
|
||||||
}
|
|
||||||
|
|
||||||
// Error messages
|
|
||||||
const (
|
|
||||||
errFilterParseFailed = "failed to parse filters: %w"
|
|
||||||
errQueryExecutionFailed = "failed to execute query: %w"
|
|
||||||
errLimitParseFailed = "failed to parse limit value '%s': %w"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
func init() {
|
||||||
if !tools.Register(kind, newConfig) {
|
if !tools.Register(kind, newConfig) {
|
||||||
panic(fmt.Sprintf("tool kind %q already registered", kind))
|
panic(fmt.Sprintf("tool kind %q already registered", kind))
|
||||||
@@ -74,6 +53,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
// compatibleSource defines the interface for sources that can provide a Firestore client
|
// compatibleSource defines the interface for sources that can provide a Firestore client
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirestoreClient() *firestoreapi.Client
|
FirestoreClient() *firestoreapi.Client
|
||||||
|
BuildQuery(string, firestoreapi.EntityFilter, []string, string, firestoreapi.Direction, int, bool) (*firestoreapi.Query, error)
|
||||||
|
ExecuteQuery(context.Context, *firestoreapi.Query, bool) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Config represents the configuration for the Firestore query tool
|
// Config represents the configuration for the Firestore query tool
|
||||||
@@ -139,15 +120,6 @@ func (t Tool) ToConfig() tools.ToolConfig {
|
|||||||
return t.Config
|
return t.Config
|
||||||
}
|
}
|
||||||
|
|
||||||
// SimplifiedFilter represents the simplified filter format
|
|
||||||
type SimplifiedFilter struct {
|
|
||||||
And []SimplifiedFilter `json:"and,omitempty"`
|
|
||||||
Or []SimplifiedFilter `json:"or,omitempty"`
|
|
||||||
Field string `json:"field,omitempty"`
|
|
||||||
Op string `json:"op,omitempty"`
|
|
||||||
Value interface{} `json:"value,omitempty"`
|
|
||||||
}
|
|
||||||
|
|
||||||
// OrderByConfig represents ordering configuration
|
// OrderByConfig represents ordering configuration
|
||||||
type OrderByConfig struct {
|
type OrderByConfig struct {
|
||||||
Field string `json:"field"`
|
Field string `json:"field"`
|
||||||
@@ -162,20 +134,27 @@ func (o *OrderByConfig) GetDirection() firestoreapi.Direction {
|
|||||||
return firestoreapi.Asc
|
return firestoreapi.Asc
|
||||||
}
|
}
|
||||||
|
|
||||||
// QueryResult represents a document result from the query
|
// SimplifiedFilter represents the simplified filter format
|
||||||
type QueryResult struct {
|
type SimplifiedFilter struct {
|
||||||
ID string `json:"id"`
|
And []SimplifiedFilter `json:"and,omitempty"`
|
||||||
Path string `json:"path"`
|
Or []SimplifiedFilter `json:"or,omitempty"`
|
||||||
Data map[string]any `json:"data"`
|
Field string `json:"field,omitempty"`
|
||||||
CreateTime interface{} `json:"createTime,omitempty"`
|
Op string `json:"op,omitempty"`
|
||||||
UpdateTime interface{} `json:"updateTime,omitempty"`
|
Value interface{} `json:"value,omitempty"`
|
||||||
ReadTime interface{} `json:"readTime,omitempty"`
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// QueryResponse represents the full response including optional metrics
|
// Firestore operators
|
||||||
type QueryResponse struct {
|
var validOperators = map[string]bool{
|
||||||
Documents []QueryResult `json:"documents"`
|
"<": true,
|
||||||
ExplainMetrics map[string]any `json:"explainMetrics,omitempty"`
|
"<=": true,
|
||||||
|
">": true,
|
||||||
|
">=": true,
|
||||||
|
"==": true,
|
||||||
|
"!=": true,
|
||||||
|
"array-contains": true,
|
||||||
|
"array-contains-any": true,
|
||||||
|
"in": true,
|
||||||
|
"not-in": true,
|
||||||
}
|
}
|
||||||
|
|
||||||
// Invoke executes the Firestore query based on the provided parameters
|
// Invoke executes the Firestore query based on the provided parameters
|
||||||
@@ -184,34 +163,18 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
|
|
||||||
// Process collection path with template substitution
|
// Process collection path with template substitution
|
||||||
collectionPath, err := parameters.PopulateTemplate("collectionPath", t.CollectionPath, paramsMap)
|
collectionPath, err := parameters.PopulateTemplate("collectionPath", t.CollectionPath, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to process collection path: %w", err)
|
return nil, fmt.Errorf("failed to process collection path: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Build the query
|
var filter firestoreapi.EntityFilter
|
||||||
query, err := t.buildQuery(source, collectionPath, paramsMap)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
// Execute the query and return results
|
|
||||||
return t.executeQuery(ctx, query)
|
|
||||||
}
|
|
||||||
|
|
||||||
// buildQuery constructs the Firestore query from parameters
|
|
||||||
func (t Tool) buildQuery(source compatibleSource, collectionPath string, params map[string]any) (*firestoreapi.Query, error) {
|
|
||||||
collection := source.FirestoreClient().Collection(collectionPath)
|
|
||||||
query := collection.Query
|
|
||||||
|
|
||||||
// Process and apply filters if template is provided
|
// Process and apply filters if template is provided
|
||||||
if t.Filters != "" {
|
if t.Filters != "" {
|
||||||
// Apply template substitution to filters
|
// Apply template substitution to filters
|
||||||
filtersJSON, err := parameters.PopulateTemplateWithJSON("filters", t.Filters, params)
|
filtersJSON, err := parameters.PopulateTemplateWithJSON("filters", t.Filters, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to process filters template: %w", err)
|
return nil, fmt.Errorf("failed to process filters template: %w", err)
|
||||||
}
|
}
|
||||||
@@ -219,48 +182,43 @@ func (t Tool) buildQuery(source compatibleSource, collectionPath string, params
|
|||||||
// Parse the simplified filter format
|
// Parse the simplified filter format
|
||||||
var simplifiedFilter SimplifiedFilter
|
var simplifiedFilter SimplifiedFilter
|
||||||
if err := json.Unmarshal([]byte(filtersJSON), &simplifiedFilter); err != nil {
|
if err := json.Unmarshal([]byte(filtersJSON), &simplifiedFilter); err != nil {
|
||||||
return nil, fmt.Errorf(errFilterParseFailed, err)
|
return nil, fmt.Errorf("failed to parse filters: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Convert simplified filter to Firestore filter
|
// Convert simplified filter to Firestore filter
|
||||||
if filter := t.convertToFirestoreFilter(source, simplifiedFilter); filter != nil {
|
filter = t.convertToFirestoreFilter(source, simplifiedFilter)
|
||||||
query = query.WhereEntity(filter)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Process select fields
|
|
||||||
selectFields, err := t.processSelectFields(params)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
if len(selectFields) > 0 {
|
|
||||||
query = query.Select(selectFields...)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Process and apply ordering
|
// Process and apply ordering
|
||||||
orderBy, err := t.getOrderBy(params)
|
orderBy, err := t.getOrderBy(paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
if orderBy != nil {
|
// Process select fields
|
||||||
query = query.OrderBy(orderBy.Field, orderBy.GetDirection())
|
selectFields, err := t.processSelectFields(paramsMap)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
// Process and apply limit
|
// Process and apply limit
|
||||||
limit, err := t.getLimit(params)
|
limit, err := t.getLimit(paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
query = query.Limit(limit)
|
|
||||||
|
|
||||||
// Apply analyze options if enabled
|
// prevent panic when accessing orderBy incase it is nil
|
||||||
if t.AnalyzeQuery {
|
var orderByField string
|
||||||
query = query.WithRunOptions(firestoreapi.ExplainOptions{
|
var orderByDirection firestoreapi.Direction
|
||||||
Analyze: true,
|
if orderBy != nil {
|
||||||
})
|
orderByField = orderBy.Field
|
||||||
|
orderByDirection = orderBy.GetDirection()
|
||||||
}
|
}
|
||||||
|
|
||||||
return &query, nil
|
// Build the query
|
||||||
|
query, err := source.BuildQuery(collectionPath, filter, selectFields, orderByField, orderByDirection, limit, t.AnalyzeQuery)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
// Execute the query and return results
|
||||||
|
return source.ExecuteQuery(ctx, query, t.AnalyzeQuery)
|
||||||
}
|
}
|
||||||
|
|
||||||
// convertToFirestoreFilter converts simplified filter format to Firestore EntityFilter
|
// convertToFirestoreFilter converts simplified filter format to Firestore EntityFilter
|
||||||
@@ -409,7 +367,7 @@ func (t Tool) getLimit(params map[string]any) (int, error) {
|
|||||||
if processedValue != "" {
|
if processedValue != "" {
|
||||||
parsedLimit, err := strconv.Atoi(processedValue)
|
parsedLimit, err := strconv.Atoi(processedValue)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return 0, fmt.Errorf(errLimitParseFailed, processedValue, err)
|
return 0, fmt.Errorf("failed to parse limit value '%s': %w", processedValue, err)
|
||||||
}
|
}
|
||||||
limit = parsedLimit
|
limit = parsedLimit
|
||||||
}
|
}
|
||||||
@@ -417,78 +375,6 @@ func (t Tool) getLimit(params map[string]any) (int, error) {
|
|||||||
return limit, nil
|
return limit, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// executeQuery runs the query and formats the results
|
|
||||||
func (t Tool) executeQuery(ctx context.Context, query *firestoreapi.Query) (any, error) {
|
|
||||||
docIterator := query.Documents(ctx)
|
|
||||||
docs, err := docIterator.GetAll()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf(errQueryExecutionFailed, err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert results to structured format
|
|
||||||
results := make([]QueryResult, len(docs))
|
|
||||||
for i, doc := range docs {
|
|
||||||
results[i] = QueryResult{
|
|
||||||
ID: doc.Ref.ID,
|
|
||||||
Path: doc.Ref.Path,
|
|
||||||
Data: doc.Data(),
|
|
||||||
CreateTime: doc.CreateTime,
|
|
||||||
UpdateTime: doc.UpdateTime,
|
|
||||||
ReadTime: doc.ReadTime,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Return with explain metrics if requested
|
|
||||||
if t.AnalyzeQuery {
|
|
||||||
explainMetrics, err := t.getExplainMetrics(docIterator)
|
|
||||||
if err == nil && explainMetrics != nil {
|
|
||||||
response := QueryResponse{
|
|
||||||
Documents: results,
|
|
||||||
ExplainMetrics: explainMetrics,
|
|
||||||
}
|
|
||||||
return response, nil
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return results, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// getExplainMetrics extracts explain metrics from the query iterator
|
|
||||||
func (t Tool) getExplainMetrics(docIterator *firestoreapi.DocumentIterator) (map[string]any, error) {
|
|
||||||
explainMetrics, err := docIterator.ExplainMetrics()
|
|
||||||
if err != nil || explainMetrics == nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
metricsData := make(map[string]any)
|
|
||||||
|
|
||||||
// Add plan summary if available
|
|
||||||
if explainMetrics.PlanSummary != nil {
|
|
||||||
planSummary := make(map[string]any)
|
|
||||||
planSummary["indexesUsed"] = explainMetrics.PlanSummary.IndexesUsed
|
|
||||||
metricsData["planSummary"] = planSummary
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add execution stats if available
|
|
||||||
if explainMetrics.ExecutionStats != nil {
|
|
||||||
executionStats := make(map[string]any)
|
|
||||||
executionStats["resultsReturned"] = explainMetrics.ExecutionStats.ResultsReturned
|
|
||||||
executionStats["readOperations"] = explainMetrics.ExecutionStats.ReadOperations
|
|
||||||
|
|
||||||
if explainMetrics.ExecutionStats.ExecutionDuration != nil {
|
|
||||||
executionStats["executionDuration"] = explainMetrics.ExecutionStats.ExecutionDuration.String()
|
|
||||||
}
|
|
||||||
|
|
||||||
if explainMetrics.ExecutionStats.DebugStats != nil {
|
|
||||||
executionStats["debugStats"] = *explainMetrics.ExecutionStats.DebugStats
|
|
||||||
}
|
|
||||||
|
|
||||||
metricsData["executionStats"] = executionStats
|
|
||||||
}
|
|
||||||
|
|
||||||
return metricsData, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// ParseParams parses and validates input parameters
|
// ParseParams parses and validates input parameters
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
return parameters.ParseParams(t.Parameters, data, claims)
|
return parameters.ParseParams(t.Parameters, data, claims)
|
||||||
|
|||||||
@@ -69,7 +69,6 @@ const (
|
|||||||
errInvalidOperator = "unsupported operator: %s. Valid operators are: %v"
|
errInvalidOperator = "unsupported operator: %s. Valid operators are: %v"
|
||||||
errMissingFilterValue = "no value specified for filter on field '%s'"
|
errMissingFilterValue = "no value specified for filter on field '%s'"
|
||||||
errOrderByParseFailed = "failed to parse orderBy: %w"
|
errOrderByParseFailed = "failed to parse orderBy: %w"
|
||||||
errQueryExecutionFailed = "failed to execute query: %w"
|
|
||||||
errTooManyFilters = "too many filters provided: %d (maximum: %d)"
|
errTooManyFilters = "too many filters provided: %d (maximum: %d)"
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -90,6 +89,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
// compatibleSource defines the interface for sources that can provide a Firestore client
|
// compatibleSource defines the interface for sources that can provide a Firestore client
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirestoreClient() *firestoreapi.Client
|
FirestoreClient() *firestoreapi.Client
|
||||||
|
BuildQuery(string, firestoreapi.EntityFilter, []string, string, firestoreapi.Direction, int, bool) (*firestoreapi.Query, error)
|
||||||
|
ExecuteQuery(context.Context, *firestoreapi.Query, bool) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Config represents the configuration for the Firestore query collection tool
|
// Config represents the configuration for the Firestore query collection tool
|
||||||
@@ -228,22 +229,6 @@ func (o *OrderByConfig) GetDirection() firestoreapi.Direction {
|
|||||||
return firestoreapi.Asc
|
return firestoreapi.Asc
|
||||||
}
|
}
|
||||||
|
|
||||||
// QueryResult represents a document result from the query
|
|
||||||
type QueryResult struct {
|
|
||||||
ID string `json:"id"`
|
|
||||||
Path string `json:"path"`
|
|
||||||
Data map[string]any `json:"data"`
|
|
||||||
CreateTime interface{} `json:"createTime,omitempty"`
|
|
||||||
UpdateTime interface{} `json:"updateTime,omitempty"`
|
|
||||||
ReadTime interface{} `json:"readTime,omitempty"`
|
|
||||||
}
|
|
||||||
|
|
||||||
// QueryResponse represents the full response including optional metrics
|
|
||||||
type QueryResponse struct {
|
|
||||||
Documents []QueryResult `json:"documents"`
|
|
||||||
ExplainMetrics map[string]any `json:"explainMetrics,omitempty"`
|
|
||||||
}
|
|
||||||
|
|
||||||
// Invoke executes the Firestore query based on the provided parameters
|
// Invoke executes the Firestore query based on the provided parameters
|
||||||
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
|
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||||
source, err := tools.GetCompatibleSource[compatibleSource](resourceMgr, t.Source, t.Name, t.Kind)
|
source, err := tools.GetCompatibleSource[compatibleSource](resourceMgr, t.Source, t.Name, t.Kind)
|
||||||
@@ -257,14 +242,37 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var filter firestoreapi.EntityFilter
|
||||||
|
// Apply filters
|
||||||
|
if len(queryParams.Filters) > 0 {
|
||||||
|
filterConditions := make([]firestoreapi.EntityFilter, 0, len(queryParams.Filters))
|
||||||
|
for _, filter := range queryParams.Filters {
|
||||||
|
filterConditions = append(filterConditions, firestoreapi.PropertyFilter{
|
||||||
|
Path: filter.Field,
|
||||||
|
Operator: filter.Op,
|
||||||
|
Value: filter.Value,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
filter = firestoreapi.AndFilter{
|
||||||
|
Filters: filterConditions,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// prevent panic incase queryParams.OrderBy is nil
|
||||||
|
var orderByField string
|
||||||
|
var orderByDirection firestoreapi.Direction
|
||||||
|
if queryParams.OrderBy != nil {
|
||||||
|
orderByField = queryParams.OrderBy.Field
|
||||||
|
orderByDirection = queryParams.OrderBy.GetDirection()
|
||||||
|
}
|
||||||
|
|
||||||
// Build the query
|
// Build the query
|
||||||
query, err := t.buildQuery(source, queryParams)
|
query, err := source.BuildQuery(queryParams.CollectionPath, filter, nil, orderByField, orderByDirection, queryParams.Limit, queryParams.AnalyzeQuery)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
return source.ExecuteQuery(ctx, query, queryParams.AnalyzeQuery)
|
||||||
// Execute the query and return results
|
|
||||||
return t.executeQuery(ctx, query, queryParams.AnalyzeQuery)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// queryParameters holds all parsed query parameters
|
// queryParameters holds all parsed query parameters
|
||||||
@@ -380,122 +388,6 @@ func (t Tool) parseOrderBy(orderByRaw interface{}) (*OrderByConfig, error) {
|
|||||||
return &orderBy, nil
|
return &orderBy, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// buildQuery constructs the Firestore query from parameters
|
|
||||||
func (t Tool) buildQuery(source compatibleSource, params *queryParameters) (*firestoreapi.Query, error) {
|
|
||||||
collection := source.FirestoreClient().Collection(params.CollectionPath)
|
|
||||||
query := collection.Query
|
|
||||||
|
|
||||||
// Apply filters
|
|
||||||
if len(params.Filters) > 0 {
|
|
||||||
filterConditions := make([]firestoreapi.EntityFilter, 0, len(params.Filters))
|
|
||||||
for _, filter := range params.Filters {
|
|
||||||
filterConditions = append(filterConditions, firestoreapi.PropertyFilter{
|
|
||||||
Path: filter.Field,
|
|
||||||
Operator: filter.Op,
|
|
||||||
Value: filter.Value,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
query = query.WhereEntity(firestoreapi.AndFilter{
|
|
||||||
Filters: filterConditions,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// Apply ordering
|
|
||||||
if params.OrderBy != nil {
|
|
||||||
query = query.OrderBy(params.OrderBy.Field, params.OrderBy.GetDirection())
|
|
||||||
}
|
|
||||||
|
|
||||||
// Apply limit
|
|
||||||
query = query.Limit(params.Limit)
|
|
||||||
|
|
||||||
// Apply analyze options
|
|
||||||
if params.AnalyzeQuery {
|
|
||||||
query = query.WithRunOptions(firestoreapi.ExplainOptions{
|
|
||||||
Analyze: true,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
return &query, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// executeQuery runs the query and formats the results
|
|
||||||
func (t Tool) executeQuery(ctx context.Context, query *firestoreapi.Query, analyzeQuery bool) (any, error) {
|
|
||||||
docIterator := query.Documents(ctx)
|
|
||||||
docs, err := docIterator.GetAll()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf(errQueryExecutionFailed, err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert results to structured format
|
|
||||||
results := make([]QueryResult, len(docs))
|
|
||||||
for i, doc := range docs {
|
|
||||||
results[i] = QueryResult{
|
|
||||||
ID: doc.Ref.ID,
|
|
||||||
Path: doc.Ref.Path,
|
|
||||||
Data: doc.Data(),
|
|
||||||
CreateTime: doc.CreateTime,
|
|
||||||
UpdateTime: doc.UpdateTime,
|
|
||||||
ReadTime: doc.ReadTime,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Return with explain metrics if requested
|
|
||||||
if analyzeQuery {
|
|
||||||
explainMetrics, err := t.getExplainMetrics(docIterator)
|
|
||||||
if err == nil && explainMetrics != nil {
|
|
||||||
response := QueryResponse{
|
|
||||||
Documents: results,
|
|
||||||
ExplainMetrics: explainMetrics,
|
|
||||||
}
|
|
||||||
return response, nil
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Return just the documents
|
|
||||||
resultsAny := make([]any, len(results))
|
|
||||||
for i, r := range results {
|
|
||||||
resultsAny[i] = r
|
|
||||||
}
|
|
||||||
return resultsAny, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// getExplainMetrics extracts explain metrics from the query iterator
|
|
||||||
func (t Tool) getExplainMetrics(docIterator *firestoreapi.DocumentIterator) (map[string]any, error) {
|
|
||||||
explainMetrics, err := docIterator.ExplainMetrics()
|
|
||||||
if err != nil || explainMetrics == nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
metricsData := make(map[string]any)
|
|
||||||
|
|
||||||
// Add plan summary if available
|
|
||||||
if explainMetrics.PlanSummary != nil {
|
|
||||||
planSummary := make(map[string]any)
|
|
||||||
planSummary["indexesUsed"] = explainMetrics.PlanSummary.IndexesUsed
|
|
||||||
metricsData["planSummary"] = planSummary
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add execution stats if available
|
|
||||||
if explainMetrics.ExecutionStats != nil {
|
|
||||||
executionStats := make(map[string]any)
|
|
||||||
executionStats["resultsReturned"] = explainMetrics.ExecutionStats.ResultsReturned
|
|
||||||
executionStats["readOperations"] = explainMetrics.ExecutionStats.ReadOperations
|
|
||||||
|
|
||||||
if explainMetrics.ExecutionStats.ExecutionDuration != nil {
|
|
||||||
executionStats["executionDuration"] = explainMetrics.ExecutionStats.ExecutionDuration.String()
|
|
||||||
}
|
|
||||||
|
|
||||||
if explainMetrics.ExecutionStats.DebugStats != nil {
|
|
||||||
executionStats["debugStats"] = *explainMetrics.ExecutionStats.DebugStats
|
|
||||||
}
|
|
||||||
|
|
||||||
metricsData["executionStats"] = executionStats
|
|
||||||
}
|
|
||||||
|
|
||||||
return metricsData, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// ParseParams parses and validates input parameters
|
// ParseParams parses and validates input parameters
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
return parameters.ParseParams(t.Parameters, data, claims)
|
return parameters.ParseParams(t.Parameters, data, claims)
|
||||||
|
|||||||
@@ -50,6 +50,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirestoreClient() *firestoreapi.Client
|
FirestoreClient() *firestoreapi.Client
|
||||||
|
UpdateDocument(context.Context, string, []firestoreapi.Update, any, bool) (map[string]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -177,23 +178,10 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
// Use selective field update with update mask
|
||||||
// Get return document data flag
|
updates := make([]firestoreapi.Update, 0, len(updatePaths))
|
||||||
returnData := false
|
var documentData any
|
||||||
if val, ok := mapParams[returnDocumentDataKey].(bool); ok {
|
|
||||||
returnData = val
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get the document reference
|
|
||||||
docRef := source.FirestoreClient().Doc(documentPath)
|
|
||||||
|
|
||||||
// Prepare update data
|
|
||||||
var writeResult *firestoreapi.WriteResult
|
|
||||||
var writeErr error
|
|
||||||
|
|
||||||
if len(updatePaths) > 0 {
|
if len(updatePaths) > 0 {
|
||||||
// Use selective field update with update mask
|
|
||||||
updates := make([]firestoreapi.Update, 0, len(updatePaths))
|
|
||||||
|
|
||||||
// Convert document data without delete markers
|
// Convert document data without delete markers
|
||||||
dataMap, err := util.JSONToFirestoreValue(documentDataRaw, source.FirestoreClient())
|
dataMap, err := util.JSONToFirestoreValue(documentDataRaw, source.FirestoreClient())
|
||||||
@@ -220,41 +208,20 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
Value: value,
|
Value: value,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
writeResult, writeErr = docRef.Update(ctx, updates)
|
|
||||||
} else {
|
} else {
|
||||||
// Update all fields in the document data (merge)
|
// Update all fields in the document data (merge)
|
||||||
documentData, err := util.JSONToFirestoreValue(documentDataRaw, source.FirestoreClient())
|
documentData, err = util.JSONToFirestoreValue(documentDataRaw, source.FirestoreClient())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("failed to convert document data: %w", err)
|
return nil, fmt.Errorf("failed to convert document data: %w", err)
|
||||||
}
|
}
|
||||||
writeResult, writeErr = docRef.Set(ctx, documentData, firestoreapi.MergeAll)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if writeErr != nil {
|
// Get return document data flag
|
||||||
return nil, fmt.Errorf("failed to update document: %w", writeErr)
|
returnData := false
|
||||||
|
if val, ok := mapParams[returnDocumentDataKey].(bool); ok {
|
||||||
|
returnData = val
|
||||||
}
|
}
|
||||||
|
return source.UpdateDocument(ctx, documentPath, updates, documentData, returnData)
|
||||||
// Build the response
|
|
||||||
response := map[string]any{
|
|
||||||
"documentPath": docRef.Path,
|
|
||||||
"updateTime": writeResult.UpdateTime.Format("2006-01-02T15:04:05.999999999Z"),
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add document data if requested
|
|
||||||
if returnData {
|
|
||||||
// Fetch the updated document to return the current state
|
|
||||||
snapshot, err := docRef.Get(ctx)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to retrieve updated document: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert the document data to simple JSON format
|
|
||||||
simplifiedData := util.FirestoreValueToJSON(snapshot.Data())
|
|
||||||
response["documentData"] = simplifiedData
|
|
||||||
}
|
|
||||||
|
|
||||||
return response, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// getFieldValue retrieves a value from a nested map using a dot-separated path
|
// getFieldValue retrieves a value from a nested map using a dot-separated path
|
||||||
|
|||||||
@@ -17,7 +17,6 @@ package firestorevalidaterules
|
|||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"fmt"
|
"fmt"
|
||||||
"strings"
|
|
||||||
|
|
||||||
yaml "github.com/goccy/go-yaml"
|
yaml "github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
@@ -50,7 +49,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
FirebaseRulesClient() *firebaserules.Service
|
FirebaseRulesClient() *firebaserules.Service
|
||||||
GetProjectId() string
|
ValidateRules(context.Context, string) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -107,30 +106,6 @@ func (t Tool) ToConfig() tools.ToolConfig {
|
|||||||
return t.Config
|
return t.Config
|
||||||
}
|
}
|
||||||
|
|
||||||
// Issue represents a validation issue in the rules
|
|
||||||
type Issue struct {
|
|
||||||
SourcePosition SourcePosition `json:"sourcePosition"`
|
|
||||||
Description string `json:"description"`
|
|
||||||
Severity string `json:"severity"`
|
|
||||||
}
|
|
||||||
|
|
||||||
// SourcePosition represents the location of an issue in the source
|
|
||||||
type SourcePosition struct {
|
|
||||||
FileName string `json:"fileName,omitempty"`
|
|
||||||
Line int64 `json:"line"` // 1-based
|
|
||||||
Column int64 `json:"column"` // 1-based
|
|
||||||
CurrentOffset int64 `json:"currentOffset"` // 0-based, inclusive start
|
|
||||||
EndOffset int64 `json:"endOffset"` // 0-based, exclusive end
|
|
||||||
}
|
|
||||||
|
|
||||||
// ValidationResult represents the result of rules validation
|
|
||||||
type ValidationResult struct {
|
|
||||||
Valid bool `json:"valid"`
|
|
||||||
IssueCount int `json:"issueCount"`
|
|
||||||
FormattedIssues string `json:"formattedIssues,omitempty"`
|
|
||||||
RawIssues []Issue `json:"rawIssues,omitempty"`
|
|
||||||
}
|
|
||||||
|
|
||||||
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
|
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||||
source, err := tools.GetCompatibleSource[compatibleSource](resourceMgr, t.Source, t.Name, t.Kind)
|
source, err := tools.GetCompatibleSource[compatibleSource](resourceMgr, t.Source, t.Name, t.Kind)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -144,114 +119,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if !ok || sourceParam == "" {
|
if !ok || sourceParam == "" {
|
||||||
return nil, fmt.Errorf("invalid or missing '%s' parameter", sourceKey)
|
return nil, fmt.Errorf("invalid or missing '%s' parameter", sourceKey)
|
||||||
}
|
}
|
||||||
|
return source.ValidateRules(ctx, sourceParam)
|
||||||
// Create test request
|
|
||||||
testRequest := &firebaserules.TestRulesetRequest{
|
|
||||||
Source: &firebaserules.Source{
|
|
||||||
Files: []*firebaserules.File{
|
|
||||||
{
|
|
||||||
Name: "firestore.rules",
|
|
||||||
Content: sourceParam,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
// We don't need test cases for validation only
|
|
||||||
TestSuite: &firebaserules.TestSuite{
|
|
||||||
TestCases: []*firebaserules.TestCase{},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
// Call the test API
|
|
||||||
projectName := fmt.Sprintf("projects/%s", source.GetProjectId())
|
|
||||||
response, err := source.FirebaseRulesClient().Projects.Test(projectName, testRequest).Context(ctx).Do()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to validate rules: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Process the response
|
|
||||||
result := t.processValidationResponse(response, sourceParam)
|
|
||||||
|
|
||||||
return result, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
func (t Tool) processValidationResponse(response *firebaserules.TestRulesetResponse, source string) ValidationResult {
|
|
||||||
if len(response.Issues) == 0 {
|
|
||||||
return ValidationResult{
|
|
||||||
Valid: true,
|
|
||||||
IssueCount: 0,
|
|
||||||
FormattedIssues: "✓ No errors detected. Rules are valid.",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert issues to our format
|
|
||||||
issues := make([]Issue, len(response.Issues))
|
|
||||||
for i, issue := range response.Issues {
|
|
||||||
issues[i] = Issue{
|
|
||||||
Description: issue.Description,
|
|
||||||
Severity: issue.Severity,
|
|
||||||
SourcePosition: SourcePosition{
|
|
||||||
FileName: issue.SourcePosition.FileName,
|
|
||||||
Line: issue.SourcePosition.Line,
|
|
||||||
Column: issue.SourcePosition.Column,
|
|
||||||
CurrentOffset: issue.SourcePosition.CurrentOffset,
|
|
||||||
EndOffset: issue.SourcePosition.EndOffset,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Format issues
|
|
||||||
formattedIssues := t.formatRulesetIssues(issues, source)
|
|
||||||
|
|
||||||
return ValidationResult{
|
|
||||||
Valid: false,
|
|
||||||
IssueCount: len(issues),
|
|
||||||
FormattedIssues: formattedIssues,
|
|
||||||
RawIssues: issues,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// formatRulesetIssues formats validation issues into a human-readable string with code snippets
|
|
||||||
func (t Tool) formatRulesetIssues(issues []Issue, rulesSource string) string {
|
|
||||||
sourceLines := strings.Split(rulesSource, "\n")
|
|
||||||
var formattedOutput []string
|
|
||||||
|
|
||||||
formattedOutput = append(formattedOutput, fmt.Sprintf("Found %d issue(s) in rules source:\n", len(issues)))
|
|
||||||
|
|
||||||
for _, issue := range issues {
|
|
||||||
issueString := fmt.Sprintf("%s: %s [Ln %d, Col %d]",
|
|
||||||
issue.Severity,
|
|
||||||
issue.Description,
|
|
||||||
issue.SourcePosition.Line,
|
|
||||||
issue.SourcePosition.Column)
|
|
||||||
|
|
||||||
if issue.SourcePosition.Line > 0 {
|
|
||||||
lineIndex := int(issue.SourcePosition.Line - 1) // 0-based index
|
|
||||||
if lineIndex >= 0 && lineIndex < len(sourceLines) {
|
|
||||||
errorLine := sourceLines[lineIndex]
|
|
||||||
issueString += fmt.Sprintf("\n```\n%s", errorLine)
|
|
||||||
|
|
||||||
// Add carets if we have column and offset information
|
|
||||||
if issue.SourcePosition.Column > 0 &&
|
|
||||||
issue.SourcePosition.CurrentOffset >= 0 &&
|
|
||||||
issue.SourcePosition.EndOffset > issue.SourcePosition.CurrentOffset {
|
|
||||||
|
|
||||||
startColumn := int(issue.SourcePosition.Column - 1) // 0-based
|
|
||||||
errorTokenLength := int(issue.SourcePosition.EndOffset - issue.SourcePosition.CurrentOffset)
|
|
||||||
|
|
||||||
if startColumn >= 0 && errorTokenLength > 0 && startColumn <= len(errorLine) {
|
|
||||||
padding := strings.Repeat(" ", startColumn)
|
|
||||||
carets := strings.Repeat("^", errorTokenLength)
|
|
||||||
issueString += fmt.Sprintf("\n%s%s", padding, carets)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
issueString += "\n```"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
formattedOutput = append(formattedOutput, issueString)
|
|
||||||
}
|
|
||||||
|
|
||||||
return strings.Join(formattedOutput, "\n\n")
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -28,13 +28,13 @@ import (
|
|||||||
// JSONToFirestoreValue converts a JSON value with type information to a Firestore-compatible value
|
// JSONToFirestoreValue converts a JSON value with type information to a Firestore-compatible value
|
||||||
// The input should be a map with a single key indicating the type (e.g., "stringValue", "integerValue")
|
// The input should be a map with a single key indicating the type (e.g., "stringValue", "integerValue")
|
||||||
// If a client is provided, referenceValue types will be converted to *firestore.DocumentRef
|
// If a client is provided, referenceValue types will be converted to *firestore.DocumentRef
|
||||||
func JSONToFirestoreValue(value interface{}, client *firestore.Client) (interface{}, error) {
|
func JSONToFirestoreValue(value any, client *firestore.Client) (any, error) {
|
||||||
if value == nil {
|
if value == nil {
|
||||||
return nil, nil
|
return nil, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
switch v := value.(type) {
|
switch v := value.(type) {
|
||||||
case map[string]interface{}:
|
case map[string]any:
|
||||||
// Check for typed values
|
// Check for typed values
|
||||||
if len(v) == 1 {
|
if len(v) == 1 {
|
||||||
for key, val := range v {
|
for key, val := range v {
|
||||||
@@ -92,7 +92,7 @@ func JSONToFirestoreValue(value interface{}, client *firestore.Client) (interfac
|
|||||||
return nil, fmt.Errorf("timestamp value must be a string")
|
return nil, fmt.Errorf("timestamp value must be a string")
|
||||||
case "geoPointValue":
|
case "geoPointValue":
|
||||||
// Convert to LatLng
|
// Convert to LatLng
|
||||||
if geoMap, ok := val.(map[string]interface{}); ok {
|
if geoMap, ok := val.(map[string]any); ok {
|
||||||
lat, latOk := geoMap["latitude"].(float64)
|
lat, latOk := geoMap["latitude"].(float64)
|
||||||
lng, lngOk := geoMap["longitude"].(float64)
|
lng, lngOk := geoMap["longitude"].(float64)
|
||||||
if latOk && lngOk {
|
if latOk && lngOk {
|
||||||
@@ -105,9 +105,9 @@ func JSONToFirestoreValue(value interface{}, client *firestore.Client) (interfac
|
|||||||
return nil, fmt.Errorf("invalid geopoint value format")
|
return nil, fmt.Errorf("invalid geopoint value format")
|
||||||
case "arrayValue":
|
case "arrayValue":
|
||||||
// Convert array
|
// Convert array
|
||||||
if arrayMap, ok := val.(map[string]interface{}); ok {
|
if arrayMap, ok := val.(map[string]any); ok {
|
||||||
if values, ok := arrayMap["values"].([]interface{}); ok {
|
if values, ok := arrayMap["values"].([]any); ok {
|
||||||
result := make([]interface{}, len(values))
|
result := make([]any, len(values))
|
||||||
for i, item := range values {
|
for i, item := range values {
|
||||||
converted, err := JSONToFirestoreValue(item, client)
|
converted, err := JSONToFirestoreValue(item, client)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -121,9 +121,9 @@ func JSONToFirestoreValue(value interface{}, client *firestore.Client) (interfac
|
|||||||
return nil, fmt.Errorf("invalid array value format")
|
return nil, fmt.Errorf("invalid array value format")
|
||||||
case "mapValue":
|
case "mapValue":
|
||||||
// Convert map
|
// Convert map
|
||||||
if mapMap, ok := val.(map[string]interface{}); ok {
|
if mapMap, ok := val.(map[string]any); ok {
|
||||||
if fields, ok := mapMap["fields"].(map[string]interface{}); ok {
|
if fields, ok := mapMap["fields"].(map[string]any); ok {
|
||||||
result := make(map[string]interface{})
|
result := make(map[string]any)
|
||||||
for k, v := range fields {
|
for k, v := range fields {
|
||||||
converted, err := JSONToFirestoreValue(v, client)
|
converted, err := JSONToFirestoreValue(v, client)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -160,8 +160,8 @@ func JSONToFirestoreValue(value interface{}, client *firestore.Client) (interfac
|
|||||||
}
|
}
|
||||||
|
|
||||||
// convertPlainMap converts a plain map to Firestore format
|
// convertPlainMap converts a plain map to Firestore format
|
||||||
func convertPlainMap(m map[string]interface{}, client *firestore.Client) (map[string]interface{}, error) {
|
func convertPlainMap(m map[string]any, client *firestore.Client) (map[string]any, error) {
|
||||||
result := make(map[string]interface{})
|
result := make(map[string]any)
|
||||||
for k, v := range m {
|
for k, v := range m {
|
||||||
converted, err := JSONToFirestoreValue(v, client)
|
converted, err := JSONToFirestoreValue(v, client)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -172,42 +172,6 @@ func convertPlainMap(m map[string]interface{}, client *firestore.Client) (map[st
|
|||||||
return result, nil
|
return result, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// FirestoreValueToJSON converts a Firestore value to a simplified JSON representation
|
|
||||||
// This removes type information and returns plain values
|
|
||||||
func FirestoreValueToJSON(value interface{}) interface{} {
|
|
||||||
if value == nil {
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
switch v := value.(type) {
|
|
||||||
case time.Time:
|
|
||||||
return v.Format(time.RFC3339Nano)
|
|
||||||
case *latlng.LatLng:
|
|
||||||
return map[string]interface{}{
|
|
||||||
"latitude": v.Latitude,
|
|
||||||
"longitude": v.Longitude,
|
|
||||||
}
|
|
||||||
case []byte:
|
|
||||||
return base64.StdEncoding.EncodeToString(v)
|
|
||||||
case []interface{}:
|
|
||||||
result := make([]interface{}, len(v))
|
|
||||||
for i, item := range v {
|
|
||||||
result[i] = FirestoreValueToJSON(item)
|
|
||||||
}
|
|
||||||
return result
|
|
||||||
case map[string]interface{}:
|
|
||||||
result := make(map[string]interface{})
|
|
||||||
for k, val := range v {
|
|
||||||
result[k] = FirestoreValueToJSON(val)
|
|
||||||
}
|
|
||||||
return result
|
|
||||||
case *firestore.DocumentRef:
|
|
||||||
return v.Path
|
|
||||||
default:
|
|
||||||
return value
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// isValidDocumentPath checks if a string is a valid Firestore document path
|
// isValidDocumentPath checks if a string is a valid Firestore document path
|
||||||
// Valid paths have an even number of segments (collection/doc/collection/doc...)
|
// Valid paths have an even number of segments (collection/doc/collection/doc...)
|
||||||
func isValidDocumentPath(path string) bool {
|
func isValidDocumentPath(path string) bool {
|
||||||
|
|||||||
@@ -312,40 +312,6 @@ func TestJSONToFirestoreValue_IntegerFromString(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestFirestoreValueToJSON_RoundTrip(t *testing.T) {
|
|
||||||
// Test round-trip conversion
|
|
||||||
original := map[string]interface{}{
|
|
||||||
"name": "Test",
|
|
||||||
"count": int64(42),
|
|
||||||
"price": 19.99,
|
|
||||||
"active": true,
|
|
||||||
"tags": []interface{}{"tag1", "tag2"},
|
|
||||||
"metadata": map[string]interface{}{
|
|
||||||
"created": time.Now(),
|
|
||||||
},
|
|
||||||
"nullField": nil,
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert to JSON representation
|
|
||||||
jsonRepresentation := FirestoreValueToJSON(original)
|
|
||||||
|
|
||||||
// Verify types are simplified
|
|
||||||
jsonMap, ok := jsonRepresentation.(map[string]interface{})
|
|
||||||
if !ok {
|
|
||||||
t.Fatalf("Expected map, got %T", jsonRepresentation)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Time should be converted to string
|
|
||||||
metadata, ok := jsonMap["metadata"].(map[string]interface{})
|
|
||||||
if !ok {
|
|
||||||
t.Fatalf("metadata should be a map, got %T", jsonMap["metadata"])
|
|
||||||
}
|
|
||||||
_, ok = metadata["created"].(string)
|
|
||||||
if !ok {
|
|
||||||
t.Errorf("created should be a string, got %T", metadata["created"])
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestJSONToFirestoreValue_InvalidFormats(t *testing.T) {
|
func TestJSONToFirestoreValue_InvalidFormats(t *testing.T) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
|
|||||||
@@ -16,9 +16,7 @@ package http
|
|||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/url"
|
"net/url"
|
||||||
"slices"
|
"slices"
|
||||||
@@ -54,7 +52,7 @@ type compatibleSource interface {
|
|||||||
HttpDefaultHeaders() map[string]string
|
HttpDefaultHeaders() map[string]string
|
||||||
HttpBaseURL() string
|
HttpBaseURL() string
|
||||||
HttpQueryParams() map[string]string
|
HttpQueryParams() map[string]string
|
||||||
Client() *http.Client
|
RunRequest(*http.Request) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -259,29 +257,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
for k, v := range allHeaders {
|
for k, v := range allHeaders {
|
||||||
req.Header.Set(k, v)
|
req.Header.Set(k, v)
|
||||||
}
|
}
|
||||||
|
return source.RunRequest(req)
|
||||||
// Make request and fetch response
|
|
||||||
resp, err := source.Client().Do(req)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error making HTTP request: %s", err)
|
|
||||||
}
|
|
||||||
defer resp.Body.Close()
|
|
||||||
|
|
||||||
var body []byte
|
|
||||||
body, err = io.ReadAll(resp.Body)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
if resp.StatusCode < 200 || resp.StatusCode > 299 {
|
|
||||||
return nil, fmt.Errorf("unexpected status code: %d, response body: %s", resp.StatusCode, string(body))
|
|
||||||
}
|
|
||||||
|
|
||||||
var data any
|
|
||||||
if err = json.Unmarshal(body, &data); err != nil {
|
|
||||||
// if unable to unmarshal data, return result as string.
|
|
||||||
return string(body), nil
|
|
||||||
}
|
|
||||||
return data, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -159,7 +159,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
visConfig := paramsMap["vis_config"].(map[string]any)
|
visConfig := paramsMap["vis_config"].(map[string]any)
|
||||||
wq.VisConfig = &visConfig
|
wq.VisConfig = &visConfig
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -192,7 +191,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
req.Dimension = &dimension
|
req.Dimension = &dimension
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -15,65 +15,17 @@ package lookercommon
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"crypto/tls"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"net/http"
|
|
||||||
"net/url"
|
"net/url"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
rtl "github.com/looker-open-source/sdk-codegen/go/rtl"
|
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||||
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
|
v4 "github.com/looker-open-source/sdk-codegen/go/sdk/v4"
|
||||||
"github.com/thlib/go-timezone-local/tzlocal"
|
"github.com/thlib/go-timezone-local/tzlocal"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Make types for RoundTripper
|
|
||||||
type transportWithAuthHeader struct {
|
|
||||||
Base http.RoundTripper
|
|
||||||
AuthToken tools.AccessToken
|
|
||||||
}
|
|
||||||
|
|
||||||
func (t *transportWithAuthHeader) RoundTrip(req *http.Request) (*http.Response, error) {
|
|
||||||
req.Header.Set("x-looker-appid", "go-sdk")
|
|
||||||
req.Header.Set("Authorization", string(t.AuthToken))
|
|
||||||
return t.Base.RoundTrip(req)
|
|
||||||
}
|
|
||||||
|
|
||||||
func GetLookerSDK(useClientOAuth bool, config *rtl.ApiSettings, client *v4.LookerSDK, accessToken tools.AccessToken) (*v4.LookerSDK, error) {
|
|
||||||
|
|
||||||
if useClientOAuth {
|
|
||||||
if accessToken == "" {
|
|
||||||
return nil, fmt.Errorf("no access token supplied with request")
|
|
||||||
}
|
|
||||||
|
|
||||||
session := rtl.NewAuthSession(*config)
|
|
||||||
// Configure base transport with TLS
|
|
||||||
transport := &http.Transport{
|
|
||||||
TLSClientConfig: &tls.Config{
|
|
||||||
InsecureSkipVerify: !config.VerifySsl,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
// Build transport for end user token
|
|
||||||
session.Client = http.Client{
|
|
||||||
Transport: &transportWithAuthHeader{
|
|
||||||
Base: transport,
|
|
||||||
AuthToken: accessToken,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
// return SDK with new Transport
|
|
||||||
return v4.NewLookerSDK(session), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
if client == nil {
|
|
||||||
return nil, fmt.Errorf("client id or client secret not valid")
|
|
||||||
}
|
|
||||||
return client, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
const (
|
const (
|
||||||
DimensionsFields = "fields(dimensions(name,type,label,label_short,description,synonyms,tags,hidden,suggestable,suggestions,suggest_dimension,suggest_explore))"
|
DimensionsFields = "fields(dimensions(name,type,label,label_short,description,synonyms,tags,hidden,suggestable,suggestions,suggest_dimension,suggest_explore))"
|
||||||
FiltersFields = "fields(filters(name,type,label,label_short,description,synonyms,tags,hidden,suggestable,suggestions,suggest_dimension,suggest_explore))"
|
FiltersFields = "fields(filters(name,type,label,label_short,description,synonyms,tags,hidden,suggestable,suggestions,suggest_dimension,suggest_explore))"
|
||||||
|
|||||||
@@ -47,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -116,7 +116,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -47,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,7 +117,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -125,7 +124,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("'devMode' must be a boolean, got %T", mapParams["devMode"])
|
return nil, fmt.Errorf("'devMode' must be a boolean, got %T", mapParams["devMode"])
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -22,7 +22,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -49,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
LookerSessionLength() int64
|
LookerSessionLength() int64
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -137,7 +136,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
contentId_ptr = nil
|
contentId_ptr = nil
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||||
@@ -47,8 +46,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -120,7 +119,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("'conn' must be a string, got %T", mapParams["conn"])
|
return nil, fmt.Errorf("'conn' must be a string, got %T", mapParams["conn"])
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -119,7 +118,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
"github.com/looker-open-source/sdk-codegen/go/rtl"
|
||||||
@@ -47,8 +46,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -122,7 +121,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
db, _ := mapParams["db"].(string)
|
db, _ := mapParams["db"].(string)
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -137,7 +136,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("'tables' must be a string, got %T", mapParams["tables"])
|
return nil, fmt.Errorf("'tables' must be a string, got %T", mapParams["tables"])
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -132,7 +131,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("'schema' must be a string, got %T", mapParams["schema"])
|
return nil, fmt.Errorf("'schema' must be a string, got %T", mapParams["schema"])
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -141,7 +140,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
limit := int64(paramsMap["limit"].(int))
|
limit := int64(paramsMap["limit"].(int))
|
||||||
offset := int64(paramsMap["offset"].(int))
|
offset := int64(paramsMap["offset"].(int))
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
LookerShowHiddenFields() bool
|
LookerShowHiddenFields() bool
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -124,7 +124,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("error processing model or explore: %w", err)
|
return nil, fmt.Errorf("error processing model or explore: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
LookerShowHiddenExplores() bool
|
LookerShowHiddenExplores() bool
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -126,7 +125,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("'model' must be a string, got %T", mapParams["model"])
|
return nil, fmt.Errorf("'model' must be a string, got %T", mapParams["model"])
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
LookerShowHiddenFields() bool
|
LookerShowHiddenFields() bool
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -125,7 +125,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
fields := lookercommon.FiltersFields
|
fields := lookercommon.FiltersFields
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -141,7 +140,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
limit := int64(paramsMap["limit"].(int))
|
limit := int64(paramsMap["limit"].(int))
|
||||||
offset := int64(paramsMap["offset"].(int))
|
offset := int64(paramsMap["offset"].(int))
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
LookerShowHiddenFields() bool
|
LookerShowHiddenFields() bool
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -125,7 +125,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
fields := lookercommon.MeasuresFields
|
fields := lookercommon.MeasuresFields
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
LookerShowHiddenModels() bool
|
LookerShowHiddenModels() bool
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -124,7 +123,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
excludeHidden := !source.LookerShowHiddenModels()
|
excludeHidden := !source.LookerShowHiddenModels()
|
||||||
includeInternal := true
|
includeInternal := true
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
LookerShowHiddenFields() bool
|
LookerShowHiddenFields() bool
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -125,7 +125,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
fields := lookercommon.ParametersFields
|
fields := lookercommon.ParametersFields
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -121,7 +121,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -120,7 +119,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -48,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -119,7 +118,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -53,8 +53,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -136,7 +136,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -53,8 +53,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -127,7 +127,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -53,8 +53,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -131,7 +131,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -23,7 +23,6 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/looker/lookercommon"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util"
|
"github.com/googleapis/genai-toolbox/internal/util"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
|
|
||||||
@@ -50,8 +49,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -129,7 +128,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
logger.DebugContext(ctx, "params = ", params)
|
logger.DebugContext(ctx, "params = ", params)
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -50,8 +50,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -139,7 +139,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, fmt.Errorf("error building query request: %w", err)
|
return nil, fmt.Errorf("error building query request: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -49,8 +49,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -123,7 +123,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error building WriteQuery request: %w", err)
|
return nil, fmt.Errorf("error building WriteQuery request: %w", err)
|
||||||
}
|
}
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -122,7 +122,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error building query request: %w", err)
|
return nil, fmt.Errorf("error building query request: %w", err)
|
||||||
}
|
}
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -48,8 +48,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -135,7 +135,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
visConfig := paramsMap["vis_config"].(map[string]any)
|
visConfig := paramsMap["vis_config"].(map[string]any)
|
||||||
wq.VisConfig = &visConfig
|
wq.VisConfig = &visConfig
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -50,8 +50,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -129,7 +129,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
|
|
||||||
dashboard_id := paramsMap["dashboard_id"].(string)
|
dashboard_id := paramsMap["dashboard_id"].(string)
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -49,8 +49,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -132,7 +132,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
limit := int64(paramsMap["limit"].(int))
|
limit := int64(paramsMap["limit"].(int))
|
||||||
limitStr := fmt.Sprintf("%d", limit)
|
limitStr := fmt.Sprintf("%d", limit)
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -47,8 +47,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
UseClientAuthorization() bool
|
UseClientAuthorization() bool
|
||||||
GetAuthTokenHeaderName() string
|
GetAuthTokenHeaderName() string
|
||||||
LookerClient() *v4.LookerSDK
|
|
||||||
LookerApiSettings() *rtl.ApiSettings
|
LookerApiSettings() *rtl.ApiSettings
|
||||||
|
GetLookerSDK(string) (*v4.LookerSDK, error)
|
||||||
}
|
}
|
||||||
type Config struct {
|
type Config struct {
|
||||||
Name string `yaml:"name" validate:"required"`
|
Name string `yaml:"name" validate:"required"`
|
||||||
@@ -117,7 +117,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
sdk, err := lookercommon.GetLookerSDK(source.UseClientAuthorization(), source.LookerApiSettings(), source.LookerClient(), accessToken)
|
sdk, err := source.GetLookerSDK(string(accessToken))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error getting sdk: %w", err)
|
return nil, fmt.Errorf("error getting sdk: %w", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -15,14 +15,12 @@ package mongodbaggregate
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"slices"
|
"slices"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"go.mongodb.org/mongo-driver/bson"
|
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
@@ -47,6 +45,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
Aggregate(context.Context, string, bool, bool, string, string) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -110,57 +109,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
|
|
||||||
pipelineString, err := parameters.PopulateTemplateWithJSON("MongoDBAggregatePipeline", t.PipelinePayload, paramsMap)
|
pipelineString, err := parameters.PopulateTemplateWithJSON("MongoDBAggregatePipeline", t.PipelinePayload, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating pipeline: %s", err)
|
return nil, fmt.Errorf("error populating pipeline: %s", err)
|
||||||
}
|
}
|
||||||
|
return source.Aggregate(ctx, pipelineString, t.Canonical, t.ReadOnly, t.Database, t.Collection)
|
||||||
var pipeline = []bson.M{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(pipelineString), t.Canonical, &pipeline)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
if t.ReadOnly {
|
|
||||||
//fail if we do a merge or an out
|
|
||||||
for _, stage := range pipeline {
|
|
||||||
for key := range stage {
|
|
||||||
if key == "$merge" || key == "$out" {
|
|
||||||
return nil, fmt.Errorf("this is not a read-only pipeline: %+v", stage)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
cur, err := source.MongoClient().Database(t.Database).Collection(t.Collection).Aggregate(ctx, pipeline)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
defer cur.Close(ctx)
|
|
||||||
|
|
||||||
var data = []any{}
|
|
||||||
err = cur.All(ctx, &data)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
if len(data) == 0 {
|
|
||||||
return []any{}, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
var final []any
|
|
||||||
for _, item := range data {
|
|
||||||
tmp, _ := bson.MarshalExtJSON(item, false, false)
|
|
||||||
var tmp2 any
|
|
||||||
err = json.Unmarshal(tmp, &tmp2)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
final = append(final, tmp2)
|
|
||||||
}
|
|
||||||
|
|
||||||
return final, err
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -15,16 +15,13 @@ package mongodbdeletemany
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"errors"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"slices"
|
"slices"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"go.mongodb.org/mongo-driver/bson"
|
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
"go.mongodb.org/mongo-driver/mongo/options"
|
|
||||||
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
@@ -48,6 +45,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
DeleteMany(context.Context, string, string, string) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -115,31 +113,11 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
|
|
||||||
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBDeleteManyFilter", t.FilterPayload, paramsMap)
|
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBDeleteManyFilter", t.FilterPayload, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating filter: %s", err)
|
return nil, fmt.Errorf("error populating filter: %s", err)
|
||||||
}
|
}
|
||||||
|
return source.DeleteMany(ctx, filterString, t.Database, t.Collection)
|
||||||
opts := options.Delete()
|
|
||||||
|
|
||||||
var filter = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
res, err := source.MongoClient().Database(t.Database).Collection(t.Collection).DeleteMany(ctx, filter, opts)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
if res.DeletedCount == 0 {
|
|
||||||
return nil, errors.New("no document found")
|
|
||||||
}
|
|
||||||
|
|
||||||
// not much to return actually
|
|
||||||
return res.DeletedCount, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -21,9 +21,7 @@ import (
|
|||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"go.mongodb.org/mongo-driver/bson"
|
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
"go.mongodb.org/mongo-driver/mongo/options"
|
|
||||||
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
@@ -47,6 +45,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
DeleteOne(context.Context, string, string, string) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -119,22 +118,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating filter: %s", err)
|
return nil, fmt.Errorf("error populating filter: %s", err)
|
||||||
}
|
}
|
||||||
|
return source.DeleteOne(ctx, filterString, t.Database, t.Collection)
|
||||||
opts := options.Delete()
|
|
||||||
|
|
||||||
var filter = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
res, err := source.MongoClient().Database(t.Database).Collection(t.Collection).DeleteOne(ctx, filter, opts)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
// do not return an error when the count is 0, to mirror the delete many call result
|
|
||||||
return res.DeletedCount, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -15,7 +15,6 @@ package mongodbfind
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"slices"
|
"slices"
|
||||||
|
|
||||||
@@ -49,6 +48,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
Find(context.Context, string, string, string, *options.FindOptions) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -164,48 +164,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
|
|
||||||
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBFindFilterString", t.FilterPayload, paramsMap)
|
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBFindFilterString", t.FilterPayload, paramsMap)
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating filter: %s", err)
|
return nil, fmt.Errorf("error populating filter: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
opts, err := getOptions(ctx, t.SortParams, t.ProjectPayload, t.Limit, paramsMap)
|
opts, err := getOptions(ctx, t.SortParams, t.ProjectPayload, t.Limit, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating options: %s", err)
|
return nil, fmt.Errorf("error populating options: %s", err)
|
||||||
}
|
}
|
||||||
|
return source.Find(ctx, filterString, t.Database, t.Collection, opts)
|
||||||
var filter = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
cur, err := source.MongoClient().Database(t.Database).Collection(t.Collection).Find(ctx, filter, opts)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
defer cur.Close(ctx)
|
|
||||||
|
|
||||||
var data = []any{}
|
|
||||||
err = cur.All(context.TODO(), &data)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
var final []any
|
|
||||||
for _, item := range data {
|
|
||||||
tmp, _ := bson.MarshalExtJSON(item, false, false)
|
|
||||||
var tmp2 any
|
|
||||||
err = json.Unmarshal(tmp, &tmp2)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
final = append(final, tmp2)
|
|
||||||
}
|
|
||||||
|
|
||||||
return final, err
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -15,7 +15,6 @@ package mongodbfindone
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"slices"
|
"slices"
|
||||||
|
|
||||||
@@ -48,6 +47,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
FindOne(context.Context, string, string, string, *options.FindOneOptions) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,9 +117,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
|
|
||||||
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBFindOneFilterString", t.FilterPayload, paramsMap)
|
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBFindOneFilterString", t.FilterPayload, paramsMap)
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating filter: %s", err)
|
return nil, fmt.Errorf("error populating filter: %s", err)
|
||||||
}
|
}
|
||||||
@@ -137,34 +135,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
opts = opts.SetProjection(projection)
|
opts = opts.SetProjection(projection)
|
||||||
}
|
}
|
||||||
|
return source.FindOne(ctx, filterString, t.Database, t.Collection, opts)
|
||||||
var filter = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
res := source.MongoClient().Database(t.Database).Collection(t.Collection).FindOne(ctx, filter, opts)
|
|
||||||
if res.Err() != nil {
|
|
||||||
return nil, res.Err()
|
|
||||||
}
|
|
||||||
|
|
||||||
var data any
|
|
||||||
err = res.Decode(&data)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
var final []any
|
|
||||||
tmp, _ := bson.MarshalExtJSON(data, false, false)
|
|
||||||
var tmp2 any
|
|
||||||
err = json.Unmarshal(tmp, &tmp2)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
final = append(final, tmp2)
|
|
||||||
|
|
||||||
return final, err
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -23,9 +23,7 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"go.mongodb.org/mongo-driver/bson"
|
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
"go.mongodb.org/mongo-driver/mongo/options"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "mongodb-insert-many"
|
const kind string = "mongodb-insert-many"
|
||||||
@@ -48,6 +46,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
InsertMany(context.Context, string, bool, string, string) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,19 +116,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if !ok {
|
if !ok {
|
||||||
return nil, errors.New("no input found")
|
return nil, errors.New("no input found")
|
||||||
}
|
}
|
||||||
|
return source.InsertMany(ctx, jsonData, t.Canonical, t.Database, t.Collection)
|
||||||
var data = []any{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(jsonData), t.Canonical, &data)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
res, err := source.MongoClient().Database(t.Database).Collection(t.Collection).InsertMany(ctx, data, options.InsertMany())
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
return res.InsertedIDs, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -23,9 +23,7 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"go.mongodb.org/mongo-driver/bson"
|
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
"go.mongodb.org/mongo-driver/mongo/options"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "mongodb-insert-one"
|
const kind string = "mongodb-insert-one"
|
||||||
@@ -48,6 +46,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
InsertOne(context.Context, string, bool, string, string) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -107,7 +106,6 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
if len(params) == 0 {
|
if len(params) == 0 {
|
||||||
return nil, errors.New("no input found")
|
return nil, errors.New("no input found")
|
||||||
}
|
}
|
||||||
@@ -116,19 +114,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if !ok {
|
if !ok {
|
||||||
return nil, errors.New("no input found")
|
return nil, errors.New("no input found")
|
||||||
}
|
}
|
||||||
|
return source.InsertOne(ctx, jsonData, t.Canonical, t.Database, t.Collection)
|
||||||
var data any
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(jsonData), t.Canonical, &data)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
res, err := source.MongoClient().Database(t.Database).Collection(t.Collection).InsertOne(ctx, data, options.InsertOne())
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
return res.InsertedID, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -23,9 +23,7 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"go.mongodb.org/mongo-driver/bson"
|
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
"go.mongodb.org/mongo-driver/mongo/options"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "mongodb-update-many"
|
const kind string = "mongodb-update-many"
|
||||||
@@ -46,6 +44,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
UpdateMany(context.Context, string, bool, string, string, string, bool) ([]any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -117,35 +116,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
|
|
||||||
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateManyFilter", t.FilterPayload, paramsMap)
|
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateManyFilter", t.FilterPayload, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating filter: %s", err)
|
return nil, fmt.Errorf("error populating filter: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
var filter = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(filterString), t.Canonical, &filter)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("unable to unmarshal filter string: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
updateString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateMany", t.UpdatePayload, paramsMap)
|
updateString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateMany", t.UpdatePayload, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("unable to get update: %w", err)
|
return nil, fmt.Errorf("unable to get update: %w", err)
|
||||||
}
|
}
|
||||||
|
return source.UpdateMany(ctx, filterString, t.Canonical, updateString, t.Database, t.Collection, t.Upsert)
|
||||||
var update = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(updateString), false, &update)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("unable to unmarshal update string: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
res, err := source.MongoClient().Database(t.Database).Collection(t.Collection).UpdateMany(ctx, filter, update, options.Update().SetUpsert(t.Upsert))
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error updating collection: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
return []any{res.ModifiedCount, res.UpsertedCount, res.MatchedCount}, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -23,9 +23,7 @@ import (
|
|||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"go.mongodb.org/mongo-driver/bson"
|
|
||||||
"go.mongodb.org/mongo-driver/mongo"
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
"go.mongodb.org/mongo-driver/mongo/options"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind string = "mongodb-update-one"
|
const kind string = "mongodb-update-one"
|
||||||
@@ -46,6 +44,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
MongoClient() *mongo.Client
|
MongoClient() *mongo.Client
|
||||||
|
UpdateOne(context.Context, string, bool, string, string, string, bool) (any, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -118,35 +117,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
}
|
}
|
||||||
|
|
||||||
paramsMap := params.AsMap()
|
paramsMap := params.AsMap()
|
||||||
|
|
||||||
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateOneFilter", t.FilterPayload, paramsMap)
|
filterString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateOneFilter", t.FilterPayload, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("error populating filter: %s", err)
|
return nil, fmt.Errorf("error populating filter: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
var filter = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(filterString), false, &filter)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("unable to unmarshal filter string: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
updateString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateOne", t.UpdatePayload, paramsMap)
|
updateString, err := parameters.PopulateTemplateWithJSON("MongoDBUpdateOne", t.UpdatePayload, paramsMap)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("unable to get update: %w", err)
|
return nil, fmt.Errorf("unable to get update: %w", err)
|
||||||
}
|
}
|
||||||
|
return source.UpdateOne(ctx, filterString, t.Canonical, updateString, t.Database, t.Collection, t.Upsert)
|
||||||
var update = bson.D{}
|
|
||||||
err = bson.UnmarshalExtJSON([]byte(updateString), t.Canonical, &update)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("unable to unmarshal update string: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
res, err := source.MongoClient().Database(t.Database).Collection(t.Collection).UpdateOne(ctx, filter, update, options.Update().SetUpsert(t.Upsert))
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error updating collection: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
return res.ModifiedCount, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -19,7 +19,6 @@ import (
|
|||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
|
||||||
dataprocpb "cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
dataprocpb "cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"google.golang.org/protobuf/encoding/protojson"
|
"google.golang.org/protobuf/encoding/protojson"
|
||||||
@@ -36,9 +35,7 @@ func unmarshalProto(data any, m proto.Message) error {
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
GetBatchControllerClient() *dataproc.BatchControllerClient
|
CreateBatch(context.Context, *dataprocpb.Batch) (map[string]any, error)
|
||||||
GetProject() string
|
|
||||||
GetLocation() string
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Config is a common config that can be used with any type of create batch tool. However, each tool
|
// Config is a common config that can be used with any type of create batch tool. However, each tool
|
||||||
|
|||||||
@@ -16,23 +16,19 @@ package createbatch
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"time"
|
|
||||||
|
|
||||||
dataprocpb "cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
dataprocpb "cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/common"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/protobuf/encoding/protojson"
|
|
||||||
"google.golang.org/protobuf/proto"
|
"google.golang.org/protobuf/proto"
|
||||||
)
|
)
|
||||||
|
|
||||||
type BatchBuilder interface {
|
type BatchBuilder interface {
|
||||||
Parameters() parameters.Parameters
|
Parameters() parameters.Parameters
|
||||||
BuildBatch(params parameters.ParamValues) (*dataprocpb.Batch, error)
|
BuildBatch(parameters.ParamValues) (*dataprocpb.Batch, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
func NewTool(cfg Config, originalCfg tools.ToolConfig, srcs map[string]sources.Source, builder BatchBuilder) (*Tool, error) {
|
func NewTool(cfg Config, originalCfg tools.ToolConfig, srcs map[string]sources.Source, builder BatchBuilder) (*Tool, error) {
|
||||||
@@ -74,7 +70,6 @@ func (t *Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, par
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
client := source.GetBatchControllerClient()
|
|
||||||
|
|
||||||
batch, err := t.Builder.BuildBatch(params)
|
batch, err := t.Builder.BuildBatch(params)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -97,46 +92,7 @@ func (t *Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, par
|
|||||||
}
|
}
|
||||||
batch.RuntimeConfig.Version = version
|
batch.RuntimeConfig.Version = version
|
||||||
}
|
}
|
||||||
|
return source.CreateBatch(ctx, batch)
|
||||||
req := &dataprocpb.CreateBatchRequest{
|
|
||||||
Parent: fmt.Sprintf("projects/%s/locations/%s", source.GetProject(), source.GetLocation()),
|
|
||||||
Batch: batch,
|
|
||||||
}
|
|
||||||
|
|
||||||
op, err := client.CreateBatch(ctx, req)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to create batch: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
meta, err := op.Metadata()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get create batch op metadata: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
jsonBytes, err := protojson.Marshal(meta)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to marshal create batch op metadata to JSON: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
var result map[string]any
|
|
||||||
if err := json.Unmarshal(jsonBytes, &result); err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to unmarshal create batch op metadata JSON: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
projectID, location, batchID, err := common.ExtractBatchDetails(meta.GetBatch())
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error extracting batch details from name %q: %v", meta.GetBatch(), err)
|
|
||||||
}
|
|
||||||
consoleUrl := common.BatchConsoleURL(projectID, location, batchID)
|
|
||||||
logsUrl := common.BatchLogsURL(projectID, location, batchID, meta.GetCreateTime().AsTime(), time.Time{})
|
|
||||||
|
|
||||||
wrappedResult := map[string]any{
|
|
||||||
"opMetadata": meta,
|
|
||||||
"consoleUrl": consoleUrl,
|
|
||||||
"logsUrl": logsUrl,
|
|
||||||
}
|
|
||||||
|
|
||||||
return wrappedResult, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t *Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t *Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -19,8 +19,7 @@ import (
|
|||||||
"fmt"
|
"fmt"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
longrunning "cloud.google.com/go/longrunning/autogen"
|
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
||||||
"cloud.google.com/go/longrunning/autogen/longrunningpb"
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
@@ -45,9 +44,8 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
}
|
}
|
||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
GetOperationsClient(context.Context) (*longrunning.OperationsClient, error)
|
GetBatchControllerClient() *dataproc.BatchControllerClient
|
||||||
GetProject() string
|
CancelOperation(context.Context, string) (any, error)
|
||||||
GetLocation() string
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -106,32 +104,15 @@ func (t *Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, par
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
client, err := source.GetOperationsClient(ctx)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get operations client: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
paramMap := params.AsMap()
|
paramMap := params.AsMap()
|
||||||
operation, ok := paramMap["operation"].(string)
|
operation, ok := paramMap["operation"].(string)
|
||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("missing required parameter: operation")
|
return nil, fmt.Errorf("missing required parameter: operation")
|
||||||
}
|
}
|
||||||
|
|
||||||
if strings.Contains(operation, "/") {
|
if strings.Contains(operation, "/") {
|
||||||
return nil, fmt.Errorf("operation must be a short operation name without '/': %s", operation)
|
return nil, fmt.Errorf("operation must be a short operation name without '/': %s", operation)
|
||||||
}
|
}
|
||||||
|
return source.CancelOperation(ctx, operation)
|
||||||
req := &longrunningpb.CancelOperationRequest{
|
|
||||||
Name: fmt.Sprintf("projects/%s/locations/%s/operations/%s", source.GetProject(), source.GetLocation(), operation),
|
|
||||||
}
|
|
||||||
|
|
||||||
err = client.CancelOperation(ctx, req)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to cancel operation: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
return fmt.Sprintf("Cancelled [%s].", operation), nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t *Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t *Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -16,19 +16,15 @@ package serverlesssparkgetbatch
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
||||||
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/common"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/protobuf/encoding/protojson"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind = "serverless-spark-get-batch"
|
const kind = "serverless-spark-get-batch"
|
||||||
@@ -49,8 +45,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
GetBatchControllerClient() *dataproc.BatchControllerClient
|
GetBatchControllerClient() *dataproc.BatchControllerClient
|
||||||
GetProject() string
|
GetBatch(context.Context, string) (map[string]any, error)
|
||||||
GetLocation() string
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -109,54 +104,15 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
client := source.GetBatchControllerClient()
|
|
||||||
|
|
||||||
paramMap := params.AsMap()
|
paramMap := params.AsMap()
|
||||||
name, ok := paramMap["name"].(string)
|
name, ok := paramMap["name"].(string)
|
||||||
if !ok {
|
if !ok {
|
||||||
return nil, fmt.Errorf("missing required parameter: name")
|
return nil, fmt.Errorf("missing required parameter: name")
|
||||||
}
|
}
|
||||||
|
|
||||||
if strings.Contains(name, "/") {
|
if strings.Contains(name, "/") {
|
||||||
return nil, fmt.Errorf("name must be a short batch name without '/': %s", name)
|
return nil, fmt.Errorf("name must be a short batch name without '/': %s", name)
|
||||||
}
|
}
|
||||||
|
return source.GetBatch(ctx, name)
|
||||||
req := &dataprocpb.GetBatchRequest{
|
|
||||||
Name: fmt.Sprintf("projects/%s/locations/%s/batches/%s", source.GetProject(), source.GetLocation(), name),
|
|
||||||
}
|
|
||||||
|
|
||||||
batchPb, err := client.GetBatch(ctx, req)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to get batch: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
jsonBytes, err := protojson.Marshal(batchPb)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to marshal batch to JSON: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
var result map[string]any
|
|
||||||
if err := json.Unmarshal(jsonBytes, &result); err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to unmarshal batch JSON: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
consoleUrl, err := common.BatchConsoleURLFromProto(batchPb)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error generating console url: %v", err)
|
|
||||||
}
|
|
||||||
logsUrl, err := common.BatchLogsURLFromProto(batchPb)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error generating logs url: %v", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
wrappedResult := map[string]any{
|
|
||||||
"consoleUrl": consoleUrl,
|
|
||||||
"logsUrl": logsUrl,
|
|
||||||
"batch": result,
|
|
||||||
}
|
|
||||||
|
|
||||||
return wrappedResult, nil
|
|
||||||
}
|
}
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
return parameters.ParseParams(t.Parameters, data, claims)
|
return parameters.ParseParams(t.Parameters, data, claims)
|
||||||
|
|||||||
@@ -17,17 +17,13 @@ package serverlesssparklistbatches
|
|||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"fmt"
|
"fmt"
|
||||||
"time"
|
|
||||||
|
|
||||||
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
||||||
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
|
||||||
"github.com/googleapis/genai-toolbox/internal/sources"
|
"github.com/googleapis/genai-toolbox/internal/sources"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools"
|
"github.com/googleapis/genai-toolbox/internal/tools"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/common"
|
|
||||||
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
"github.com/googleapis/genai-toolbox/internal/util/parameters"
|
||||||
"google.golang.org/api/iterator"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
const kind = "serverless-spark-list-batches"
|
const kind = "serverless-spark-list-batches"
|
||||||
@@ -48,8 +44,7 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.T
|
|||||||
|
|
||||||
type compatibleSource interface {
|
type compatibleSource interface {
|
||||||
GetBatchControllerClient() *dataproc.BatchControllerClient
|
GetBatchControllerClient() *dataproc.BatchControllerClient
|
||||||
GetProject() string
|
ListBatches(context.Context, *int, string, string) (any, error)
|
||||||
GetLocation() string
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type Config struct {
|
type Config struct {
|
||||||
@@ -104,95 +99,24 @@ type Tool struct {
|
|||||||
Parameters parameters.Parameters
|
Parameters parameters.Parameters
|
||||||
}
|
}
|
||||||
|
|
||||||
// ListBatchesResponse is the response from the list batches API.
|
|
||||||
type ListBatchesResponse struct {
|
|
||||||
Batches []Batch `json:"batches"`
|
|
||||||
NextPageToken string `json:"nextPageToken"`
|
|
||||||
}
|
|
||||||
|
|
||||||
// Batch represents a single batch job.
|
|
||||||
type Batch struct {
|
|
||||||
Name string `json:"name"`
|
|
||||||
UUID string `json:"uuid"`
|
|
||||||
State string `json:"state"`
|
|
||||||
Creator string `json:"creator"`
|
|
||||||
CreateTime string `json:"createTime"`
|
|
||||||
Operation string `json:"operation"`
|
|
||||||
ConsoleURL string `json:"consoleUrl"`
|
|
||||||
LogsURL string `json:"logsUrl"`
|
|
||||||
}
|
|
||||||
|
|
||||||
// Invoke executes the tool's operation.
|
// Invoke executes the tool's operation.
|
||||||
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
|
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
|
||||||
source, err := tools.GetCompatibleSource[compatibleSource](resourceMgr, t.Source, t.Name, t.Kind)
|
source, err := tools.GetCompatibleSource[compatibleSource](resourceMgr, t.Source, t.Name, t.Kind)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
client := source.GetBatchControllerClient()
|
|
||||||
|
|
||||||
parent := fmt.Sprintf("projects/%s/locations/%s", source.GetProject(), source.GetLocation())
|
|
||||||
req := &dataprocpb.ListBatchesRequest{
|
|
||||||
Parent: parent,
|
|
||||||
OrderBy: "create_time desc",
|
|
||||||
}
|
|
||||||
|
|
||||||
paramMap := params.AsMap()
|
paramMap := params.AsMap()
|
||||||
|
var pageSize *int
|
||||||
if ps, ok := paramMap["pageSize"]; ok && ps != nil {
|
if ps, ok := paramMap["pageSize"]; ok && ps != nil {
|
||||||
req.PageSize = int32(ps.(int))
|
pageSizeV := ps.(int)
|
||||||
if (req.PageSize) <= 0 {
|
if pageSizeV <= 0 {
|
||||||
return nil, fmt.Errorf("pageSize must be positive: %d", req.PageSize)
|
return nil, fmt.Errorf("pageSize must be positive: %d", pageSizeV)
|
||||||
}
|
}
|
||||||
|
pageSize = &pageSizeV
|
||||||
}
|
}
|
||||||
if pt, ok := paramMap["pageToken"]; ok && pt != nil {
|
pt, _ := paramMap["pageToken"].(string)
|
||||||
req.PageToken = pt.(string)
|
filter, _ := paramMap["filter"].(string)
|
||||||
}
|
return source.ListBatches(ctx, pageSize, pt, filter)
|
||||||
if filter, ok := paramMap["filter"]; ok && filter != nil {
|
|
||||||
req.Filter = filter.(string)
|
|
||||||
}
|
|
||||||
|
|
||||||
it := client.ListBatches(ctx, req)
|
|
||||||
pager := iterator.NewPager(it, int(req.PageSize), req.PageToken)
|
|
||||||
|
|
||||||
var batchPbs []*dataprocpb.Batch
|
|
||||||
nextPageToken, err := pager.NextPage(&batchPbs)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to list batches: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
batches, err := ToBatches(batchPbs)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
return ListBatchesResponse{Batches: batches, NextPageToken: nextPageToken}, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// ToBatches converts a slice of protobuf Batch messages to a slice of Batch structs.
|
|
||||||
func ToBatches(batchPbs []*dataprocpb.Batch) ([]Batch, error) {
|
|
||||||
batches := make([]Batch, 0, len(batchPbs))
|
|
||||||
for _, batchPb := range batchPbs {
|
|
||||||
consoleUrl, err := common.BatchConsoleURLFromProto(batchPb)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error generating console url: %v", err)
|
|
||||||
}
|
|
||||||
logsUrl, err := common.BatchLogsURLFromProto(batchPb)
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("error generating logs url: %v", err)
|
|
||||||
}
|
|
||||||
batch := Batch{
|
|
||||||
Name: batchPb.Name,
|
|
||||||
UUID: batchPb.Uuid,
|
|
||||||
State: batchPb.State.Enum().String(),
|
|
||||||
Creator: batchPb.Creator,
|
|
||||||
CreateTime: batchPb.CreateTime.AsTime().Format(time.RFC3339),
|
|
||||||
Operation: batchPb.Operation,
|
|
||||||
ConsoleURL: consoleUrl,
|
|
||||||
LogsURL: logsUrl,
|
|
||||||
}
|
|
||||||
batches = append(batches, batch)
|
|
||||||
}
|
|
||||||
return batches, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
|
||||||
|
|||||||
@@ -85,13 +85,66 @@ func initDataplexConnection(ctx context.Context) (*dataplex.CatalogClient, error
|
|||||||
return client, nil
|
return client, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// cleanupOldAspectTypes Deletes AspectTypes older than the specified duration.
|
||||||
|
func cleanupOldAspectTypes(t *testing.T, ctx context.Context, client *dataplex.CatalogClient, oldThreshold time.Duration) {
|
||||||
|
parent := fmt.Sprintf("projects/%s/locations/us", DataplexProject)
|
||||||
|
olderThanTime := time.Now().Add(-oldThreshold)
|
||||||
|
|
||||||
|
listReq := &dataplexpb.ListAspectTypesRequest{
|
||||||
|
Parent: parent,
|
||||||
|
PageSize: 100, // Fetch up to 100 items
|
||||||
|
OrderBy: "create_time asc", // Order by creation time
|
||||||
|
}
|
||||||
|
|
||||||
|
const maxDeletes = 8 // Explicitly limit the number of deletions
|
||||||
|
it := client.ListAspectTypes(ctx, listReq)
|
||||||
|
var aspectTypesToDelete []string
|
||||||
|
for len(aspectTypesToDelete) < maxDeletes {
|
||||||
|
aspectType, err := it.Next()
|
||||||
|
if err == iterator.Done {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
t.Logf("Warning: Failed to list aspect types during cleanup: %v", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
// Perform time-based filtering in memory
|
||||||
|
if aspectType.CreateTime != nil {
|
||||||
|
createTime := aspectType.CreateTime.AsTime()
|
||||||
|
if createTime.Before(olderThanTime) {
|
||||||
|
aspectTypesToDelete = append(aspectTypesToDelete, aspectType.GetName())
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
t.Logf("Warning: AspectType %s has no CreateTime", aspectType.GetName())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if len(aspectTypesToDelete) == 0 {
|
||||||
|
t.Logf("cleanupOldAspectTypes: No aspect types found older than %s to delete.", oldThreshold.String())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, aspectTypeName := range aspectTypesToDelete {
|
||||||
|
deleteReq := &dataplexpb.DeleteAspectTypeRequest{Name: aspectTypeName}
|
||||||
|
op, err := client.DeleteAspectType(ctx, deleteReq)
|
||||||
|
if err != nil {
|
||||||
|
t.Logf("Warning: Failed to delete aspect type %s: %v", aspectTypeName, err)
|
||||||
|
continue // Skip to the next item if initiation fails
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := op.Wait(ctx); err != nil {
|
||||||
|
t.Logf("Warning: Failed to delete aspect type %s, operation error: %v", aspectTypeName, err)
|
||||||
|
} else {
|
||||||
|
t.Logf("cleanupOldAspectTypes: Successfully deleted %s", aspectTypeName)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestDataplexToolEndpoints(t *testing.T) {
|
func TestDataplexToolEndpoints(t *testing.T) {
|
||||||
sourceConfig := getDataplexVars(t)
|
sourceConfig := getDataplexVars(t)
|
||||||
ctx, cancel := context.WithTimeout(context.Background(), 3*time.Minute)
|
ctx, cancel := context.WithTimeout(context.Background(), 3*time.Minute)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
|
|
||||||
var args []string
|
var args []string
|
||||||
|
|
||||||
bigqueryClient, err := initBigQueryConnection(ctx, DataplexProject)
|
bigqueryClient, err := initBigQueryConnection(ctx, DataplexProject)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("unable to create Cloud SQL connection pool: %s", err)
|
t.Fatalf("unable to create Cloud SQL connection pool: %s", err)
|
||||||
@@ -102,6 +155,9 @@ func TestDataplexToolEndpoints(t *testing.T) {
|
|||||||
t.Fatalf("unable to create Dataplex connection: %s", err)
|
t.Fatalf("unable to create Dataplex connection: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Cleanup older aspecttypes
|
||||||
|
cleanupOldAspectTypes(t, ctx, dataplexClient, 1*time.Hour)
|
||||||
|
|
||||||
// create resources with UUID
|
// create resources with UUID
|
||||||
datasetName := fmt.Sprintf("temp_toolbox_test_%s", strings.ReplaceAll(uuid.New().String(), "-", ""))
|
datasetName := fmt.Sprintf("temp_toolbox_test_%s", strings.ReplaceAll(uuid.New().String(), "-", ""))
|
||||||
tableName := fmt.Sprintf("param_table_%s", strings.ReplaceAll(uuid.New().String(), "-", ""))
|
tableName := fmt.Sprintf("param_table_%s", strings.ReplaceAll(uuid.New().String(), "-", ""))
|
||||||
|
|||||||
@@ -33,8 +33,8 @@ import (
|
|||||||
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
dataproc "cloud.google.com/go/dataproc/v2/apiv1"
|
||||||
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
"cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
|
||||||
"github.com/google/go-cmp/cmp"
|
"github.com/google/go-cmp/cmp"
|
||||||
|
"github.com/googleapis/genai-toolbox/internal/sources/serverlessspark"
|
||||||
"github.com/googleapis/genai-toolbox/internal/testutils"
|
"github.com/googleapis/genai-toolbox/internal/testutils"
|
||||||
"github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparklistbatches"
|
|
||||||
"github.com/googleapis/genai-toolbox/tests"
|
"github.com/googleapis/genai-toolbox/tests"
|
||||||
"google.golang.org/api/iterator"
|
"google.golang.org/api/iterator"
|
||||||
"google.golang.org/api/option"
|
"google.golang.org/api/option"
|
||||||
@@ -676,7 +676,7 @@ func runListBatchesTest(t *testing.T, client *dataproc.BatchControllerClient, ct
|
|||||||
filter string
|
filter string
|
||||||
pageSize int
|
pageSize int
|
||||||
numPages int
|
numPages int
|
||||||
want []serverlesssparklistbatches.Batch
|
want []serverlessspark.Batch
|
||||||
}{
|
}{
|
||||||
{name: "one page", pageSize: 2, numPages: 1, want: batch2},
|
{name: "one page", pageSize: 2, numPages: 1, want: batch2},
|
||||||
{name: "two pages", pageSize: 1, numPages: 2, want: batch2},
|
{name: "two pages", pageSize: 1, numPages: 2, want: batch2},
|
||||||
@@ -701,7 +701,7 @@ func runListBatchesTest(t *testing.T, client *dataproc.BatchControllerClient, ct
|
|||||||
for _, tc := range tcs {
|
for _, tc := range tcs {
|
||||||
t.Run(tc.name, func(t *testing.T) {
|
t.Run(tc.name, func(t *testing.T) {
|
||||||
t.Parallel()
|
t.Parallel()
|
||||||
var actual []serverlesssparklistbatches.Batch
|
var actual []serverlessspark.Batch
|
||||||
var pageToken string
|
var pageToken string
|
||||||
for i := 0; i < tc.numPages; i++ {
|
for i := 0; i < tc.numPages; i++ {
|
||||||
request := map[string]any{
|
request := map[string]any{
|
||||||
@@ -733,7 +733,7 @@ func runListBatchesTest(t *testing.T, client *dataproc.BatchControllerClient, ct
|
|||||||
t.Fatalf("unable to find result in response body")
|
t.Fatalf("unable to find result in response body")
|
||||||
}
|
}
|
||||||
|
|
||||||
var listResponse serverlesssparklistbatches.ListBatchesResponse
|
var listResponse serverlessspark.ListBatchesResponse
|
||||||
if err := json.Unmarshal([]byte(result), &listResponse); err != nil {
|
if err := json.Unmarshal([]byte(result), &listResponse); err != nil {
|
||||||
t.Fatalf("error unmarshalling result: %s", err)
|
t.Fatalf("error unmarshalling result: %s", err)
|
||||||
}
|
}
|
||||||
@@ -759,7 +759,7 @@ func runListBatchesTest(t *testing.T, client *dataproc.BatchControllerClient, ct
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func listBatchesRpc(t *testing.T, client *dataproc.BatchControllerClient, ctx context.Context, filter string, n int, exact bool) []serverlesssparklistbatches.Batch {
|
func listBatchesRpc(t *testing.T, client *dataproc.BatchControllerClient, ctx context.Context, filter string, n int, exact bool) []serverlessspark.Batch {
|
||||||
parent := fmt.Sprintf("projects/%s/locations/%s", serverlessSparkProject, serverlessSparkLocation)
|
parent := fmt.Sprintf("projects/%s/locations/%s", serverlessSparkProject, serverlessSparkLocation)
|
||||||
req := &dataprocpb.ListBatchesRequest{
|
req := &dataprocpb.ListBatchesRequest{
|
||||||
Parent: parent,
|
Parent: parent,
|
||||||
@@ -783,7 +783,7 @@ func listBatchesRpc(t *testing.T, client *dataproc.BatchControllerClient, ctx co
|
|||||||
if !exact && (len(batchPbs) == 0 || len(batchPbs) > n) {
|
if !exact && (len(batchPbs) == 0 || len(batchPbs) > n) {
|
||||||
t.Fatalf("expected between 1 and %d batches, got %d", n, len(batchPbs))
|
t.Fatalf("expected between 1 and %d batches, got %d", n, len(batchPbs))
|
||||||
}
|
}
|
||||||
batches, err := serverlesssparklistbatches.ToBatches(batchPbs)
|
batches, err := serverlessspark.ToBatches(batchPbs)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("failed to convert batches to JSON: %v", err)
|
t.Fatalf("failed to convert batches to JSON: %v", err)
|
||||||
}
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user