Commit Graph

12 Commits

Author SHA1 Message Date
rahulpinto19
bb24e03475 test 2025-12-24 07:16:52 +00:00
rahulpinto19
f73d466f74 test:add integration tests for collection 2025-12-24 06:47:01 +00:00
rahulpinto19
f8f4c10d35 test: add integration tests for collections 2025-12-24 06:34:58 +00:00
Dave Borowitz
c6ccf4bd87 feat(serverless-spark)!: add URLs to create batch tool outputs 2025-12-10 15:10:40 -08:00
Dave Borowitz
5605eabd69 feat(serverless-spark)!: add URLs to list_batches output
Unlike get_batch, in this case we are not returning a JSON type directly
from the server, so we can add the new fields in our top-level object
rather than wrapping.
2025-12-10 15:10:40 -08:00
Dave Borowitz
e29c0616d6 feat(serverless-spark)!: add Cloud Console and Logging URLs to get_batch
These are useful links for humans to follow for more information
(output, metrics, logs) that's not readily availble via MCP.
2025-12-10 15:10:40 -08:00
Dave Borowitz
17a979207d feat(serverless-spark): add create_spark_batch tool
This tool is almost identical to create_pyspark_batch, but for Java
Spark batches. There are some minor differences in how the main files
and args are provided.
2025-12-04 11:05:53 -08:00
Dave Borowitz
1bf0b51f03 feat(serverless-spark): add create_pyspark_batch tool
This tool creates a PySpark batch from a minimal set of parameters, to
keep things simple for the LLM. Advanced runtime and environment config
can be specified in tools.yaml.
2025-12-04 11:05:53 -08:00
Dave Borowitz
2881683226 feat(serverless-spark): add cancel-batch tool 2025-11-05 11:13:35 -08:00
Dave Borowitz
8ef0566e1e refactor(serverless-spark): rearrange and parallelize integration tests
In general tests should be parallizable since they interact only with a
deterministic set of batches. The exception is list-batches, especially
with pagination, so run that one sequentially.

This doesn't make much difference for the current set of tests, but in
the near future we will have tests that create batches, which take tens
of seconds to even start running.

Rearrange subtests to be primarily organized by tool, which is more
understandable and easier to filter with `-run`. Test helper methods can
still be called multiple times in different subtests for different
tools.

Sample test output showing the new structure:

```
--- PASS: TestServerlessSparkToolEndpoints (2.01s)
    --- PASS: TestServerlessSparkToolEndpoints/list-batches (1.78s)
        --- PASS: TestServerlessSparkToolEndpoints/list-batches/success (1.23s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/success/filtered (0.34s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/success/empty (0.40s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/success/omit_page_size (0.42s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/success/one_page (0.43s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/success/20_batches (0.44s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/success/two_pages (0.54s)
        --- PASS: TestServerlessSparkToolEndpoints/list-batches/errors (0.00s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/errors/negative_page_size (0.01s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/errors/zero_page_size (0.01s)
        --- PASS: TestServerlessSparkToolEndpoints/list-batches/auth (0.77s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/auth/no_auth_token (0.00s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/auth/invalid_auth_token (0.00s)
            --- PASS: TestServerlessSparkToolEndpoints/list-batches/auth/valid_auth_token (0.18s)
    --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests (0.00s)
        --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch (0.09s)
            --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/errors (0.00s)
                --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/errors/full_batch_name (0.01s)
                --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/errors/missing_batch (0.11s)
            --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/success (0.21s)
                --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/success/found_batch (0.11s)
            --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/auth (0.60s)
                --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/auth/invalid_auth_token (0.00s)
                --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/auth/no_auth_token (0.00s)
                --- PASS: TestServerlessSparkToolEndpoints/parallel-tool-tests/get-batch/auth/valid_auth_token (0.11s)
```
2025-11-05 11:13:35 -08:00
Dave Borowitz
7ad10720b4 feat(serverless-spark): Add get_batch tool 2025-10-28 13:42:02 -07:00
Dave Borowitz
816dbce268 feat(serverless-spark): Add serverless-spark source with list_batches tool
Built as a thin wrapper over the official Google Cloud Dataproc Go
client library, with support for filtering and pagination.
2025-10-23 20:40:52 -07:00