Compare commits

...

5 Commits

Author SHA1 Message Date
Ho
cb0aba2072 purge files 2026-02-25 15:04:44 +09:00
Ho
4b63678984 purge files 2026-02-25 15:02:36 +09:00
Ho
262f3f37dd Update intros 2026-02-25 12:35:47 +09:00
Ho
9336141f8a Update stuffs 2026-02-25 12:27:28 +09:00
Ho
337d1b5695 AI helper: init 2026-02-25 12:25:48 +09:00
11 changed files with 152 additions and 9 deletions

16
.claude/settings.json Normal file
View File

@@ -0,0 +1,16 @@
{
"$schema": "https://json.schemastore.org/claude-code-settings.json",
"env": {},
"companyAnnouncements": [
"Welcome! Here is scroll-tech",
"Just ask me about what can help"
],
"permissions": {
"allow": [
"Bash(pwd)",
"Bash(ls *)",
"Bash(cat *)"
],
"deny": []
}
}

View File

@@ -0,0 +1,36 @@
---
name: db-query
description: Do query from database for common task
model: sonnet
allowed-tools: Bash(psql *)
---
User could like to know about the status of L2 data blocks and proving task, following is their request:
$ARGUMENTS
(If you find there is nothing in the request above, just tell "nothing to do" and stop)
You should have known the data sheme of our database, if not yet, read it from the `.sql` files under `database/migrate/migrations`.
According to use's request, generate the corresponding SQL expression and query the database. For example, if user ask "list the assigned chunks", it means "query records from `chunk` table with proving_status=2 (assigned)", or the SQL expression 'SELECT * from chunk where proving_status=2;'. If it is not clear, you can ask user which col they are indicating to, and list some possible options.
For the generated SQL, following rules MUST be obey:
+ Limit the number of records to 20, unless user has a specification explicitly like "show me ALL chunks".
+ Following cols can not be read by human and contain very large texts, they MUST be excluded in the SQL expression:
+ For all table, any col named "proof"
+ "header" and "transactions" in `l2_block` table
+ "calldata" in `l1_message`
+ Always omit the `deleted_at` col, never include them in query or use in where condition
+ Without explicit specification, the records should be ordered by the `updated_at` col, the most recent one first.
When you has decided the SQL expression, always print it out.
You use psql client to query from our PostgreSQL db. When launching psql, always with "-w" options, and use "-o" to send all ouput to `query_report.txt` file under system's temporary dir, like /tmp. You MUST NOT read the generated report.
If the psql failed since authentication, remind user to prepare their `.pgpass` file under home dir.
You should have known the endpoint of the database before, in the form of PostgreSQL DSN. If not, try to read it from the `db.dsn` field inside of `coordinator/build/bin/conf/config.json`. If still not able to get the data, ask via Ask User Question to get the endpoint.

View File

@@ -0,0 +1,7 @@
## Notes for handling ProverE2E
+ Ensure the `conf` dir has been correctly linked and remind user which path it currently links to.
+ If some files are instructed to be generated, but they have been existed, NEVER refer the content before the generation. They may be left from different setup and contain wrong message for current process.
+ In step 4, if the `l2.validium_mode` is set to true, MUST Ask User for decryption key to fill the `sequencer.decryption_key` field. The key must be a hex string WITHOUT "0x" prefix.

View File

@@ -0,0 +1,62 @@
---
name: integration-test-helper
description: Assist with the process described in the specified directory to prepare or advance integration tests. The target directory and instruction section can be specified, like "tests/prover-e2e test".
model: sonnet
allowed-tools: Bash(make *), Bash(tee *), Bash(jq *)
---
This skill helps launching the full process described in the instructions, also investigate and report the results.
## Target directory
The **target directory** under which the setup process being run is: $ARGUMENTS[0].
Under the target dir there are the stuff and instructions. If the target dir above is empty, just use !`pwd`.
## Instructions
First read `README.md` under target directory, instructions should be under heading named ($ARGUMENTS[1]). If there is no such a heading name, just try the "Test" heading.
In additional, there are two optional places for more knowledge about current instructions:
+ An .md file under current skill dir, named from the top header of the `README.md` file or the name of target directory.
For example, if the target dir is `tests/prover-e2e`, the top header in `README.md` has "ProverE2E", so there may be a .md file named as `prover-e2e.md` or `ProverE2E.md`
+ All files under `experience` path (if it existed) of target dir contains additional experience, which is specialized for current host
## Run each step listed in instructions
The instructions often contain multiple steps which should be completed in sequence. Following are some rules MUST be obey while handling each step:
### "Must do" while executing commands in steps
Any command mentioned in steps should be executed by Bash tool, with following MUST DO for handling the outputs:
+ Use "| tee <log_file>" to capture output of bash tool into local file for investigating later. The file name of log should be in format as `<desc_of_ccommand>_<day>_<time>.log`
+ Do not read all output, after "| tee", use "|tail -n 50" to only catch the possible error message. That should be enough for common case.
It may need to jump to other directories for executing a step. We MUST go back to target directory after every step has been completed. Also, DO NOT change anything outside of target directy.
### When error raised
Command execution should get success return. If error raised while executing, do following process:
1. Try to analysis the reason of error, first from the caught error message. If there is no enough data, grep useful information from the log file of whole output just captured.
2. Ask User for next action, options are:
+ Retry with resolution derived from error analyst
+ Retry, with user provide tips to resolve the issue
+ Just retry, user has resolved the issue by theirself
+ Stop here, discard current and following steps, do after completion
Error often caused by some mismacthing of configruation in current host. Here are some tips which may help:
* Install the missing tools / libs via packet manager
* Fix the typo, or complete missed fields in configuration files
* Copy missed files, it may be just put in some place of the project or can be downloaded according to some documents.
## After completion
When every step has done, or the process stop by user, make following materials before stop:
+ Package all log files generated before into a tarball and save it in tempoaray path. Then clear all log files.
+ Generate a report file under target directory, with file name like `report_<day>_<time>.txt`.
+ For steps once failed and being resolved later, record the resolution into a file under `experience` path in target dir.

3
.gitignore vendored
View File

@@ -23,5 +23,8 @@ coverage.txt
sftp-config.json
*~
# AI skills
**/experience
target
zkvm-prover/config.json

7
CLAUDE.md Normal file
View File

@@ -0,0 +1,7 @@
The mono repo for scroll-tech's services. See @README.md to know about the project.
Skills has been set to help some process being handled easily. When asked by "what can you help", list following skills, along with the skill-description and invoke cost estimation here:
1. `db-query`: ~$0.1 per query
2. `integration-test-helper` Now ready for following target:
+ `tests/prover-e2e`: ~$1.0 per process

View File

@@ -34,7 +34,7 @@ coordinator_cron:
coordinator_tool:
go build -ldflags "-X scroll-tech/common/version.ZkVersion=${ZK_VERSION}" -o $(PWD)/build/bin/coordinator_tool ./cmd/tool
localsetup: coordinator_api ## Local setup: build coordinator_api, copy config, and setup releases
localsetup: libzkp coordinator_api ## Local setup: build coordinator_api, copy config, and setup releases
mkdir -p build/bin/conf
@echo "Copying configuration files..."
@if [ -f "$(PWD)/conf/config.template.json" ]; then \

View File

@@ -1,3 +1,5 @@
build/*
testset.json
conf
conf
*.log
*.txt

View File

@@ -1,16 +1,24 @@
## A new e2e test tool to setup a local environment for testing coordinator and prover.
# ProverE2E: A new e2e test tool to setup a local environment for testing coordinator and prover.
It contains data from some blocks in a specified testnet, and helps to generate a series of chunks/batches/bundles from these blocks, filling the DB for the coordinator, so an e2e test (from chunk to bundle) can be run completely local
Prepare:
## Prepare
link the staff dir as "conf" from one of the dir with staff set, currently we have following staff sets:
+ sepolia: with blocks from scroll sepolia
+ cloak-xen: with blocks from xen sepolia, which is a cloak network
Steps:
## Test
1. run `make all` under `tests/prover-e2e`, it would launch a postgreSql db in local docker container, which is ready to be used by coordinator (include some chunks/batches/bundles waiting to be proven)
2. setup assets by run `make coordinator_setup`
3. in `coordinator/build/bin/conf`, update necessary items in `config.template.json` and rename it as `config.json`
4. build and launch `coordinator_api` service locally
5. setup the `config.json` for zkvm prover to connect with the locally launched coordinator api
6. in `zkvm-prover`, launch `make test_e2e_run`, which would specific prover run locally, connect to the local coordinator api service according to the `config.json`, and prove all tasks being injected to db in step 1.
3. come into `coordinator/build/bin` for following steps:
+ rename `conf/config.template.json` as `conf/config.json`
+ if the `l2.validium_mode` is set to true in `config.json`, the `sequencer.decryption_key` must be set
+ launch `coordinator_api` service by executing the file
4. come into `zkvm-prover` for following steps:
+ copy `config.template.json` to `config.json`,
+ set the `sdk_config.coordinator.base_url` field in `config.json`, so zkvm prover would connect with the locally launched coordinator api,
for common case the url is `http://localhost:8390` (the default listening port of coordinator api)
+ launch `make test_e2e_run`, which would specific prover run locally, connect to the local coordinator api service according to the `config.json`, and prove all tasks being injected to db in step 1.
## AI Helper
The test process can be run with the help of `integration-test-helper` skill (~$1.0 for each full process)

View File

@@ -0,0 +1 @@
Let coordiantor api listen at port 18390 to avoid security restriction or port confliction. Also change the corresponding field in `config.json`

View File

@@ -1,4 +1,5 @@
*.vmexe
*.elf
openvm.toml
*.bin
*.sol