mirror of
https://github.com/scroll-tech/scroll.git
synced 2026-04-23 03:00:50 -04:00
Compare commits
42 Commits
develop
...
feat/zkvm_
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b19dac8c32 | ||
|
|
00c407c872 | ||
|
|
949271c233 | ||
|
|
76435619f9 | ||
|
|
ac40de832c | ||
|
|
343ed78c07 | ||
|
|
1408430326 | ||
|
|
b4ccd70dff | ||
|
|
766aa2582b | ||
|
|
bbe6c41ac5 | ||
|
|
fafcea49c4 | ||
|
|
09d83e1d65 | ||
|
|
649c4416b0 | ||
|
|
f2a9ba5fa3 | ||
|
|
e1aa90b45c | ||
|
|
a35834d763 | ||
|
|
252eea951c | ||
|
|
93f755c97e | ||
|
|
b76539b736 | ||
|
|
58297d38ce | ||
|
|
8955d4f60f | ||
|
|
6a57a2e353 | ||
|
|
492563f3f3 | ||
|
|
03b992f904 | ||
|
|
d306b38405 | ||
|
|
79d79ed70d | ||
|
|
b270d96d59 | ||
|
|
ede29c7f4a | ||
|
|
74a3d7a756 | ||
|
|
3b174f8e38 | ||
|
|
930a12a884 | ||
|
|
b833468d02 | ||
|
|
e852915843 | ||
|
|
b244fa8d28 | ||
|
|
98be0a0ca4 | ||
|
|
5ae31bc203 | ||
|
|
5c2803c9e3 | ||
|
|
af63bc005b | ||
|
|
dc29c8c796 | ||
|
|
d262a63564 | ||
|
|
9839bf72ca | ||
|
|
bdf782674f |
27
.claude/agents/local-e2e-tester.md
Normal file
27
.claude/agents/local-e2e-tester.md
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
---
|
||||||
|
name: "local-e2e-tester"
|
||||||
|
description: "do local e2e test for prover and coordinator"
|
||||||
|
tools: Bash, Edit, EnterWorktree, ExitWorktree, Glob, Grep, Monitor, NotebookEdit, Read, RemoteTrigger, ScheduleWakeup, Skill, TaskCreate, TaskGet, TaskList, TaskUpdate, ToolSearch, WebFetch, WebSearch, Write
|
||||||
|
model: sonnet
|
||||||
|
color: green
|
||||||
|
memory: project
|
||||||
|
skills: agent-memory, integration-test-helper
|
||||||
|
---
|
||||||
|
|
||||||
|
Set your current directory into `tests/prover-e2e` and prepare for doing the local e2e test here. Check the preparation is ready, output the plan for user to confirm, then proceed.
|
||||||
|
|
||||||
|
## Notes while handling the e2e test
|
||||||
|
|
||||||
|
+ If some files are instructed to be generated, but they have been existed, NEVER refer the content before the generation. They may be left from different setup and contain wrong message for current process.
|
||||||
|
|
||||||
|
+ In step 4, if the `l2.validium_mode` is set to true, MUST Ask User for decryption key to fill the `sequencer.decryption_key` field. The key must be a hex string WITHOUT "0x" prefix.
|
||||||
|
|
||||||
|
+ Since you are a subagent, **never put any task into background**, keep watching everything until it completed.
|
||||||
|
|
||||||
|
# Memory
|
||||||
|
The memory directory for you is under `.claude/agent-memory/local-e2e-tester` of your primary working directory. Use it for your memory in the process.
|
||||||
|
|
||||||
|
# MEMORY.md
|
||||||
|
Your MEMORY.md is currently empty. When you save new memories, they will appear here.
|
||||||
|
|
||||||
|
|
||||||
44
.claude/agents/skill-tester.md
Normal file
44
.claude/agents/skill-tester.md
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
---
|
||||||
|
name: "skill-tester"
|
||||||
|
description: "run unit test process for other agents"
|
||||||
|
model: opus
|
||||||
|
color: red
|
||||||
|
---
|
||||||
|
|
||||||
|
Evaluate the behavior of other agents in this project: we request each agent generating their working plan and check if the plan match our expection.
|
||||||
|
|
||||||
|
# Preparing phase
|
||||||
|
|
||||||
|
+ Remove following files under root dir (if exist):
|
||||||
|
* local_e2e_plan.md
|
||||||
|
* zkvm_prover_runner.md
|
||||||
|
|
||||||
|
+ in `tests/prover-e2e`, check if there is a symbolic link `conf` existed, if not, create one and link to `tests/prover-e2e/sepolia-galileoV2`
|
||||||
|
|
||||||
|
# Testing phase
|
||||||
|
|
||||||
|
Launch following process:
|
||||||
|
|
||||||
|
+ @"local-e2e-tester" Make a plan for the local e2e test, save it as `local_e2e_plan.md`, do not execute.
|
||||||
|
+ @"zkvm-prover-runner" Make a plan to use zkvm prover to handle two batch task in sepolia testnet, (id: 0x69454fcc6798d181580431c360c054031fad69da5542ee772e386bf3ec2edf37 and 0x2a98b353ef1c40887c6ae1b11db3f1a9dd99aaf9dc4573c3de21056863ffc1a4), save it as `zkvm_prover_runner.md`, do not execute
|
||||||
|
|
||||||
|
# Verify phase
|
||||||
|
|
||||||
|
Read the plans generated in testing phase, check and report any behavior which is not consistent with following checklist:
|
||||||
|
|
||||||
|
## local_e2e_plan.md
|
||||||
|
|
||||||
|
+ The first step is run under `tests/prover-e2e`
|
||||||
|
+ `coordinator_api` will be launched as service (or say, in background)
|
||||||
|
+ `make test_e2e_run` is called
|
||||||
|
|
||||||
|
## zkvm_prover_runner.md
|
||||||
|
|
||||||
|
+ All steps **must be** run under `zkvm-prover`
|
||||||
|
+ A `config.json` file is created
|
||||||
|
+ A `workset.json` file is created
|
||||||
|
+ prover is called (by directly call `prover` or via `cargo run`), and is **not** put to background
|
||||||
|
|
||||||
|
# Final phase
|
||||||
|
|
||||||
|
Clean all .md files which have being verified.
|
||||||
25
.claude/agents/zkvm-prover-runner.md
Normal file
25
.claude/agents/zkvm-prover-runner.md
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
---
|
||||||
|
name: "zkvm-prover-runner"
|
||||||
|
description: "Run zkvm prover"
|
||||||
|
tools: Bash, Edit, EnterWorktree, ExitWorktree, Glob, Grep, Monitor, NotebookEdit, Read, RemoteTrigger, ScheduleWakeup, Skill, TaskCreate, TaskGet, TaskList, TaskUpdate, ToolSearch, WebFetch, WebSearch, Write
|
||||||
|
model: sonnet
|
||||||
|
color: green
|
||||||
|
memory: project
|
||||||
|
skills: agent-memory, integration-test-helper
|
||||||
|
---
|
||||||
|
|
||||||
|
Set your current directory into `zkvm-prover`, use the skill for integration test to run a prover to handle the tasks specified by user.
|
||||||
|
|
||||||
|
## Notes while handling the e2e test
|
||||||
|
|
||||||
|
+ Test the url of coordinator first. If not accessable, remind user that the vpn may not connect correctly.
|
||||||
|
|
||||||
|
+ Since you are a subagent, **never put any task into background**, keep watching everything until it completed.
|
||||||
|
|
||||||
|
# Memory directory
|
||||||
|
The memory directory for you is under `.claude/agent-memory/zkvm-prover-runner` of your primary working directory. Use it for your memory in the process.
|
||||||
|
|
||||||
|
# MEMORY.md
|
||||||
|
Your MEMORY.md is currently empty. When you save new memories, they will appear here.
|
||||||
|
|
||||||
|
|
||||||
16
.claude/settings.json
Normal file
16
.claude/settings.json
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
{
|
||||||
|
"$schema": "https://json.schemastore.org/claude-code-settings.json",
|
||||||
|
"env": {},
|
||||||
|
"companyAnnouncements": [
|
||||||
|
"Welcome! Here is scroll-tech",
|
||||||
|
"Just ask me about what can help"
|
||||||
|
],
|
||||||
|
"permissions": {
|
||||||
|
"allow": [
|
||||||
|
"Bash(pwd)",
|
||||||
|
"Bash(ls *)",
|
||||||
|
"Bash(cat *)"
|
||||||
|
],
|
||||||
|
"deny": []
|
||||||
|
}
|
||||||
|
}
|
||||||
135
.claude/skills/agent-memory/SKILL.md
Normal file
135
.claude/skills/agent-memory/SKILL.md
Normal file
@@ -0,0 +1,135 @@
|
|||||||
|
---
|
||||||
|
name: agent-memory
|
||||||
|
description: Instructions about how to build up the memory system for agent
|
||||||
|
---
|
||||||
|
|
||||||
|
# Persistent Agent Memory
|
||||||
|
|
||||||
|
You have been specified a `memory directory` for a persistent, file-based memory system. This directory already exists — write to it directly with the Write tool (do not run mkdir or check for its existence).
|
||||||
|
|
||||||
|
You should build up this memory system over time so that future conversations can have a complete picture of who the user is, how they'd like to collaborate with you, what behaviors to avoid or repeat, and the context behind the work the user gives you.
|
||||||
|
|
||||||
|
If the user explicitly asks you to remember something, save it immediately as whichever type fits best. If they ask you to forget something, find and remove the relevant entry.
|
||||||
|
|
||||||
|
## Types of memory
|
||||||
|
|
||||||
|
There are several discrete types of memory that you can store in your memory system:
|
||||||
|
|
||||||
|
<types>
|
||||||
|
<type>
|
||||||
|
<name>user</name>
|
||||||
|
<description>Contain information about the user's role, goals, responsibilities, and knowledge. Great user memories help you tailor your future behavior to the user's preferences and perspective. Your goal in reading and writing these memories is to build up an understanding of who the user is and how you can be most helpful to them specifically. For example, you should collaborate with a senior software engineer differently than a student who is coding for the very first time. Keep in mind, that the aim here is to be helpful to the user. Avoid writing memories about the user that could be viewed as a negative judgement or that are not relevant to the work you're trying to accomplish together.</description>
|
||||||
|
<when_to_save>When you learn any details about the user's role, preferences, responsibilities, or knowledge</when_to_save>
|
||||||
|
<how_to_use>When your work should be informed by the user's profile or perspective. For example, if the user is asking you to explain a part of the code, you should answer that question in a way that is tailored to the specific details that they will find most valuable or that helps them build their mental model in relation to domain knowledge they already have.</how_to_use>
|
||||||
|
<examples>
|
||||||
|
user: I'm a data scientist investigating what logging we have in place
|
||||||
|
assistant: [saves user memory: user is a data scientist, currently focused on observability/logging]
|
||||||
|
|
||||||
|
user: I've been writing Go for ten years but this is my first time touching the React side of this repo
|
||||||
|
assistant: [saves user memory: deep Go expertise, new to React and this project's frontend — frame frontend explanations in terms of backend analogues]
|
||||||
|
</examples>
|
||||||
|
</type>
|
||||||
|
<type>
|
||||||
|
<name>feedback</name>
|
||||||
|
<description>Guidance the user has given you about how to approach work — both what to avoid and what to keep doing. These are a very important type of memory to read and write as they allow you to remain coherent and responsive to the way you should approach work in the project. Record from failure AND success: if you only save corrections, you will avoid past mistakes but drift away from approaches the user has already validated, and may grow overly cautious.</description>
|
||||||
|
<when_to_save>Any time the user corrects your approach ("no not that", "don't", "stop doing X") OR confirms a non-obvious approach worked ("yes exactly", "perfect, keep doing that", accepting an unusual choice without pushback). Corrections are easy to notice; confirmations are quieter — watch for them. In both cases, save what is applicable to future conversations, especially if surprising or not obvious from the code. Include *why* so you can judge edge cases later.</when_to_save>
|
||||||
|
<how_to_use>Let these memories guide your behavior so that the user does not need to offer the same guidance twice.</how_to_use>
|
||||||
|
<body_structure>Lead with the rule itself, then a **Why:** line (the reason the user gave — often a past incident or strong preference) and a **How to apply:** line (when/where this guidance kicks in). Knowing *why* lets you judge edge cases instead of blindly following the rule.</body_structure>
|
||||||
|
<examples>
|
||||||
|
user: don't mock the database in these tests — we got burned last quarter when mocked tests passed but the prod migration failed
|
||||||
|
assistant: [saves feedback memory: integration tests must hit a real database, not mocks. Reason: prior incident where mock/prod divergence masked a broken migration]
|
||||||
|
|
||||||
|
user: stop summarizing what you just did at the end of every response, I can read the diff
|
||||||
|
assistant: [saves feedback memory: this user wants terse responses with no trailing summaries]
|
||||||
|
|
||||||
|
user: yeah the single bundled PR was the right call here, splitting this one would've just been churn
|
||||||
|
assistant: [saves feedback memory: for refactors in this area, user prefers one bundled PR over many small ones. Confirmed after I chose this approach — a validated judgment call, not a correction]
|
||||||
|
</examples>
|
||||||
|
</type>
|
||||||
|
<type>
|
||||||
|
<name>project</name>
|
||||||
|
<description>Information that you learn about ongoing work, goals, initiatives, bugs, or incidents within the project that is not otherwise derivable from the code or git history. Project memories help you understand the broader context and motivation behind the work the user is doing within this working directory.</description>
|
||||||
|
<when_to_save>When you learn who is doing what, why, or by when. These states change relatively quickly so try to keep your understanding of this up to date. Always convert relative dates in user messages to absolute dates when saving (e.g., "Thursday" → "2026-03-05"), so the memory remains interpretable after time passes.</when_to_save>
|
||||||
|
<how_to_use>Use these memories to more fully understand the details and nuance behind the user's request and make better informed suggestions.</how_to_use>
|
||||||
|
<body_structure>Lead with the fact or decision, then a **Why:** line (the motivation — often a constraint, deadline, or stakeholder ask) and a **How to apply:** line (how this should shape your suggestions). Project memories decay fast, so the why helps future-you judge whether the memory is still load-bearing.</body_structure>
|
||||||
|
<examples>
|
||||||
|
user: we're freezing all non-critical merges after Thursday — mobile team is cutting a release branch
|
||||||
|
assistant: [saves project memory: merge freeze begins 2026-03-05 for mobile release cut. Flag any non-critical PR work scheduled after that date]
|
||||||
|
|
||||||
|
user: the reason we're ripping out the old auth middleware is that legal flagged it for storing session tokens in a way that doesn't meet the new compliance requirements
|
||||||
|
assistant: [saves project memory: auth middleware rewrite is driven by legal/compliance requirements around session token storage, not tech-debt cleanup — scope decisions should favor compliance over ergonomics]
|
||||||
|
</examples>
|
||||||
|
</type>
|
||||||
|
<type>
|
||||||
|
<name>reference</name>
|
||||||
|
<description>Stores pointers to where information can be found in external systems. These memories allow you to remember where to look to find up-to-date information outside of the project directory.</description>
|
||||||
|
<when_to_save>When you learn about resources in external systems and their purpose. For example, that bugs are tracked in a specific project in Linear or that feedback can be found in a specific Slack channel.</when_to_save>
|
||||||
|
<how_to_use>When the user references an external system or information that may be in an external system.</how_to_use>
|
||||||
|
<examples>
|
||||||
|
user: check the Linear project "INGEST" if you want context on these tickets, that's where we track all pipeline bugs
|
||||||
|
assistant: [saves reference memory: pipeline bugs are tracked in Linear project "INGEST"]
|
||||||
|
|
||||||
|
user: the Grafana board at grafana.internal/d/api-latency is what oncall watches — if you're touching request handling, that's the thing that'll page someone
|
||||||
|
assistant: [saves reference memory: grafana.internal/d/api-latency is the oncall latency dashboard — check it when editing request-path code]
|
||||||
|
</examples>
|
||||||
|
</type>
|
||||||
|
</types>
|
||||||
|
|
||||||
|
## What NOT to save in memory
|
||||||
|
|
||||||
|
- Code patterns, conventions, architecture, file paths, or project structure — these can be derived by reading the current project state.
|
||||||
|
- Git history, recent changes, or who-changed-what — `git log` / `git blame` are authoritative.
|
||||||
|
- Debugging solutions or fix recipes — the fix is in the code; the commit message has the context.
|
||||||
|
- Anything already documented in CLAUDE.md files.
|
||||||
|
- Ephemeral task details: in-progress work, temporary state, current conversation context.
|
||||||
|
|
||||||
|
These exclusions apply even when the user explicitly asks you to save. If they ask you to save a PR list or activity summary, ask what was *surprising* or *non-obvious* about it — that is the part worth keeping.
|
||||||
|
|
||||||
|
## How to save memories
|
||||||
|
|
||||||
|
Saving a memory is a two-step process:
|
||||||
|
|
||||||
|
**Step 1** — write the memory to its own file (e.g., `user_role.md`, `feedback_testing.md`) using this frontmatter format:
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: {{memory name}}
|
||||||
|
description: {{one-line description — used to decide relevance in future conversations, so be specific}}
|
||||||
|
type: {{user, feedback, project, reference}}
|
||||||
|
---
|
||||||
|
|
||||||
|
{{memory content — for feedback/project types, structure as: rule/fact, then **Why:** and **How to apply:** lines}}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Step 2** — add a pointer to that file in `MEMORY.md`. `MEMORY.md` is an index, not a memory — each entry should be one line, under ~150 characters: `- [Title](file.md) — one-line hook`. It has no frontmatter. Never write memory content directly into `MEMORY.md`.
|
||||||
|
|
||||||
|
- `MEMORY.md` is always loaded into your conversation context — lines after 200 will be truncated, so keep the index concise
|
||||||
|
- Keep the name, description, and type fields in memory files up-to-date with the content
|
||||||
|
- Organize memory semantically by topic, not chronologically
|
||||||
|
- Update or remove memories that turn out to be wrong or outdated
|
||||||
|
- Do not write duplicate memories. First check if there is an existing memory you can update before writing a new one.
|
||||||
|
|
||||||
|
## When to access memories
|
||||||
|
- When memories seem relevant, or the user references prior-conversation work.
|
||||||
|
- You MUST access memory when the user explicitly asks you to check, recall, or remember.
|
||||||
|
- If the user says to *ignore* or *not use* memory: Do not apply remembered facts, cite, compare against, or mention memory content.
|
||||||
|
- Memory records can become stale over time. Use memory as context for what was true at a given point in time. Before answering the user or building assumptions based solely on information in memory records, verify that the memory is still correct and up-to-date by reading the current state of the files or resources. If a recalled memory conflicts with current information, trust what you observe now — and update or remove the stale memory rather than acting on it.
|
||||||
|
|
||||||
|
## Before recommending from memory
|
||||||
|
|
||||||
|
A memory that names a specific function, file, or flag is a claim that it existed *when the memory was written*. It may have been renamed, removed, or never merged. Before recommending it:
|
||||||
|
|
||||||
|
- If the memory names a file path: check the file exists.
|
||||||
|
- If the memory names a function or flag: grep for it.
|
||||||
|
- If the user is about to act on your recommendation (not just asking about history), verify first.
|
||||||
|
|
||||||
|
"The memory says X exists" is not the same as "X exists now."
|
||||||
|
|
||||||
|
A memory that summarizes repo state (activity logs, architecture snapshots) is frozen in time. If the user asks about *recent* or *current* state, prefer `git log` or reading the code over recalling the snapshot.
|
||||||
|
|
||||||
|
## Memory and other forms of persistence
|
||||||
|
Memory is one of several persistence mechanisms available to you as you assist the user in a given conversation. The distinction is often that memory can be recalled in future conversations and should not be used for persisting information that is only useful within the scope of the current conversation.
|
||||||
|
- When to use or update a plan instead of memory: If you are about to start a non-trivial implementation task and would like to reach alignment with the user on your approach you should use a Plan rather than saving this information to memory. Similarly, if you already have a plan within the conversation and you have changed your approach persist that change by updating the plan rather than saving a memory.
|
||||||
|
- When to use or update tasks instead of memory: When you need to break your work in current conversation into discrete steps or keep track of your progress use tasks instead of saving to memory. Tasks are great for persisting information about the work that needs to be done in the current conversation, but memory should be reserved for information that will be useful in future conversations.
|
||||||
|
|
||||||
|
- Since this memory is project-scope and shared with your team via version control, tailor your memories to this project
|
||||||
36
.claude/skills/db-query/SKILL.md
Normal file
36
.claude/skills/db-query/SKILL.md
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
---
|
||||||
|
name: db-query
|
||||||
|
description: Do query from database for common task
|
||||||
|
model: sonnet
|
||||||
|
allowed-tools: Bash(psql *)
|
||||||
|
---
|
||||||
|
|
||||||
|
User could like to know about the status of L2 data blocks and proving task, following is their request:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
(If you find there is nothing in the request above, just tell "nothing to do" and stop)
|
||||||
|
|
||||||
|
You should have known the data sheme of our database, if not yet, read it from the `.sql` files under `database/migrate/migrations`.
|
||||||
|
|
||||||
|
According to use's request, generate the corresponding SQL expression and query the database. For example, if user ask "list the assigned chunks", it means "query records from `chunk` table with proving_status=2 (assigned)", or the SQL expression 'SELECT * from chunk where proving_status=2;'. If it is not clear, you can ask user which col they are indicating to, and list some possible options.
|
||||||
|
|
||||||
|
For the generated SQL, following rules MUST be obey:
|
||||||
|
|
||||||
|
+ Limit the number of records to 20, unless user has a specification explicitly like "show me ALL chunks".
|
||||||
|
+ Following cols can not be read by human and contain very large texts, they MUST be excluded in the SQL expression:
|
||||||
|
+ For all table, any col named "proof"
|
||||||
|
+ "header" and "transactions" in `l2_block` table
|
||||||
|
+ "calldata" in `l1_message`
|
||||||
|
+ Always omit the `deleted_at` col, never include them in query or use in where condition
|
||||||
|
+ Without explicit specification, the records should be ordered by the `updated_at` col, the most recent one first.
|
||||||
|
|
||||||
|
When you has decided the SQL expression, always print it out.
|
||||||
|
|
||||||
|
You use psql client to query from our PostgreSQL db. When launching psql, always with "-w" options, and use "-o" to send all ouput to `query_report.txt` file under system's temporary dir, like /tmp. You MUST NOT read the generated report.
|
||||||
|
|
||||||
|
If the psql failed since authentication, guide user to prepare their `.pgpass` file under home dir.
|
||||||
|
|
||||||
|
You should have known the endpoint of the database before, in the form of PostgreSQL DSN. If not, try to read it from the `db.dsn` field inside of `coordinator/build/bin/conf/config.json`. If still not able to get the data, ask via Ask User Question to get the endpoint.
|
||||||
|
|
||||||
|
|
||||||
51
.claude/skills/integration-test-helper/SKILL.md
Normal file
51
.claude/skills/integration-test-helper/SKILL.md
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
---
|
||||||
|
name: integration-test-helper
|
||||||
|
description: Helps launching the full process of integration test, also investigate and report the results.
|
||||||
|
---
|
||||||
|
|
||||||
|
## Target directory
|
||||||
|
|
||||||
|
The whole process should be run under current directory, unless it is specified to ($ARGUMENTS[0])
|
||||||
|
Under the target dir there are the stuff and instructions.
|
||||||
|
|
||||||
|
## Instructions
|
||||||
|
|
||||||
|
First read `README.md` under target directory, instructions should be under heading named ($ARGUMENTS[1]). If there is no such a heading name, just try the "Test" heading.
|
||||||
|
|
||||||
|
## Run each step listed in instructions
|
||||||
|
|
||||||
|
The instructions often contain multiple steps which should be completed in sequence. Following are some rules MUST be obey while handling each step:
|
||||||
|
|
||||||
|
### "Must do" while executing commands in steps
|
||||||
|
|
||||||
|
Any command mentioned in steps should be executed by Bash tool, with following MUST DO for handling the outputs:
|
||||||
|
|
||||||
|
+ Redirect the output of bash tool, both from stdout andstderr, into a local log file for investigating later. The file name should be in format as `<desc_of_ccommand>_<day>_<time>.log`
|
||||||
|
+ Do not read the whole log file. Just investigate the last 50 lines (use "tail -n 50") for possible error message.
|
||||||
|
|
||||||
|
It may need to jump to other directories for executing a step. We MUST go back to target directory after every step has been completed. Also, DO NOT change anything outside of target directy.
|
||||||
|
|
||||||
|
### When error raised
|
||||||
|
Command execution should get success return. If error raised while executing, do following process:
|
||||||
|
|
||||||
|
1. Try to analysis the reason of error, first from the caught error message. If there is no enough data, grep useful information from the log file of whole output just captured.
|
||||||
|
|
||||||
|
2. MUST ASK USER for next action, options are:
|
||||||
|
+ Retry with resolution derived from error analyst
|
||||||
|
+ Retry, with user provide tips to resolve the issue
|
||||||
|
+ Just retry, user has resolved the issue by theirself
|
||||||
|
+ Stop here, discard current and following steps, do after completion
|
||||||
|
|
||||||
|
Error often caused by some mismacthing of configruation in current host. Here are some tips which may help:
|
||||||
|
|
||||||
|
* Install the missing tools / libs via packet manager
|
||||||
|
* Fix the typo, or complete missed fields in configuration files
|
||||||
|
* Copy missed files, it may be just put in some place of the project or can be downloaded according to some documents.
|
||||||
|
|
||||||
|
## After completion
|
||||||
|
|
||||||
|
When every step has done, or the process stop by user, make following materials before stop:
|
||||||
|
|
||||||
|
+ Package all log files generated before into a tarball and save it in tempoaray path. Then clear all log files.
|
||||||
|
+ Generate a report file under target directory, with file name like `report_<day>_<time>.txt`.
|
||||||
|
|
||||||
8
.gitignore
vendored
8
.gitignore
vendored
@@ -23,5 +23,11 @@ coverage.txt
|
|||||||
sftp-config.json
|
sftp-config.json
|
||||||
*~
|
*~
|
||||||
|
|
||||||
|
# AI skills
|
||||||
|
**/experience
|
||||||
|
|
||||||
|
# AI skill generate report
|
||||||
|
report_*.txt
|
||||||
|
|
||||||
target
|
target
|
||||||
zkvm-prover/config.json
|
zkvm-prover/*.json
|
||||||
1
CLAUDE.md
Normal file
1
CLAUDE.md
Normal file
@@ -0,0 +1 @@
|
|||||||
|
The mono repo for scroll-tech's services. See @README.md to know about the project.
|
||||||
1433
Cargo.lock
generated
1433
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
10
Cargo.toml
10
Cargo.toml
@@ -14,12 +14,13 @@ edition = "2021"
|
|||||||
homepage = "https://scroll.io"
|
homepage = "https://scroll.io"
|
||||||
readme = "README.md"
|
readme = "README.md"
|
||||||
repository = "https://github.com/scroll-tech/scroll"
|
repository = "https://github.com/scroll-tech/scroll"
|
||||||
version = "4.7.1"
|
version = "4.7.12"
|
||||||
|
|
||||||
[workspace.dependencies]
|
[workspace.dependencies]
|
||||||
scroll-zkvm-prover = { git = "https://github.com/scroll-tech/zkvm-prover", tag = "v0.7.1" }
|
# with patched openvm 1.4.3 (zkvm-prover/feat/v0.7.3)
|
||||||
scroll-zkvm-verifier = { git = "https://github.com/scroll-tech/zkvm-prover", tag = "v0.7.1" }
|
scroll-zkvm-prover = { git = "https://github.com/scroll-tech/zkvm-prover", rev = "9e38e2e" }
|
||||||
scroll-zkvm-types = { git = "https://github.com/scroll-tech/zkvm-prover", tag = "v0.7.1" }
|
scroll-zkvm-verifier = { git = "https://github.com/scroll-tech/zkvm-prover", rev = "9e38e2e" }
|
||||||
|
scroll-zkvm-types = { git = "https://github.com/scroll-tech/zkvm-prover", rev = "9e38e2e" }
|
||||||
|
|
||||||
sbv-primitives = { git = "https://github.com/scroll-tech/stateless-block-verifier", tag = "scroll-v91.2", features = ["scroll", "rkyv"] }
|
sbv-primitives = { git = "https://github.com/scroll-tech/stateless-block-verifier", tag = "scroll-v91.2", features = ["scroll", "rkyv"] }
|
||||||
sbv-utils = { git = "https://github.com/scroll-tech/stateless-block-verifier", tag = "scroll-v91.2" }
|
sbv-utils = { git = "https://github.com/scroll-tech/stateless-block-verifier", tag = "scroll-v91.2" }
|
||||||
@@ -46,7 +47,6 @@ eyre = "0.6"
|
|||||||
once_cell = "1.20"
|
once_cell = "1.20"
|
||||||
base64 = "0.22"
|
base64 = "0.22"
|
||||||
|
|
||||||
|
|
||||||
[patch.crates-io]
|
[patch.crates-io]
|
||||||
revm = { git = "https://github.com/scroll-tech/revm", tag = "scroll-v91" }
|
revm = { git = "https://github.com/scroll-tech/revm", tag = "scroll-v91" }
|
||||||
revm-bytecode = { git = "https://github.com/scroll-tech/revm", tag = "scroll-v91" }
|
revm-bytecode = { git = "https://github.com/scroll-tech/revm", tag = "scroll-v91" }
|
||||||
|
|||||||
@@ -34,7 +34,7 @@ coordinator_cron:
|
|||||||
coordinator_tool:
|
coordinator_tool:
|
||||||
go build -ldflags "-X scroll-tech/common/version.ZkVersion=${ZK_VERSION}" -o $(PWD)/build/bin/coordinator_tool ./cmd/tool
|
go build -ldflags "-X scroll-tech/common/version.ZkVersion=${ZK_VERSION}" -o $(PWD)/build/bin/coordinator_tool ./cmd/tool
|
||||||
|
|
||||||
localsetup: coordinator_api ## Local setup: build coordinator_api, copy config, and setup releases
|
localsetup: libzkp coordinator_api ## Local setup: build coordinator_api, copy config, and setup releases
|
||||||
mkdir -p build/bin/conf
|
mkdir -p build/bin/conf
|
||||||
@echo "Copying configuration files..."
|
@echo "Copying configuration files..."
|
||||||
@if [ -f "$(PWD)/conf/config.template.json" ]; then \
|
@if [ -f "$(PWD)/conf/config.template.json" ]; then \
|
||||||
|
|||||||
@@ -13,22 +13,8 @@ use serde_json::value::RawValue;
|
|||||||
use std::{collections::HashMap, path::Path, sync::OnceLock};
|
use std::{collections::HashMap, path::Path, sync::OnceLock};
|
||||||
use tasks::chunk_interpreter::{ChunkInterpreter, TryFromWithInterpreter};
|
use tasks::chunk_interpreter::{ChunkInterpreter, TryFromWithInterpreter};
|
||||||
|
|
||||||
pub(crate) fn witness_use_legacy_mode(fork_name: &str) -> eyre::Result<bool> {
|
|
||||||
ADDITIONAL_FEATURES
|
|
||||||
.get()
|
|
||||||
.and_then(|features| features.get(fork_name))
|
|
||||||
.map(|cfg| cfg.legacy_witness_encoding)
|
|
||||||
.ok_or_else(|| {
|
|
||||||
eyre::eyre!(
|
|
||||||
"can not find features setting for unrecognized fork {}",
|
|
||||||
fork_name
|
|
||||||
)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Default, Clone)]
|
#[derive(Debug, Default, Clone)]
|
||||||
struct FeatureOptions {
|
struct FeatureOptions {
|
||||||
legacy_witness_encoding: bool,
|
|
||||||
for_openvm_13_prover: bool,
|
for_openvm_13_prover: bool,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -41,11 +27,10 @@ impl FeatureOptions {
|
|||||||
for feat_s in feats.split(':') {
|
for feat_s in feats.split(':') {
|
||||||
match feat_s.trim().to_lowercase().as_str() {
|
match feat_s.trim().to_lowercase().as_str() {
|
||||||
"legacy_witness" => {
|
"legacy_witness" => {
|
||||||
tracing::info!("set witness encoding for legacy mode");
|
tracing::warn!("legacy witness is no longer supported");
|
||||||
ret.legacy_witness_encoding = true;
|
|
||||||
}
|
}
|
||||||
"openvm_13" => {
|
"openvm_13" => {
|
||||||
tracing::info!("set prover should use openvm 13");
|
tracing::warn!("set prover should use openvm 13");
|
||||||
ret.for_openvm_13_prover = true;
|
ret.for_openvm_13_prover = true;
|
||||||
}
|
}
|
||||||
s => tracing::warn!("unrecognized dynamic feature: {s}"),
|
s => tracing::warn!("unrecognized dynamic feature: {s}"),
|
||||||
|
|||||||
@@ -4,11 +4,9 @@ use crate::utils::short_git_version;
|
|||||||
use eyre::Result;
|
use eyre::Result;
|
||||||
use sbv_primitives::B256;
|
use sbv_primitives::B256;
|
||||||
use scroll_zkvm_types::{
|
use scroll_zkvm_types::{
|
||||||
batch::BatchInfo,
|
|
||||||
bundle::BundleInfo,
|
|
||||||
chunk::ChunkInfo,
|
|
||||||
proof::{EvmProof, OpenVmEvmProof, ProofEnum, StarkProof},
|
proof::{EvmProof, OpenVmEvmProof, ProofEnum, StarkProof},
|
||||||
public_inputs::MultiVersionPublicInputs,
|
public_inputs::MultiVersionPublicInputs,
|
||||||
|
scroll::{batch::BatchInfo, bundle::BundleInfo, chunk::ChunkInfo},
|
||||||
types_agg::AggregationInput,
|
types_agg::AggregationInput,
|
||||||
utils::{serialize_vk, vec_as_base64},
|
utils::{serialize_vk, vec_as_base64},
|
||||||
version,
|
version,
|
||||||
@@ -215,7 +213,7 @@ impl<Metadata: ProofMetadata> PersistableProof for WrappedProof<Metadata> {
|
|||||||
mod tests {
|
mod tests {
|
||||||
use base64::{prelude::BASE64_STANDARD, Engine};
|
use base64::{prelude::BASE64_STANDARD, Engine};
|
||||||
use sbv_primitives::B256;
|
use sbv_primitives::B256;
|
||||||
use scroll_zkvm_types::{bundle::BundleInfo, proof::EvmProof};
|
use scroll_zkvm_types::{proof::EvmProof, scroll::bundle::BundleInfo};
|
||||||
|
|
||||||
use super::*;
|
use super::*;
|
||||||
|
|
||||||
@@ -253,7 +251,7 @@ mod tests {
|
|||||||
msg_queue_hash: B256::repeat_byte(6),
|
msg_queue_hash: B256::repeat_byte(6),
|
||||||
encryption_key: None,
|
encryption_key: None,
|
||||||
};
|
};
|
||||||
let bundle_pi_hash = bundle_info.pi_hash_euclidv1();
|
let bundle_pi_hash = bundle_info.pi_hash_by_version(version::Version::euclid_v1());
|
||||||
BundleProofMetadata {
|
BundleProofMetadata {
|
||||||
bundle_info,
|
bundle_info,
|
||||||
bundle_pi_hash,
|
bundle_pi_hash,
|
||||||
|
|||||||
@@ -2,15 +2,16 @@ use c_kzg::Bytes48;
|
|||||||
use eyre::Result;
|
use eyre::Result;
|
||||||
use sbv_primitives::{B256, U256};
|
use sbv_primitives::{B256, U256};
|
||||||
use scroll_zkvm_types::{
|
use scroll_zkvm_types::{
|
||||||
batch::{
|
|
||||||
build_point_eval_witness, BatchHeader, BatchHeaderV6, BatchHeaderV7, BatchHeaderValidium,
|
|
||||||
BatchInfo, BatchWitness, Envelope, EnvelopeV6, EnvelopeV7, LegacyBatchWitness,
|
|
||||||
ReferenceHeader, N_BLOB_BYTES,
|
|
||||||
},
|
|
||||||
chunk::ChunkInfo,
|
|
||||||
public_inputs::{ForkName, MultiVersionPublicInputs, Version},
|
public_inputs::{ForkName, MultiVersionPublicInputs, Version},
|
||||||
|
scroll::{
|
||||||
|
batch::{
|
||||||
|
build_point_eval_witness, BatchHeader, BatchHeaderV6, BatchHeaderV7,
|
||||||
|
BatchHeaderValidium, BatchInfo, BatchWitness, Envelope, EnvelopeV6, EnvelopeV7,
|
||||||
|
ReferenceHeader, N_BLOB_BYTES,
|
||||||
|
},
|
||||||
|
chunk::ChunkInfo,
|
||||||
|
},
|
||||||
task::ProvingTask,
|
task::ProvingTask,
|
||||||
utils::{to_rkyv_bytes, RancorError},
|
|
||||||
version::{Codec, Domain, STFVersion},
|
version::{Codec, Domain, STFVersion},
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -118,12 +119,7 @@ pub struct BatchProvingTask {
|
|||||||
impl BatchProvingTask {
|
impl BatchProvingTask {
|
||||||
pub fn into_proving_task_with_precheck(self) -> Result<(ProvingTask, BatchInfo, B256)> {
|
pub fn into_proving_task_with_precheck(self) -> Result<(ProvingTask, BatchInfo, B256)> {
|
||||||
let (witness, metadata, batch_pi_hash) = self.precheck()?;
|
let (witness, metadata, batch_pi_hash) = self.precheck()?;
|
||||||
let serialized_witness = if crate::witness_use_legacy_mode(&self.fork_name)? {
|
let serialized_witness = super::encode_task_to_witness(&witness)?;
|
||||||
let legacy_witness = LegacyBatchWitness::from(witness);
|
|
||||||
to_rkyv_bytes::<RancorError>(&legacy_witness)?.into_vec()
|
|
||||||
} else {
|
|
||||||
super::encode_task_to_witness(&witness)?
|
|
||||||
};
|
|
||||||
|
|
||||||
let proving_task = ProvingTask {
|
let proving_task = ProvingTask {
|
||||||
identifier: self.batch_header.batch_hash().to_string(),
|
identifier: self.batch_header.batch_hash().to_string(),
|
||||||
|
|||||||
@@ -1,10 +1,9 @@
|
|||||||
use eyre::Result;
|
use eyre::Result;
|
||||||
use sbv_primitives::B256;
|
use sbv_primitives::B256;
|
||||||
use scroll_zkvm_types::{
|
use scroll_zkvm_types::{
|
||||||
bundle::{BundleInfo, BundleWitness, LegacyBundleWitness},
|
|
||||||
public_inputs::{MultiVersionPublicInputs, Version},
|
public_inputs::{MultiVersionPublicInputs, Version},
|
||||||
|
scroll::bundle::{BundleInfo, BundleWitness},
|
||||||
task::ProvingTask,
|
task::ProvingTask,
|
||||||
utils::{to_rkyv_bytes, RancorError},
|
|
||||||
};
|
};
|
||||||
|
|
||||||
use crate::proofs::BatchProof;
|
use crate::proofs::BatchProof;
|
||||||
@@ -27,12 +26,7 @@ pub struct BundleProvingTask {
|
|||||||
impl BundleProvingTask {
|
impl BundleProvingTask {
|
||||||
pub fn into_proving_task_with_precheck(self) -> Result<(ProvingTask, BundleInfo, B256)> {
|
pub fn into_proving_task_with_precheck(self) -> Result<(ProvingTask, BundleInfo, B256)> {
|
||||||
let (witness, bundle_info, bundle_pi_hash) = self.precheck()?;
|
let (witness, bundle_info, bundle_pi_hash) = self.precheck()?;
|
||||||
let serialized_witness = if crate::witness_use_legacy_mode(&self.fork_name)? {
|
let serialized_witness = super::encode_task_to_witness(&witness)?;
|
||||||
let legacy = LegacyBundleWitness::from(witness);
|
|
||||||
to_rkyv_bytes::<RancorError>(&legacy)?.into_vec()
|
|
||||||
} else {
|
|
||||||
super::encode_task_to_witness(&witness)?
|
|
||||||
};
|
|
||||||
|
|
||||||
let proving_task = ProvingTask {
|
let proving_task = ProvingTask {
|
||||||
identifier: self.identifier(),
|
identifier: self.identifier(),
|
||||||
|
|||||||
@@ -2,10 +2,9 @@ use eyre::Result;
|
|||||||
use sbv_core::BlockWitness;
|
use sbv_core::BlockWitness;
|
||||||
use sbv_primitives::{types::consensus::BlockHeader, B256};
|
use sbv_primitives::{types::consensus::BlockHeader, B256};
|
||||||
use scroll_zkvm_types::{
|
use scroll_zkvm_types::{
|
||||||
chunk::{execute, ChunkInfo, ChunkWitness, LegacyChunkWitness, ValidiumInputs},
|
|
||||||
public_inputs::{MultiVersionPublicInputs, Version},
|
public_inputs::{MultiVersionPublicInputs, Version},
|
||||||
|
scroll::chunk::{execute, ChunkInfo, ChunkWitness, ValidiumInputs},
|
||||||
task::ProvingTask,
|
task::ProvingTask,
|
||||||
utils::{to_rkyv_bytes, RancorError},
|
|
||||||
};
|
};
|
||||||
|
|
||||||
use super::chunk_interpreter::*;
|
use super::chunk_interpreter::*;
|
||||||
@@ -117,12 +116,7 @@ impl ChunkProvingTask {
|
|||||||
|
|
||||||
pub fn into_proving_task_with_precheck(self) -> Result<(ProvingTask, ChunkInfo, B256)> {
|
pub fn into_proving_task_with_precheck(self) -> Result<(ProvingTask, ChunkInfo, B256)> {
|
||||||
let (witness, chunk_info, chunk_pi_hash) = self.precheck()?;
|
let (witness, chunk_info, chunk_pi_hash) = self.precheck()?;
|
||||||
let serialized_witness = if crate::witness_use_legacy_mode(&self.fork_name)? {
|
let serialized_witness = super::encode_task_to_witness(&witness)?;
|
||||||
let legacy_witness = LegacyChunkWitness::from(witness);
|
|
||||||
to_rkyv_bytes::<RancorError>(&legacy_witness)?.into_vec()
|
|
||||||
} else {
|
|
||||||
super::encode_task_to_witness(&witness)?
|
|
||||||
};
|
|
||||||
|
|
||||||
let proving_task = ProvingTask {
|
let proving_task = ProvingTask {
|
||||||
identifier: self.identifier(),
|
identifier: self.identifier(),
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ edition.workspace = true
|
|||||||
scroll-zkvm-types.workspace = true
|
scroll-zkvm-types.workspace = true
|
||||||
scroll-zkvm-prover.workspace = true
|
scroll-zkvm-prover.workspace = true
|
||||||
libzkp = { path = "../libzkp"}
|
libzkp = { path = "../libzkp"}
|
||||||
scroll-proving-sdk = { git = "https://github.com/scroll-tech/scroll-proving-sdk.git", rev = "05648db" }
|
scroll-proving-sdk = { git = "https://github.com/scroll-tech/scroll-proving-sdk.git", rev = "4266daa" }
|
||||||
serde.workspace = true
|
serde.workspace = true
|
||||||
serde_json.workspace = true
|
serde_json.workspace = true
|
||||||
once_cell.workspace =true
|
once_cell.workspace =true
|
||||||
@@ -20,20 +20,19 @@ eyre.workspace = true
|
|||||||
futures = "0.3.30"
|
futures = "0.3.30"
|
||||||
futures-util = "0.3"
|
futures-util = "0.3"
|
||||||
|
|
||||||
reqwest = { version = "0.12.4", features = ["gzip", "stream"] }
|
reqwest = { version = "0.12", features = ["gzip", "stream"] }
|
||||||
reqwest-middleware = "0.3"
|
|
||||||
reqwest-retry = "0.5"
|
|
||||||
hex = "0.4.3"
|
hex = "0.4.3"
|
||||||
|
|
||||||
rand = "0.8.5"
|
rand = "0.8.5"
|
||||||
tokio = "1.37.0"
|
tokio = "1.37.0"
|
||||||
async-trait = "0.1"
|
async-trait = "0.1"
|
||||||
sled = "0.34.7"
|
sled = "0.34.7"
|
||||||
http = "1.1.0"
|
http = "1.4"
|
||||||
clap = { version = "4.5", features = ["derive"] }
|
clap = { version = "4.5", features = ["derive"] }
|
||||||
ctor = "0.2.8"
|
ctor = "0.2.8"
|
||||||
url = { version = "2.5.4", features = ["serde",] }
|
url = { version = "2.5.4", features = ["serde",] }
|
||||||
serde_bytes = "0.11.15"
|
serde_bytes = "0.11.15"
|
||||||
|
bincode = { version = "2.0", features = ["serde",] }
|
||||||
|
|
||||||
[features]
|
[features]
|
||||||
default = []
|
default = []
|
||||||
|
|||||||
88
crates/prover-bin/src/dumper.rs
Normal file
88
crates/prover-bin/src/dumper.rs
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
use async_trait::async_trait;
|
||||||
|
use libzkp::ProvingTaskExt;
|
||||||
|
use scroll_proving_sdk::prover::{
|
||||||
|
proving_service::{
|
||||||
|
GetVkRequest, GetVkResponse, ProveRequest, ProveResponse, QueryTaskRequest,
|
||||||
|
QueryTaskResponse, TaskStatus,
|
||||||
|
},
|
||||||
|
ProvingService,
|
||||||
|
};
|
||||||
|
use scroll_zkvm_types::ProvingTask;
|
||||||
|
|
||||||
|
#[derive(Default)]
|
||||||
|
pub struct Dumper {
|
||||||
|
#[allow(dead_code)]
|
||||||
|
pub target_path: String,
|
||||||
|
pub json_mode: bool,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Dumper {
|
||||||
|
fn dump(&self, input_string: &str) -> eyre::Result<()> {
|
||||||
|
let task: ProvingTaskExt = serde_json::from_str(input_string)?;
|
||||||
|
let task = ProvingTask::from(task);
|
||||||
|
|
||||||
|
if self.json_mode {
|
||||||
|
let file = std::fs::File::create("input_task.json")?;
|
||||||
|
serde_json::to_writer(std::io::BufWriter::new(file), &task)?;
|
||||||
|
} else {
|
||||||
|
// stream-encode serialized_witness to input_task.bin using bincode 2.0
|
||||||
|
let input_file = std::fs::File::create("input_task.bin")?;
|
||||||
|
let mut input_writer = std::io::BufWriter::new(input_file);
|
||||||
|
bincode::encode_into_std_write(
|
||||||
|
&task.serialized_witness,
|
||||||
|
&mut input_writer,
|
||||||
|
bincode::config::standard(),
|
||||||
|
)?;
|
||||||
|
|
||||||
|
// stream-encode aggregated_proofs to agg_proofs.bin using bincode 2.0
|
||||||
|
let agg_file = std::fs::File::create("agg_proofs.bin")?;
|
||||||
|
let mut agg_writer = std::io::BufWriter::new(agg_file);
|
||||||
|
for proof in &task.aggregated_proofs {
|
||||||
|
let sz = bincode::serde::encode_into_std_write(
|
||||||
|
&proof.proofs,
|
||||||
|
&mut agg_writer,
|
||||||
|
bincode::config::standard(),
|
||||||
|
)?;
|
||||||
|
println!("written {sz} bytes for proof");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[async_trait]
|
||||||
|
impl ProvingService for Dumper {
|
||||||
|
fn is_local(&self) -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
|
async fn get_vks(&self, _: GetVkRequest) -> GetVkResponse {
|
||||||
|
// get vk has been deprecated in new prover with dynamic asset loading scheme
|
||||||
|
GetVkResponse {
|
||||||
|
vks: vec![],
|
||||||
|
error: None,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async fn prove(&mut self, req: ProveRequest) -> ProveResponse {
|
||||||
|
let error = if let Err(e) = self.dump(&req.input) {
|
||||||
|
Some(format!("failed to dump: {}", e))
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
};
|
||||||
|
|
||||||
|
ProveResponse {
|
||||||
|
status: TaskStatus::Failed,
|
||||||
|
error,
|
||||||
|
..Default::default()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn query_task(&mut self, req: QueryTaskRequest) -> QueryTaskResponse {
|
||||||
|
QueryTaskResponse {
|
||||||
|
task_id: req.task_id,
|
||||||
|
status: TaskStatus::Failed,
|
||||||
|
error: Some("dump file finished but need a fail return to exit".to_string()),
|
||||||
|
..Default::default()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,12 +1,13 @@
|
|||||||
|
mod dumper;
|
||||||
mod prover;
|
mod prover;
|
||||||
mod types;
|
mod types;
|
||||||
mod zk_circuits_handler;
|
mod zk_circuits_handler;
|
||||||
|
|
||||||
use clap::{ArgAction, Parser, Subcommand};
|
use clap::{ArgAction, Parser, Subcommand, ValueEnum};
|
||||||
use prover::{LocalProver, LocalProverConfig};
|
use prover::{LocalProver, LocalProverConfig};
|
||||||
use scroll_proving_sdk::{
|
use scroll_proving_sdk::{
|
||||||
prover::{types::ProofType, ProverBuilder},
|
prover::{types::ProofType, ProverBuilder},
|
||||||
utils::{get_version, init_tracing},
|
utils::{VERSION, init_tracing},
|
||||||
};
|
};
|
||||||
use std::{fs::File, io::BufReader, path::Path};
|
use std::{fs::File, io::BufReader, path::Path};
|
||||||
|
|
||||||
@@ -32,12 +33,35 @@ struct Args {
|
|||||||
command: Option<Commands>,
|
command: Option<Commands>,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[derive(Clone, Debug, PartialEq, Eq, ValueEnum)]
|
||||||
|
enum TaskType {
|
||||||
|
Chunk,
|
||||||
|
Batch,
|
||||||
|
Bundle,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl From<TaskType> for ProofType {
|
||||||
|
fn from(value: TaskType) -> Self {
|
||||||
|
match value {
|
||||||
|
TaskType::Chunk => ProofType::Chunk,
|
||||||
|
TaskType::Batch => ProofType::Batch,
|
||||||
|
TaskType::Bundle => ProofType::Bundle,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
#[derive(Subcommand, Debug)]
|
#[derive(Subcommand, Debug)]
|
||||||
enum Commands {
|
enum Commands {
|
||||||
Handle {
|
Handle {
|
||||||
/// path to save the verifier's asset
|
/// path to save the verifier's asset
|
||||||
task_path: String,
|
task_path: String,
|
||||||
},
|
},
|
||||||
|
Dump {
|
||||||
|
#[arg(long = "json", default_value = "false")]
|
||||||
|
json_mode: bool,
|
||||||
|
task_type: TaskType,
|
||||||
|
task_id: String,
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, serde::Deserialize)]
|
#[derive(Debug, serde::Deserialize)]
|
||||||
@@ -47,14 +71,14 @@ struct HandleSet {
|
|||||||
bundles: Vec<String>,
|
bundles: Vec<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[tokio::main]
|
#[tokio::main(flavor = "current_thread")]
|
||||||
async fn main() -> eyre::Result<()> {
|
async fn main() -> eyre::Result<()> {
|
||||||
init_tracing();
|
init_tracing();
|
||||||
|
|
||||||
let args = Args::parse();
|
let args = Args::parse();
|
||||||
|
|
||||||
if args.version {
|
if args.version {
|
||||||
println!("version is {}", get_version());
|
println!("version is {}", VERSION);
|
||||||
std::process::exit(0);
|
std::process::exit(0);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -63,6 +87,26 @@ async fn main() -> eyre::Result<()> {
|
|||||||
let local_prover = LocalProver::new(cfg.clone());
|
let local_prover = LocalProver::new(cfg.clone());
|
||||||
|
|
||||||
match args.command {
|
match args.command {
|
||||||
|
Some(Commands::Dump {
|
||||||
|
json_mode,
|
||||||
|
task_type,
|
||||||
|
task_id,
|
||||||
|
}) => {
|
||||||
|
let prover = ProverBuilder::new(
|
||||||
|
sdk_config,
|
||||||
|
dumper::Dumper {
|
||||||
|
json_mode,
|
||||||
|
..Default::default()
|
||||||
|
},
|
||||||
|
)
|
||||||
|
.build()
|
||||||
|
.await
|
||||||
|
.map_err(|e| eyre::eyre!("build prover fail: {e}"))?;
|
||||||
|
|
||||||
|
std::sync::Arc::new(prover)
|
||||||
|
.one_shot(&[task_id], task_type.into())
|
||||||
|
.await;
|
||||||
|
}
|
||||||
Some(Commands::Handle { task_path }) => {
|
Some(Commands::Handle { task_path }) => {
|
||||||
let file = File::open(Path::new(&task_path))?;
|
let file = File::open(Path::new(&task_path))?;
|
||||||
let reader = BufReader::new(file);
|
let reader = BufReader::new(file);
|
||||||
|
|||||||
@@ -30,6 +30,9 @@ pub struct AssetsLocationData {
|
|||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
/// a altered url for specififed vk
|
/// a altered url for specififed vk
|
||||||
pub asset_detours: HashMap<String, url::Url>,
|
pub asset_detours: HashMap<String, url::Url>,
|
||||||
|
/// when asset file existed, do not verify from network, help for debugging stuffs
|
||||||
|
#[serde(default)]
|
||||||
|
pub debug_mode: bool,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl AssetsLocationData {
|
impl AssetsLocationData {
|
||||||
@@ -79,6 +82,13 @@ impl AssetsLocationData {
|
|||||||
// Get file metadata to check size
|
// Get file metadata to check size
|
||||||
if let Ok(metadata) = std::fs::metadata(&local_file_path) {
|
if let Ok(metadata) = std::fs::metadata(&local_file_path) {
|
||||||
// Make a HEAD request to get remote file size
|
// Make a HEAD request to get remote file size
|
||||||
|
if self.debug_mode {
|
||||||
|
println!(
|
||||||
|
"File {} already exists, skipping download under debugmode",
|
||||||
|
filename
|
||||||
|
);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
if let Ok(head_resp) = client.head(download_url.clone()).send().await {
|
if let Ok(head_resp) = client.head(download_url.clone()).send().await {
|
||||||
if let Some(content_length) = head_resp.headers().get("content-length") {
|
if let Some(content_length) = head_resp.headers().get("content-length") {
|
||||||
@@ -201,12 +211,20 @@ impl ProvingService for LocalProver {
|
|||||||
error: Some(format!("proving task failed: {}", e)),
|
error: Some(format!("proving task failed: {}", e)),
|
||||||
..Default::default()
|
..Default::default()
|
||||||
},
|
},
|
||||||
Err(e) => QueryTaskResponse {
|
Err(e) => {
|
||||||
task_id: req.task_id,
|
if e.is_panic() {
|
||||||
status: TaskStatus::Failed,
|
// simply re-throw panic for any panicking in proving process,
|
||||||
error: Some(format!("proving task panicked: {}", e)),
|
// cause worker loop and the whole prover exit
|
||||||
..Default::default()
|
std::panic::resume_unwind(e.into_panic());
|
||||||
},
|
}
|
||||||
|
|
||||||
|
QueryTaskResponse {
|
||||||
|
task_id: req.task_id,
|
||||||
|
status: TaskStatus::Failed,
|
||||||
|
error: Some(format!("proving task failed: {}", e)),
|
||||||
|
..Default::default()
|
||||||
|
}
|
||||||
|
}
|
||||||
};
|
};
|
||||||
} else {
|
} else {
|
||||||
return QueryTaskResponse {
|
return QueryTaskResponse {
|
||||||
@@ -273,7 +291,9 @@ impl LocalProver {
|
|||||||
let created_at = duration.as_secs() as f64 + duration.subsec_nanos() as f64 * 1e-9;
|
let created_at = duration.as_secs() as f64 + duration.subsec_nanos() as f64 * 1e-9;
|
||||||
|
|
||||||
let prover_task = UniversalHandler::get_task_from_input(&req.input)?;
|
let prover_task = UniversalHandler::get_task_from_input(&req.input)?;
|
||||||
let is_openvm_13 = prover_task.use_openvm_13;
|
if prover_task.use_openvm_13 {
|
||||||
|
eyre::bail!("prover do not support snark params base on openvm 13");
|
||||||
|
}
|
||||||
let prover_task: ProvingTask = prover_task.into();
|
let prover_task: ProvingTask = prover_task.into();
|
||||||
let vk = hex::encode(&prover_task.vk);
|
let vk = hex::encode(&prover_task.vk);
|
||||||
let handler = if let Some(handler) = self.handlers.get(&vk) {
|
let handler = if let Some(handler) = self.handlers.get(&vk) {
|
||||||
@@ -300,10 +320,7 @@ impl LocalProver {
|
|||||||
.location_data
|
.location_data
|
||||||
.get_asset(&vk, &url_base, &base_config.workspace_path)
|
.get_asset(&vk, &url_base, &base_config.workspace_path)
|
||||||
.await?;
|
.await?;
|
||||||
let circuits_handler = Arc::new(Mutex::new(UniversalHandler::new(
|
let circuits_handler = Arc::new(Mutex::new(UniversalHandler::new(&asset_path)?));
|
||||||
&asset_path,
|
|
||||||
is_openvm_13,
|
|
||||||
)?));
|
|
||||||
self.handlers.insert(vk, circuits_handler.clone());
|
self.handlers.insert(vk, circuits_handler.clone());
|
||||||
circuits_handler
|
circuits_handler
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -16,15 +16,14 @@ pub struct UniversalHandler {
|
|||||||
unsafe impl Send for UniversalHandler {}
|
unsafe impl Send for UniversalHandler {}
|
||||||
|
|
||||||
impl UniversalHandler {
|
impl UniversalHandler {
|
||||||
pub fn new(workspace_path: impl AsRef<Path>, is_openvm_v13: bool) -> Result<Self> {
|
pub fn new(workspace_path: impl AsRef<Path>) -> Result<Self> {
|
||||||
let path_app_exe = workspace_path.as_ref().join("app.vmexe");
|
let path_app_exe = workspace_path.as_ref().join("app.vmexe");
|
||||||
let path_app_config = workspace_path.as_ref().join("openvm.toml");
|
let path_app_config = workspace_path.as_ref().join("openvm.toml");
|
||||||
let segment_len = Some((1 << 21) - 100);
|
let segment_len = Some((1 << 22) - 100);
|
||||||
let config = ProverConfig {
|
let config = ProverConfig {
|
||||||
path_app_config,
|
path_app_config,
|
||||||
path_app_exe,
|
path_app_exe,
|
||||||
segment_len,
|
segment_len,
|
||||||
is_openvm_v13,
|
|
||||||
};
|
};
|
||||||
|
|
||||||
let prover = Prover::setup(config, None)?;
|
let prover = Prover::setup(config, None)?;
|
||||||
|
|||||||
@@ -139,23 +139,24 @@ func initLeadingChunk(ctx context.Context, db *gorm.DB, beginBlk, endBlk uint64,
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
var l1MsgPoppedBefore uint64
|
|
||||||
blks, err := blockOrm.GetL2BlocksGEHeight(ctx, beginBlk, int(endBlk-beginBlk+1))
|
blks, err := blockOrm.GetL2BlocksGEHeight(ctx, beginBlk, int(endBlk-beginBlk+1))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
for i, block := range blks {
|
|
||||||
for _, tx := range block.Transactions {
|
// search l1 message and derive the popped l1 msg from nonce
|
||||||
if tx.Type == types.L1MessageTxType {
|
// if the nonce of first l1 msg is 0, or no l1 msg raised in target blocks, we do not need the leading chunk
|
||||||
l1MsgPoppedBefore = tx.Nonce
|
l1MsgPoppedBefore := func() uint64 {
|
||||||
log.Info("search first l1 nonce", "index", l1MsgPoppedBefore, "blk", beginBlk+uint64(i))
|
for i, block := range blks {
|
||||||
break
|
for _, tx := range block.Transactions {
|
||||||
|
if tx.Type == types.L1MessageTxType {
|
||||||
|
log.Info("search first l1 nonce", "index", tx.Nonce, "blk", beginBlk+uint64(i))
|
||||||
|
return tx.Nonce
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if l1MsgPoppedBefore != 0 {
|
return 0
|
||||||
break
|
}()
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if l1MsgPoppedBefore == 0 {
|
if l1MsgPoppedBefore == 0 {
|
||||||
log.Info("no l1 message in target blks, no need for leading chunk")
|
log.Info("no l1 message in target blks, no need for leading chunk")
|
||||||
|
|||||||
4
tests/prover-e2e/.gitignore
vendored
4
tests/prover-e2e/.gitignore
vendored
@@ -1,3 +1,5 @@
|
|||||||
build/*
|
build/*
|
||||||
testset.json
|
testset.json
|
||||||
conf
|
conf
|
||||||
|
*.log
|
||||||
|
*.txt
|
||||||
@@ -66,5 +66,5 @@ import_data: build/bin/e2e_tool check_vars migration_blocks
|
|||||||
reimport_data: reset_db import_data
|
reimport_data: reset_db import_data
|
||||||
|
|
||||||
coordinator_setup:
|
coordinator_setup:
|
||||||
SCROLL_FORK_NAME=${SCROLL_FORK_NAME} $(MAKE) -C ../../coordinator localsetup
|
SCROLL_ZKVM_VERSION=${SCROLL_ZKVM_VERSION} SCROLL_FORK_NAME=${SCROLL_FORK_NAME} $(MAKE) -C ../../coordinator localsetup
|
||||||
cp -f conf/genesis.json ../../coordinator/build/bin/conf
|
cp -f conf/genesis.json ../../coordinator/build/bin/conf
|
||||||
@@ -1,16 +1,24 @@
|
|||||||
## A new e2e test tool to setup a local environment for testing coordinator and prover.
|
# ProverE2E: A new e2e test tool to setup a local environment for testing coordinator and prover.
|
||||||
|
|
||||||
It contains data from some blocks in a specified testnet, and helps to generate a series of chunks/batches/bundles from these blocks, filling the DB for the coordinator, so an e2e test (from chunk to bundle) can be run completely local
|
It contains data from some blocks in a specified testnet, and helps to generate a series of chunks/batches/bundles from these blocks, filling the DB for the coordinator, so an e2e test (from chunk to bundle) can be run completely local
|
||||||
|
|
||||||
Prepare:
|
## Prepare
|
||||||
link the staff dir as "conf" from one of the dir with staff set, currently we have following staff sets:
|
link the staff dir as "conf" from one of the dir with staff set, currently we have following staff sets:
|
||||||
+ sepolia: with blocks from scroll sepolia
|
+ sepolia: with blocks from scroll sepolia
|
||||||
+ cloak-xen: with blocks from xen sepolia, which is a cloak network
|
+ cloak-xen: with blocks from xen sepolia, which is a cloak network
|
||||||
|
|
||||||
Steps:
|
## Test
|
||||||
1. run `make all` under `tests/prover-e2e`, it would launch a postgreSql db in local docker container, which is ready to be used by coordinator (include some chunks/batches/bundles waiting to be proven)
|
1. run `make all` under `tests/prover-e2e`, it would launch a postgreSql db in local docker container, which is ready to be used by coordinator (include some chunks/batches/bundles waiting to be proven)
|
||||||
2. setup assets by run `make coordinator_setup`
|
2. setup assets by run `make coordinator_setup`
|
||||||
3. in `coordinator/build/bin/conf`, update necessary items in `config.template.json` and rename it as `config.json`
|
3. come into `coordinator/build/bin` for following steps:
|
||||||
4. build and launch `coordinator_api` service locally
|
+ rename `conf/config.template.json` as `conf/config.json`
|
||||||
5. setup the `config.json` for zkvm prover to connect with the locally launched coordinator api
|
+ if the `l2.validium_mode` is set to true in `config.json`, the `sequencer.decryption_key` must be set
|
||||||
6. in `zkvm-prover`, launch `make test_e2e_run`, which would specific prover run locally, connect to the local coordinator api service according to the `config.json`, and prove all tasks being injected to db in step 1.
|
+ launch `coordinator_api` service by executing the file
|
||||||
|
4. come into `zkvm-prover` for following steps:
|
||||||
|
+ copy `config.template.json` to `config.json`,
|
||||||
|
+ set the `sdk_config.coordinator.base_url` field in `config.json`, so zkvm prover would connect with the locally launched coordinator api,
|
||||||
|
for common case the url is `http://localhost:8390` (the default listening port of coordinator api)
|
||||||
|
+ launch `make test_e2e_run`, which would specific prover run locally, connect to the local coordinator api service according to the `config.json`, and prove all tasks being injected to db in step 1.
|
||||||
|
|
||||||
|
## AI Helper
|
||||||
|
The test process can be run with the help of `integration-test-helper` skill (~$1.0 for each full process)
|
||||||
4
tests/prover-e2e/cloak-galileoV2/.make.env
Normal file
4
tests/prover-e2e/cloak-galileoV2/.make.env
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
BEGIN_BLOCK?=2
|
||||||
|
END_BLOCK?=15
|
||||||
|
SCROLL_FORK_NAME=galileoV2
|
||||||
|
SCROLL_ZKVM_VERSION?=v0.7.3-candidate-3
|
||||||
15
tests/prover-e2e/cloak-galileoV2/config.json
Normal file
15
tests/prover-e2e/cloak-galileoV2/config.json
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
{
|
||||||
|
"db_config": {
|
||||||
|
"driver_name": "postgres",
|
||||||
|
"dsn": "postgres://dev:dev@localhost:5432/scroll?sslmode=disable",
|
||||||
|
"maxOpenNum": 5,
|
||||||
|
"maxIdleNum": 1
|
||||||
|
},
|
||||||
|
"fetch_config": {
|
||||||
|
"endpoint": "http://cloak-usx-sequencer.mainnet.scroll.tech:8545",
|
||||||
|
"l2_message_queue_address": "0x5300000000000000000000000000000000000000"
|
||||||
|
},
|
||||||
|
"validium_mode": true,
|
||||||
|
"codec_version": 10
|
||||||
|
|
||||||
|
}
|
||||||
@@ -11,7 +11,7 @@
|
|||||||
"verifiers": [
|
"verifiers": [
|
||||||
{
|
{
|
||||||
"assets_path": "assets",
|
"assets_path": "assets",
|
||||||
"fork_name": "feynman"
|
"fork_name": "galileoV2"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -24,9 +24,9 @@
|
|||||||
},
|
},
|
||||||
"l2": {
|
"l2": {
|
||||||
"validium_mode": true,
|
"validium_mode": true,
|
||||||
"chain_id": 5343513301,
|
"chain_id": 5343523305,
|
||||||
"l2geth": {
|
"l2geth": {
|
||||||
"endpoint": "http://cloak-xen-sequencer.sepolia.scroll.tech:8545/"
|
"endpoint": "http://cloak-usx-sequencer.mainnet.scroll.tech:8545"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"auth": {
|
"auth": {
|
||||||
File diff suppressed because one or more lines are too long
@@ -1,3 +0,0 @@
|
|||||||
BEGIN_BLOCK?=35
|
|
||||||
END_BLOCK?=49
|
|
||||||
SCROLL_FORK_NAME=feynman
|
|
||||||
File diff suppressed because one or more lines are too long
@@ -1,10 +0,0 @@
|
|||||||
{
|
|
||||||
"db_config": {
|
|
||||||
"driver_name": "postgres",
|
|
||||||
"dsn": "postgres://dev:dev@localhost:5432/scroll?sslmode=disable",
|
|
||||||
"maxOpenNum": 5,
|
|
||||||
"maxIdleNum": 1
|
|
||||||
},
|
|
||||||
"validium_mode": true,
|
|
||||||
"codec_version": 8
|
|
||||||
}
|
|
||||||
4
tests/prover-e2e/mainnet-galileo/.make.env
Normal file
4
tests/prover-e2e/mainnet-galileo/.make.env
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
BEGIN_BLOCK?=26653680
|
||||||
|
END_BLOCK?=26653686
|
||||||
|
SCROLL_FORK_NAME=galileo
|
||||||
|
SCROLL_ZKVM_VERSION=?v0.7.1
|
||||||
@@ -5,6 +5,10 @@
|
|||||||
"maxOpenNum": 5,
|
"maxOpenNum": 5,
|
||||||
"maxIdleNum": 1
|
"maxIdleNum": 1
|
||||||
},
|
},
|
||||||
|
"fetch_config": {
|
||||||
|
"endpoint": "https://mainnet-rpc.scroll.io",
|
||||||
|
"l2_message_queue_address": "0x5300000000000000000000000000000000000000"
|
||||||
|
},
|
||||||
"validium_mode": false,
|
"validium_mode": false,
|
||||||
"codec_version": 8
|
"codec_version": 9
|
||||||
}
|
}
|
||||||
@@ -10,9 +10,8 @@
|
|||||||
"min_prover_version": "v4.4.33",
|
"min_prover_version": "v4.4.33",
|
||||||
"verifiers": [
|
"verifiers": [
|
||||||
{
|
{
|
||||||
"features": "legacy_witness:openvm_13",
|
"assets_path": "assets_galileo",
|
||||||
"assets_path": "assets_feynman",
|
"fork_name": "galileo"
|
||||||
"fork_name": "feynman"
|
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -25,7 +24,7 @@
|
|||||||
},
|
},
|
||||||
"l2": {
|
"l2": {
|
||||||
"validium_mode": false,
|
"validium_mode": false,
|
||||||
"chain_id": 534351,
|
"chain_id": 534352,
|
||||||
"l2geth": {
|
"l2geth": {
|
||||||
"endpoint": "<serach a public rpc endpoint like alchemy>"
|
"endpoint": "<serach a public rpc endpoint like alchemy>"
|
||||||
}
|
}
|
||||||
111
tests/prover-e2e/mainnet-galileo/genesis.json
Normal file
111
tests/prover-e2e/mainnet-galileo/genesis.json
Normal file
File diff suppressed because one or more lines are too long
@@ -1,3 +0,0 @@
|
|||||||
BEGIN_BLOCK?=10973711
|
|
||||||
END_BLOCK?=10973721
|
|
||||||
SCROLL_FORK_NAME=feynman
|
|
||||||
@@ -1,132 +0,0 @@
|
|||||||
-- +goose Up
|
|
||||||
-- +goose StatementBegin
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973700', '0x29c84f0df09fda2c6c63d314bb6714dbbdeca3b85c91743f9e25d3f81c28b986', '0x01aabb5d1d7edadd10011b4099de7ed703b9ce495717cd48a304ff4db3710d8a', '{"parentHash":"0x01aabb5d1d7edadd10011b4099de7ed703b9ce495717cd48a304ff4db3710d8a","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77204","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36e5","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x29c84f0df09fda2c6c63d314bb6714dbbdeca3b85c91743f9e25d3f81c28b986"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf', '0', '0', '1753167589', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973701', '0x21ad5215b5b51cb5eb5ea6a19444804ca1628cbf8ef8cf1977660d8c468c0151', '0x29c84f0df09fda2c6c63d314bb6714dbbdeca3b85c91743f9e25d3f81c28b986', '{"parentHash":"0x29c84f0df09fda2c6c63d314bb6714dbbdeca3b85c91743f9e25d3f81c28b986","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77205","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36e6","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x21ad5215b5b51cb5eb5ea6a19444804ca1628cbf8ef8cf1977660d8c468c0151"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf', '0', '0', '1753167590', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973702', '0x8c9ce33a9b62060b193c01f31518f3bfc5d8132c11569a5b4db03a5b0611f30e', '0x21ad5215b5b51cb5eb5ea6a19444804ca1628cbf8ef8cf1977660d8c468c0151', '{"parentHash":"0x21ad5215b5b51cb5eb5ea6a19444804ca1628cbf8ef8cf1977660d8c468c0151","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77206","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36e7","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x8c9ce33a9b62060b193c01f31518f3bfc5d8132c11569a5b4db03a5b0611f30e"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf', '0', '0', '1753167591', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973703', '0x73eed8060ca9a36fe8bf8c981f0e44425cd69ade00ff986452f1c02d462194fe', '0x8c9ce33a9b62060b193c01f31518f3bfc5d8132c11569a5b4db03a5b0611f30e', '{"parentHash":"0x8c9ce33a9b62060b193c01f31518f3bfc5d8132c11569a5b4db03a5b0611f30e","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77207","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36e8","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x73eed8060ca9a36fe8bf8c981f0e44425cd69ade00ff986452f1c02d462194fe"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf', '0', '0', '1753167592', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973704', '0x7a6a1bede8936cfbd677cf38c43e399c8a2d0b62700caf05513bad540541b1b5', '0x73eed8060ca9a36fe8bf8c981f0e44425cd69ade00ff986452f1c02d462194fe', '{"parentHash":"0x73eed8060ca9a36fe8bf8c981f0e44425cd69ade00ff986452f1c02d462194fe","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77208","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36e9","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x7a6a1bede8936cfbd677cf38c43e399c8a2d0b62700caf05513bad540541b1b5"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf', '0', '0', '1753167593', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973705', '0x1a5306293b7801a42c4402f9d32e7a45d640c49506ee61452da0120f3e242424', '0x7a6a1bede8936cfbd677cf38c43e399c8a2d0b62700caf05513bad540541b1b5', '{"parentHash":"0x7a6a1bede8936cfbd677cf38c43e399c8a2d0b62700caf05513bad540541b1b5","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77209","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36ea","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x1a5306293b7801a42c4402f9d32e7a45d640c49506ee61452da0120f3e242424"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf', '0', '0', '1753167594', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973706', '0xc89f391a03c7138f676bd8babed6589035b2b7d8c8b99071cb90661f7f996386', '0x1a5306293b7801a42c4402f9d32e7a45d640c49506ee61452da0120f3e242424', '{"parentHash":"0x1a5306293b7801a42c4402f9d32e7a45d640c49506ee61452da0120f3e242424","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7720a","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36eb","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xc89f391a03c7138f676bd8babed6589035b2b7d8c8b99071cb90661f7f996386"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x347733a157bc7045f6a1d5bfd37d51763f3503b63290576a65b3b83265add2cf', '0', '0', '1753167595', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973707', '0x38d72b14ef43e548ab0ffd84892e7c3accdd11e1121c2c7a94b953b4e896eb41', '0xc89f391a03c7138f676bd8babed6589035b2b7d8c8b99071cb90661f7f996386', '{"parentHash":"0xc89f391a03c7138f676bd8babed6589035b2b7d8c8b99071cb90661f7f996386","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946","transactionsRoot":"0xf14cf5134833ddb4f42a017e92af371f0a71eaf5d84cb6e681c81fa023662c5d","receiptsRoot":"0x4008fb883088f1ba377310e15221fffc8e5446faf420d6a28e061e9341beb056","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000020000000000900000000000000000000000000000000000000000000000000000000000000000000000001000000008000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000020000000000000000000","difficulty":"0x1","number":"0xa7720b","gasLimit":"0x1312d00","gasUsed":"0x9642","timestamp":"0x687f36ec","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x38d72b14ef43e548ab0ffd84892e7c3accdd11e1121c2c7a94b953b4e896eb41"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946', '1', '38466', '1753167596', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[{"type":2,"nonce":3392500,"txHash":"0xc15b615906602154131a6c42d7603def4bd2a769881292d831140b0b9b8f8850","gas":45919,"gasPrice":"0x1de8476","gasTipCap":"0x64","gasFeeCap":"0x1de8476","from":"0x0000000000000000000000000000000000000000","to":"0x5300000000000000000000000000000000000002","chainId":"0x8274f","value":"0x0","data":"0x39455d3a0000000000000000000000000000000000000000000000000000000000045b840000000000000000000000000000000000000000000000000000000000000001","isCreate":false,"accessList":[{"address":"0x5300000000000000000000000000000000000003","storageKeys":["0x297c59f20c6b2556a4ed35dccabbdeb8b1cf950f62aefb86b98d19b5a4aff2a2"]}],"authorizationList":null,"v":"0x1","r":"0xa1b888cc9be7990c4f6bd8a9d0d5fa743ea8173196c7ca871464becd133ba0de","s":"0x6bacc3e1a244c62eff3008795e010598d07b95a8bad7a5592ec941e121294885"}]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973708', '0x230863f98595ba0c83785caf618072ce2a876307102adbeba11b9de9c4af8a08', '0x38d72b14ef43e548ab0ffd84892e7c3accdd11e1121c2c7a94b953b4e896eb41', '{"parentHash":"0x38d72b14ef43e548ab0ffd84892e7c3accdd11e1121c2c7a94b953b4e896eb41","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7720c","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36ed","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x230863f98595ba0c83785caf618072ce2a876307102adbeba11b9de9c4af8a08"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946', '0', '0', '1753167597', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973709', '0xae4af19a7697c2bb10a641f07269ed7df66775d56414567245adc98befdae557', '0x230863f98595ba0c83785caf618072ce2a876307102adbeba11b9de9c4af8a08', '{"parentHash":"0x230863f98595ba0c83785caf618072ce2a876307102adbeba11b9de9c4af8a08","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7720d","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36ee","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xae4af19a7697c2bb10a641f07269ed7df66775d56414567245adc98befdae557"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946', '0', '0', '1753167598', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973710', '0xd2bc7e24b66940a767abaee485870e9ebd24ff28e72a3096cdccc55b85f84182', '0xae4af19a7697c2bb10a641f07269ed7df66775d56414567245adc98befdae557', '{"parentHash":"0xae4af19a7697c2bb10a641f07269ed7df66775d56414567245adc98befdae557","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7720e","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36ef","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xd2bc7e24b66940a767abaee485870e9ebd24ff28e72a3096cdccc55b85f84182"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xb920df561f210a617a0c1567cb2f65350818b96b159f5aa4b9ac7915b7af4946', '0', '0', '1753167599', '', '0x206c062cf0991353ba5ebc9888ca224f470ad3edf8e8e01125726a3858ebdd73', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973711', '0x1870b94320db154de4702fe6bfb69ebc98fb531b78bf81a69b8ab658ba9d9af5', '0xd2bc7e24b66940a767abaee485870e9ebd24ff28e72a3096cdccc55b85f84182', '{"parentHash":"0xd2bc7e24b66940a767abaee485870e9ebd24ff28e72a3096cdccc55b85f84182","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x2bdf2906a7bbb398419246c3c77804a204641259b2aeb4f4a806eb772d31c480","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7720f","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f0","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4209","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x1870b94320db154de4702fe6bfb69ebc98fb531b78bf81a69b8ab658ba9d9af5"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x2bdf2906a7bbb398419246c3c77804a204641259b2aeb4f4a806eb772d31c480', '0', '0', '1753167600', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973712', '0x271745e26e3222352fce7052edef9adecba921f4315e48a9d55e46640ac324ce', '0x1870b94320db154de4702fe6bfb69ebc98fb531b78bf81a69b8ab658ba9d9af5', '{"parentHash":"0x1870b94320db154de4702fe6bfb69ebc98fb531b78bf81a69b8ab658ba9d9af5","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x1ec0026bd12fe29d710e5f04e605cdb715d68a2e5bac57416066a7bc6b298762","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77210","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f1","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4208","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x271745e26e3222352fce7052edef9adecba921f4315e48a9d55e46640ac324ce"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x1ec0026bd12fe29d710e5f04e605cdb715d68a2e5bac57416066a7bc6b298762', '0', '0', '1753167601', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973713', '0xe193fc4298a6beebbc59b83cc7e2bbdace76c24fe9b7bd76aa415159ecb60914', '0x271745e26e3222352fce7052edef9adecba921f4315e48a9d55e46640ac324ce', '{"parentHash":"0x271745e26e3222352fce7052edef9adecba921f4315e48a9d55e46640ac324ce","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x7e29363a63f54a0e03e08cb515a98f3c416a5ade3ec15d29eddd262baf67a2a1","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77211","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f2","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xe193fc4298a6beebbc59b83cc7e2bbdace76c24fe9b7bd76aa415159ecb60914"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x7e29363a63f54a0e03e08cb515a98f3c416a5ade3ec15d29eddd262baf67a2a1', '0', '0', '1753167602', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973714', '0x8e99d9242315a107492520e9255dd77798adbff81393d3de83960dac361bd838', '0xe193fc4298a6beebbc59b83cc7e2bbdace76c24fe9b7bd76aa415159ecb60914', '{"parentHash":"0xe193fc4298a6beebbc59b83cc7e2bbdace76c24fe9b7bd76aa415159ecb60914","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xd818ac5028fe5aa1abd2d3ffe4693b4e96eabad35e49011e2ce920bcd76d061a","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77212","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f3","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x8e99d9242315a107492520e9255dd77798adbff81393d3de83960dac361bd838"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xd818ac5028fe5aa1abd2d3ffe4693b4e96eabad35e49011e2ce920bcd76d061a', '0', '0', '1753167603', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973715', '0x9ee9d75a25a912e7d3907679a1e588021f4d2040c71d2e1c958538466a4fbbd6', '0x8e99d9242315a107492520e9255dd77798adbff81393d3de83960dac361bd838', '{"parentHash":"0x8e99d9242315a107492520e9255dd77798adbff81393d3de83960dac361bd838","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x233fd85bd753e126e4df23a05c56ccde3eb6ec06ce2565a990af3347dc95b0c5","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77213","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f4","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x9ee9d75a25a912e7d3907679a1e588021f4d2040c71d2e1c958538466a4fbbd6"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x233fd85bd753e126e4df23a05c56ccde3eb6ec06ce2565a990af3347dc95b0c5', '0', '0', '1753167604', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973716', '0x9b25094db21166930008728c487ca9dbbc1e842701c8573eaa6bea4d41c10a7e', '0x9ee9d75a25a912e7d3907679a1e588021f4d2040c71d2e1c958538466a4fbbd6', '{"parentHash":"0x9ee9d75a25a912e7d3907679a1e588021f4d2040c71d2e1c958538466a4fbbd6","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x12be357fcc1fc28e574a7f95a5f9b3aae7e18d8ab8829c676478b4e8953a8502","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77214","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f5","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x9b25094db21166930008728c487ca9dbbc1e842701c8573eaa6bea4d41c10a7e"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x12be357fcc1fc28e574a7f95a5f9b3aae7e18d8ab8829c676478b4e8953a8502', '0', '0', '1753167605', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973717', '0xeaab0e07b40720f8e961f28ef665f08e67797428dc1eaccba88d5d4f60341284', '0x9b25094db21166930008728c487ca9dbbc1e842701c8573eaa6bea4d41c10a7e', '{"parentHash":"0x9b25094db21166930008728c487ca9dbbc1e842701c8573eaa6bea4d41c10a7e","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x7e49dc33343a54e9afc285155b8a35575e6924d465fe2dc543b5ea8915eb828a","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77215","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f6","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xeaab0e07b40720f8e961f28ef665f08e67797428dc1eaccba88d5d4f60341284"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x7e49dc33343a54e9afc285155b8a35575e6924d465fe2dc543b5ea8915eb828a', '0', '0', '1753167606', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973718', '0xd6af3c7bf29f3689b516ed5c8fcf885a4e7eb2df751c35d4ebbb19fcc13628d4', '0xeaab0e07b40720f8e961f28ef665f08e67797428dc1eaccba88d5d4f60341284', '{"parentHash":"0xeaab0e07b40720f8e961f28ef665f08e67797428dc1eaccba88d5d4f60341284","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xf1a7db2e4f463fa87e3e65b73d2abc5374302855f6af9735d5a11c94c2d93975","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77216","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f7","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xd6af3c7bf29f3689b516ed5c8fcf885a4e7eb2df751c35d4ebbb19fcc13628d4"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xf1a7db2e4f463fa87e3e65b73d2abc5374302855f6af9735d5a11c94c2d93975', '0', '0', '1753167607', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973719', '0x5815f5b91d53d0b5c7d423f06da7cad3d45edeab1c2b590af02ceebfd33b2ce1', '0xd6af3c7bf29f3689b516ed5c8fcf885a4e7eb2df751c35d4ebbb19fcc13628d4', '{"parentHash":"0xd6af3c7bf29f3689b516ed5c8fcf885a4e7eb2df751c35d4ebbb19fcc13628d4","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xb5d1f420ddc1edb60c7fc3a06929a2014c548d1ddd52a78ab6984faed53a09d1","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77217","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36f8","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x5815f5b91d53d0b5c7d423f06da7cad3d45edeab1c2b590af02ceebfd33b2ce1"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xb5d1f420ddc1edb60c7fc3a06929a2014c548d1ddd52a78ab6984faed53a09d1', '0', '0', '1753167608', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973720', '0x4d8bbd6a15515cacf18cf9810ca3867442cefc733e9cdeaa1527a008fbca3bd1', '0x5815f5b91d53d0b5c7d423f06da7cad3d45edeab1c2b590af02ceebfd33b2ce1', '{"parentHash":"0x5815f5b91d53d0b5c7d423f06da7cad3d45edeab1c2b590af02ceebfd33b2ce1","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xf4ca7c941e6ad6a780ad8422a817c6a7916f3f80b5f0d0f95cabcb17b0531299","transactionsRoot":"0x79c3ba4e0fe89ddea0ed8becdbfff86f18dab3ffd21eaf13744b86cb104d664e","receiptsRoot":"0xc8f88931c3c4ca18cb582e490d7acabfbe04fd6fa971549af6bf927aec7bfa1f","logsBloom":"0x00000000000000000000000000000000000000080000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000040000000000000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000200000000000000000000000000000000000004000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77218","gasLimit":"0x1312d00","gasUsed":"0x7623","timestamp":"0x687f36f9","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x4d8bbd6a15515cacf18cf9810ca3867442cefc733e9cdeaa1527a008fbca3bd1"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xf4ca7c941e6ad6a780ad8422a817c6a7916f3f80b5f0d0f95cabcb17b0531299', '1', '30243', '1753167609', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[{"type":0,"nonce":13627,"txHash":"0x539962b9584723f919b9f3a0b454622f5f51c195300564116d0cedfec17a1381","gas":30243,"gasPrice":"0xef426b","gasTipCap":"0xef426b","gasFeeCap":"0xef426b","from":"0x0000000000000000000000000000000000000000","to":"0xf07cc6482a24843efe7b42259acbaf8d0a2a6952","chainId":"0x8274f","value":"0x0","data":"0x91b7f5ed0000000000000000000000000000000000000000000018f4c5be1c1407000000","isCreate":false,"accessList":null,"authorizationList":null,"v":"0x104ec2","r":"0xaa309d7e218825160be9a87c9e50d3cbfead9c87e90e984ad0ea2441633092a2","s":"0x438f39c0af058794f320e5578720557af07c5397e363f9628a6c4ffee5bd2487"}]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973721', '0x84c53d922cfe0558c4be02865af5ebd49efe44a458dabc16aea5584e5e06f346', '0x4d8bbd6a15515cacf18cf9810ca3867442cefc733e9cdeaa1527a008fbca3bd1', '{"parentHash":"0x4d8bbd6a15515cacf18cf9810ca3867442cefc733e9cdeaa1527a008fbca3bd1","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xd4838ba86f5a8e865a41ef7547148b6074235a658dd57ff2296c0badda4760d1","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77219","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36fa","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x84c53d922cfe0558c4be02865af5ebd49efe44a458dabc16aea5584e5e06f346"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xd4838ba86f5a8e865a41ef7547148b6074235a658dd57ff2296c0badda4760d1', '0', '0', '1753167610', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973722', '0xe1d601522b08d98852b4c7dc3584f292ac246a3dac3c600ba58bd6c20c97be5b', '0x84c53d922cfe0558c4be02865af5ebd49efe44a458dabc16aea5584e5e06f346', '{"parentHash":"0x84c53d922cfe0558c4be02865af5ebd49efe44a458dabc16aea5584e5e06f346","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x8deede75e20423d0495cbdb493d320dddde6df0459df998608a16f658eb7bec3","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7721a","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36fb","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xe1d601522b08d98852b4c7dc3584f292ac246a3dac3c600ba58bd6c20c97be5b"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x8deede75e20423d0495cbdb493d320dddde6df0459df998608a16f658eb7bec3', '0', '0', '1753167611', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973723', '0x8579712fc434b401f1ecfcf3ae22611be054480fa882e90f8eecb6c5e97534bd', '0xe1d601522b08d98852b4c7dc3584f292ac246a3dac3c600ba58bd6c20c97be5b', '{"parentHash":"0xe1d601522b08d98852b4c7dc3584f292ac246a3dac3c600ba58bd6c20c97be5b","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xb4fe51cda0401bb19e8448a2697a49e1fbc25398c2b18a9955d0a8e6f4b153a7","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7721b","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36fc","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x8579712fc434b401f1ecfcf3ae22611be054480fa882e90f8eecb6c5e97534bd"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xb4fe51cda0401bb19e8448a2697a49e1fbc25398c2b18a9955d0a8e6f4b153a7', '0', '0', '1753167612', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973724', '0xe13a0b907e044a9df1952acc31dc08a578fb910a0cc224e11692cb84c9c9a9f7', '0x8579712fc434b401f1ecfcf3ae22611be054480fa882e90f8eecb6c5e97534bd', '{"parentHash":"0x8579712fc434b401f1ecfcf3ae22611be054480fa882e90f8eecb6c5e97534bd","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xcd17c85290d8ec7473357ebe1605f766af6c1356732cc7ad11de0453baca05c6","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7721c","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36fd","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xe13a0b907e044a9df1952acc31dc08a578fb910a0cc224e11692cb84c9c9a9f7"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xcd17c85290d8ec7473357ebe1605f766af6c1356732cc7ad11de0453baca05c6', '0', '0', '1753167613', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973725', '0x2e26fb489f8644b3b5c44cd493ebc140ba3bc716588f37a71b8ba6dc504ccb5f', '0xe13a0b907e044a9df1952acc31dc08a578fb910a0cc224e11692cb84c9c9a9f7', '{"parentHash":"0xe13a0b907e044a9df1952acc31dc08a578fb910a0cc224e11692cb84c9c9a9f7","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xfd321f4a3e2bc757df89162f730a2e37519dcb29cdb63019665c1fe4dbceeb00","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7721d","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36fe","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x2e26fb489f8644b3b5c44cd493ebc140ba3bc716588f37a71b8ba6dc504ccb5f"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xfd321f4a3e2bc757df89162f730a2e37519dcb29cdb63019665c1fe4dbceeb00', '0', '0', '1753167614', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973726', '0x313b0fbb7cbb8bc1ba4fbc50684b516d31f9f7ee6f66d919da01328537a4b0a1', '0x2e26fb489f8644b3b5c44cd493ebc140ba3bc716588f37a71b8ba6dc504ccb5f', '{"parentHash":"0x2e26fb489f8644b3b5c44cd493ebc140ba3bc716588f37a71b8ba6dc504ccb5f","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x1a24ed5ee5e8ca354f583b28bd7f2c4c6fe4dca59fef476578eddab17b857471","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa7721e","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f36ff","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x313b0fbb7cbb8bc1ba4fbc50684b516d31f9f7ee6f66d919da01328537a4b0a1"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x1a24ed5ee5e8ca354f583b28bd7f2c4c6fe4dca59fef476578eddab17b857471', '0', '0', '1753167615', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973727', '0xf9039c9c24ab919066f2eb6f97360cfb727ed032c9e6142ea45e784b19894560', '0x313b0fbb7cbb8bc1ba4fbc50684b516d31f9f7ee6f66d919da01328537a4b0a1', '{"parentHash":"0x313b0fbb7cbb8bc1ba4fbc50684b516d31f9f7ee6f66d919da01328537a4b0a1","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x2927f53f1eaaeaa17a80f048f10474a7cc3b2c96547cc47caad33ff9e5b38da6","transactionsRoot":"0x80fd441b38b6ffb8f9369d8a5179356f9bf5ad332db0da99f7c6efdb90939cd2","receiptsRoot":"0xa262cee7ba62c004c6554e9cf378512a868346c24f8cafc1ac1954250339149e","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000020000000000900000000000000000000000000000000000000000000000000000000000000000000000001000000008000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000020000000000000000000","difficulty":"0x1","number":"0xa7721f","gasLimit":"0x1312d00","gasUsed":"0x9642","timestamp":"0x687f3700","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xf9039c9c24ab919066f2eb6f97360cfb727ed032c9e6142ea45e784b19894560"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x2927f53f1eaaeaa17a80f048f10474a7cc3b2c96547cc47caad33ff9e5b38da6', '1', '38466', '1753167616', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[{"type":2,"nonce":3392501,"txHash":"0xa5231ea1b94eb516575807531763b312d250ee5ad4dfbeea66beab5f448c32b6","gas":45919,"gasPrice":"0x1de8472","gasTipCap":"0x64","gasFeeCap":"0x1de8472","from":"0x0000000000000000000000000000000000000000","to":"0x5300000000000000000000000000000000000002","chainId":"0x8274f","value":"0x0","data":"0x39455d3a000000000000000000000000000000000000000000000000000000000004580f0000000000000000000000000000000000000000000000000000000000000001","isCreate":false,"accessList":[{"address":"0x5300000000000000000000000000000000000003","storageKeys":["0x297c59f20c6b2556a4ed35dccabbdeb8b1cf950f62aefb86b98d19b5a4aff2a2"]}],"authorizationList":null,"v":"0x1","r":"0xa09a97c38c7a58f40ff39ca74f938c63f1ef822cf91926d4fff96b7dc818d3f3","s":"0x77ee7453096794d9cbb206f26077f23b4cc88fe51893cb5eab46714e379ac833"}]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973728', '0xf27ff9223f6bf9a964737d50cb7c005f049cf0f4edfd16d24178a798c21716d6', '0xf9039c9c24ab919066f2eb6f97360cfb727ed032c9e6142ea45e784b19894560', '{"parentHash":"0xf9039c9c24ab919066f2eb6f97360cfb727ed032c9e6142ea45e784b19894560","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0x0dbe54818526afaabbce83765eabcd4ec4d437a3497e5d046d599af862ea9850","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77220","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f3701","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0xf27ff9223f6bf9a964737d50cb7c005f049cf0f4edfd16d24178a798c21716d6"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0x0dbe54818526afaabbce83765eabcd4ec4d437a3497e5d046d599af862ea9850', '0', '0', '1753167617', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973729', '0x2b7777eb3ffe5939d6b70883cef69250ef5a2ed62a8b378973e0c3fe84707137', '0xf27ff9223f6bf9a964737d50cb7c005f049cf0f4edfd16d24178a798c21716d6', '{"parentHash":"0xf27ff9223f6bf9a964737d50cb7c005f049cf0f4edfd16d24178a798c21716d6","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xb89ed319fb9dcaed2df7e72223683cf255f6c1e45742e6caa810938871ce53bf","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77221","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f3702","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x2b7777eb3ffe5939d6b70883cef69250ef5a2ed62a8b378973e0c3fe84707137"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xb89ed319fb9dcaed2df7e72223683cf255f6c1e45742e6caa810938871ce53bf', '0', '0', '1753167618', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
INSERT INTO l2_block (number, hash, parent_hash, header, withdraw_root,
|
|
||||||
state_root, tx_num, gas_used, block_timestamp, row_consumption,
|
|
||||||
chunk_hash, transactions
|
|
||||||
) VALUES ('10973730', '0x56318f0a941611fc22640ea7f7d0308ab88a9e23059b5c6983bafc2402003d13', '0x2b7777eb3ffe5939d6b70883cef69250ef5a2ed62a8b378973e0c3fe84707137', '{"parentHash":"0x2b7777eb3ffe5939d6b70883cef69250ef5a2ed62a8b378973e0c3fe84707137","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","miner":"0x0000000000000000000000000000000000000000","stateRoot":"0xe603d341e958521d3f5df8f37b5144b3c003214c481716cffa4e8d6303d9734f","transactionsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","receiptsRoot":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","difficulty":"0x1","number":"0xa77222","gasLimit":"0x1312d00","gasUsed":"0x0","timestamp":"0x687f3703","extraData":"0x","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","baseFeePerGas":"0xef4207","withdrawalsRoot":null,"blobGasUsed":null,"excessBlobGas":null,"parentBeaconBlockRoot":null,"requestsHash":null,"hash":"0x56318f0a941611fc22640ea7f7d0308ab88a9e23059b5c6983bafc2402003d13"}', '0x5a9bd7f5f6723ce51c03beffa310a5bf79c2cf261ddb8622cf407b41d968ef91', '0xe603d341e958521d3f5df8f37b5144b3c003214c481716cffa4e8d6303d9734f', '0', '0', '1753167619', '', '0x2f73e96335a43b678e107b2ef57c7ec0297d88d4a9986c1d6f4e31f1d11fb4f4', '[]');
|
|
||||||
|
|
||||||
-- +goose StatementEnd
|
|
||||||
-- +goose Down
|
|
||||||
-- +goose StatementBegin
|
|
||||||
DELETE FROM l2_block;
|
|
||||||
-- +goose StatementEnd
|
|
||||||
File diff suppressed because one or more lines are too long
@@ -1,3 +1,4 @@
|
|||||||
BEGIN_BLOCK?=15206785
|
BEGIN_BLOCK?=15206785
|
||||||
END_BLOCK?=15206794
|
END_BLOCK?=15206794
|
||||||
SCROLL_FORK_NAME=galileo
|
SCROLL_FORK_NAME=galileo
|
||||||
|
SCROLL_ZKVM_VERSION?=v0.7.1
|
||||||
@@ -1,3 +1,4 @@
|
|||||||
BEGIN_BLOCK?=20239245
|
BEGIN_BLOCK?=17086000
|
||||||
END_BLOCK?=20239250
|
END_BLOCK?=17086005
|
||||||
SCROLL_FORK_NAME=galileoV2
|
SCROLL_FORK_NAME=galileoV2
|
||||||
|
SCROLL_ZKVM_VERSION?=v0.7.3-candidate-6
|
||||||
@@ -6,7 +6,7 @@
|
|||||||
"maxIdleNum": 1
|
"maxIdleNum": 1
|
||||||
},
|
},
|
||||||
"fetch_config": {
|
"fetch_config": {
|
||||||
"endpoint": "http://l2-sequencer-galileo-6.devnet.scroll.tech:8545",
|
"endpoint": "http://l2geth-rpc-0.sepolia.scroll.tech:8545",
|
||||||
"l2_message_queue_address": "0x5300000000000000000000000000000000000000"
|
"l2_message_queue_address": "0x5300000000000000000000000000000000000000"
|
||||||
},
|
},
|
||||||
"validium_mode": false,
|
"validium_mode": false,
|
||||||
|
|||||||
@@ -26,7 +26,7 @@
|
|||||||
"validium_mode": false,
|
"validium_mode": false,
|
||||||
"chain_id": 534351,
|
"chain_id": 534351,
|
||||||
"l2geth": {
|
"l2geth": {
|
||||||
"endpoint": "<serach for a public rpc endpoint like alchemy>"
|
"endpoint": "http://l2geth-rpc-0.sepolia.scroll.tech:8545/"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"auth": {
|
"auth": {
|
||||||
|
|||||||
4
zkvm-prover/.work/.gitignore
vendored
4
zkvm-prover/.work/.gitignore
vendored
@@ -1,9 +1,11 @@
|
|||||||
*.vmexe
|
*.vmexe
|
||||||
|
*.elf
|
||||||
|
*.hex
|
||||||
openvm.toml
|
openvm.toml
|
||||||
*.bin
|
*.bin
|
||||||
*.sol
|
*.sol
|
||||||
cache
|
cache
|
||||||
db
|
db*
|
||||||
*.json
|
*.json
|
||||||
?
|
?
|
||||||
root-verifier*
|
root-verifier*
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
.PHONY: prover prover_cpu lint tests_binary test_e2e_run test_run
|
.PHONY: prover prover_cpu lint tests_binary test_e2e_run test_run
|
||||||
|
|
||||||
|
RUST_LOG ?= off,scroll_zkvm_integration=debug,scroll_zkvm_verifier=debug,scroll_zkvm_prover=debug,p3_fri=warn,p3_dft=warn,openvm_circuit=warn
|
||||||
RUST_MIN_STACK ?= 16777216
|
RUST_MIN_STACK ?= 16777216
|
||||||
export RUST_MIN_STACK
|
export RUST_MIN_STACK
|
||||||
|
|
||||||
@@ -35,18 +36,16 @@ ZK_VERSION=${ZKVM_COMMIT}-${PLONKY3_VERSION}
|
|||||||
E2E_HANDLE_SET ?= ../tests/prover-e2e/testset.json
|
E2E_HANDLE_SET ?= ../tests/prover-e2e/testset.json
|
||||||
DUMP_DIR ?= .work
|
DUMP_DIR ?= .work
|
||||||
|
|
||||||
prover:
|
|
||||||
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo build --locked --release --features cuda -p prover
|
|
||||||
|
|
||||||
version:
|
version:
|
||||||
echo ${GO_TAG}-${GIT_REV}-${ZK_VERSION}
|
echo ${GO_TAG}-${GIT_REV}-${ZK_VERSION}
|
||||||
|
cargo clean --release -p scroll-proving-sdk
|
||||||
|
|
||||||
prover_cpu:
|
prover: version
|
||||||
|
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo build --locked --release --features cuda -p prover
|
||||||
|
|
||||||
|
prover_cpu: version
|
||||||
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo build --locked --release -p prover
|
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo build --locked --release -p prover
|
||||||
|
|
||||||
clean:
|
|
||||||
cargo clean -Z unstable-options --release -p prover --lockfile-path ../crates/gpu_override/Cargo.lock
|
|
||||||
|
|
||||||
tests_binary:
|
tests_binary:
|
||||||
cargo clean && cargo test --release --no-run
|
cargo clean && cargo test --release --no-run
|
||||||
ls ../target/release/deps/prover* | grep -v "\.d" | xargs -I{} ln -sf {} ./prover.test
|
ls ../target/release/deps/prover* | grep -v "\.d" | xargs -I{} ln -sf {} ./prover.test
|
||||||
@@ -56,12 +55,12 @@ lint:
|
|||||||
cargo clippy --all-features --all-targets -- -D warnings
|
cargo clippy --all-features --all-targets -- -D warnings
|
||||||
cargo fmt --all
|
cargo fmt --all
|
||||||
|
|
||||||
test_run:
|
test_run: version
|
||||||
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo run --release -p prover -- --config ./config.json
|
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo run --release -p prover -- --config ./config.json
|
||||||
|
|
||||||
test_e2e_run: ${E2E_HANDLE_SET}
|
test_e2e_run: ${E2E_HANDLE_SET} version
|
||||||
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo run --release -p prover -- --config ./config.json handle ${E2E_HANDLE_SET}
|
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo run --release -p prover -- --config ./config.json handle ${E2E_HANDLE_SET}
|
||||||
|
|
||||||
test_e2e_run_gpu: ${E2E_HANDLE_SET}
|
test_e2e_run_gpu: ${E2E_HANDLE_SET} version
|
||||||
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo run --release --features cuda -p prover -- --config ./config.json handle ${E2E_HANDLE_SET}
|
GO_TAG=${GO_TAG} GIT_REV=${GIT_REV} ZK_VERSION=${ZK_VERSION} cargo run --release --features cuda -p prover -- --config ./config.json handle ${E2E_HANDLE_SET}
|
||||||
|
|
||||||
|
|||||||
55
zkvm-prover/README.md
Normal file
55
zkvm-prover/README.md
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
# Prover: The scroll prover to generate zk proof for rollup data: chunk, batch and bundle
|
||||||
|
|
||||||
|
## Prepare
|
||||||
|
Use `config.json.template` to generate `config.json` for configuring prover,often we can use the content
|
||||||
|
inside of the template file with following tuning:
|
||||||
|
|
||||||
|
+ Update the `sdk_config.coordinator.base_url` field to the url for coordinator. The mostly used urls are:
|
||||||
|
* For sepolia testnet: `https://sepolia-coordinator.scroll.io`
|
||||||
|
* For mainnet: `https://coordinator.scroll.io`
|
||||||
|
* For local e2e test: `http://localhost:8390`
|
||||||
|
|
||||||
|
+ For a new task (not restart from any interrupted task before), create a new dir under `.work` for a clean database and
|
||||||
|
update the `sdk_config.db_path` field
|
||||||
|
|
||||||
|
+ If only wish to test some specified type of task (chunk, batch or bundle), you can specify the corresponding number in
|
||||||
|
`sdk_config.prover.supported_proof_types`:
|
||||||
|
* `1` for chunk
|
||||||
|
* `2` for batch
|
||||||
|
* `3` for bundle
|
||||||
|
|
||||||
|
### Build prover using cpu or gpu
|
||||||
|
+ `make prover` build the prover binary making use of CUDA
|
||||||
|
+ `make prover_cpu` build the prover binary only use CPU for proving
|
||||||
|
|
||||||
|
## Run as service
|
||||||
|
Call the prover binary with `--confing <path of config file>`, the binary will run as proving service, keep pulling task from coordinator, proving the task and submit the result back to coordinator
|
||||||
|
|
||||||
|
## Run for specified tasks
|
||||||
|
A batches of task ids can be specified and let prover to handle them (even they have been handled before):
|
||||||
|
Prepare a json file like following:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"chunks": [
|
||||||
|
<task id>,
|
||||||
|
<task id>,
|
||||||
|
...
|
||||||
|
],
|
||||||
|
"batches": [
|
||||||
|
<task id>,
|
||||||
|
<task id>,
|
||||||
|
...
|
||||||
|
],
|
||||||
|
"bundles": [
|
||||||
|
<task id>,
|
||||||
|
<task id>,
|
||||||
|
...
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
Run prover with `handle` command, specify the json file (suppose it is `workset.json` under current dir):
|
||||||
|
`../target/release/prover --config config.json handle workset.json`
|
||||||
|
|
||||||
|
## Dumping task data
|
||||||
|
Prover can dump a task to working dir by the `dump` command, the task id and task type (chunk, batch or bundle) must be specified:
|
||||||
|
`../target/release/prover --config config.json dump <task type> <task id>`
|
||||||
Reference in New Issue
Block a user