Compare commits

..

14 Commits

Author SHA1 Message Date
David Sanders
fa8685b03c chore: protect against destroyed LanguageModel 2026-04-16 19:44:39 -07:00
David Sanders
8ee63bff71 Merge branch 'main' into feat/prompt-api 2026-04-16 17:59:00 -07:00
David Sanders
bda9d26258 Merge branch 'main' into feat/prompt-api 2026-04-15 17:09:13 -07:00
David Sanders
cd35dceb7e chore: remove localAIHandler.setPromptAPIHandler(null) 2026-04-15 17:08:42 -07:00
David Sanders
926ab5e979 chore: store pending BindAIManager calls until utility process calls setPromptAPIHandler
Assisted-by: Claude Opus 4.6
2026-04-15 17:08:21 -07:00
David Sanders
08404a162e chore: use cppgc::WeakMember instead 2026-04-15 13:53:31 -07:00
David Sanders
16c44dffde Merge branch 'main' into feat/prompt-api 2026-04-14 18:18:53 -07:00
David Sanders
2f1d41bd7b fixup! chore: refactor for cppgc managing api::UtilityProcessWrapper 2026-04-14 17:59:28 -07:00
David Sanders
d1068736d2 chore: let setPromptAPIHandler return null to reject 2026-04-14 11:12:01 -07:00
David Sanders
1cad72b89c fix: prevent UAF with PromptResponder 2026-04-13 23:05:40 -07:00
David Sanders
0c6286798a chore: refactor for cppgc managing api::UtilityProcessWrapper
Changed in https://github.com/electron/electron/pull/50955/
2026-04-13 22:31:36 -07:00
David Sanders
1ad7f97717 docs: address review feedback 2026-04-13 20:55:37 -07:00
David Sanders
129ec3a468 Merge branch 'main' into feat/prompt-api 2026-04-13 20:53:46 -07:00
David Sanders
39aed69a33 feat: implement the Prompt API via localAIHandler
Assisted-by: Claude Opus 4.6
2026-04-03 23:15:18 -07:00
216 changed files with 5333 additions and 3552 deletions

View File

@@ -1,106 +0,0 @@
---
name: chrome-release-cls
description: Given a Chrome Releases blog post URL (chromereleases.googleblog.com), extract every CVE/bug and find the underlying Gerrit CL that fixed it by searching the local Chromium checkout and sub-repos. Use when asked to map Chrome security release notes to fixing CLs, or to find which commits correspond to CVEs in a Chrome stable update.
---
# Chrome Release → Fixing CL Mapper
Maps every security fix in a Chrome Releases blog post to the Gerrit CL(s) that fixed it.
## Input
`$ARGUMENTS` — a `https://chromereleases.googleblog.com/...` URL. If empty, ask the user for one.
## Procedure
### 1. Extract CVE → bug ID pairs from the blog post
The blog HTML buries bug IDs inside `<a>` tags, so strip tags first. Run:
```bash
curl -sL "$URL" | python3 -c '
import sys, re, html
t = re.sub(r"<[^>]+>", " ", sys.stdin.read())
t = re.sub(r"\s+", " ", html.unescape(t))
seen = set()
for m in re.finditer(r"\[\s*(\d{6,})\s*\]\s*(Critical|High|Medium|Low)\s*(CVE-\d{4}-\d+):\s*([^.]+?)\.", t):
if m.group(3) in seen: continue
seen.add(m.group(3))
print(f"{m.group(3)}|{m.group(1)}|{m.group(2)}|{m.group(4).strip()}")
' > /tmp/cve_bugs.txt
cat /tmp/cve_bugs.txt
```
If this yields nothing, the page may have changed format — fall back to `grep -oE 'CVE-[0-9]{4}-[0-9]+'` and `grep -oE 'crbug\.com/[0-9]+'` and pair them by order.
### 2. Find the fixing CL for each bug
Search git history in the Chromium checkout and relevant sub-repos for commits whose `Bug:` or `Fixed:` footer references the bug ID, then extract the `Reviewed-on:` Gerrit URL.
Repo selection by component keyword:
- ANGLE → `third_party/angle`
- Skia, Graphite → `third_party/skia`
- PDFium → `third_party/pdfium`
- Dawn → `third_party/dawn`
- V8, Turbofan, Maglev, Turboshaft → `v8`
- everything else → `.` (chromium/src)
Always also fall back to `.` if the hinted repo has no match.
```bash
cd /root/src/electron/src # chromium root (parent of electron/)
lookup() {
local bug="$1" repos="$2"
for repo in $repos . v8 third_party/skia third_party/angle third_party/pdfium third_party/dawn; do
local hits
hits=$(git -C "$repo" log --all --since='6 months ago' -E \
--grep="(Bug|Fixed):.*\\b${bug}\\b" --format='%H' 2>/dev/null | sort -u)
[[ -z "$hits" ]] && continue
while read -r h; do
git -C "$repo" log -1 --format='%B' "$h" | grep '^Reviewed-on:' | sed 's/^/ /'
echo "$(git -C "$repo" log -1 --format='%s' "$h")"
done <<<"$hits"
return 0
done
echo " (not found locally)"
}
```
Drive it from `/tmp/cve_bugs.txt`. Prefer the **non-`[M1xx]`-prefixed** commit subject as the canonical main CL; the `[M1xx]` ones are branch cherry-picks.
### 3. Handle misses
For any bug with no local hit:
- `git -C <repo> fetch origin` then re-search `--remotes` (fix may be newer than the checkout).
- Query Gerrit directly: `curl -s "https://chromium-review.googlesource.com/changes/?q=bug:${BUG}&n=10" | tail -n +2 | python3 -m json.tool` (also try `skia-review`, `pdfium-review`, `dawn-review`, `aomedia-review`).
- **`b/` bug format (Skia, Graphite, Dawn):** These repos reference bugs as `b/<id>` in commit messages rather than `Bug: <id>` footers. The Gerrit `bug:` query will return nothing. Use `message:<id>` search instead:
```bash
curl -s "https://skia-review.googlesource.com/changes/?q=message:${BUG}&n=5" | tail -n +2
```
Apply the same pattern for `dawn-review.googlesource.com` when the component is Dawn.
- **Tracing main CLs from merges:** When only `[M1xx]` merge CLs are found, query the CL detail for `cherry_pick_of_change` to find the original main CL number:
```bash
curl -s "https://chromium-review.googlesource.com/changes/${CL_NUM}?o=CURRENT_REVISION" | tail -n +2 | python3 -c "
import sys, json
d = json.load(sys.stdin)
print(d.get('cherry_pick_of_change', 'none'))
"
```
- If still nothing and the bug was reported very recently (especially by "Google Threat Intelligence" or marked in-the-wild), the CL is likely still access-restricted — report it as such rather than guessing.
### 4. Special cases
- **Roll CLs — skip and find the upstream fix:** For components whose fixes land in upstream repos (PDFium, Dawn, Skia, Graphite, libaom, libvpx, ffmpeg), the chromium-review hit will be a `Roll src/third_party/...` commit. Do not report the roll CL as the fix. Instead, query the component's own Gerrit instance directly for the actual fixing CL:
- PDFium → `pdfium-review.googlesource.com` (use `bug:` or `message:` query)
- Dawn → `dawn-review.googlesource.com` (use `message:` query — uses `b/` format)
- Skia / Graphite → `skia-review.googlesource.com` (use `message:` query — uses `b/` format)
- libaom → `aomedia-review.googlesource.com`
Only if the upstream Gerrit instance returns no results should you fall back to reporting the roll CL — in that case, include the roll CL and note that the actual fix is upstream but the specific CL could not be identified.
- Multiple `Reviewed-on:` lines in one commit body: cherry-picks keep the original line plus a new one. The **first** `Reviewed-on:` is the original CL.
- A bug may have multiple distinct fix CLs (fix + follow-up hardening) — list all of them.
### 5. Output
Produce a markdown table per severity level: `CVE | Bug | Component | Fix CL (main)`. Link bugs as `https://crbug.com/<id>`. Save raw output (including all branch merges) to `/tmp/cve_cls.txt` and mention the path.

View File

@@ -1,123 +0,0 @@
---
name: chrome-release-verify
description: End-to-end Chrome security backport for an Electron release branch. Given a Chrome Releases blog URL and a branch (e.g. 41-x-y), determines which CVE fixes are missing from the *actual synced source*, writes the cherry-pick patches locally, validates them with `e sync --3` + `lint --patches`, then pushes a single PR. Use when asked to backport a Chrome security release to N-x-y, "is CVE-X already in N-x-y?", or to produce/validate the cherry-pick set for a release branch.
---
# Chrome Release → Validated Backport PR
Input: `$ARGUMENTS` = `<release-branch> <chrome-releases-blog-url>` (e.g. `41-x-y https://chromereleases.googleblog.com/2026/04/stable-channel-update-for-desktop_15.html`). Ask if either is missing.
The flow is **local-first**: nothing is pushed until every patch applies via `e sync --3` and passes `lint --patches`.
## 1. Map CVE → bug → fix CL
Run `/chrome-release-cls <blog-url>` (or its inline procedure) to produce `/tmp/cve_bugs.txt` (`CVE|bug|severity|desc`) and a per-bug canonical fix CL. For each CL also note `repo` (path under `src/`: `.`, `v8`, `third_party/{skia,angle,pdfium,dawn}`, `third_party/libaom/source/libaom`) and `gerrit-host`.
**Prefer the target-milestone merge CL** if one exists (e.g. on `41-x-y` ≈ M146, prefer the `[M146]` cherry-pick over the main CL) — it's already rebased and far less likely to conflict. Find it via `git log --all --grep` on the Change-Id, or Gerrit `?q=bug:<n>`. If Chrome did *not* merge a fix to the target milestone, that's a strong signal the vulnerable code doesn't exist there — flag it for skip rather than forcing a port.
## 2. Prepare a synced worktree
Reuse `bp-<NN>` from `e show configs` if present, else `e worktree add bp-<NN> ~/src/electron-bp-<NN> --source <current> --no-sync`.
```bash
cd <root>/src/electron
git fetch origin <branch>
git checkout -B security-backport/<branch>/<short-date> origin/<branch>
e use bp-<NN>
e sync 2>&1 | tee /tmp/bp_sync.log
```
If sync fails with `NotADirectoryError: '<root>/src/.git/objects/info/alternates'`, remove `GIT_CACHE_PATH` from the bp config's `env` and retry.
## 3. Verify IN-TREE vs NEEDS-BACKPORT
For each bug, three checks against the **synced** repo:
1. `git -C "$repo" log HEAD --since='1 year ago' -E --grep="\b${bug}\b" --format='%h %s'`
2. Fetch Change-Id from Gerrit, then `git log HEAD --grep="^Change-Id: ${cid}$"`
3. `grep -rlE "(\b${bug}\b|${cid})" <root>/src/electron/patches/`
Any hit ⇒ IN-TREE. All empty ⇒ NEEDS-BACKPORT.
For each NEEDS-BACKPORT CL, also fetch its file list (`/changes/<proj>~<cl>/revisions/current/files`) and **skip** if every file is under `chrome/browser/`, `chrome/android/`, `ios/`, or `components/**/android/` — Electron doesn't compile those.
Report the table now (`CVE | Sev | Bug | Component | Verdict | CL`) and the proposed backport set; get user sign-off before continuing.
## 4. Write patches locally (no push yet)
For each backport CL, fetch the raw patch and write it into `patches/<dir>/`:
```bash
curl -s "https://${host}.googlesource.com/changes/${proj//\//%2F}~${cl}/revisions/current/patch" \
| base64 -d > "patches/${dir}/cherry-pick-${short}.patch"
echo "cherry-pick-${short}.patch" >> "patches/${dir}/.patches"
```
For repos with no Gerrit host `e cherry-pick` supports (e.g. **libaom** on aomedia), instead `git cherry-pick` the upstream commits onto the synced sub-repo HEAD and `git format-patch` the result.
For any newly-created `patches/<dir>/`, append to `patches/config.json` **preserving the compact one-line-per-entry style**:
```json
{ "patch_dir": "src/electron/patches/<dir>", "repo": "src/third_party/<dir-or-nested-path>" }
```
## 5. Validate with `e sync --3`
```bash
e sync --3 2>&1 | tee /tmp/bp_sync3.log
```
On `Patch failed at NNNN <subject>`:
- `cd` into the failing repo, inspect `git diff` for conflict markers.
- **Test-only files** (e.g. `web_tests/VirtualTestSuites`, `*_unittest.cc` context drift): take ours (`git checkout --ours -- <file>`) if the security-relevant hunks merged cleanly.
- **Substantive code conflicts**: check whether a target-milestone merge CL exists and swap to it. If none exists upstream and the surrounding code is structurally different, **drop the patch** (delete the file, remove from `.patches` and `config.json`) and note it for a separate manual-port PR — do not improvise security-fix semantics.
- After resolving: `git add <files> && git -c commit.gpgsign=false am --continue`, then `e patches <repo>` to export the resolved patch, then re-run `e sync --3`. Repeat until clean.
## 6. Export → lint → re-apply loop
```bash
e patches all
node script/lint.js --patches # must exit 0
```
If lint reports findings (typically trailing whitespace on `+` content lines), fixing them **changes the bytes the patch writes**, which invalidates the `index <old>..<new>` blob hashes that `e patches` baked in. Hand-editing a `.patch` and pushing it as-is will pass lint locally but fail CI's Apply Patches re-export check with a one-line `index` hash diff.
So whenever lint (or you) modifies any `.patch` file after export, round-trip once more:
```bash
# fix the lint findings in patches/**/*.patch, then:
e sync # re-apply the edited patches (no --3 needed; they applied cleanly last time)
e patches all # re-export so index blob hashes match the edited content
node script/lint.js --patches # must now exit 0
git diff --quiet -- patches/ || { echo "patches changed again — repeat the loop"; }
```
Repeat until `lint --patches` exits 0 **and** `git diff -- patches/` is empty after the final `e patches all`. Only then is the patch set CI-stable.
## 7. Commit, push, PR
```bash
git add patches/
git commit -m "chore: cherry-pick <N> changes from <dirs>"
git push origin HEAD
gh pr create --repo electron/electron --base <branch> --head <this-branch> \
--title "chore: cherry-pick <N> changes from <dirs>" \
--label "<branch>" --label backport-check-skip --label semver/patch --label "security 🔒" \
--body-file /tmp/pr_body.md
```
PR body format:
```markdown
Backports the following changes:
* [`<shortCommit>`](<gerrit-CL-url>) from <patchDir> — <subject> ([<bug>](https://crbug.com/<bug>), CVE-YYYY-NNNN)
* ...
Notes: Security: backported fixes for CVE-YYYY-NNNN, CVE-YYYY-NNNN, ....
```
Short commit links to the **Gerrit CL**; bug links to `crbug.com`; CVE comes from the blog mapping (the patch's own `Bug:` footer may differ); `Notes:` is the last line. Mention any dropped patches (with reason) above the `Notes:` line.
Restore `e use <previous>` when done.

View File

@@ -1,6 +1,6 @@
---
name: electron-node-upgrade
description: Guide for performing Node.js version upgrades in the Electron project. Use when working on the roller/node/main branch to fix patch conflicts during `e sync --3`. Covers the patch application workflow, conflict resolution, analyzing upstream Node.js changes, building, running the Node.js test suite, and proper commit formatting for patch fixes.
description: Guide for performing Node.js version upgrades in the Electron project. Use when working on the roller/node/main branch to fix patch conflicts during `e sync --3`. Covers the patch application workflow, conflict resolution, analyzing upstream Node.js changes, and proper commit formatting for patch fixes.
---
# Electron Node.js Upgrade: Phase One
@@ -174,127 +174,10 @@ When the error is in Electron's own source code:
1. Edit files directly in the electron repo
2. Commit directly (no patch export needed)
# Electron Node.js Upgrade: Phase Three
## Summary
Run the Node.js test suite via `script/node-spec-runner.js`, fix failing tests, and commit fixes until all tests pass. Certain tests are permanently disabled (listed in `script/node-disabled-tests.json`) and should not be run.
Run Phase Three immediately after Phase Two is complete.
## Success Criteria
Phase Three is complete when:
- `node script/node-spec-runner.js --default` exits with zero failures
- All changes are committed per the commit guidelines
Do not stop until these criteria are met.
## Context
Electron runs a subset of Node.js's upstream test suite using a custom runner (`script/node-spec-runner.js`). Tests are executed with the built Electron binary via `ELECTRON_RUN_AS_NODE=true`. Many tests need adaptation because Electron uses BoringSSL (not OpenSSL) and Chromium's V8 (which may differ from Node.js's bundled V8).
**Key files:**
- `script/node-spec-runner.js` — Test runner script
- `script/node-disabled-tests.json` — Permanently disabled tests (do not try to fix these)
- `../third_party/electron_node/test/` — Node.js test files (where patches apply)
- `patches/node/fix_crypto_tests_to_run_with_bssl.patch` — BoringSSL crypto test adaptations
- `patches/node/test_formally_mark_some_tests_as_flaky.patch` — Flaky test list
## Workflow
1. Run `node script/node-spec-runner.js --default` from the electron repo
2. If all tests pass → Phase Three is complete
3. If tests fail:
- Identify the failing test file(s) from the output
- Analyze each failure (see "Common Failure Patterns" below)
- Fix the test in `../third_party/electron_node/test/...`
- Re-run the specific failing test to verify: `node script/node-spec-runner.js {test-path}`
- The test path is relative to the node `test/` directory, e.g. `test/parallel/test-crypto-key-objects-raw.js`
- Do NOT use `--default` when running specific tests — it adds the full suite flags
- Do NOT run tests directly with `ELECTRON_RUN_AS_NODE` — the runner handles environment setup (e.g. temporarily switching `package.json` from ESM to CommonJS)
- Commit the fix using the fixup workflow and commit guidelines
- Return to step 1
## Commands Reference
| Command | Purpose |
|---------|---------|
| `node script/node-spec-runner.js --default` | Run full Node.js test suite |
| `node script/node-spec-runner.js test/parallel/test-foo.js` | Run a single test |
| `NODE_REGENERATE_SNAPSHOTS=1 node script/node-spec-runner.js test/test-runner/test-foo.mjs` | Regenerate snapshot for a snapshot-based test |
## Common Failure Patterns
### BoringSSL incompatibilities
Electron uses BoringSSL (via Chromium) instead of OpenSSL. Many crypto features are missing or behave differently:
| Unsupported in BoringSSL | Guard pattern |
|--------------------------|---------------|
| ChaCha20-Poly1305 | `if (!process.features.openssl_is_boringssl)` |
| AES-CCM (aes-128-ccm, aes-256-ccm) | `if (ciphers.includes('aes-128-ccm'))` |
| AES-KW (key wrapping) | `if (!process.features.openssl_is_boringssl)` |
| DSA keys | `if (!process.features.openssl_is_boringssl)` |
| Ed448 / X448 curves | `if (!process.features.openssl_is_boringssl)` |
| DH key PEM loading | `if (!process.features.openssl_is_boringssl)` |
| PQC algorithms (ML-KEM, ML-DSA, SLH-DSA) | `if (hasOpenSSL(3, 5))` (already guards these) |
When guarding tests, prefer checking cipher availability (`ciphers.includes(algo)`) over blanket BoringSSL checks where possible, as it's more precise and self-documenting.
New upstream tests that exercise these features will need guards added to the `fix_crypto_tests_to_run_with_bssl` patch.
### Snapshot test mismatches
Some tests compare output against committed `.snapshot` files using `assert.strictEqual` — these are NOT wildcard comparisons. When Chromium's V8 produces different output (e.g. different stack traces due to V8 enhancements), the snapshot must be regenerated:
```bash
NODE_REGENERATE_SNAPSHOTS=1 node script/node-spec-runner.js test/test-runner/test-foo.mjs
```
Then inspect the diff to verify the changes are expected, and commit the updated snapshot into the appropriate patch.
### V8 behavioral differences
Chromium's V8 may be ahead of Node.js's bundled V8. This can cause:
- Different stack trace formats (e.g. thenable async stack frames)
- Different error messages
- Features available in Chromium V8 that aren't in stock Node.js V8 (or vice versa)
## Two Types of Test Fixes
### A. Patch Fixes (most common for test failures)
Most test fixes go into existing patches in `patches/node/`. Use the fixup workflow:
1. Edit the test file in `../third_party/electron_node/test/...`
2. Find the relevant patch commit: `git log --oneline | grep -i "keyword"`
- Crypto/BoringSSL tests → `fix crypto tests to run with bssl`
- Snapshot tests → the specific snapshot patch (e.g. `test: accomodate V8 thenable`)
- Flaky tests → `test: formally mark some tests as flaky`
3. Create a fixup commit:
```bash
cd ../third_party/electron_node
git add test/path/to/test.js
git commit --fixup=<patch-commit-hash>
GIT_SEQUENCE_EDITOR=: git rebase --autosquash --autostash -i <commit>^
```
4. Export: `e patches node`
5. **Read `references/phase-three-commit-guidelines.md` NOW**, then commit the updated patch file.
### B. New Patches (rare)
Only create a new patch when the fix doesn't belong in any existing patch. The new patch commit in `../third_party/electron_node` must include a description explaining why the patch exists and when it can be removed — the lint check enforces this.
## Adding to Disabled Tests
Only add a test to `script/node-disabled-tests.json` as a **last resort** — when the test is fundamentally incompatible with Electron's architecture (not just a BoringSSL difference that can be guarded). Tests disabled here are completely skipped and never run.
# Critical: Read Before Committing
- Before ANY Phase One commits: Read `references/phase-one-commit-guidelines.md`
- Before ANY Phase Two commits: Read `references/phase-two-commit-guidelines.md`
- Before ANY Phase Three commits: Read `references/phase-three-commit-guidelines.md`
# High-Churn Patches
@@ -318,6 +201,5 @@ This skill has additional reference files in `references/`:
- patch-analysis.md - How to analyze patch failures
- phase-one-commit-guidelines.md - Commit format for Phase One
- phase-two-commit-guidelines.md - Commit format for Phase Two
- phase-three-commit-guidelines.md - Commit format for Phase Three
Read these when referenced in the workflow steps.

View File

@@ -1,80 +0,0 @@
# Phase Three Commit Guidelines
Only follow these instructions if there are uncommitted changes after fixing a test failure during Phase Three.
Ignore other instructions about making commit messages, our guidelines are CRITICALLY IMPORTANT and must be followed.
## Commit Message Style
**Titles** follow the 60/80-character guideline: simple changes fit within 60 characters, otherwise the limit is 80 characters.
Always include a `Co-Authored-By` trailer identifying the AI model that assisted (e.g., `Co-Authored-By: <AI model attribution>`).
## Commit Types
### Patch updates (most test fixes)
Test fixes go into existing patches via the fixup workflow. Use `fix(patch):` prefix with a descriptive topic:
```
fix(patch): {topic headline}
Ref: {Node.js commit or issue link}
Co-Authored-By: <AI model attribution>
```
Examples:
- `fix(patch): guard DH key test for BoringSSL`
- `fix(patch): adapt new crypto tests for BoringSSL`
- `fix(patch): correct thenable snapshot for Chromium V8`
- `fix(patch): skip AES-KW tests with BoringSSL`
Group related test fixes into a single commit when they address the same root cause (e.g., multiple crypto tests all needing BoringSSL guards for the same missing cipher). Don't create one commit per test file if they share the same fix pattern.
### Snapshot regeneration
When a snapshot test fails because Chromium's V8 produces different output, regenerate it:
```bash
NODE_REGENERATE_SNAPSHOTS=1 node script/node-spec-runner.js test/test-runner/test-foo.mjs
```
Then commit the updated snapshot patch with a title describing what changed:
```
fix(patch): correct {name} snapshot for Chromium V8
Ref: {V8 CL or issue link if known}
Co-Authored-By: <AI model attribution>
```
### Trivial patch updates
After any patch modification, check for dependent patches that only have index/hunk header changes:
```bash
git status
# If other .patch files show as modified with only trivial changes:
git add patches/
git commit -m "chore: update patches (trivial only)"
```
## Finding References
For BoringSSL-related test fixes, the reference is typically the upstream Node.js PR that added the new test:
```bash
cd ../third_party/electron_node
git log --oneline -5 -- test/parallel/test-crypto-foo.js
git log -1 <commit> --format="%B" | grep "PR-URL"
```
For V8 behavioral differences, reference the Chromium CL:
```
Ref: https://chromium-review.googlesource.com/c/v8/v8/+/NNNNNNN
```
If no reference found after searching: `Ref: Unable to locate reference`

View File

@@ -0,0 +1,14 @@
name: Maintainer Issue (not for public use)
description: Only to be created by Electron maintainers
body:
- type: checkboxes
attributes:
label: Confirmation
options:
- label: I am a [maintainer](https://github.com/orgs/electron/people) of the Electron project. (If not, please create a [different issue type](https://github.com/electron/electron/issues/new/).)
required: true
- type: textarea
attributes:
label: Description
validations:
required: true

View File

@@ -101,31 +101,12 @@ runs:
git pack-refs
cd ..
# Pre-create the ThinLTO cache directory so lld-link does not need to
# call CreateDirectoryW through the bindflt filter driver, which can
# return ERROR_INVALID_PARAMETER under concurrent I/O on ARC runners.
# Discover the path from GN instead of hardcoding it so we stay in
# sync with `cache_dir` in build/config/compiler/BUILD.gn; skip the
# pre-create when ThinLTO is disabled (non-official builds).
$env:ELECTRON_DEPOT_TOOLS_DISABLE_LOG = "1"
$ltoFlag = e d gn desc out/Default //electron:electron_app ldflags 2>$null |
Select-String -Pattern '^/lldltocache:(.+)$' |
Select-Object -First 1
if ($ltoFlag) {
$cachePath = Join-Path 'out\Default' $ltoFlag.Matches[0].Groups[1].Value
New-Item -ItemType Directory -Force -Path $cachePath | Out-Null
}
$env:NINJA_SUMMARIZE_BUILD = 1
if ("${{ inputs.is-release }}" -eq "true") {
e build --target electron:release_build
} else {
e build --target electron:testing_build
}
if ($LASTEXITCODE -ne 0) {
Write-Host "e build failed with exit code $LASTEXITCODE"
exit $LASTEXITCODE
}
Copy-Item out\Default\.ninja_log out\electron_ninja_log
node electron\script\check-symlinks.js

View File

@@ -1,24 +0,0 @@
name: 'Build Image SHA'
description: 'Single source of truth for the ghcr.io/electron/build image SHA'
inputs:
override:
description: 'Optional override SHA (e.g. from a workflow_dispatch input)'
required: false
default: ''
outputs:
build-image-sha:
description: 'The electron/build image SHA to use'
value: ${{ steps.set.outputs.build-image-sha }}
runs:
using: 'composite'
steps:
- id: set
shell: bash
env:
OVERRIDE: ${{ inputs.override }}
run: |
if [ -n "$OVERRIDE" ]; then
echo "build-image-sha=$OVERRIDE" >> "$GITHUB_OUTPUT"
else
echo "build-image-sha=daad061f4b99a0ae1c841be4aa09188280a9c8a4" >> "$GITHUB_OUTPUT"
fi

View File

@@ -133,7 +133,7 @@ runs:
run : |
cd src/third_party/angle
rm -f .git/objects/info/alternates
git remote set-url origin https://github.com/google/angle.git
git remote set-url origin https://chromium.googlesource.com/angle/angle.git
cp .git/config .git/config.backup
git remote remove origin
mv .git/config.backup .git/config

View File

@@ -21,28 +21,11 @@ runs:
if [ "$TARGET_ARCH" = "x86" ]; then
export npm_config_arch="ia32"
fi
ARCH=$(uname -m)
node script/yarn.js install --immutable --mode=skip-build
# if running on linux arm skip yarn Builds
ARCH=$(uname -m)
if [ "$ARCH" = "armv7l" ]; then
echo "Skipping yarn build on linux arm"
node script/yarn.js install --immutable --mode=skip-build
else
# Pre-seed the node-gyp header cache so the parallel native-addon
# builds below don't race on a cold cache. Linux build containers
# already ship a warm cache (electron/build-images#68), so only do
# this on macOS / Windows runners.
if [ "$(uname -s)" != "Linux" ]; then
for i in 1 2 3; do
if node node_modules/node-gyp/bin/node-gyp.js install; then
break
fi
if [ "$i" = "3" ]; then
echo "node-gyp header pre-seed failed after 3 attempts" >&2
exit 1
fi
echo "node-gyp header pre-seed failed (attempt $i), retrying in 5s..." >&2
sleep 5
done
fi
node script/yarn.js install --immutable
fi

View File

@@ -1,132 +0,0 @@
From a8afee1089ec2ae9ab5837b438d07338aefb3bc4 Mon Sep 17 00:00:00 2001
From: Samuel Attard <sam@electronjs.org>
Date: Wed, 22 Apr 2026 16:27:51 -0700
Subject: [PATCH] siso: retry transient ERROR_INVALID_PARAMETER when opening
ninja files on Windows
ManifestParser.Load fans out across all subninja files (~90k in a
Chromium build) at NumCPU parallelism. On Windows builders where out/
is served through a filesystem filter driver (e.g. bindflt/wcifs for
container bind mounts), CreateFileW can intermittently return
ERROR_INVALID_PARAMETER under this concurrent open burst. The previous
patch removes the redundant per-chunk re-open, but the single remaining
open per file can still hit the race; without a retry a single transient
failure aborts the entire manifest load.
Wrap the remaining os.Open call in readFile in a small Windows-only
retry for ERROR_INVALID_PARAMETER (5 attempts, 5-80ms backoff). Each
retry is logged via clog.Warningf and also written to stderr so it is
visible in CI step output where glog warnings are file-only by default.
Other platforms keep the direct os.Open path.
---
siso/toolsupport/ninjautil/file_parser.go | 3 +-
siso/toolsupport/ninjautil/openfile_other.go | 18 +++++++
.../toolsupport/ninjautil/openfile_windows.go | 50 +++++++++++++++++++
3 files changed, 69 insertions(+), 2 deletions(-)
create mode 100644 siso/toolsupport/ninjautil/openfile_other.go
create mode 100644 siso/toolsupport/ninjautil/openfile_windows.go
diff --git a/siso/toolsupport/ninjautil/file_parser.go b/siso/toolsupport/ninjautil/file_parser.go
index 6311666..324528d 100644
--- a/siso/toolsupport/ninjautil/file_parser.go
+++ b/siso/toolsupport/ninjautil/file_parser.go
@@ -7,7 +7,6 @@ package ninjautil
import (
"context"
"fmt"
- "os"
"runtime/trace"
"sync"
"time"
@@ -91,7 +90,7 @@ func (p *fileParser) parseFile(ctx context.Context, fname string) error {
// readFile reads a file of fname in parallel.
func (p *fileParser) readFile(ctx context.Context, fname string) ([]byte, error) {
defer trace.StartRegion(ctx, "ninja.read").End()
- f, err := os.Open(fname)
+ f, err := openFile(ctx, fname)
if err != nil {
return nil, err
}
diff --git a/siso/toolsupport/ninjautil/openfile_other.go b/siso/toolsupport/ninjautil/openfile_other.go
new file mode 100644
index 0000000..9fca690
--- /dev/null
+++ b/siso/toolsupport/ninjautil/openfile_other.go
@@ -0,0 +1,18 @@
+// Copyright 2026 The Chromium Authors
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+//go:build !windows
+
+package ninjautil
+
+import (
+ "context"
+ "os"
+)
+
+// openFile opens fname for reading.
+// See openfile_windows.go for the Windows variant with transient-error retry.
+func openFile(ctx context.Context, fname string) (*os.File, error) {
+ return os.Open(fname)
+}
diff --git a/siso/toolsupport/ninjautil/openfile_windows.go b/siso/toolsupport/ninjautil/openfile_windows.go
new file mode 100644
index 0000000..f9d8e9d
--- /dev/null
+++ b/siso/toolsupport/ninjautil/openfile_windows.go
@@ -0,0 +1,50 @@
+// Copyright 2026 The Chromium Authors
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+//go:build windows
+
+package ninjautil
+
+import (
+ "context"
+ "errors"
+ "fmt"
+ "os"
+ "time"
+
+ "golang.org/x/sys/windows"
+
+ "go.chromium.org/build/siso/o11y/clog"
+)
+
+// openFile opens fname for reading, retrying transient
+// ERROR_INVALID_PARAMETER failures.
+//
+// On Windows, CreateFileW can intermittently return
+// ERROR_INVALID_PARAMETER when the target lives behind a filesystem
+// filter driver (e.g. bindflt/wcifs for container bind mounts) under
+// highly concurrent opens. loadFile fans out across ~90k subninja
+// files at NumCPU parallelism, so a single transient failure would
+// otherwise abort the whole manifest load.
+func openFile(ctx context.Context, fname string) (*os.File, error) {
+ const maxAttempts = 5
+ delay := 5 * time.Millisecond
+ for i := 0; ; i++ {
+ f, err := os.Open(fname)
+ if err == nil {
+ return f, nil
+ }
+ if i+1 >= maxAttempts || !errors.Is(err, windows.ERROR_INVALID_PARAMETER) {
+ return nil, err
+ }
+ clog.Warningf(ctx, "open %s: %v; retrying (%d/%d) after %s", fname, err, i+1, maxAttempts, delay)
+ fmt.Fprintf(os.Stderr, "siso: open %s: %v; retrying (%d/%d) after %s\n", fname, err, i+1, maxAttempts, delay)
+ select {
+ case <-time.After(delay):
+ case <-ctx.Done():
+ return nil, context.Cause(ctx)
+ }
+ delay *= 2
+ }
+}
--
2.53.0

View File

@@ -18,7 +18,6 @@ jobs:
pull-requests: read
outputs:
has-patches: ${{ steps.filter.outputs.patches }}
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
with:
@@ -34,9 +33,6 @@ jobs:
patches:
- DEPS
- 'patches/**'
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
apply-patches:
needs: setup
@@ -45,7 +41,7 @@ jobs:
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -17,7 +17,7 @@ jobs:
with:
fetch-depth: 0
- name: Setup Node.js/npm
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
with:
node-version: 24.12.x
- name: Setting Up Dig Site

View File

@@ -17,7 +17,7 @@ jobs:
contents: read
steps:
- name: Setup Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6.4.0
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22.17.x
- name: Sparse checkout repository

View File

@@ -157,7 +157,7 @@ jobs:
}))
- name: Create Release Project Board
if: ${{ steps.check-major-version.outputs.MAJOR }}
uses: dsanders11/project-actions/copy-project@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/copy-project@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
id: create-release-board
with:
drafts: true
@@ -170,60 +170,6 @@ jobs:
template-view: ${{ steps.generate-project-metadata.outputs.template-view }}
title: ${{ steps.generate-project-metadata.outputs.major }}-x-y
token: ${{ steps.generate-token.outputs.token }}
- name: Randomly Assign Draft Issues to Release WG Members
if: ${{ steps.check-major-version.outputs.MAJOR }}
uses: dsanders11/project-actions/github-script@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
env:
PROJECT_ID: ${{ steps.create-release-board.outputs.id }}
with:
token: ${{ steps.generate-token.outputs.token }}
script: |
const { data: members } = await github.rest.teams.listMembersInOrg({
org: 'electron',
team_slug: 'wg-releases',
});
const excludedLogins = ['nikwen'];
const memberLogins = new Set(members.map(m => m.login));
for (const login of excludedLogins) {
if (!memberLogins.has(login)) {
core.warning(`Excluded member "${login}" is not in @electron/wg-releases`);
}
}
const eligible = members.filter(m => !excludedLogins.includes(m.login));
if (eligible.length === 0) {
core.warning('No eligible members found in @electron/wg-releases team');
return;
}
const projectId = process.env.PROJECT_ID;
const draftIssues = await actions.getDraftIssues(projectId);
if (draftIssues.length === 0) {
core.info('No draft issues found in the project');
return;
}
// Fisher-Yates shuffle for uniform random assignment
const shuffled = [...eligible];
for (let i = shuffled.length - 1; i > 0; i--) {
const j = Math.floor(Math.random() * (i + 1));
[shuffled[i], shuffled[j]] = [shuffled[j], shuffled[i]];
}
// Assign draft issues round-robin across team members
for (let i = 0; i < draftIssues.length; i++) {
const member = shuffled[i % shuffled.length];
const draftIssue = draftIssues[i];
core.info(`Assigning "${draftIssue.content.title}" to ${member.login}`);
await actions.editItem(projectId, draftIssue.content.id, {
assignees: [member.login],
});
}
core.info(`Assigned ${draftIssues.length} draft issues to ${eligible.length} team members`);
- name: Dump Release Project Board Contents
if: ${{ steps.check-major-version.outputs.MAJOR }}
run: gh project item-list ${{ steps.create-release-board.outputs.number }} --owner electron --format json | jq
@@ -231,7 +177,7 @@ jobs:
GITHUB_TOKEN: ${{ steps.generate-token.outputs.token }}
- name: Find Previous Release Project Board
if: ${{ steps.check-major-version.outputs.MAJOR }}
uses: dsanders11/project-actions/find-project@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/find-project@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
id: find-prev-release-board
with:
fail-if-project-not-found: false
@@ -239,7 +185,7 @@ jobs:
token: ${{ steps.generate-token.outputs.token }}
- name: Close Previous Release Project Board
if: ${{ steps.find-prev-release-board.outputs.number }}
uses: dsanders11/project-actions/close-project@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/close-project@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
project-number: ${{ steps.find-prev-release-board.outputs.number }}
token: ${{ steps.generate-token.outputs.token }}

View File

@@ -2,38 +2,25 @@ name: Build Git Cache
# This workflow updates git cache on the cross-instance cache volumes
# It runs daily at midnight.
on:
on:
schedule:
- cron: "0 0 * * *"
permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
build-git-cache-linux:
needs: setup
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
env:
CHROMIUM_GIT_COOKIE: ${{ secrets.CHROMIUM_GIT_COOKIE }}
CHROMIUM_GIT_COOKIE: ${{ secrets.CHROMIUM_GIT_COOKIE }}
GCLIENT_EXTRA_ARGS: '--custom-var=checkout_arm=True --custom-var=checkout_arm64=True'
steps:
- name: Checkout Electron
@@ -47,12 +34,12 @@ jobs:
target-platform: linux
build-git-cache-windows:
needs: setup
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
options: --user root --device /dev/fuse --cap-add SYS_ADMIN
volumes:
- /mnt/win-cache:/mnt/win-cache
@@ -72,13 +59,14 @@ jobs:
target-platform: win
build-git-cache-macos:
# This job updates the same git cache as linux, so it needs to run after the linux one.
needs: [setup, build-git-cache-linux]
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
# This job updates the same git cache as linux, so it needs to run after the linux one.
needs: build-git-cache-linux
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
@@ -94,4 +82,4 @@ jobs:
- name: Build Git Cache
uses: ./src/electron/.github/actions/build-git-cache
with:
target-platform: macos
target-platform: macos

View File

@@ -6,8 +6,8 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: ''
required: false
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
required: true
skip-macos:
type: boolean
description: 'Skip macOS builds'
@@ -48,14 +48,14 @@ permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
outputs:
docs: ${{ steps.filter.outputs.docs }}
src: ${{ steps.filter.outputs.src }}
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
build-image-sha: ${{ steps.set-output.outputs.build-image-sha }}
docs-only: ${{ steps.set-output.outputs.docs-only }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
@@ -67,21 +67,20 @@ jobs:
filters: |
docs:
- 'docs/**'
- '.claude/**'
- README.md
- SECURITY.md
- CONTRIBUTING.md
- CODE_OF_CONDUCT.md
src:
- '!{docs,.claude}/**'
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
- name: Set Docs Only
- '!docs/**'
- name: Set Outputs for Build Image SHA & Docs Only
id: set-output
run: |
if [ -z "${{ inputs.build-image-sha }}" ]; then
echo "build-image-sha=eac3529546ea8f3aa356d31e345715eef342233b" >> "$GITHUB_OUTPUT"
else
echo "build-image-sha=${{ inputs.build-image-sha }}" >> "$GITHUB_OUTPUT"
fi
echo "docs-only=${{ steps.filter.outputs.docs == 'true' && steps.filter.outputs.src == 'false' }}" >> "$GITHUB_OUTPUT"
# Lint Jobs
@@ -275,11 +274,10 @@ jobs:
contents: read
issues: read
pull-requests: read
uses: ./.github/workflows/pipeline-electron-build-and-tidy-and-test.yml
uses: ./.github/workflows/pipeline-electron-build-and-test.yml
needs: checkout-macos
with:
build-runs-on: macos-15-xlarge
clang-tidy-runs-on: macos-15-large
test-runs-on: macos-15
target-platform: macos
target-arch: arm64
@@ -394,14 +392,12 @@ jobs:
contents: read
issues: read
pull-requests: read
uses: ./.github/workflows/pipeline-electron-build-and-tidy-and-test.yml
uses: ./.github/workflows/pipeline-electron-build-and-test.yml
needs: [checkout-windows, build-siso-windows]
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
with:
build-runs-on: electron-arc-centralus-windows-amd64-32core
clang-tidy-runs-on: electron-arc-centralus-linux-amd64-8core
build-runs-on: electron-arc-centralus-windows-amd64-16core
test-runs-on: windows-latest
clang-tidy-container: '{"image":"ghcr.io/electron/build:${{ needs.checkout-windows.outputs.build-image-sha }}","options":"--user root --device /dev/fuse --cap-add SYS_ADMIN","volumes":["/mnt/win-cache:/mnt/win-cache"]}'
target-platform: win
target-arch: x64
is-release: false
@@ -419,7 +415,7 @@ jobs:
needs: [checkout-windows, build-siso-windows]
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
with:
build-runs-on: electron-arc-centralus-windows-amd64-32core
build-runs-on: electron-arc-centralus-windows-amd64-16core
test-runs-on: windows-latest
target-platform: win
target-arch: x86
@@ -438,7 +434,7 @@ jobs:
needs: [checkout-windows, build-siso-windows]
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
with:
build-runs-on: electron-arc-centralus-windows-amd64-32core
build-runs-on: electron-arc-centralus-windows-amd64-16core
test-runs-on: windows-11-arm
target-platform: win
target-arch: arm64

View File

@@ -13,26 +13,13 @@ on:
permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
clean-orphaned-uploads:
needs: setup
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -12,28 +12,15 @@ on:
permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
clean-src-cache:
needs: setup
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
env:
DD_API_KEY: ${{ secrets.DD_API_KEY }}
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -21,7 +21,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Set status
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90
@@ -42,7 +42,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Set status
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90

View File

@@ -20,7 +20,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Add to Issue Triage
uses: dsanders11/project-actions/add-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/add-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
field: Reporter
field-value: ${{ github.event.issue.user.login }}

View File

@@ -20,7 +20,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Remove from issue triage
uses: dsanders11/project-actions/delete-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/delete-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90

View File

@@ -33,7 +33,7 @@ jobs:
org: electron
- name: Set status
if: ${{ steps.check-for-blocked-labels.outputs.NOT_BLOCKED }}
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90

View File

@@ -6,8 +6,7 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: ''
required: false
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
upload-to-storage:
description: 'Uploads to Azure storage'
required: false
@@ -21,28 +20,13 @@ on:
permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
checkout-linux:
needs: setup
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
@@ -66,11 +50,11 @@ jobs:
attestations: write
contents: read
id-token: write
needs: [setup, checkout-linux]
needs: checkout-linux
with:
environment: production-release
build-runs-on: electron-arc-centralus-linux-amd64-32core
build-container: '{"image":"ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
build-container: '{"image":"ghcr.io/electron/build:${{ inputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
target-platform: linux
target-arch: x64
is-release: true
@@ -86,11 +70,11 @@ jobs:
attestations: write
contents: read
id-token: write
needs: [setup, checkout-linux]
needs: checkout-linux
with:
environment: production-release
build-runs-on: electron-arc-centralus-linux-amd64-32core
build-container: '{"image":"ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
build-container: '{"image":"ghcr.io/electron/build:${{ inputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
target-platform: linux
target-arch: arm
is-release: true
@@ -106,11 +90,11 @@ jobs:
attestations: write
contents: read
id-token: write
needs: [setup, checkout-linux]
needs: checkout-linux
with:
environment: production-release
build-runs-on: electron-arc-centralus-linux-amd64-32core
build-container: '{"image":"ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
build-container: '{"image":"ghcr.io/electron/build:${{ inputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
target-platform: linux
target-arch: arm64
is-release: true

View File

@@ -6,8 +6,8 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: ''
required: false
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
required: true
upload-to-storage:
description: 'Uploads to Azure storage'
required: false
@@ -21,28 +21,13 @@ on:
permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
checkout-macos:
needs: setup
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -123,7 +123,7 @@ jobs:
run: df -h
- name: Setup Node.js/npm
if: ${{ inputs.target-platform == 'macos' }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
with:
node-version: 22.21.x
cache: yarn

View File

@@ -137,32 +137,13 @@ jobs:
run: |
e init -f --root=$(pwd) --out=${ELECTRON_OUT_DIR} testing --target-cpu ${TARGET_ARCH} --remote-build none
# For macOS use_remoteexec=false will cause GN errors, so even though we're doing no remote build, set it
export GN_EXTRA_ARGS="use_remoteexec=true target_cpu=\"${TARGET_ARCH}\""
export GN_EXTRA_ARGS="target_cpu=\"${TARGET_ARCH}\""
if [ "${{ inputs.target-platform }}" = "win" ]; then
export GN_EXTRA_ARGS="$GN_EXTRA_ARGS use_v8_context_snapshot=true target_os=\"win\""
fi
e build --only-gen
# Copy macOS framework headers so clang-tidy can find them via -F.
# This must happen after e build --only-gen since e init -f may
# recreate the output directory.
if [ "${{ inputs.target-platform }}" = "macos" ]; then
OUT=src/out/${ELECTRON_OUT_DIR}
SQRL=src/third_party/squirrel.mac
mkdir -p ${OUT}/{ReactiveObjC,Squirrel,Mantle}.framework/Headers
cp ${SQRL}/vendor/ReactiveObjC/ReactiveObjC/*.h ${OUT}/ReactiveObjC.framework/Headers/
cp ${SQRL}/vendor/ReactiveObjC/ReactiveObjC/extobjc/*.h ${OUT}/ReactiveObjC.framework/Headers/
cp ${SQRL}/Squirrel/*.h ${OUT}/Squirrel.framework/Headers/
cp ${SQRL}/vendor/Mantle/Mantle/include/*.h ${OUT}/Mantle.framework/Headers/
cp ${SQRL}/vendor/Mantle/Mantle/extobjc/include/*.h ${OUT}/Mantle.framework/Headers/
fi
cd src/electron
node script/yarn.js lint:clang-tidy --jobs 8 --out-dir ../out/${ELECTRON_OUT_DIR}
- name: Remove Clang problem matcher

View File

@@ -131,7 +131,7 @@ jobs:
run: df -h
- name: Setup Node.js/npm
if: ${{ inputs.target-platform == 'macos' }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
with:
node-version: 22.21.x
cache: yarn

View File

@@ -79,7 +79,7 @@ jobs:
cp $(which node) /mnt/runner-externals/node24/bin/
- name: Setup Node.js/npm
if: ${{ inputs.target-platform == 'win' }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
with:
node-version: 22.21.x
- name: Add TCC permissions on macOS

View File

@@ -37,7 +37,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Get project item status
uses: dsanders11/project-actions/get-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/get-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
id: get-item
with:
token: ${{ steps.generate-token.outputs.token }}
@@ -47,7 +47,7 @@ jobs:
if: >-
(steps.get-item.outputs.field-status == '🛑 Needs Submitter Response'
|| steps.get-item.outputs.field-status == '🟡 WIP')
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 118

View File

@@ -42,7 +42,7 @@ jobs:
creds: ${{ secrets.RELEASE_BOARD_GH_APP_CREDS }}
org: electron
- name: Set status
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 94

View File

@@ -51,6 +51,6 @@ jobs:
# Upload the results to GitHub's code scanning dashboard.
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v3.29.5
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v3.29.5
with:
sarif_file: results.sarif

View File

@@ -29,7 +29,7 @@ jobs:
PROJECT_NUMBER=$(gh project list --owner electron --format json | jq -r '.projects | map(select(.title | test("^[0-9]+-x-y$"))) | max_by(.number) | .number')
echo "PROJECT_NUMBER=$PROJECT_NUMBER" >> "$GITHUB_OUTPUT"
- name: Update Completed Stable Prep Items
uses: dsanders11/project-actions/completed-by@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
uses: dsanders11/project-actions/completed-by@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
with:
field: Prep Status
field-value: ✅ Complete

View File

@@ -6,8 +6,8 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: ''
required: false
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
required: true
upload-to-storage:
description: 'Uploads to Azure storage'
required: false
@@ -21,28 +21,13 @@ on:
permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
checkout-windows:
needs: setup
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
options: --user root --device /dev/fuse --cap-add SYS_ADMIN
volumes:
- /mnt/win-cache:/mnt/win-cache
@@ -52,6 +37,8 @@ jobs:
GCLIENT_EXTRA_ARGS: '--custom-var=checkout_win=True'
TARGET_OS: 'win'
ELECTRON_DEPOT_TOOLS_WIN_TOOLCHAIN: '1'
outputs:
build-image-sha: ${{ inputs.build-image-sha }}
steps:
- name: Checkout Electron
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
@@ -67,7 +54,7 @@ jobs:
# Build the patched siso binary in parallel with checkout-windows; the
# publish-*-win jobs consume it via SISO_PATH.
build-siso-windows:
needs: setup
if: github.repository == 'electron/electron'
uses: ./.github/workflows/pipeline-segment-build-siso-windows.yml
permissions:
contents: read
@@ -82,7 +69,7 @@ jobs:
needs: [checkout-windows, build-siso-windows]
with:
environment: production-release
build-runs-on: electron-arc-centralus-windows-amd64-32core
build-runs-on: electron-arc-centralus-windows-amd64-16core
target-platform: win
target-arch: x64
is-release: true
@@ -101,7 +88,7 @@ jobs:
needs: [checkout-windows, build-siso-windows]
with:
environment: production-release
build-runs-on: electron-arc-centralus-windows-amd64-32core
build-runs-on: electron-arc-centralus-windows-amd64-16core
target-platform: win
target-arch: arm64
is-release: true
@@ -120,7 +107,7 @@ jobs:
needs: [checkout-windows, build-siso-windows]
with:
environment: production-release
build-runs-on: electron-arc-centralus-windows-amd64-32core
build-runs-on: electron-arc-centralus-windows-amd64-16core
target-platform: win
target-arch: x86
is-release: true

View File

@@ -1,82 +0,0 @@
# Vendored Yarn release
This directory holds the Yarn release used by this repo (`yarnPath` in
`.yarnrc.yml`). The release file is checked in so every contributor and CI job
runs the exact same Yarn, and so we can carry small local patches when needed.
`releases/yarn-4.12.0.cjs` currently carries one such patch, described below.
If you bump the Yarn version, read the **Upgrading Yarn** section first.
## Patch: use `JsZipImpl` for the node-modules link step
### What changed
Two call sites in `releases/yarn-4.12.0.cjs` are modified so the
`node-modules` linker (and the `pnpm`-loose linker) construct their read-only
`ZipOpenFS` with `customZipImplementation: ST` — Yarn's pure-JS `JsZipImpl`
instead of falling through to the default WASM-backed `LibZipImpl`:
```text
new $f({maxOpenFiles:80,readOnlyArchives:!0})
→ new $f({maxOpenFiles:80,readOnlyArchives:!0,customZipImplementation:ST})
```
A comment block at the top of the `.cjs` file marks the file as patched and
points back here.
### Why
On the `linux-arm` CI test shards we run a 32-bit `arm32v7` container. During
`yarn install`'s **Link step**, Yarn opens up to 80 cache zips concurrently.
With `LibZipImpl`, each open zip is `readFileSync`'d into a Node `Buffer`
**and copied again into the WASM linear memory**, and every file read does a
WASM `_malloc(size)` for the entry. The WASM heap has to grow as a single
contiguous region of the 32-bit address space; once enough zips are resident,
the `_malloc` for a large entry — most often `typescript/lib/typescript.js`
(~9 MB inside a ~22 MB zip) — fails.
Yarn's cross-FS `copyFilePromise` swallows the underlying error and re-throws
a generic one, so CI shows:
```text
YN0001: While persisting .../typescript-patch-...zip/node_modules/typescript/
EINVAL: invalid argument, copyfile '/node_modules/typescript/lib/typescript.js' -> '...'
```
The unmasked form (occasionally seen on `pdfjs-dist`) is the WASM-heap failure
string `Couldn't allocate enough memory`. This started failing ~1-in-3
`linux-arm / test` shards at **Install Dependencies** on 2026-04-13, after
[#50692](https://github.com/electron/electron/pull/50692) grew the cache enough
to push the 32-bit process over the edge nondeterministically — e.g.
[run 24739817558](https://github.com/electron/electron/actions/runs/24739817558/job/72380803746).
`JsZipImpl` avoids the problem entirely: it opens the zip by file descriptor,
reads only the central directory into memory, and `readSync`s individual
entries into ordinary Node `Buffer`s — **no WASM heap involved**. It is
read-only and path-based, which is exactly how the linker uses these archives.
There is no `.yarnrc.yml` setting or environment variable to select the zip
implementation (verified against the bundle), so editing the vendored release
is the only way to switch it short of re-implementing the linker in a plugin.
Upstream references:
[yarnpkg/berry#3972](https://github.com/yarnpkg/berry/issues/3972),
[yarnpkg/berry#6722](https://github.com/yarnpkg/berry/issues/6722),
[yarnpkg/berry#6550](https://github.com/yarnpkg/berry/issues/6550).
### Upgrading Yarn
When bumping `releases/yarn-*.cjs`:
1. Check whether upstream now defaults `readOnlyArchives` opens to `JsZipImpl`,
or exposes a config knob for the zip implementation. If so, drop this patch.
2. Otherwise, re-apply: search the new bundle for
`maxOpenFiles:80,readOnlyArchives:!0` (the surrounding minified identifiers
will differ) and add `,customZipImplementation:<JsZipImpl symbol>` — that
symbol is whatever the new bundle exports as `JsZipImpl` from
`@yarnpkg/libzip`.
3. Re-add the header comment pointing back to this README.
4. Verify with
`rm -rf node_modules spec/node_modules && node script/yarn.js install --immutable --mode=skip-build`
and confirm `node_modules/typescript/lib/typescript.js` is byte-identical to
an unpatched install.

File diff suppressed because one or more lines are too long

View File

@@ -105,25 +105,21 @@ electron_mac_bundle_id = branding.mac_bundle_id
if (override_electron_version != "") {
electron_version = override_electron_version
} else {
# When building from a source code tarball there is no git tag available and
# When building from source code tarball there is no git tag available and
# builders must explicitly pass override_electron_version in gn args.
#
# Resolve the real locations of packed-refs and HEAD via git so that this
# also works when electron/ is a `git worktree` (where .git is a file, not a
# directory, and GN's read_file cannot follow the gitdir indirection).
electron_git_ref_paths =
exec_script("script/get-git-ref-paths.py", [], "list lines")
# This read_file call will assert if there is no git information, without it
# gn will generate a malformed build configuration and ninja will get into
# infinite loop.
read_file(electron_git_ref_paths[0], "string")
read_file(".git/packed-refs", "string")
# Set electron version from git tag.
electron_version = exec_script("script/get-git-version.py",
[],
"trim string",
electron_git_ref_paths)
[
".git/packed-refs",
".git/HEAD",
])
}
if (is_mas_build) {
@@ -499,10 +495,8 @@ source_set("electron_lib") {
"//components/certificate_transparency",
"//components/compose:buildflags",
"//components/embedder_support:user_agent",
"//components/heap_profiling/multi_process",
"//components/input",
"//components/language/core/browser",
"//components/memory_system",
"//components/net_log",
"//components/network_hints/browser",
"//components/network_hints/common:mojo_bindings",
@@ -637,6 +631,17 @@ source_set("electron_lib") {
]
}
if (enable_prompt_api) {
sources += [
"shell/browser/ai/proxying_ai_manager.cc",
"shell/browser/ai/proxying_ai_manager.h",
"shell/utility/ai/utility_ai_language_model.cc",
"shell/utility/ai/utility_ai_language_model.h",
"shell/utility/ai/utility_ai_manager.cc",
"shell/utility/ai/utility_ai_manager.h",
]
}
if (is_mac) {
# Disable C++ modules to resolve linking error when including MacOS SDK
# headers from third_party/electron_node/deps/uv/include/uv/darwin.h

2
DEPS
View File

@@ -4,7 +4,7 @@ vars = {
'chromium_version':
'148.0.7778.0',
'node_version':
'v24.15.0',
'v24.14.1',
'nan_version':
'675cefebca42410733da8a454c8d9391fcebfbc2',
'squirrel.mac_version':

View File

@@ -12,6 +12,7 @@ buildflag_header("buildflags") {
"ENABLE_PDF_VIEWER=$enable_pdf_viewer",
"ENABLE_ELECTRON_EXTENSIONS=$enable_electron_extensions",
"ENABLE_BUILTIN_SPELLCHECKER=$enable_builtin_spellchecker",
"ENABLE_PROMPT_API=$enable_prompt_api",
"OVERRIDE_LOCATION_PROVIDER=$enable_fake_location_provider",
]

View File

@@ -17,6 +17,9 @@ declare_args() {
# Enable Spellchecker support
enable_builtin_spellchecker = true
# Enable Prompt API support.
enable_prompt_api = true
# The version of Electron.
# Packagers and vendor builders should set this in gn args to avoid running
# the script that reads git tag.

View File

@@ -124,65 +124,4 @@ Returns `Promise<Object>` - Resolves with an object containing the `value` and `
Get the maximum usage across processes of trace buffer as a percentage of the
full state.
### `contentTracing.enableHeapProfiling([options])` _Experimental_
<!--
```YAML history
added:
- pr-url: https://github.com/electron/electron/pull/50826
```
-->
* `options` ([EnableHeapProfilingOptions](structures/enable-heap-profiling-options.md)) (optional)
Returns `Promise<void>` - Resolves once heap profiling has been enabled.
Enable [heap profiling](https://chromium.googlesource.com/chromium/src/+/lkgr/docs/memory-infra/heap_profiler.md)
for MemoryInfra traces. Equivalent to the `--memlog` switch in Chrome.
Only takes effect if the `disabled-by-default-memory-infra` category is included.
Needs to be called before `contentTracing.startRecording()`.
Usage:
```js
const { contentTracing } = require('electron')
async function recordTrace () {
await contentTracing.enableHeapProfiling()
await contentTracing.startRecording({
included_categories: ['disabled-by-default-memory-infra'],
excluded_categories: ['*'],
memory_dump_config: {
triggers: [
{ mode: 'detailed', periodic_interval_ms: 1000 }
]
}
})
await new Promise(resolve => setTimeout(resolve, 5000))
const filePath = await contentTracing.stopRecording()
}
```
To view the recorded heap dumps:
1. Download the breakpad symbols for your Electron version from the Electron GitHub
[releases](https://github.com/electron/electron/releases)
2. Clone the [Electron source code](../development/build-instructions-gn.md)
3. In your Chromium checkout for Electron, run this command to symbolicate the heap dump:
```bash
python3 third_party/catapult/tracing/bin/symbolize_trace --use-breakpad-symbols --breakpad-symbols-directory /path/to/breakpad_symbols /path/to/trace.json
```
4. Open the symbolicated trace in `chrome://tracing` (the Perfetto UI does not support memory dumps
yet)
5. Click on one of the `M` symbols
6. Click on a `` triple bar icon (e.g., in the `malloc` column)
<img src="../images/viewing-heap-dumps.png" alt="Screenshot showing how to view a heapdump in Chromium's tracing view" />
[trace viewer]: https://chromium.googlesource.com/catapult/+/HEAD/tracing/README.md

View File

@@ -0,0 +1,85 @@
## Class: LanguageModel
> Implement local AI language models
Process: [Utility](../glossary.md#utility-process)
### `new LanguageModel(initialState)`
* `initialState` Object
* `contextUsage` number
* `contextWindow` number
> [!NOTE]
> Do not use this constructor directly outside of the class itself, as it will not be properly connected to the `localAIHandler`
### Static Methods
The `LanguageModel` class has the following static methods:
#### `LanguageModel.create(options)` _Experimental_
* `options` [LanguageModelCreateOptions](structures/language-model-create-options.md)
Returns `Promise<LanguageModel>`. Creates a new `LanguageModel` with the provided `options`.
#### `LanguageModel.availability([options])` _Experimental_
* `options` [LanguageModelCreateCoreOptions](structures/language-model-create-core-options.md) (optional)
Returns `Promise<string>`
Determines the availability of the language model and returns one of the following strings:
* `available`
* `downloadable`
* `downloading`
* `unavailable`
### Instance Properties
The following properties are available on instances of `LanguageModel`:
#### `languageModel.contextUsage` _Experimental_
A `number` representing how many tokens are currently in the context window.
#### `languageModel.contextWindow` _Experimental_
A `number` representing the size of the context window, in tokens.
### Instance Methods
The following methods are available on instances of `LanguageModel`:
#### `languageModel.prompt(input, options)` _Experimental_
* `input` [LanguageModelMessage[]](structures/language-model-message.md)
* `options` [LanguageModelPromptOptions](structures/language-model-prompt-options.md)
Returns `Promise<string> | Promise<import('stream/web').ReadableStream<string>>`. Prompt the model for a response.
#### `languageModel.append(input, options)` _Experimental_
* `input` [LanguageModelMessage[]](structures/language-model-message.md)
* `options` [LanguageModelAppendOptions](structures/language-model-append-options.md)
Returns `Promise<undefined>`. Append a message without prompting for a response.
#### `languageModel.measureContextUsage(input, options)` _Experimental_
* `input` [LanguageModelMessage[]](structures/language-model-message.md)
* `options` [LanguageModelPromptOptions](structures/language-model-prompt-options.md)
Returns `Promise<number>`. Measure how many tokens the input would use.
#### `languageModel.clone(options)` _Experimental_
* `options` [LanguageModelCloneOptions](structures/language-model-clone-options.md)
Returns `Promise<LanguageModel>`. Clones the `LanguageModel` such that the
context and initial prompt should be preserved.
#### `languageModel.destroy()` _Experimental_
Destroys the model, and any ongoing executions are aborted.

View File

@@ -0,0 +1,32 @@
# localAIHandler
> Proxy built-in AI APIs to a local LLM implementation
Process: [Utility](../glossary.md#utility-process)
This module is intended to be used by a script registered to a session via
[`ses.registerLocalAIHandler(handler)`](./session.md#sesregisterlocalaihandlerhandler-experimental)
## Methods
The `localAIHandler` module has the following methods:
#### `localAIHandler.setPromptAPIHandler(promptAPIHandler)` _Experimental_
* `promptAPIHandler` Function\<typeof [LanguageModel](language-model.md) | null\>
* `details` Object
* `webContentsId` Integer - The [unique id](web-contents.md#contentsid-readonly) of
the [WebContents](web-contents.md) calling the Prompt API.
* `securityOrigin` string - Origin of the page calling the Prompt API.
Sets the handler for new Prompt API binding requests from the renderer process. This happens
once per pair of `webContentsId` and `securityOrigin`. Returning `null` from the handler
will reject the creation of a new Prompt API session in the renderer. If you want to
invalidate existing Prompt API sessions, clear the local AI handler for the session using
`ses.registerLocalAIHandler(null)`.
> [!NOTE] If a renderer calls the Prompt API before `setPromptAPIHandler()` has been called,
> the request is queued. Once the handler is set, all queued requests are flushed. If too
> many requests are queued, the oldest pending request is dropped and pending promises in
> the renderer will be rejected. To avoid this, be sure to call `setPromptAPIHandler()` as
> early as possible.

View File

@@ -650,7 +650,7 @@ Clears the sessions HTTP cache.
`scheme://host:port`.
* `storages` string[] (optional) - The types of storages to clear, can be
`cookies`, `filesystem`, `indexdb`, `localstorage`,
`shadercache`, `serviceworkers`, `cachestorage`. If not
`shadercache`, `websql`, `serviceworkers`, `cachestorage`. If not
specified, clear all storage types.
Returns `Promise<void>` - resolves when the storage data has been cleared.
@@ -1632,6 +1632,12 @@ This method clears more types of data and is more thorough than the
For more information, refer to Chromium's [`BrowsingDataRemover` interface][browsing-data-remover].
#### `ses.registerLocalAIHandler(handler)` _Experimental_
* `handler` [UtilityProcess](utility-process.md#class-utilityprocess) | null
Registers a local AI handler `UtilityProcess`. To clear the handler, call `registerLocalAIHandler(null)`, which will disconnect any existing Prompt API sessions and destroy any `LanguageModel` instances.
### Instance Properties
The following properties are available on instances of `Session`:

View File

@@ -1,26 +0,0 @@
# EnableHeapProfilingOptions Object
* `mode` string (optional) - Controls which processes are profiled. Equivalent to `--memlog` in
Chrome. Default is `all`.
* `all` - Profile all processes.
* `browser` - Profile only the browser process.
* `gpu` - Profile only the GPU process.
* `minimal` - Profile only the browser and GPU processes.
* `renderer-sampling` - Profile at most 1 renderer process. Each renderer process has a fixed
probability of being profiled when the renderer process is started or, for existing processes,
when heap profiling is enabled.
* `all-renderers` - Profile all renderer processes.
* `utility-sampling` - Each utility process has a fixed probability of being profiled.
* `all-utilities` - Profile all utility processes.
* `utility-and-browser` - Profile all utility processes and the browser process.
* `samplingRate` number (optional) - Controls the sampling interval in bytes. The lower the
interval, the more precise the profile is. However it comes at the cost of performance. Default
is `100000` (100KB). That is enough to observe allocation sites that make allocations >500KB
total, where total equals to a single allocation size times the number of such allocations at the
same call site. Equivalent to `--memlog-sampling-rate` in Chrome. Must be an integer between
`1000` and `10000000`.
* `stackMode` string (optional) - Controls the type of metadata recorded for each allocation.
Equivalent to `--memlog-stack-mode` in Chrome. Default is `native`.
* `native` - Instruction addresses from unwinding the stack.
* `native-with-thread-names` - Instruction addresses from unwinding the stack. Includes the thread
name as the first frame.

View File

@@ -0,0 +1,3 @@
# LanguageModelAppendOptions Object
* `signal` [AbortSignal](https://nodejs.org/api/globals.html#globals_class_abortsignal)

View File

@@ -0,0 +1,3 @@
# LanguageModelCloneOptions Object
* `signal` [AbortSignal](https://nodejs.org/api/globals.html#globals_class_abortsignal)

View File

@@ -0,0 +1,4 @@
# LanguageModelCreateCoreOptions Object
* `expectedInputs` [LanguageModelExpected[]](language-model-expected.md) (optional)
* `expectedOutputs` [LanguageModelExpected[]](language-model-expected.md) (optional)

View File

@@ -0,0 +1,4 @@
# LanguageModelCreateOptions Object extends `LanguageModelCreateCoreOptions`
* `signal` [AbortSignal](https://nodejs.org/api/globals.html#globals_class_abortsignal)
* `initialPrompts` [LanguageModelMessage[]](language-model-message.md) (optional)

View File

@@ -0,0 +1,7 @@
# LanguageModelExpected Object
* `type` string - Can be one of the following values:
* `text`
* `image`
* `audio`
* `languages` string[] (optional)

View File

@@ -0,0 +1,7 @@
# LanguageModelMessageContent Object
* `type` string - Can be one of the following values:
* `text`
* `image`
* `audio`
* `value` ArrayBuffer | string

View File

@@ -0,0 +1,8 @@
# LanguageModelMessage Object
* `role` string - Can be one of the following values:
* `system`
* `user`
* `assistant`
* `content` [LanguageModelMessageContent[]](language-model-message-content.md)
* `prefix` boolean (optional)

View File

@@ -0,0 +1,4 @@
# LanguageModelPromptOptions Object
* `responseConstraint` Object (optional)
* `signal` [AbortSignal](https://nodejs.org/api/globals.html#globals_class_abortsignal)

View File

@@ -5,7 +5,6 @@
* `rgba` - 32bpp RGBA (byte-order), 1 plane.
* `rgbaf16` - Half float RGBA, 1 plane.
* `nv12` - 12bpp with Y plane followed by a 2x2 interleaved UV plane.
* `nv16` - 16bpp with Y plane followed by a 2x1 interleaved UV plane.
* `p010le` - 4:2:0 10-bit YUV (little-endian), Y plane followed by a 2x2 interleaved UV plane.
* `colorSpace` [ColorSpace](color-space.md) (optional) - The color space of the texture.
* `codedSize` [Size](size.md) - The full dimensions of the shared texture.

View File

@@ -2275,20 +2275,6 @@ Returns `Integer` - The Chromium internal `pid` of the associated renderer. Can
be compared to the `frameProcessId` passed by frame specific navigation events
(e.g. `did-frame-navigate`)
#### `contents.clone()`
Returns `WebContents` - A cloned WebContents instance. This method creates a copy
of the WebContents with the following attributes:
* **WebPreferences** - All preferences from the original WebContents are copied
* **SiteInstance** - Uses the same SiteInstance as the original. This means the cloned WebContents will reuse the same render process as the original when loading same-origin pages, and only spawn a new render process for cross-origin navigations. This process allocation behavior is consistent with window.open and tab duplication in Chromium. For more details, see [Chromium's Site Isolation](https://www.chromium.org/developers/design-documents/site-isolation/) design document.
* **Opener relationship** - Inherits the opener (window.opener) relationship
* **Navigation state** - Copies the navigation history and controller state
The cloned WebContents is an independent instance with its own lifecycle that can be destroyed separately and will not contain any open web pages.
This API is useful for use cases where you want to create a new WebContents that shares the same render process with the original for same-origin content, while maintaining full lifecycle independence. Additionally, reusing the existing render process can help optimize memory usage and page load speed to a certain extent, as it eliminates the overhead of spawning and initializing a new render process from scratch.
#### `contents.takeHeapSnapshot(filePath)`
* `filePath` string - Path to the output file.

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.7 MiB

View File

@@ -2,53 +2,28 @@
Electron frequently releases major versions alongside every other Chromium release.
This document focuses on the release cadence and version support policy.
> [!TIP]
> See the [Electron Versioning](./electron-versioning.md) document for more details
> on how Electron is versioned.
For a more in-depth guide on our git branches and how Electron uses semantic versions,
check out our [Electron Versioning](./electron-versioning.md) doc.
## Timeline
[Electron's Release Schedule](https://releases.electronjs.org/schedule) lists a schedule of Electron major releases showing key milestones including alpha, beta, and stable release dates, as well as end-of-life dates and dependency versions.
> [!IMPORTANT]
> Electron's official support policy is the latest 3 stable releases. Our stable
> release and end-of-life dates are determined by Chromium, and may be subject to
> change. While we try to keep our planned release and end-of-life dates frequently
> updated here, future dates may change if affected by upstream scheduling changes,
> and may not always be accurately reflected.
>
> See [Chromium's public release schedule](https://chromiumdash.appspot.com/schedule) for
> definitive information about Chromium's scheduled release dates.
:::info Official support dates may change
Electron's cadence between major version releases is 8 weeks long. Before each major
version hits stable, it goes through a four-week **alpha** phase and a four-week
**beta** phase.
Electron's official support policy is the latest 3 stable releases. Our stable
release and end-of-life dates are determined by Chromium, and may be subject to
change. While we try to keep our planned release and end-of-life dates frequently
updated here, future dates may change if affected by upstream scheduling changes,
and may not always be accurately reflected.
```mermaid
gantt
title Electron release cycle
dateFormat YYYY-MM-DD
axisFormat Week %W
todayMarker off
section v41
Alpha phase :a1, 2026-01-19, 4w
M146 enters Chrome beta :milestone, bm1, after a1, 0d
Beta phase :b1, after a1, 4w
M146 enters Chrome stable :milestone, s1, after b1, 0d
Supported until v44 release :active, after b1, 12w
section v42
Alpha phase :a2, after b1, 4w
M148 enters Chrome beta :milestone, bm2, after a2, 0d
Beta phase :b2, after a2, 4w
M148 enters Chrome stable :milestone, s2, after b2, 0d
Supported until v45 release :active, after b2, 4w
```
See [Chromium's public release schedule](https://chromiumdash.appspot.com/schedule) for
definitive information about Chromium's scheduled release dates.
:::
**Notes:**
* Alphas are generally less stable than beta releases. The cutoff between the two
corresponds to when the underlying Chromium version enters Chrome's Beta channel.
* The `-alpha.1`, `-beta.1`, and `stable` dates are our solid release dates.
* We strive for weekly alpha/beta releases, but we often release more than scheduled.
* All dates are our goals but there may be reasons for adjusting the stable deadline, such as security bugs.
@@ -63,11 +38,10 @@ gantt
## Version support policy
The latest three _stable_ major versions are supported by the Electron team.
For example, if the latest release is 42.1.x, then the 41.0.x as well
as the 40.2.x series are supported. We only support the latest minor release
For example, if the latest release is 6.1.x, then the 5.0.x as well
as the 4.2.x series are supported. We only support the latest minor release
for each stable release series. This means that in the case of a security fix,
42.1.x will receive the fix, but we will not release a new version of 42.0.x.
6.1.x will receive the fix, but we will not release a new version of 6.0.x.
The latest stable release unilaterally receives all fixes from `main`,
and the version prior to that receives the vast majority of those fixes
@@ -76,8 +50,11 @@ only security fixes directly.
### Chromium version support
> [!TIP]
> Chromium's public release schedule is [here](https://chromiumdash.appspot.com/schedule).
:::info Chromium release schedule
Chromium's public release schedule is [here](https://chromiumdash.appspot.com/schedule).
:::
Electron targets Chromium even-number versions, releasing every 8 weeks in concert
with Chromium's 4-week release schedule. For example, Electron 26 uses Chromium 116, while Electron 27 uses Chromium 118.
@@ -105,7 +82,3 @@ and that number is reduced to two in major version 10, the three-argument versio
continue to work until, at minimum, major version 12. Past the minimum two-version
threshold, we will attempt to support backwards compatibility beyond two versions
until the maintainers feel the maintenance burden is too high to continue doing so.
> [!TIP]
> For a canonical list of breaking changes, see the [Breaking Changes](../breaking-changes.md)
> document.

View File

@@ -14,6 +14,18 @@ To update an existing project to use the latest stable version:
npm install --save-dev electron@latest
```
## Versioning scheme
There are several major changes from our 1.x strategy outlined below. Each change is intended to satisfy the needs and priorities of developers/maintainers and app developers.
1. Strict use of the [SemVer](#semver) spec
2. Introduction of semver-compliant `-beta` tags
3. Introduction of [conventional commit messages](https://conventionalcommits.org/)
4. Well-defined stabilization branches
5. The `main` branch is versionless; only stabilization branches contain version information
We will cover in detail how git branching works, how npm tagging works, what developers should expect to see, and how one can backport changes.
## SemVer
Below is a table explicitly mapping types of changes to their corresponding category of SemVer (e.g. Major, Minor, Patch).
@@ -22,7 +34,7 @@ Below is a table explicitly mapping types of changes to their corresponding cate
| ------------------------------- | ---------------------------------- | ----------------------------- |
| Electron breaking API changes | Electron non-breaking API changes | Electron bug fixes |
| Node.js major version updates | Node.js minor version updates | Node.js patch version updates |
| Chromium version updates | | fix-related Chromium patches |
| Chromium version updates | | fix-related chromium patches |
For more information, see the [Semantic Versioning 2.0.0](https://semver.org/) spec.
@@ -32,189 +44,68 @@ Note that most Chromium updates will be considered breaking. Fixes that can be b
Stabilization branches are branches that run parallel to `main`, taking in only cherry-picked commits that are related to security or stability. These branches are never merged back to `main`.
```mermaid
gitGraph
commit
commit
branch N-x-y
checkout main
commit id:"fix-1"
checkout N-x-y
cherry-pick id:"fix-1"
checkout main
commit id:"fix-2"
checkout N-x-y
cherry-pick id:"fix-2"
checkout main
commit
commit
![Stabilization Branches](../images/versioning-sketch-1.png)
Since Electron 8, stabilization branches are always **major** version lines, and named against the following template `$MAJOR-x-y` e.g. `8-x-y`. Prior to that we used **minor** version lines and named them as `$MAJOR-$MINOR-x` e.g. `2-0-x`.
We allow for multiple stabilization branches to exist simultaneously, one for each supported version. For more details on which versions are supported, see our [Electron Releases](./electron-timelines.md) doc.
![Multiple Stability Branches](../images/versioning-sketch-2.png)
Older lines will not be supported by the Electron project, but other groups can take ownership and backport stability and security fixes on their own. We discourage this, but recognize that it makes life easier for many app developers.
## Beta releases and bug fixes
Developers want to know which releases are _safe_ to use. Even seemingly innocent features can introduce regressions in complex applications. At the same time, locking to a fixed version is dangerous because youre ignoring security patches and bug fixes that may have come out since your version. Our goal is to allow the following standard semver ranges in `package.json` :
* Use `~2.0.0` to admit only stability or security related fixes to your `2.0.0` release.
* Use `^2.0.0` to admit non-breaking _reasonably stable_ feature work as well as security and bug fixes.
Whats important about the second point is that apps using `^` should still be able to expect a reasonable level of stability. To accomplish this, SemVer allows for a _pre-release identifier_ to indicate a particular version is not yet _safe_ or _stable_.
Whatever you choose, you will periodically have to bump the version in your `package.json` as breaking changes are a fact of Chromium life.
The process is as follows:
1. All new major and minor releases lines begin with a beta series indicated by SemVer prerelease tags of `beta.N`, e.g. `2.0.0-beta.1`. After the first beta, subsequent beta releases must meet all of the following conditions:
1. The change is backwards API-compatible (deprecations are allowed)
2. The risk to meeting our stability timeline must be low.
2. If allowed changes need to be made once a release is beta, they are applied and the prerelease tag is incremented, e.g. `2.0.0-beta.2`.
3. If a particular beta release is _generally regarded_ as stable, it will be re-released as a stable build, changing only the version information. e.g. `2.0.0`. After the first stable, all changes must be backwards-compatible bug or security fixes.
4. If future bug fixes or security patches need to be made once a release is stable, they are applied and the _patch_ version is incremented
e.g. `2.0.1`.
Specifically, the above means:
1. Admitting non-breaking-API changes before Week 3 in the beta cycle is okay, even if those changes have the potential to cause moderate side-effects.
2. Admitting feature-flagged changes, that do not otherwise alter existing code paths, at most points in the beta cycle is okay. Users can explicitly enable those flags in their apps.
3. Admitting features of any sort after Week 3 in the beta cycle is 👎 without a very good reason.
For each major and minor bump, you should expect to see something like the following:
```plaintext
2.0.0-beta.1
2.0.0-beta.2
2.0.0-beta.3
2.0.0
2.0.1
2.0.2
```
Since Electron 8, stabilization branches are always **major** version lines, and named against the following template `$MAJOR-x-y` e.g. `8-x-y`. (Prior to that, we used **minor** version lines and named them as `$MAJOR-$MINOR-x` e.g. `2-0-x`.)
An example lifecycle in pictures:
We allow for multiple stabilization branches to exist simultaneously, one for each supported version.
* A new release branch is created that includes the latest set of features. It is published as `2.0.0-beta.1`.
![New Release Branch](../images/versioning-sketch-3.png)
* A bug fix comes into master that can be backported to the release branch. The patch is applied, and a new beta is published as `2.0.0-beta.2`.
![Bugfix Backport to Beta](../images/versioning-sketch-4.png)
* The beta is considered _generally stable_ and it is published again as a non-beta under `2.0.0`.
![Beta to Stable](../images/versioning-sketch-5.png)
* Later, a zero-day exploit is revealed and a fix is applied to master. We backport the fix to the `2-0-x` line and release `2.0.1`.
![Security Backports](../images/versioning-sketch-6.png)
> [!TIP]
> For more details on which versions are supported, see our [Electron Releases](./electron-timelines.md) doc.
A few examples of how various SemVer ranges will pick up new releases:
```mermaid
gitGraph
commit
branch "41-x-y"
checkout main
commit
commit
commit id:"fix-a"
checkout "41-x-y"
cherry-pick id:"fix-a"
checkout main
commit
commit id:"fix-b"
checkout "41-x-y"
cherry-pick id:"fix-b"
checkout main
commit
branch "42-x-y"
checkout main
commit
commit id:"fix-c"
checkout "41-x-y"
cherry-pick id:"fix-c"
checkout "42-x-y"
cherry-pick id:"fix-c"
checkout main
commit
commit id:"fix-d"
checkout "41-x-y"
cherry-pick id:"fix-d"
checkout "42-x-y"
cherry-pick id:"fix-d"
checkout main
commit
```
Older lines will not be supported by the Electron project.
## Release cycle
Electron follows an **8-week regular release cycle** where key milestones correspond to
matching dates in the Chromium release cycle.
```mermaid
gantt
title Electron release cycle
dateFormat YYYY-MM-DD
axisFormat Week %W
todayMarker off
section v41
Alpha phase :a1, 2026-01-19, 4w
M146 enters Chrome beta :milestone, bm1, after a1, 0d
Beta phase :b1, after a1, 4w
M146 enters Chrome stable :milestone, s1, after b1, 0d
Supported until v44 release :active, after b1, 12w
section v42
Alpha phase :a2, after b1, 4w
M148 enters Chrome beta :milestone, bm2, after a2, 0d
Beta phase :b2, after a2, 4w
M148 enters Chrome stable :milestone, s2, after b2, 0d
Supported until v45 release :active, after b2, 4w
```
### Example
When Electron 41 hits its stable release, the release line for Electron 42 is branched off of `main`.
Its first alpha release is created with all the changes contained on `main`:
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit tag:"v42.0.0-alpha.1"
```
A bug fix comes into `main` that can be backported to the release branch. The patch is applied,
and it is published in the next `v42.0.0-alpha.2` release.
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit id:"42.0.0-alpha.1" tag:"v42.0.0-alpha.1"
checkout "main"
commit
commit id:"fix-1"
checkout "42-x-y"
cherry-pick id:"fix-1" tag:"v42.0.0-alpha.2"
```
The version of Chromium that powers Electron 42 hits Chrome's beta channel. The `alpha` line is
promoted to `beta`.
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit id:"42.0.0-alpha.1" tag:"v42.0.0-alpha.1"
checkout "main"
commit
commit id:"fix-1"
checkout "42-x-y"
cherry-pick id:"fix-1" tag:"v42.0.0-alpha.2"
checkout "main"
commit
commit
commit id:"fix-2"
checkout "42-x-y"
cherry-pick id:"fix-2" tag:"v42.0.0-beta.1"
```
Beta releases continue weekly until Electron 42 is promoted to stable and the same cycle starts again
with `43-x-y`. Later, a zero-day exploit is revealed and a fix is applied to `main`. We backport the
fix to the `42-x-y` line and release `42.0.1`.
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit id:"42.0.0-alpha.1" tag:"v42.0.0-alpha.1"
checkout "main"
commit
commit id:"fix-1"
checkout "42-x-y"
cherry-pick id:"fix-1" tag:"v42.0.0-alpha.2"
checkout "main"
commit
commit
commit id:"fix-2"
checkout "42-x-y"
cherry-pick id:"fix-2" tag:"v42.0.0-beta.1"
checkout "main"
commit id:"fix-3"
checkout "42-x-y"
cherry-pick id:"fix-3" tag:"v42.0.0"
checkout "main"
branch "43-x-y"
checkout "43-x-y"
commit id:"43.0.0-alpha.1" tag:"v43.0.0-alpha.1"
checkout "main"
commit id:"security-fix"
checkout "42-x-y"
cherry-pick id:"security-fix" tag:"v42.0.1"
checkout "43-x-y"
cherry-pick id:"security-fix" tag:"v43.0.0-alpha.2"
```
![Semvers and Releases](../images/versioning-sketch-7.png)
### Backport request process
@@ -245,11 +136,10 @@ The `electron/electron` repository also enforces squash merging, so you only nee
## Versioned `main` branch
* The `main` branch always corresponds to the major version above the current pre-release line.
* Unstable nightly releases of `main` are released under the [`electron-nightly`](https://www.npmjs.com/package/electron-nightly)
package on npm.
* The `main` branch will always contain the next major version `X.0.0-nightly.DATE` in its `package.json`.
* Release branches are never merged back to `main`.
* All `package.json` values are fixed at `0.0.0-development`.
* Release branches _do_ contain the correct version in their `package.json`.
* As soon as a release branch is cut for a major, `main` must be bumped to the next major (i.e. `main` is always versioned as the next theoretical release branch).
## Historical versioning (Electron 1.X)
@@ -257,29 +147,6 @@ Electron versions _< 2.0_ did not conform to the [SemVer](https://semver.org) sp
Here is an example of the 1.x strategy:
```mermaid
---
config:
gitGraph:
mainBranchName: 'master'
---
gitGraph
commit
branch "bugfix-1"
checkout "bugfix-1"
commit
checkout master
merge "bugfix-1" tag:"1.8.1"
branch "feature"
checkout "feature"
commit
checkout master
merge "feature" tag:"1.8.2"
branch "bugfix-2"
checkout "bugfix-2"
commit
checkout master
merge "bugfix-2" tag:"1.8.3"
```
![1.x Versioning](../images/versioning-sketch-0.png)
An app developed with `1.8.1` cannot take the `1.8.3` bug fix without either absorbing the `1.8.2` feature, or by backporting the fix and maintaining a new release line.

View File

@@ -0,0 +1,188 @@
---
title: Local AI Handler
description: Handle built-in AI APIs with a local LLM implementation
slug: local-ai-handler
hide_title: true
---
# Local AI Handler
> **This API is experimental.** It may change or be removed in future Electron releases.
Electron supports Chromium's experimental [Prompt API](https://github.com/webmachinelearning/prompt-api)
(`LanguageModel`) web API by letting you route calls to a local LLM running in a
[utility process](../api/utility-process.md). Web content calls
`LanguageModel.create()` and `LanguageModel.prompt()` like it would in any
browser, while your Electron app decides which model handles the request.
## How it works
The local AI handler architecture involves three processes:
1. **Main process** — creates `UtilityProcess`, and then registers it to handle
Prompt API calls for a given session via [`ses.registerLocalAIHandler()`](../api/session.md#sesregisterlocalaihandlerhandler-experimental).
2. **Utility process** — runs a script that calls
[`localAIHandler.setPromptAPIHandler()`](../api/local-ai-handler.md#localaihandlersetpromptapihandlerpromptapihandler-experimental)
to supply a `LanguageModel` subclass.
3. **Renderer process** — web content uses the `LanguageModel` API from Chromium's experimental Prompt API
(e.g. `LanguageModel.create()`, `model.prompt()`).
When a renderer calls the Prompt API, Electron proxies the request through the
main process to the registered utility process, which invokes your
`LanguageModel` implementation and sends the result back directly to the renderer.
## Prerequisites
The Prompt API Blink feature must be enabled on any `BrowserWindow` that will
use it with the `AIPromptAPI` feature. To enable multi-modal inputs, add the
`AIPromptAPIMultimodalInput` as well.
```js
const win = new BrowserWindow({
webPreferences: {
enableBlinkFeatures: 'AIPromptAPI'
}
})
```
## Quick start
### 1. Create the utility process script
The utility process script registers your `LanguageModel` subclass. The
handler function receives a `details` object with information about the
caller, and must return a class that extends `LanguageModel`.
```js title='ai-handler.js (Utility Process)'
const { localAIHandler, LanguageModel } = require('electron/utility')
localAIHandler.setPromptAPIHandler((details) => {
// details.webContentsId — ID of the calling WebContents
// details.securityOrigin — origin of the calling page
return class MyLanguageModel extends LanguageModel {
static async create (options) {
// options.signal - AbortSignal to cancel the creation of the model
// options.initialPrompts - initial prompts to pass to the language model
return new MyLanguageModel({
contextUsage: 0,
contextWindow: 4096
})
}
static async availability () {
// Return 'available', 'downloadable', 'downloading', or 'unavailable'
return 'available'
}
async prompt (input) {
// input is a string or LanguageModelMessage[]
// Return a string response from your model, or a ReadableStream
// to return a streaming response.
return 'This is a response from your local LLM!'
}
async clone () {
return new MyLanguageModel({
contextUsage: this.contextUsage,
contextWindow: this.contextWindow
})
}
destroy () {
// Clean up model resources
}
}
})
```
### 2. Register the handler in the main process
Fork the utility process and register it as the AI handler for a session:
```js title='main.js (Main Process)'
const { app, BrowserWindow, utilityProcess } = require('electron')
const path = require('node:path')
app.whenReady().then(() => {
// Fork the utility process running your AI handler script
const aiHandler = utilityProcess.fork(path.join(__dirname, 'ai-handler.js'))
// Create a window with the Prompt API enabled
const win = new BrowserWindow({
webPreferences: {
enableBlinkFeatures: 'AIPromptAPI'
}
})
// Connect the AI handler to this session
win.webContents.session.registerLocalAIHandler(aiHandler)
win.loadFile('index.html')
})
```
### 3. Use the Prompt API in your renderer
Your web content can now use the standard `LanguageModel` API, which is a
global available in the renderer:
```html title='index.html (Renderer Process)'
<script>
async function askAI () {
const model = await LanguageModel.create()
const response = await model.prompt('What is Electron?')
document.getElementById('response').textContent = response
}
</script>
<button onclick="askAI()">Ask AI</button>
<p id="response"></p>
```
## Implementing a real model
The quick-start example returns a hardcoded string. A real implementation
would integrate with a local model. See [`electron/llm`](https://github.com/electron/llm)
for an example of using `node-llama-cpp` to wire up GGUF (GPT-Generated Unified Format) models.
## Clearing the handler
To disconnect the AI handler from a session, pass `null`:
```js @ts-type={win:Electron.BrowserWindow}
win.webContents.session.registerLocalAIHandler(null)
```
After clearing, any `LanguageModel.create()` calls from renderers using that
session will fail.
## Late handler registration
If a renderer uses the Prompt API after `ses.registerLocalAIHandler()` has been
called, but before `localAIHandler.setPromptAPIHandler()` has been called in
the utility process, the request is not immediately rejected. Instead, Electron
queues pending requests. Once `setPromptAPIHandler()` is called, all queued
requests are flushed and handled normally.
If too many requests arrive before the handler is set, the **oldest** pending
request is dropped and pending promises will reject in the renderer. Call
`setPromptAPIHandler()` as early as possible in your utility process script
to avoid hitting this.
## Security considerations
The `details` object passed to your handler includes `webContentsId` and
`securityOrigin`. Use these to decide whether to handle a request, and
when to reuse a model instance versus providing a fresh instance to
provide proper isolation between origins.
## Further reading
- [`localAIHandler` API reference](../api/local-ai-handler.md)
- [`LanguageModel` API reference](../api/language-model.md)
- [`ses.registerLocalAIHandler()`](../api/session.md#sesregisterlocalaihandlerhandler-experimental)
- [`utilityProcess.fork()`](../api/utility-process.md#utilityprocessforkmodulepath-args-options)
- [`electron/llm`](https://github.com/electron/llm)

View File

@@ -30,6 +30,8 @@ auto_filenames = {
"docs/api/ipc-main-service-worker.md",
"docs/api/ipc-main.md",
"docs/api/ipc-renderer.md",
"docs/api/language-model.md",
"docs/api/local-ai-handler.md",
"docs/api/menu-item.md",
"docs/api/menu.md",
"docs/api/message-channel-main.md",
@@ -91,7 +93,6 @@ auto_filenames = {
"docs/api/structures/custom-scheme.md",
"docs/api/structures/desktop-capturer-source.md",
"docs/api/structures/display.md",
"docs/api/structures/enable-heap-profiling-options.md",
"docs/api/structures/extension-info.md",
"docs/api/structures/extension.md",
"docs/api/structures/file-filter.md",
@@ -109,6 +110,14 @@ auto_filenames = {
"docs/api/structures/jump-list-item.md",
"docs/api/structures/keyboard-event.md",
"docs/api/structures/keyboard-input-event.md",
"docs/api/structures/language-model-append-options.md",
"docs/api/structures/language-model-clone-options.md",
"docs/api/structures/language-model-create-core-options.md",
"docs/api/structures/language-model-create-options.md",
"docs/api/structures/language-model-expected.md",
"docs/api/structures/language-model-message-content.md",
"docs/api/structures/language-model-message.md",
"docs/api/structures/language-model-prompt-options.md",
"docs/api/structures/media-access-permission-request.md",
"docs/api/structures/memory-info.md",
"docs/api/structures/memory-usage-details.md",
@@ -401,6 +410,8 @@ auto_filenames = {
"lib/common/init.ts",
"lib/common/webpack-globals-provider.ts",
"lib/utility/api/exports/electron.ts",
"lib/utility/api/language-model.ts",
"lib/utility/api/local-ai-handler.ts",
"lib/utility/api/module-list.ts",
"lib/utility/api/net.ts",
"lib/utility/init.ts",

View File

@@ -759,6 +759,8 @@ filenames = {
"shell/services/node/node_service.h",
"shell/services/node/parent_port.cc",
"shell/services/node/parent_port.h",
"shell/utility/api/electron_api_local_ai_handler.cc",
"shell/utility/api/electron_api_local_ai_handler.h",
"shell/utility/electron_content_utility_client.cc",
"shell/utility/electron_content_utility_client.h",
]

View File

@@ -53,50 +53,45 @@ export async function getSources(args: Electron.SourcesOptions) {
}
}
let resolveGetSources!: (value: ElectronInternal.GetSourcesResult[]) => void;
let rejectGetSources!: (reason?: any) => void;
const getSources = new Promise<ElectronInternal.GetSourcesResult[]>((resolve, reject) => {
resolveGetSources = resolve;
rejectGetSources = reject;
});
let capturer: ElectronInternal.DesktopCapturer | null = createDesktopCapturer();
// Register in currentlyRunning BEFORE startHandling so that synchronous
// completion (e.g. when no capturers are created) can properly clean up.
currentlyRunning.push({ options, getSources });
const stopRunning = () => {
if (capturer) {
delete capturer._onerror;
delete capturer._onfinished;
capturer = null;
let capturer: ElectronInternal.DesktopCapturer | null = createDesktopCapturer();
const stopRunning = () => {
if (capturer) {
delete capturer._onerror;
delete capturer._onfinished;
capturer = null;
if (process.platform === 'darwin') {
for (const win of BrowserWindow.getAllWindows()) {
if (resizableValues.has(win.id)) {
win.resizable = resizableValues.get(win.id);
if (process.platform === 'darwin') {
for (const win of BrowserWindow.getAllWindows()) {
if (resizableValues.has(win.id)) {
win.resizable = resizableValues.get(win.id);
}
}
}
}
}
// Remove from currentlyRunning once we resolve or reject
currentlyRunning = currentlyRunning.filter((running) => running.options !== options);
};
// Remove from currentlyRunning once we resolve or reject
currentlyRunning = currentlyRunning.filter((running) => running.options !== options);
};
capturer._onerror = (error: string) => {
stopRunning();
// eslint-disable-next-line prefer-promise-reject-errors
rejectGetSources(error);
};
capturer._onerror = (error: string) => {
stopRunning();
// eslint-disable-next-line prefer-promise-reject-errors
reject(error);
};
capturer._onfinished = (sources: Electron.DesktopCapturerSource[]) => {
stopRunning();
resolveGetSources(sources);
};
capturer._onfinished = (sources: Electron.DesktopCapturerSource[]) => {
stopRunning();
resolve(sources);
};
capturer.startHandling(captureWindow, captureScreen, thumbnailSize, fetchWindowIcons);
capturer.startHandling(captureWindow, captureScreen, thumbnailSize, fetchWindowIcons);
});
currentlyRunning.push({
options,
getSources
});
return getSources;
}

View File

@@ -2,7 +2,7 @@ import { fetchWithSession } from '@electron/internal/browser/api/net-fetch';
import { addIpcDispatchListeners } from '@electron/internal/browser/ipc-dispatch';
import * as deprecate from '@electron/internal/common/deprecate';
import { net } from 'electron/main';
import { net, type UtilityProcess } from 'electron/main';
const { fromPartition, fromPath, Session } = process._linkedBinding('electron_browser_session');
const { isDisplayMediaSystemPickerAvailable } = process._linkedBinding('electron_browser_desktop_capturer');
@@ -116,6 +116,12 @@ Session.prototype.removeExtension = deprecate.moveAPI(
'session.extensions.removeExtension'
);
Session.prototype.registerLocalAIHandler = function (handler: UtilityProcess | null) {
// We need to unwrap the userland `ForkUtilityProcess` object and get the underlying
// `ElectronInternal.UtilityProcessWrapper` before we call the C++ function
return this._registerLocalAIHandler(handler !== null ? (handler as any)._unwrapHandle() : null);
};
export default {
fromPartition,
fromPath,

View File

@@ -131,6 +131,10 @@ class ForkUtilityProcess extends EventEmitter implements Electron.UtilityProcess
return this.#stderr;
}
_unwrapHandle() {
return this.#handle;
}
postMessage(message: any, transfer?: MessagePortMain[]) {
if (Array.isArray(transfer)) {
transfer = transfer.map((o: any) => (o instanceof MessagePortMain ? o._internalPort : o));

View File

@@ -229,7 +229,7 @@ function parsePageSize(pageSize: string | ElectronInternal.PageSize) {
// Translate the options of printToPDF.
const printToPDFQueues = new WeakMap<Electron.WebContents, Promise<unknown>>();
let pendingPromise: Promise<any> | undefined;
WebContents.prototype.printToPDF = async function (options) {
const margins = checkType(options.margins ?? {}, 'object', 'margins');
const pageSize = parsePageSize(options.pageSize ?? 'letter');
@@ -261,19 +261,16 @@ WebContents.prototype.printToPDF = async function (options) {
...pageSize
};
if (!this._printToPDF) {
if (this._printToPDF) {
if (pendingPromise) {
pendingPromise = pendingPromise.then(() => this._printToPDF(printSettings));
} else {
pendingPromise = this._printToPDF(printSettings);
}
return pendingPromise;
} else {
throw new Error('Printing feature is disabled');
}
const prev = printToPDFQueues.get(this) ?? Promise.resolve();
const next = prev.catch(() => {}).then(() => this._printToPDF(printSettings));
printToPDFQueues.set(this, next);
next
.finally(() => {
if (printToPDFQueues.get(this) === next) printToPDFQueues.delete(this);
})
.catch(() => {});
return next;
};
// TODO(codebytere): deduplicate argument sanitization by moving rest of

View File

@@ -56,7 +56,7 @@ export function removeFunction<T extends Function>(fn: T, removedName: string):
const warn = warnOnce(`${fn.name} function`);
return function (this: any) {
warn();
return fn.apply(this, arguments);
fn.apply(this, arguments);
} as unknown as typeof fn;
}

View File

@@ -0,0 +1,44 @@
interface LanguageModelConstructorValues {
contextUsage: number;
contextWindow: number;
}
export default class LanguageModel implements Electron.LanguageModel {
contextUsage: number;
contextWindow: number;
constructor(values: LanguageModelConstructorValues) {
this.contextUsage = values.contextUsage;
this.contextWindow = values.contextWindow;
}
static async create(): Promise<LanguageModel> {
return new LanguageModel({
contextUsage: 0,
contextWindow: 0
});
}
static async availability() {
return 'available';
}
async prompt() {
return '';
}
async append(): Promise<undefined> {}
async measureContextUsage() {
return 0;
}
async clone() {
return new LanguageModel({
contextUsage: this.contextUsage,
contextWindow: this.contextWindow
});
}
destroy() {}
}

View File

@@ -0,0 +1,7 @@
import { EventEmitter } from 'events';
const binding = process._linkedBinding('electron_utility_local_ai_handler');
Object.setPrototypeOf(binding, EventEmitter.prototype);
module.exports = binding;

View File

@@ -1,5 +1,7 @@
// Utility side modules, please sort alphabetically.
export const utilityNodeModuleList: ElectronInternal.ModuleEntry[] = [
{ name: 'localAIHandler', loader: () => require('./local-ai-handler') },
{ name: 'LanguageModel', loader: () => require('./language-model') },
{ name: 'net', loader: () => require('./net') },
{ name: 'systemPreferences', loader: () => require('@electron/internal/browser/api/system-preferences') }
];

View File

@@ -1,6 +1,8 @@
import LanguageModel from '@electron/internal/utility/api/language-model';
import { ParentPort } from '@electron/internal/utility/parent-port';
import { EventEmitter } from 'events';
import { ReadableStream } from 'stream/web';
import { pathToFileURL } from 'url';
const v8Util = process._linkedBinding('electron_common_v8_util');
@@ -10,6 +12,15 @@ const entryScript: string = v8Util.getHiddenValue(process, '_serviceStartupScrip
// we need to restore it here.
process.argv.splice(1, 1, entryScript);
// These are used by C++ to more easily identify these objects.
v8Util.setHiddenValue(global, 'isReadableStream', (val: unknown) => val instanceof ReadableStream);
v8Util.setHiddenValue(global, 'isLanguageModel', (val: unknown) => val instanceof LanguageModel);
v8Util.setHiddenValue(
global,
'isLanguageModelClass',
(val: any) => Object.is(val, LanguageModel) || val?.prototype instanceof LanguageModel || false
);
// Import common settings.
require('@electron/internal/common/init');

View File

@@ -6,11 +6,11 @@
"install-electron": "install.js"
},
"dependencies": {
"@electron/get": "^5.0.0",
"@electron/get": "^4.0.3",
"@types/node": "^24.9.0",
"extract-zip": "^2.0.1"
},
"engines": {
"node": ">= 22.12.0"
"node": ">= 12.20.55"
}
}

View File

@@ -78,7 +78,7 @@
"gn-typescript-definitions": "npm run create-typescript-definitions && node script/cp.mjs electron.d.ts",
"pre-flight": "pre-flight",
"gn-check": "node ./script/gn-check.js",
"gn-format": "node ./script/lint.js --gn --fix",
"gn-format": "python3 script/run-gn-format.py",
"precommit": "lint-staged",
"preinstall": "node -e 'process.exit(0)'",
"pretest": "npm run create-typescript-definitions",
@@ -117,7 +117,7 @@
],
"*.{gn,gni}": [
"npm run gn-check",
"node ./script/lint.js --gn --fix --only --"
"npm run gn-format"
],
"*.py": [
"node script/lint.js --py --fix --only --"

View File

@@ -147,6 +147,7 @@ fix_use_fresh_lazynow_for_onendworkitemimpl_after_didruntask.patch
fix_pulseaudio_stream_and_icon_names.patch
fix_fire_menu_popup_start_for_dynamically_created_aria_menus.patch
feat_allow_enabling_extensions_on_custom_protocols.patch
reject_prompt_api_promises_on_mojo_connection_disconnect.patch
fix_initialize_com_on_desktopmedialistcapturethread_on_windows.patch
chore_register_node_as_a_dynamic_trace_category_prefix.patch
gin_mark_argumentholder_as_cppgc_stack_allocated.patch

View File

@@ -0,0 +1,168 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: David Sanders <dsanders11@ucsbalum.com>
Date: Wed, 1 Apr 2026 21:14:38 -0700
Subject: Reject Prompt API promises on Mojo connection disconnect
Without these changes to reject promises when the Mojo connection
disconnects, these promises will hang indefinitely if the Prompt
API handler is killed or unregistered.
This will be upstreamed to Chromium.
Change-Id: I89a6a076ae35cbaf12a93c517223a524bab3dff0
diff --git a/third_party/blink/renderer/modules/ai/language_model.cc b/third_party/blink/renderer/modules/ai/language_model.cc
index d60d6fd742b870360e25190db15da9e94eccc5f1..3435c3a38a5e356845d43623832038473e9db66f 100644
--- a/third_party/blink/renderer/modules/ai/language_model.cc
+++ b/third_party/blink/renderer/modules/ai/language_model.cc
@@ -193,6 +193,9 @@ class CloneLanguageModelClient
client_remote;
receiver_.Bind(client_remote.InitWithNewPipeAndPassReceiver(),
language_model->GetTaskRunner());
+ receiver_.set_disconnect_handler(
+ BindOnce(&CloneLanguageModelClient::OnConnectionError,
+ WrapWeakPersistent(this)));
language_model_->GetAILanguageModelRemote()->Fork(std::move(client_remote));
}
~CloneLanguageModelClient() override = default;
@@ -235,6 +238,11 @@ class CloneLanguageModelClient
Cleanup();
}
+ void OnConnectionError() {
+ OnError(mojom::blink::AIManagerCreateClientError::kUnableToCreateSession,
+ /*quota_error_info=*/nullptr);
+ }
+
void ResetReceiver() override { receiver_.reset(); }
private:
@@ -265,6 +273,8 @@ class AppendClient : public GarbageCollected<AppendClient>,
mojo::PendingRemote<mojom::blink::ModelStreamingResponder> client_remote;
receiver_.Bind(client_remote.InitWithNewPipeAndPassReceiver(),
language_model->GetTaskRunner());
+ receiver_.set_disconnect_handler(
+ BindOnce(&AppendClient::OnConnectionError, WrapWeakPersistent(this)));
language_model_->GetAILanguageModelRemote()->Append(
std::move(prompts), std::move(client_remote));
}
@@ -320,6 +330,11 @@ class AppendClient : public GarbageCollected<AppendClient>,
Cleanup();
}
+ void OnConnectionError() {
+ OnError(ModelStreamingResponseStatus::kErrorSessionDestroyed,
+ /*quota_error_info=*/nullptr);
+ }
+
void OnStreaming(const String& text) override {
NOTREACHED() << "Append() should not invoke `OnStreaming()`";
}
@@ -761,6 +776,7 @@ void LanguageModel::ExecuteMeasureInputUsage(
ScriptPromiseResolver<IDLDouble>* resolver,
AbortSignal* signal,
Vector<mojom::blink::AILanguageModelPromptPtr> prompts) {
+ auto reject_fn = RejectOnDestruction(resolver, signal);
language_model_remote_->MeasureInputUsage(
std::move(prompts),
BindOnce(
@@ -783,7 +799,8 @@ void LanguageModel::ExecuteMeasureInputUsage(
}
resolver->Resolve(static_cast<double>(usage.value()));
},
- WrapPersistent(resolver), WrapPersistent(signal)));
+ WrapPersistent(resolver), WrapPersistent(signal))
+ .Then(std::move(reject_fn)));
}
bool LanguageModel::ValidateInput(ScriptState* script_state,
diff --git a/third_party/blink/renderer/modules/ai/language_model_create_client.cc b/third_party/blink/renderer/modules/ai/language_model_create_client.cc
index ddc6fcda3ffbdc271bcdebfbd85aa711c063fee2..2b63e9e77dc1ed4a0ed9a687527efdc104974e7a 100644
--- a/third_party/blink/renderer/modules/ai/language_model_create_client.cc
+++ b/third_party/blink/renderer/modules/ai/language_model_create_client.cc
@@ -509,6 +509,11 @@ void LanguageModelCreateClient::OnError(
Cleanup();
}
+void LanguageModelCreateClient::OnConnectionError() {
+ OnError(mojom::blink::AIManagerCreateClientError::kUnableToCreateSession,
+ /*quota_error_info=*/nullptr);
+}
+
void LanguageModelCreateClient::ResetReceiver() {
receiver_.reset();
}
@@ -524,6 +529,8 @@ void LanguageModelCreateClient::OnInitialPromptsResolved(
mojo::PendingRemote<mojom::blink::AIManagerCreateLanguageModelClient>
client_remote;
receiver_.Bind(client_remote.InitWithNewPipeAndPassReceiver(), task_runner_);
+ receiver_.set_disconnect_handler(
+ BindOnce(&LanguageModelCreateClient::OnConnectionError, WrapWeakPersistent(this)));
HeapMojoRemote<mojom::blink::AIManager>& ai_manager_remote =
AIInterfaceProxy::GetAIManagerRemote(GetExecutionContext());
diff --git a/third_party/blink/renderer/modules/ai/language_model_create_client.h b/third_party/blink/renderer/modules/ai/language_model_create_client.h
index 9ed8dfbefeccf1627d56f5ccc315f06071a63e25..7c8e823608883171c115676db151b70eb2fd055d 100644
--- a/third_party/blink/renderer/modules/ai/language_model_create_client.h
+++ b/third_party/blink/renderer/modules/ai/language_model_create_client.h
@@ -49,6 +49,8 @@ class LanguageModelCreateClient
// Process options and create, if the availability result is valid.
void Create(mojom::blink::ModelAvailabilityCheckResult result);
+ void OnConnectionError();
+
// Continue creation after any initial prompts were processed or rejected.
void OnInitialPromptsResolved(
Vector<mojom::blink::AILanguageModelExpectedPtr> expected_inputs,
diff --git a/third_party/blink/renderer/modules/ai/model_execution_responder.cc b/third_party/blink/renderer/modules/ai/model_execution_responder.cc
index 47b65b13adfab4b8f2597a23d38a386915643d1b..fa9b54e1069019a66b8dab6eb0efe5df8b34c11a 100644
--- a/third_party/blink/renderer/modules/ai/model_execution_responder.cc
+++ b/third_party/blink/renderer/modules/ai/model_execution_responder.cc
@@ -84,7 +84,10 @@ class Responder final : public GarbageCollected<Responder>,
mojo::PendingRemote<blink::mojom::blink::ModelStreamingResponder>
BindNewPipeAndPassRemote(
scoped_refptr<base::SequencedTaskRunner> task_runner) {
- return receiver_.BindNewPipeAndPassRemote(task_runner);
+ auto pending_remote = receiver_.BindNewPipeAndPassRemote(task_runner);
+ receiver_.set_disconnect_handler(
+ BindOnce(&Responder::OnConnectionError, WrapWeakPersistent(this)));
+ return pending_remote;
}
// `mojom::blink::ModelStreamingResponder` implementation.
@@ -144,6 +147,11 @@ class Responder final : public GarbageCollected<Responder>,
Cleanup();
}
+ void OnConnectionError() {
+ OnError(ModelStreamingResponseStatus::kErrorSessionDestroyed,
+ /*quota_error_info=*/nullptr);
+ }
+
void RecordResponseStatusMetrics(
mojom::blink::ModelStreamingResponseStatus status) {
base::UmaHistogramEnumeration(
@@ -235,7 +243,10 @@ class StreamingResponder final
mojo::PendingRemote<blink::mojom::blink::ModelStreamingResponder>
BindNewPipeAndPassRemote(
scoped_refptr<base::SequencedTaskRunner> task_runner) {
- return receiver_.BindNewPipeAndPassRemote(task_runner);
+ auto pending_remote = receiver_.BindNewPipeAndPassRemote(task_runner);
+ receiver_.set_disconnect_handler(BindOnce(
+ &StreamingResponder::OnConnectionError, WrapWeakPersistent(this)));
+ return pending_remote;
}
ReadableStream* CreateReadableStream() {
@@ -337,6 +348,11 @@ class StreamingResponder final
Cleanup();
}
+ void OnConnectionError() {
+ OnError(ModelStreamingResponseStatus::kErrorSessionDestroyed,
+ /*quota_error_info=*/nullptr);
+ }
+
void RecordResponseStatusMetrics(
mojom::blink::ModelStreamingResponseStatus status) {
base::UmaHistogramEnumeration(

View File

@@ -44,4 +44,5 @@ src_refactor_module_wrap_cc_to_update_fixedarray_get_params.patch
src_refactor_wasmstreaming_finish_to_accept_a_callback.patch
src_stop_using_v8_propertycallbackinfo_t_this.patch
build_restore_macos_deployment_target_to_12_0.patch
fix_generate_config_gypi_needs_to_generate_valid_json.patch
fix_add_externalpointertypetag_to_v8_external_api_calls.patch

View File

@@ -6,7 +6,7 @@ Subject: Delete deprecated fields on v8::Isolate
https://chromium-review.googlesource.com/c/v8/v8/+/7081397
diff --git a/src/api/environment.cc b/src/api/environment.cc
index 6873a9f203ccace7d5c501b62bed56732332d060..7ba7943c666d9cf3de47bf9a18b8e6e0e1196886 100644
index 2111ee63a6ace438c1a143c90a807ed9fc2bcc9d..ce6426a1bf2dadb1a642874a05718724ef0f3d7c 100644
--- a/src/api/environment.cc
+++ b/src/api/environment.cc
@@ -218,8 +218,6 @@ void SetIsolateCreateParamsForNode(Isolate::CreateParams* params) {

View File

@@ -44,7 +44,7 @@ index 37d83e41b618a07aca98118260abe9618f11256d..26d5c1bd3c8191fce1d22b969996b6bf
template <typename T>
diff --git a/src/base_object.cc b/src/base_object.cc
index 8a14a8c32626f98c2afa402786e1b6b2fbbb0988..946e60ea2d0ffe984328185a0e7762603be8e0dd 100644
index 404e2aa8c88d0cc0e6717c01e0df68899c64cc32..16462f305a2ac6b6c3d7b85024f2e52648c4300c 100644
--- a/src/base_object.cc
+++ b/src/base_object.cc
@@ -45,7 +45,8 @@ BaseObject::~BaseObject() {
@@ -72,7 +72,7 @@ index 74bbb9fb83246a90bc425e259150f0868020ac9e..a4b3a1c0907c9d50baf6c8cd473cb4c7
inline Environment* Environment::GetCurrent(
diff --git a/src/histogram.cc b/src/histogram.cc
index 2bcfcbdd9547db136c5992335a76dd3190886d6d..c34a83c9ccc83ec2973228a32cf0a51d29b4d681 100644
index 836a51b0e5aa4b1910604537c8b380038c27a7db..c4634e42fd2e5a27b0139a9b1716bc04875be469 100644
--- a/src/histogram.cc
+++ b/src/histogram.cc
@@ -136,7 +136,8 @@ HistogramBase::HistogramBase(
@@ -116,7 +116,7 @@ index 2bcfcbdd9547db136c5992335a76dd3190886d6d..c34a83c9ccc83ec2973228a32cf0a51d
std::unique_ptr<worker::TransferData>
diff --git a/src/js_udp_wrap.cc b/src/js_udp_wrap.cc
index 6b35871d92ceb1b1eead21e328088cf87a10fa6e..57c96645fa0fdb7ed77c0a36c877b2628b7f0571 100644
index 51e4f8c45ffd38fcf925ab8d283b3b88f2a35832..0c30c8b4609e4870c0ccfc5e9e465248c20763b8 100644
--- a/src/js_udp_wrap.cc
+++ b/src/js_udp_wrap.cc
@@ -55,8 +55,9 @@ JSUDPWrap::JSUDPWrap(Environment* env, Local<Object> obj)
@@ -207,10 +207,10 @@ index 0af487a9abc9ee1783367ac86b0016ab89e02006..c01cef477aaba52f9894943acede7fb2
inline Realm* Realm::GetCurrent(
diff --git a/src/node_snapshotable.cc b/src/node_snapshotable.cc
index c3e995adf437413fe956e45c2ccc704d8737de59..8561933060ac30e3559d7ce2ac633d25f1e4d6ec 100644
index e34d24d51d5c090b560d06f727043f20924e6f46..07615933c858a17515836a29f7e27ace4b81e6ff 100644
--- a/src/node_snapshotable.cc
+++ b/src/node_snapshotable.cc
@@ -1461,7 +1461,8 @@ StartupData SerializeNodeContextInternalFields(Local<Object> holder,
@@ -1400,7 +1400,8 @@ StartupData SerializeNodeContextInternalFields(Local<Object> holder,
// For the moment we do not set any internal fields in ArrayBuffer
// or ArrayBufferViews, so just return nullptr.
if (holder->IsArrayBuffer() || holder->IsArrayBufferView()) {
@@ -220,7 +220,7 @@ index c3e995adf437413fe956e45c2ccc704d8737de59..8561933060ac30e3559d7ce2ac633d25
return StartupData{nullptr, 0};
}
@@ -1481,7 +1482,8 @@ StartupData SerializeNodeContextInternalFields(Local<Object> holder,
@@ -1420,7 +1421,8 @@ StartupData SerializeNodeContextInternalFields(Local<Object> holder,
*holder);
BaseObject* object_ptr = static_cast<BaseObject*>(
@@ -296,7 +296,7 @@ index 29a4c29f3d3822394d23c453899cdd6aae280f3f..2a953d6390d5e4e251e54c1e847d4e5e
} // namespace node
diff --git a/src/udp_wrap.cc b/src/udp_wrap.cc
index 6f3e68b79f1c7089b8c17a6ed5cd33eee4078b02..e09cd7a553464060619ecf4ac51e028382e2e148 100644
index 150ef0f0400bed9df4f4b1a4c20ec4045ef7a5f6..2ca2ac177c6b5edc3b40712a40ff4a36e96904dc 100644
--- a/src/udp_wrap.cc
+++ b/src/udp_wrap.cc
@@ -126,8 +126,8 @@ void UDPWrapBase::set_listener(UDPListener* listener) {

View File

@@ -6,7 +6,7 @@ Subject: Remove deprecated `GetIsolate`
https://chromium-review.googlesource.com/c/v8/v8/+/6905244
diff --git a/src/api/environment.cc b/src/api/environment.cc
index ec1496467f5071a810a3d7a76d80f3d12a8582dc..6873a9f203ccace7d5c501b62bed56732332d060 100644
index 8974bac7dca43294cc5cc4570f8e2e78f42aefaa..2111ee63a6ace438c1a143c90a807ed9fc2bcc9d 100644
--- a/src/api/environment.cc
+++ b/src/api/environment.cc
@@ -795,7 +795,7 @@ std::unique_ptr<MultiIsolatePlatform> MultiIsolatePlatform::Create(
@@ -85,7 +85,7 @@ index cc60ddddb037e0279615bbe24821eb20fd8da677..37d83e41b618a07aca98118260abe961
return handle;
diff --git a/src/crypto/crypto_context.cc b/src/crypto/crypto_context.cc
index cb586936a904e7b9a017732e993a35ef1115ff9a..fd2dfa9fcf444fe705a2d42cd0963531cea9a74c 100644
index 2e3f31e1765024373c3fc2acd33fc3bfb352a906..ca62d3001bf51193d78caac0cccd93c188a8410c 100644
--- a/src/crypto/crypto_context.cc
+++ b/src/crypto/crypto_context.cc
@@ -1045,7 +1045,7 @@ bool ArrayOfStringsToX509s(Local<Context> context,
@@ -98,21 +98,21 @@ index cb586936a904e7b9a017732e993a35ef1115ff9a..fd2dfa9fcf444fe705a2d42cd0963531
uint32_t array_length = cert_array->Length();
diff --git a/src/crypto/crypto_x509.cc b/src/crypto/crypto_x509.cc
index 6b7e4211a8969351168fc982fe3466a2096bed3a..76c84dd516719849a44e7d67f42ea16dd315b190 100644
index 4c5427596d1c90d3a413cdd9ff4f1151e657073d..70135a6be65e41fcb3564ddf6d1e8083a59ef8bb 100644
--- a/src/crypto/crypto_x509.cc
+++ b/src/crypto/crypto_x509.cc
@@ -109,7 +109,7 @@ MaybeLocal<Value> ToV8Value(Local<Context> context, BIOPointer&& bio) {
if (!mem) [[unlikely]]
@@ -107,7 +107,7 @@ MaybeLocal<Value> ToV8Value(Local<Context> context, BIOPointer&& bio) {
return {};
BUF_MEM* mem = bio;
Local<Value> ret;
- if (!String::NewFromUtf8(context->GetIsolate(),
+ if (!String::NewFromUtf8(Isolate::GetCurrent(),
mem->data,
NewStringType::kNormal,
mem->length)
@@ -125,7 +125,7 @@ MaybeLocal<Value> ToV8Value(Local<Context> context, const BIOPointer& bio) {
if (!mem) [[unlikely]]
@@ -121,7 +121,7 @@ MaybeLocal<Value> ToV8Value(Local<Context> context, const BIOPointer& bio) {
return {};
BUF_MEM* mem = bio;
Local<Value> ret;
- if (!String::NewFromUtf8(context->GetIsolate(),
+ if (!String::NewFromUtf8(Isolate::GetCurrent(),
@@ -133,10 +133,10 @@ index 6fe4f0492dc1f3eaf576c8ff7866080a54cb81c1..41e8e052ff81df78ece87163b0499966
// Recreate the buffer in the constructor.
InternalFieldInfo* casted_info = static_cast<InternalFieldInfo*>(info);
diff --git a/src/env.cc b/src/env.cc
index dd8be35de7c447c9a77f8b9849a6a3de19bc55af..79a309661aff7ebb18f96e64d0d2b860ab80dc57 100644
index 57a46c8be2e052b298ed841eed6f291d62711750..e4ffaa465a4ffe21334496c52334fcb1404f67a9 100644
--- a/src/env.cc
+++ b/src/env.cc
@@ -1763,10 +1763,10 @@ void AsyncHooks::Deserialize(Local<Context> context) {
@@ -1764,10 +1764,10 @@ void AsyncHooks::Deserialize(Local<Context> context) {
context->GetDataFromSnapshotOnce<Array>(
info_->js_execution_async_resources).ToLocalChecked();
} else {
@@ -149,7 +149,7 @@ index dd8be35de7c447c9a77f8b9849a6a3de19bc55af..79a309661aff7ebb18f96e64d0d2b860
// The native_execution_async_resources_ field requires v8::Local<> instances
// for async calls whose resources were on the stack as JS objects when they
@@ -1806,7 +1806,7 @@ AsyncHooks::SerializeInfo AsyncHooks::Serialize(Local<Context> context,
@@ -1807,7 +1807,7 @@ AsyncHooks::SerializeInfo AsyncHooks::Serialize(Local<Context> context,
info.async_id_fields = async_id_fields_.Serialize(context, creator);
if (!js_execution_async_resources_.IsEmpty()) {
info.js_execution_async_resources = creator->AddData(
@@ -353,10 +353,10 @@ index 52483740bb377a2bc2a16af701615d9a4e448eae..84d17a46efe146c1794a43963c41a446
CHECK(!env->temporary_required_module_facade_original.IsEmpty());
return env->temporary_required_module_facade_original.Get(isolate);
diff --git a/src/node.h b/src/node.h
index 8aac774805a002f5af9e9aca62abc56e8f986bab..da87773ba7f0d38f04a7b3851d8a1a6df0eca489 100644
index bbe35c7a8f1bc0bcddf628af42b71efaef8a7759..102bcc0b3400fd334bdf259a076a3ac3b5d4a266 100644
--- a/src/node.h
+++ b/src/node.h
@@ -1144,7 +1144,7 @@ NODE_DEPRECATED("Use v8::Date::ValueOf() directly",
@@ -1142,7 +1142,7 @@ NODE_DEPRECATED("Use v8::Date::ValueOf() directly",
#define NODE_DEFINE_CONSTANT(target, constant) \
do { \
@@ -365,7 +365,7 @@ index 8aac774805a002f5af9e9aca62abc56e8f986bab..da87773ba7f0d38f04a7b3851d8a1a6d
v8::Local<v8::Context> context = isolate->GetCurrentContext(); \
v8::Local<v8::String> constant_name = v8::String::NewFromUtf8Literal( \
isolate, #constant, v8::NewStringType::kInternalized); \
@@ -1160,7 +1160,7 @@ NODE_DEPRECATED("Use v8::Date::ValueOf() directly",
@@ -1158,7 +1158,7 @@ NODE_DEPRECATED("Use v8::Date::ValueOf() directly",
#define NODE_DEFINE_HIDDEN_CONSTANT(target, constant) \
do { \
@@ -375,10 +375,10 @@ index 8aac774805a002f5af9e9aca62abc56e8f986bab..da87773ba7f0d38f04a7b3851d8a1a6d
v8::Local<v8::String> constant_name = v8::String::NewFromUtf8Literal( \
isolate, #constant, v8::NewStringType::kInternalized); \
diff --git a/src/node_blob.cc b/src/node_blob.cc
index 40407527800075b6afec5b6c7d98de2c6229e85a..6371aad07beb8514ca2a3acfd30d7c68969e8e20 100644
index 4311d71bb0526f9a83a16525243446a590092910..417cd8cbd307b9bfc498ad2df24ed193616ac512 100644
--- a/src/node_blob.cc
+++ b/src/node_blob.cc
@@ -561,7 +561,7 @@ void BlobBindingData::Deserialize(Local<Context> context,
@@ -562,7 +562,7 @@ void BlobBindingData::Deserialize(Local<Context> context,
int index,
InternalFieldInfoBase* info) {
DCHECK_IS_SNAPSHOT_SLOT(index);
@@ -388,37 +388,37 @@ index 40407527800075b6afec5b6c7d98de2c6229e85a..6371aad07beb8514ca2a3acfd30d7c68
BlobBindingData* binding = realm->AddBindingData<BlobBindingData>(holder);
CHECK_NOT_NULL(binding);
diff --git a/src/node_builtins.cc b/src/node_builtins.cc
index 7922f2f936f64cbb7bd08f0d367f66f0b9eb083b..04d76fdd3d170a7c501bd773b698d380b8bb2426 100644
index 3377d697615ee168e49e83c4202bc227581f1aaf..1a9a57b73e635ac61016598687167a08b073f84a 100644
--- a/src/node_builtins.cc
+++ b/src/node_builtins.cc
@@ -274,7 +274,7 @@ MaybeLocal<Data> BuiltinLoader::LookupAndCompile(
Local<Context> context,
const BuiltinSource* builtin_source,
@@ -260,7 +260,7 @@ MaybeLocal<Function> BuiltinLoader::LookupAndCompileInternal(
const char* id,
LocalVector<String>* parameters,
Realm* optional_realm) {
- Isolate* isolate = context->GetIsolate();
+ Isolate* isolate = Isolate::GetCurrent();
EscapableHandleScope scope(isolate);
BuiltinCodeCacheData cached_data{};
@@ -442,7 +442,7 @@ void BuiltinLoader::SaveCodeCache(const std::string& id, Local<Data> data) {
MaybeLocal<Function> BuiltinLoader::LookupAndCompileFunction(
Local<Context> context, const char* id, Realm* optional_realm) {
Local<String> source;
@@ -382,7 +382,7 @@ void BuiltinLoader::SaveCodeCache(const char* id, Local<Function> fun) {
MaybeLocal<Function> BuiltinLoader::LookupAndCompile(Local<Context> context,
const char* id,
Realm* optional_realm) {
- Isolate* isolate = context->GetIsolate();
+ Isolate* isolate = Isolate::GetCurrent();
LocalVector<String> parameters(isolate);
Local<Data> data;
@@ -483,7 +483,7 @@ MaybeLocal<Function> BuiltinLoader::LookupAndCompileFunction(
// Detects parameters of the scripts based on module ids.
// internal/bootstrap/realm: process, getLinkedBinding,
@@ -436,7 +436,7 @@ MaybeLocal<Function> BuiltinLoader::LookupAndCompile(Local<Context> context,
MaybeLocal<Value> BuiltinLoader::CompileAndCall(Local<Context> context,
const char* id,
Realm* realm) {
- Isolate* isolate = context->GetIsolate();
+ Isolate* isolate = Isolate::GetCurrent();
const BuiltinSource* builtin_source = LoadBuiltinSource(isolate, id);
if (builtin_source == nullptr) {
THROW_ERR_MODULE_NOT_FOUND(isolate, "Cannot find module %s", id);
@@ -555,7 +555,7 @@ MaybeLocal<Value> BuiltinLoader::CompileAndCallWith(Local<Context> context,
// Detects parameters of the scripts based on module ids.
// internal/bootstrap/realm: process, getLinkedBinding,
// getInternalBinding, primordials
@@ -492,7 +492,7 @@ MaybeLocal<Value> BuiltinLoader::CompileAndCall(Local<Context> context,
if (!maybe_fn.ToLocal(&fn)) {
return MaybeLocal<Value>();
}
@@ -427,17 +427,17 @@ index 7922f2f936f64cbb7bd08f0d367f66f0b9eb083b..04d76fdd3d170a7c501bd773b698d380
return fn->Call(context, undefined, argc, argv);
}
@@ -579,14 +579,14 @@ bool BuiltinLoader::CompileAllBuiltinsAndCopyCodeCache(
@@ -530,14 +530,14 @@ bool BuiltinLoader::CompileAllBuiltinsAndCopyCodeCache(
to_eager_compile_.emplace(id);
}
- TryCatch bootstrapCatch(context->GetIsolate());
+ TryCatch bootstrapCatch(Isolate::GetCurrent());
auto data = LookupAndCompile(context, id.data(), nullptr);
auto fn = LookupAndCompile(context, id.data(), nullptr);
if (bootstrapCatch.HasCaught()) {
per_process::Debug(DebugCategory::CODE_CACHE,
"Failed to compile code cache for %s\n",
id);
id.data());
all_succeeded = false;
- PrintCaughtException(context->GetIsolate(), context, bootstrapCatch);
+ PrintCaughtException(Isolate::GetCurrent(), context, bootstrapCatch);
@@ -458,19 +458,19 @@ index fea0426496978c0003fe1481afcf93fc9c23edca..c9588880d05435ab9f4e23fcff74c933
CHECK(
diff --git a/src/node_contextify.cc b/src/node_contextify.cc
index 0f5da3e48a2049a3e61fe81da94426a3b5928c7d..1561e5096f5d8274001429000aa26731bca1b34e 100644
index 986a2d8da7fd04b5d4060d9c8d44c61a231dcce6..9f11d32c70366524cf3b7c1cfdfd24f31e438e7b 100644
--- a/src/node_contextify.cc
+++ b/src/node_contextify.cc
@@ -108,6 +108,8 @@ using v8::Value;
// For every `set` of a global property, the interceptor callback defines or
// changes the property both on the sandbox and the global proxy.
@@ -113,7 +113,7 @@ namespace {
+
+
ContextifyContext* ContextifyContext::New(Environment* env,
Local<Object> sandbox_obj,
ContextOptions* options) {
@@ -667,7 +669,7 @@ Intercepted ContextifyContext::PropertyDefinerCallback(
// Convert an int to a V8 Name (String or Symbol).
MaybeLocal<String> Uint32ToName(Local<Context> context, uint32_t index) {
- return Uint32::New(context->GetIsolate(), index)->ToString(context);
+ return Uint32::New(Isolate::GetCurrent(), index)->ToString(context);
}
} // anonymous namespace
@@ -677,7 +677,7 @@ Intercepted ContextifyContext::PropertyDefinerCallback(
}
Local<Context> context = ctx->context();
@@ -479,7 +479,7 @@ index 0f5da3e48a2049a3e61fe81da94426a3b5928c7d..1561e5096f5d8274001429000aa26731
PropertyAttribute attributes = PropertyAttribute::None;
bool is_declared =
@@ -1640,7 +1642,7 @@ static MaybeLocal<Function> CompileFunctionForCJSLoader(
@@ -1665,7 +1665,7 @@ static MaybeLocal<Function> CompileFunctionForCJSLoader(
bool* cache_rejected,
bool is_cjs_scope,
ScriptCompiler::CachedData* cached_data) {
@@ -489,7 +489,7 @@ index 0f5da3e48a2049a3e61fe81da94426a3b5928c7d..1561e5096f5d8274001429000aa26731
Local<Symbol> symbol = env->vm_dynamic_import_default_internal();
diff --git a/src/node_env_var.cc b/src/node_env_var.cc
index 5550a4bee3ce9ec8759d216335a9b2b96e20c96e..b38c2f75a92f2d6c1b9c6e7b7aca924653f7494d 100644
index 6aad252eb5681bb9ab9890812602b43c418e7a7f..5f7ef8cc58f589ba30a44abaaaaaf1514458c3f0 100644
--- a/src/node_env_var.cc
+++ b/src/node_env_var.cc
@@ -311,7 +311,7 @@ std::shared_ptr<KVStore> KVStore::CreateMapKVStore() {
@@ -502,10 +502,10 @@ index 5550a4bee3ce9ec8759d216335a9b2b96e20c96e..b38c2f75a92f2d6c1b9c6e7b7aca9246
Local<Array> keys;
if (!entries->GetOwnPropertyNames(context).ToLocal(&keys))
diff --git a/src/node_errors.cc b/src/node_errors.cc
index c6404e00d04e61b675a8c4a02139b36da25bd2a8..ea90e6501bb58260f06d6720cc2fc4989752a347 100644
index 55a0c986c5b6989ee9ce277bb6a9778abb2ad2ee..809d88f21e5572807e38132d40ee75870ab8de07 100644
--- a/src/node_errors.cc
+++ b/src/node_errors.cc
@@ -629,7 +629,7 @@ v8::ModifyCodeGenerationFromStringsResult ModifyCodeGenerationFromStrings(
@@ -631,7 +631,7 @@ v8::ModifyCodeGenerationFromStringsResult ModifyCodeGenerationFromStrings(
v8::Local<v8::Context> context,
v8::Local<v8::Value> source,
bool is_code_like) {
@@ -514,7 +514,7 @@ index c6404e00d04e61b675a8c4a02139b36da25bd2a8..ea90e6501bb58260f06d6720cc2fc498
if (context->GetNumberOfEmbedderDataFields() <=
ContextEmbedderIndex::kAllowCodeGenerationFromStrings) {
@@ -1035,7 +1035,7 @@ const char* errno_string(int errorno) {
@@ -1037,7 +1037,7 @@ const char* errno_string(int errorno) {
}
void PerIsolateMessageListener(Local<Message> message, Local<Value> error) {
@@ -523,7 +523,7 @@ index c6404e00d04e61b675a8c4a02139b36da25bd2a8..ea90e6501bb58260f06d6720cc2fc498
switch (message->ErrorLevel()) {
case Isolate::MessageErrorLevel::kMessageWarning: {
Environment* env = Environment::GetCurrent(isolate);
@@ -1195,7 +1195,7 @@ void Initialize(Local<Object> target,
@@ -1197,7 +1197,7 @@ void Initialize(Local<Object> target,
SetMethod(
context, target, "getErrorSourcePositions", GetErrorSourcePositions);
@@ -533,10 +533,10 @@ index c6404e00d04e61b675a8c4a02139b36da25bd2a8..ea90e6501bb58260f06d6720cc2fc498
READONLY_PROPERTY(target, "exitCodes", exit_codes);
diff --git a/src/node_file.cc b/src/node_file.cc
index 4c0a128628e0184af53451d37149b33342f6e647..c775015350c8269fa3fcb99b8db70762c22b9fcf 100644
index c7a9648b0f83e910190dc620f4b72577ffde6c44..46cd16b535d9bd651ef733ca52ea58db7d39b09f 100644
--- a/src/node_file.cc
+++ b/src/node_file.cc
@@ -3898,7 +3898,7 @@ void BindingData::Deserialize(Local<Context> context,
@@ -3857,7 +3857,7 @@ void BindingData::Deserialize(Local<Context> context,
int index,
InternalFieldInfoBase* info) {
DCHECK_IS_SNAPSHOT_SLOT(index);
@@ -647,10 +647,10 @@ index ba2dd7e676bfdfe7da66a4a79db3c791a505c9a8..28e6cfac682e301b605c00c4ef2eaf01
if (!error_obj->GetOwnPropertyNames(context).ToLocal(&keys)) {
return writer->json_objectend(); // the end of 'errorProperties'
diff --git a/src/node_snapshotable.cc b/src/node_snapshotable.cc
index 41b0773f4c37a016cfa55aff6bb03baf50905b32..c3e995adf437413fe956e45c2ccc704d8737de59 100644
index c2e24b4645e7903e08c80aead1c18c7bcff1bd89..e34d24d51d5c090b560d06f727043f20924e6f46 100644
--- a/src/node_snapshotable.cc
+++ b/src/node_snapshotable.cc
@@ -1675,7 +1675,7 @@ void BindingData::Deserialize(Local<Context> context,
@@ -1614,7 +1614,7 @@ void BindingData::Deserialize(Local<Context> context,
int index,
InternalFieldInfoBase* info) {
DCHECK_IS_SNAPSHOT_SLOT(index);
@@ -660,10 +660,10 @@ index 41b0773f4c37a016cfa55aff6bb03baf50905b32..c3e995adf437413fe956e45c2ccc704d
// Recreate the buffer in the constructor.
InternalFieldInfo* casted_info = static_cast<InternalFieldInfo*>(info);
diff --git a/src/node_sqlite.cc b/src/node_sqlite.cc
index d7c5bc5514044aa1ed39dd4e1c0cef346498c96f..91b80b4fb44c26e95503556064e7429b8cbf4639 100644
index 050d779bdcd2b3129abddc3fefa1e852831df236..3f4749286406e03e77de6567b667c0098fbc2a18 100644
--- a/src/node_sqlite.cc
+++ b/src/node_sqlite.cc
@@ -2436,7 +2436,7 @@ bool StatementSync::BindParams(const FunctionCallbackInfo<Value>& args) {
@@ -2162,7 +2162,7 @@ bool StatementSync::BindParams(const FunctionCallbackInfo<Value>& args) {
if (args[0]->IsObject() && !args[0]->IsArrayBufferView()) {
Local<Object> obj = args[0].As<Object>();
@@ -699,7 +699,7 @@ index 9b676a0156ab8ef47f62627be953c23d4fcbf4f4..6294cd03667980e2ad23cae9e7961262
BindingData* binding = realm->AddBindingData<BindingData>(holder);
CHECK_NOT_NULL(binding);
diff --git a/src/node_v8.cc b/src/node_v8.cc
index 4ee452d5bc6b67da52e91d98531ac35a7af155c7..b226d6fe60f4fdf5a237c336b9101c5a974ce837 100644
index 8dd32dad262679444c10878299eb6bb8fb04e120..935ea2cf5157c3a2fbdf142fc7024ec6b6d5de26 100644
--- a/src/node_v8.cc
+++ b/src/node_v8.cc
@@ -163,7 +163,7 @@ void BindingData::Deserialize(Local<Context> context,
@@ -736,10 +736,10 @@ index 370221d3cddc201180260ecb3a222bc831c91093..f5aff2f65fe6b9f48cf970ab3e7c57cf
THROW_ERR_WASI_NOT_STARTED(isolate);
return EinvalError<R>();
diff --git a/src/node_webstorage.cc b/src/node_webstorage.cc
index 013322e8fb6cb76074326c2a45a04eb0f8e133f1..0a169a8dcf27eeb5b5b0c1b00ac8b79ed43d551b 100644
index 5c7d268d38ff55ce4db07463b1ea0bcb2f4e63ea..bd83654012442195866e57173b6e5d4d25fecf0f 100644
--- a/src/node_webstorage.cc
+++ b/src/node_webstorage.cc
@@ -56,7 +56,7 @@ using v8::Value;
@@ -57,7 +57,7 @@ using v8::Value;
} while (0)
static void ThrowQuotaExceededException(Local<Context> context) {
@@ -748,17 +748,17 @@ index 013322e8fb6cb76074326c2a45a04eb0f8e133f1..0a169a8dcf27eeb5b5b0c1b00ac8b79e
auto dom_exception_str = FIXED_ONE_BYTE_STRING(isolate, "DOMException");
auto err_name = FIXED_ONE_BYTE_STRING(isolate, "QuotaExceededError");
auto err_message =
@@ -435,6 +435,8 @@ Maybe<void> Storage::Store(Local<Name> key, Local<Value> value) {
return JustVoid();
@@ -437,7 +437,7 @@ Maybe<void> Storage::Store(Local<Name> key, Local<Value> value) {
}
static MaybeLocal<String> Uint32ToName(Local<Context> context, uint32_t index) {
- return Uint32::New(context->GetIsolate(), index)->ToString(context);
+ return Uint32::New(Isolate::GetCurrent(), index)->ToString(context);
}
+
+
static void Clear(const FunctionCallbackInfo<Value>& info) {
Storage* storage;
ASSIGN_OR_RETURN_UNWRAP(&storage, info.This());
diff --git a/src/node_worker.cc b/src/node_worker.cc
index a2631a96371becb0f4ea4f47a52313f4f02477da..4866c7ff589825d41fe84786ed8f9b3fccd3d1b7 100644
index 1acc61af0c995ddefbc00fe232b2454de77a84a3..3041746fc8a132f68cc1d801bb1700634699828d 100644
--- a/src/node_worker.cc
+++ b/src/node_worker.cc
@@ -1465,8 +1465,6 @@ void GetEnvMessagePort(const FunctionCallbackInfo<Value>& args) {
@@ -784,10 +784,10 @@ index da4206187f7c7d2becb8a101c1ff5346a10e13f4..03f0910926f3d403121e227cee32a546
// Recreate the buffer in the constructor.
BindingData* binding = realm->AddBindingData<BindingData>(holder);
diff --git a/src/util-inl.h b/src/util-inl.h
index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66db54b068f 100644
index aae5956742f195279ab6af04029d76dee6af2e84..6898e8ea794675e903e13e2b45524d572a3f68bb 100644
--- a/src/util-inl.h
+++ b/src/util-inl.h
@@ -337,14 +337,14 @@ v8::Maybe<void> FromV8Array(v8::Local<v8::Context> context,
@@ -336,14 +336,14 @@ v8::Maybe<void> FromV8Array(v8::Local<v8::Context> context,
std::vector<v8::Global<v8::Value>>* out) {
uint32_t count = js_array->Length();
out->reserve(count);
@@ -804,7 +804,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
if (str.size() >= static_cast<size_t>(v8::String::kMaxLength)) [[unlikely]] {
// V8 only has a TODO comment about adding an exception when the maximum
// string size is exceeded.
@@ -360,7 +360,7 @@ v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
@@ -359,7 +359,7 @@ v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
std::u16string_view str,
v8::Isolate* isolate) {
@@ -813,7 +813,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
if (str.length() >= static_cast<size_t>(v8::String::kMaxLength))
[[unlikely]] {
// V8 only has a TODO comment about adding an exception when the maximum
@@ -380,7 +380,7 @@ v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
@@ -379,7 +379,7 @@ v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
v8_inspector::StringView str,
v8::Isolate* isolate) {
@@ -822,7 +822,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
if (str.length() >= static_cast<size_t>(v8::String::kMaxLength))
[[unlikely]] {
// V8 only has a TODO comment about adding an exception when the maximum
@@ -407,7 +407,7 @@ template <typename T>
@@ -406,7 +406,7 @@ template <typename T>
v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
const std::vector<T>& vec,
v8::Isolate* isolate) {
@@ -831,7 +831,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
v8::EscapableHandleScope handle_scope(isolate);
MaybeStackBuffer<v8::Local<v8::Value>, 128> arr(vec.size());
@@ -424,7 +424,7 @@ template <typename T>
@@ -423,7 +423,7 @@ template <typename T>
v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
const std::set<T>& set,
v8::Isolate* isolate) {
@@ -840,7 +840,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
v8::Local<v8::Set> set_js = v8::Set::New(isolate);
v8::HandleScope handle_scope(isolate);
@@ -443,7 +443,7 @@ template <typename T, std::size_t U>
@@ -442,7 +442,7 @@ template <typename T, std::size_t U>
v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
const std::ranges::elements_view<T, U>& vec,
v8::Isolate* isolate) {
@@ -849,7 +849,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
v8::EscapableHandleScope handle_scope(isolate);
MaybeStackBuffer<v8::Local<v8::Value>, 128> arr(vec.size());
@@ -462,7 +462,7 @@ template <typename T, typename U>
@@ -461,7 +461,7 @@ template <typename T, typename U>
v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
const std::unordered_map<T, U>& map,
v8::Isolate* isolate) {
@@ -858,7 +858,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
v8::EscapableHandleScope handle_scope(isolate);
v8::Local<v8::Map> ret = v8::Map::New(isolate);
@@ -505,7 +505,7 @@ template <typename T, typename>
@@ -504,7 +504,7 @@ template <typename T, typename>
v8::MaybeLocal<v8::Value> ToV8Value(v8::Local<v8::Context> context,
const T& number,
v8::Isolate* isolate) {
@@ -867,7 +867,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
return ConvertNumberToV8Value(isolate, number);
}
@@ -518,7 +518,7 @@ v8::Local<v8::Array> ToV8ValuePrimitiveArray(v8::Local<v8::Context> context,
@@ -517,7 +517,7 @@ v8::Local<v8::Array> ToV8ValuePrimitiveArray(v8::Local<v8::Context> context,
std::is_floating_point_v<T>,
"Only primitive types (bool, integral, floating-point) are supported.");
@@ -876,7 +876,7 @@ index f42c7b1250d1eb2e4d9f8e10c5ea9a9ca310924b..d59e30a635b08b97d255ed2e5540a66d
v8::EscapableHandleScope handle_scope(isolate);
v8::LocalVector<v8::Value> elements(isolate);
@@ -811,7 +811,7 @@ inline v8::MaybeLocal<v8::Object> NewDictionaryInstanceNullProto(
@@ -803,7 +803,7 @@ inline v8::MaybeLocal<v8::Object> NewDictionaryInstanceNullProto(
if (value.IsEmpty()) return v8::MaybeLocal<v8::Object>();
}
v8::Local<v8::Object> obj = tmpl->NewInstance(context, property_values);
@@ -935,7 +935,7 @@ index 660cfff6b8a0c583be843e555e7a06cd09e0d279..c4b39450c5b7f91c46f7027db367c30d
context, that, OneByteString(isolate, name), tmpl, flag);
}
diff --git a/src/util.h b/src/util.h
index 51f0b6463ab6bc33aa4e66bd55e0ab3822840ab0..ee05fa017f48c5b03e7179d6fef39b6e32e488a5 100644
index 81d08c27fb7037d16e12843dc03c3d8f9caee723..52e6a149d6760640d93c56ce91a759ae9207a8c7 100644
--- a/src/util.h
+++ b/src/util.h
@@ -753,7 +753,7 @@ inline v8::MaybeLocal<v8::Value> ToV8Value(

View File

@@ -10,15 +10,11 @@ however those files were cherry-picked from main branch and do not
really in 20/21. We have to wait until 22 is released to be able to
build with upstream GN files.
src/inspector/unofficial.gni - The node_protocol_generated_sources action
was missing gypi_values.node_pdl_files from its inputs, causing Ninja to skip
regeneration when PDL domain files changed. Should be upstreamed.
diff --git a/configure.py b/configure.py
index fa25de8c316b71d3ad5b55b5ce398b69a5d4a965..fc48438060e0dd84edc60d1aebf3d0946be98ea9 100755
index 98a8b147e4cbfd5957c35688f2b37ae0ca52a818..fd13970ae73bbe5db186f81faed792a5597bbcd0 100755
--- a/configure.py
+++ b/configure.py
@@ -1838,7 +1838,7 @@ def configure_v8(o, configs):
@@ -1821,7 +1821,7 @@ def configure_v8(o, configs):
# Until we manage to get rid of all those, v8_enable_sandbox cannot be used.
# Note that enabling pointer compression without enabling sandbox is unsupported by V8,
# so this can be broken at any time.
@@ -27,8 +23,54 @@ index fa25de8c316b71d3ad5b55b5ce398b69a5d4a965..fc48438060e0dd84edc60d1aebf3d094
# We set v8_enable_pointer_compression_shared_cage to 0 always, even when
# pointer compression is enabled so that we don't accidentally enable shared
# cage mode when pointer compression is on.
diff --git a/deps/merve/BUILD.gn b/deps/merve/BUILD.gn
new file mode 100644
index 0000000000000000000000000000000000000000..7bb318f8835dba6f4a6f211d8534bb6923958747
--- /dev/null
+++ b/deps/merve/BUILD.gn
@@ -0,0 +1,14 @@
+##############################################################################
+# #
+# DO NOT EDIT THIS FILE! #
+# #
+##############################################################################
+
+# This file is used by GN for building, which is NOT the build system used for
+# building official binaries.
+# Please modify the gyp files if you are making changes to build system.
+
+import("unofficial.gni")
+
+merve_gn_build("merve") {
+}
diff --git a/deps/merve/unofficial.gni b/deps/merve/unofficial.gni
new file mode 100644
index 0000000000000000000000000000000000000000..dfb508d1d22f84accb146620ed07d89715b367e6
--- /dev/null
+++ b/deps/merve/unofficial.gni
@@ -0,0 +1,20 @@
+# This file is used by GN for building, which is NOT the build system used for
+# building official binaries.
+# Please edit the gyp files if you are making changes to build system.
+
+# The actual configurations are put inside a template in unofficial.gni to
+# prevent accidental edits from contributors.
+template("merve_gn_build") {
+ config("merve_config") {
+ include_dirs = [ "." ]
+ }
+ gypi_values = exec_script("../../tools/gypi_to_gn.py",
+ [ rebase_path("merve.gyp") ],
+ "scope",
+ [ "merve.gyp" ])
+ source_set(target_name) {
+ forward_variables_from(invoker, "*")
+ public_configs = [ ":merve_config" ]
+ sources = gypi_values.merve_sources
+ }
+}
diff --git a/node.gni b/node.gni
index 41f200189a34e150e4c8f25da2a72c2108259720..156fee33b3813fe4d94a1c9585f217a99dbfbd5f 100644
index d4438f7fd61598afac2c1e3184721a759d22b10c..156fee33b3813fe4d94a1c9585f217a99dbfbd5f 100644
--- a/node.gni
+++ b/node.gni
@@ -5,10 +5,10 @@
@@ -44,7 +86,16 @@ index 41f200189a34e150e4c8f25da2a72c2108259720..156fee33b3813fe4d94a1c9585f217a9
# The location of OpenSSL - use the one from node's deps by default.
node_openssl_path = "$node_path/deps/openssl"
@@ -48,7 +48,7 @@ declare_args() {
@@ -26,8 +26,6 @@ declare_args() {
# TODO(zcbenz): This is currently copied from configure.py, we should share
# the list between configure.py and GN configurations.
node_builtin_shareable_builtins = [
- "deps/cjs-module-lexer/lexer.js",
- "deps/cjs-module-lexer/dist/lexer.js",
"deps/undici/undici.js",
"deps/amaro/dist/index.js",
]
@@ -50,7 +48,7 @@ declare_args() {
node_openssl_system_ca_path = ""
# Initialize v8 platform during node.js startup.
@@ -53,7 +104,7 @@ index 41f200189a34e150e4c8f25da2a72c2108259720..156fee33b3813fe4d94a1c9585f217a9
# Custom build tag.
node_tag = ""
@@ -68,10 +68,16 @@ declare_args() {
@@ -70,10 +68,16 @@ declare_args() {
# TODO(zcbenz): There are few broken things for now:
# 1. cross-os compilation is not supported.
# 2. node_mksnapshot crashes when cross-compiling for x64 from arm64.
@@ -71,65 +122,12 @@ index 41f200189a34e150e4c8f25da2a72c2108259720..156fee33b3813fe4d94a1c9585f217a9
}
assert(!node_enable_inspector || node_use_openssl,
diff --git a/src/inspector/unofficial.gni b/src/inspector/unofficial.gni
index 4810d93eb971b253f7dadff7011a632f6dbe6a2b..48cf5102630737ffeba98719a8b508b52fe5e27a 100644
--- a/src/inspector/unofficial.gni
+++ b/src/inspector/unofficial.gni
@@ -29,7 +29,7 @@ template("inspector_gn_build") {
deps = [ ":node_protocol_json" ]
outputs = gypi_values.node_inspector_generated_sources
- inputs = gypi_values.node_protocol_files + [
+ inputs = gypi_values.node_protocol_files + gypi_values.node_pdl_files + [
"node_protocol_config.json",
"$node_gen_dir/src/node_protocol.json",
]
diff --git a/src/node_builtins.cc b/src/node_builtins.cc
index 6506dcea3f4f88a7781975fae1ee5f8b87d4dfb2..a077ad673fdf7eab61878940e5fef43921c2e453 100644
index f25ca01d6ef016489371a3a1c9d8500da65e8023..2c816bef8d64f3e0ba2993c4885641620ee64272 100644
--- a/src/node_builtins.cc
+++ b/src/node_builtins.cc
@@ -455,6 +455,30 @@ MaybeLocal<Function> BuiltinLoader::LookupAndCompileFunction(
return value.As<Function>();
}
+MaybeLocal<Function> BuiltinLoader::LookupAndCompileFunction(
+ Local<Context> context,
+ const char* id,
+ LocalVector<String>* parameters,
+ Realm* optional_realm) {
+ Isolate* isolate = Isolate::GetCurrent();
+ const BuiltinSource* builtin_source = LoadBuiltinSource(isolate, id);
+ if (builtin_source == nullptr) {
+ THROW_ERR_MODULE_NOT_FOUND(isolate, "Cannot find module %s", id);
+ return MaybeLocal<Function>();
+ }
+ std::string filename_s = std::string("node:") + builtin_source->id;
+ Local<String> filename = OneByteString(isolate, filename_s);
+ Local<String> source = builtin_source->source.ToStringChecked(isolate);
+ ScriptOrigin origin(filename, 0, 0, true);
+ ScriptCompiler::Source script_source(source, origin);
+ return ScriptCompiler::CompileFunction(context,
+ &script_source,
+ parameters->size(),
+ parameters->data(),
+ 0,
+ nullptr);
+}
+
MaybeLocal<Value> BuiltinLoader::CompileAndCall(Local<Context> context,
const char* id,
Realm* realm) {
@@ -741,7 +765,7 @@ MaybeLocal<Module> BuiltinLoader::LoadBuiltinSourceTextModule(Realm* realm,
// Pre-fetch all dependencies.
if (requests->Length() > 0) {
for (int i = 0; i < requests->Length(); i++) {
- Local<ModuleRequest> req = requests->Get(context, i).As<ModuleRequest>();
+ Local<ModuleRequest> req = requests->Get(i).As<ModuleRequest>();
std::string specifier =
Utf8Value(isolate, req->GetSpecifier()).ToString();
std::string resolved_id = ResolveRequestForBuiltin(specifier);
@@ -900,6 +924,7 @@ void BuiltinLoader::RegisterExternalReferences(
registry->Register(ImportBuiltinSourceTextModule);
@@ -760,6 +760,7 @@ void BuiltinLoader::RegisterExternalReferences(
registry->Register(GetNatives);
RegisterExternalReferencesForInternalizedBuiltinCode(registry);
+ EmbedderRegisterExternalReferencesForInternalizedBuiltinCode(registry);
@@ -137,10 +135,10 @@ index 6506dcea3f4f88a7781975fae1ee5f8b87d4dfb2..a077ad673fdf7eab61878940e5fef439
} // namespace builtins
diff --git a/src/node_builtins.h b/src/node_builtins.h
index e4af1f42f4442b4c1ec94cf25d8d811f0e82d89e..490f429986e43653e0dd2048d9e3bd2e99ae44b2 100644
index 7a7b84337feb67960819472e43192dbdc151e299..bcdd50f635757f41287c87df1db9cd3b55c4b6b9 100644
--- a/src/node_builtins.h
+++ b/src/node_builtins.h
@@ -82,6 +82,8 @@ using BuiltinCodeCacheMap =
@@ -75,6 +75,8 @@ using BuiltinCodeCacheMap =
// Generated by tools/js2c.cc as node_javascript.cc
void RegisterExternalReferencesForInternalizedBuiltinCode(
ExternalReferenceRegistry* registry);
@@ -149,27 +147,13 @@ index e4af1f42f4442b4c1ec94cf25d8d811f0e82d89e..490f429986e43653e0dd2048d9e3bd2e
// Handles compilation and caching of built-in JavaScript modules and
// bootstrap scripts, whose source are bundled into the binary as static data.
@@ -104,6 +106,13 @@ class NODE_EXTERN_PRIVATE BuiltinLoader {
v8::MaybeLocal<v8::Function> LookupAndCompileFunction(
v8::Local<v8::Context> context, const char* id, Realm* optional_realm);
+ // Overload that accepts custom parameters for embedder scripts.
+ v8::MaybeLocal<v8::Function> LookupAndCompileFunction(
+ v8::Local<v8::Context> context,
+ const char* id,
+ v8::LocalVector<v8::String>* parameters,
+ Realm* optional_realm);
+
v8::MaybeLocal<v8::Value> CompileAndCallWith(v8::Local<v8::Context> context,
const char* id,
int argc,
diff --git a/tools/js2c.cc b/tools/js2c.cc
old mode 100644
new mode 100755
index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b345d478f
index 9c2f70de4e00834ff448e573743898072dc14c5d..71a12c606f4da7165cc41a295a278b2e504af1b6
--- a/tools/js2c.cc
+++ b/tools/js2c.cc
@@ -29,6 +29,7 @@ namespace js2c {
@@ -28,6 +28,7 @@ namespace js2c {
int Main(int argc, char* argv[]);
static bool is_verbose = false;
@@ -177,7 +161,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
void Debug(const char* format, ...) {
va_list arguments;
@@ -176,6 +177,7 @@ const char* kTemplate = R"(
@@ -175,6 +176,7 @@ const char* kTemplate = R"(
#include "node_builtins.h"
#include "node_external_reference.h"
#include "node_internals.h"
@@ -185,7 +169,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
namespace node {
@@ -191,7 +193,11 @@ const ThreadsafeCopyOnWrite<BuiltinSourceMap> global_source_map {
@@ -190,7 +192,11 @@ const ThreadsafeCopyOnWrite<BuiltinSourceMap> global_source_map {
} // anonymous namespace
void BuiltinLoader::LoadJavaScriptSource() {
@@ -198,7 +182,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
}
void RegisterExternalReferencesForInternalizedBuiltinCode(
@@ -208,6 +214,45 @@ UnionBytes BuiltinLoader::GetConfig() {
@@ -207,6 +213,45 @@ UnionBytes BuiltinLoader::GetConfig() {
} // namespace node
)";
@@ -244,7 +228,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
Fragment Format(const Fragments& definitions,
const Fragments& initializers,
const Fragments& registrations) {
@@ -217,13 +262,12 @@ Fragment Format(const Fragments& definitions,
@@ -216,13 +261,12 @@ Fragment Format(const Fragments& definitions,
size_t init_size = init_buf.size();
std::vector<char> reg_buf = Join(registrations, "\n");
size_t reg_size = reg_buf.size();
@@ -261,7 +245,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
static_cast<int>(def_buf.size()),
def_buf.data(),
static_cast<int>(init_buf.size()),
@@ -848,12 +892,15 @@ int JS2C(const FileList& js_files,
@@ -836,12 +880,15 @@ int JS2C(const FileList& js_files,
}
}
@@ -277,7 +261,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
Fragment out = Format(definitions, initializers, registrations);
return WriteIfChanged(out, dest);
}
@@ -879,6 +926,8 @@ int Main(int argc, char* argv[]) {
@@ -867,6 +914,8 @@ int Main(int argc, char* argv[]) {
std::string arg(argv[i]);
if (arg == "--verbose") {
is_verbose = true;
@@ -286,7 +270,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
} else if (arg == "--root") {
if (i == argc - 1) {
fprintf(stderr, "--root must be followed by a path\n");
@@ -927,6 +976,14 @@ int Main(int argc, char* argv[]) {
@@ -915,6 +964,14 @@ int Main(int argc, char* argv[]) {
}
}
@@ -301,7 +285,7 @@ index 2cb09f8e1d7ba6ba389f70cdfc6300458f469caa..1a7b6ec6e6c51cf947694fac5dfd860b
// Should have exactly 3 types: `.js`, `.mjs` and `.gypi`.
assert(file_map.size() == 3);
auto gypi_it = file_map.find(".gypi");
@@ -953,6 +1010,7 @@ int Main(int argc, char* argv[]) {
@@ -941,6 +998,7 @@ int Main(int argc, char* argv[]) {
std::sort(mjs_it->second.begin(), mjs_it->second.end());
return JS2C(js_it->second, mjs_it->second, gypi_it->second[0], output);
@@ -322,10 +306,10 @@ index 856878c33681a73d41016729dabe48b0a6a80589..91a11852d206b65485fe90fd037a0bd1
if sys.platform == 'win32':
files = [ x.replace('\\', '/') for x in files ]
diff --git a/unofficial.gni b/unofficial.gni
index aa78f9ce60c0439536eaf6e23880e30ebef0e1a9..df0ae804a5338d8f2ec4d331a1e2ed053c3c3955 100644
index c742b62c484e9dd205eff63dcffad78c76828375..bff7b0650cfe8578a044e45d0f9e352859909695 100644
--- a/unofficial.gni
+++ b/unofficial.gni
@@ -147,32 +147,42 @@ template("node_gn_build") {
@@ -147,31 +147,42 @@ template("node_gn_build") {
public_configs = [
":node_external_config",
"deps/googletest:googletest_config",
@@ -344,7 +328,7 @@ index aa78f9ce60c0439536eaf6e23880e30ebef0e1a9..df0ae804a5338d8f2ec4d331a1e2ed05
"deps/cares",
"deps/histogram",
"deps/llhttp",
"deps/merve",
+ "deps/merve",
"deps/nbytes",
"deps/nghttp2",
- "deps/ngtcp2",
@@ -371,7 +355,7 @@ index aa78f9ce60c0439536eaf6e23880e30ebef0e1a9..df0ae804a5338d8f2ec4d331a1e2ed05
"$target_gen_dir/node_javascript.cc",
] + gypi_values.node_sources
@@ -195,7 +205,7 @@ template("node_gn_build") {
@@ -194,7 +205,7 @@ template("node_gn_build") {
}
if (node_use_openssl) {
deps += [ "deps/ncrypto" ]
@@ -380,7 +364,7 @@ index aa78f9ce60c0439536eaf6e23880e30ebef0e1a9..df0ae804a5338d8f2ec4d331a1e2ed05
sources += gypi_values.node_crypto_sources
}
if (node_use_sqlite) {
@@ -224,6 +234,10 @@ template("node_gn_build") {
@@ -223,6 +234,10 @@ template("node_gn_build") {
}
}
@@ -391,7 +375,7 @@ index aa78f9ce60c0439536eaf6e23880e30ebef0e1a9..df0ae804a5338d8f2ec4d331a1e2ed05
executable(target_name) {
forward_variables_from(invoker, "*")
@@ -315,6 +329,7 @@ template("node_gn_build") {
@@ -314,6 +329,7 @@ template("node_gn_build") {
}
executable("node_js2c") {
@@ -399,9 +383,9 @@ index aa78f9ce60c0439536eaf6e23880e30ebef0e1a9..df0ae804a5338d8f2ec4d331a1e2ed05
deps = [
"deps/uv",
"$node_simdutf_path",
@@ -327,26 +342,75 @@ template("node_gn_build") {
"src/builtin_info.cc",
"src/builtin_info.h",
@@ -324,26 +340,75 @@ template("node_gn_build") {
"src/embedded_data.cc",
"src/embedded_data.h",
]
- include_dirs = [ "src" ]
+ include_dirs = [ "src", "tools" ]
@@ -485,7 +469,7 @@ index aa78f9ce60c0439536eaf6e23880e30ebef0e1a9..df0ae804a5338d8f2ec4d331a1e2ed05
outputs = [ "$target_gen_dir/node_javascript.cc" ]
# Get the path to node_js2c executable of the host toolchain.
@@ -360,11 +424,11 @@ template("node_gn_build") {
@@ -357,11 +422,11 @@ template("node_gn_build") {
get_label_info(":node_js2c($host_toolchain)", "name") +
host_executable_suffix

View File

@@ -14,7 +14,7 @@ We don't need to do this for zlib, as the existing gn workflow uses the same
Upstreamed at https://github.com/nodejs/node/pull/55903
diff --git a/unofficial.gni b/unofficial.gni
index eeffbafa4023ed68f56410c858e459f569ed6344..d7c717d5c2799692c7ea4537d7faea234faefaa2 100644
index a773152813376bef1fa227c331241a1d944c9317..43f09d1e68c88d3ba3b862a1a74769f73c370894 100644
--- a/unofficial.gni
+++ b/unofficial.gni
@@ -203,7 +203,17 @@ template("node_gn_build") {

View File

@@ -168,10 +168,10 @@ index 06a4f8a239855571dcc67cd81e7da7a255a9ebfd..1fa9e314ad796cdf74f718f0eb2a1553
}
diff --git a/node.gyp b/node.gyp
index 0620850e0872cfea9a1241b2a56f7bede7fa291a..934bee079ed785e4e72f33f71f445aae4ca6c0a6 100644
index f5cd416b5fe7a51084bc4af9a4427a8e62599fd8..b7072ce74354495bec49357f962f4ef2999bf727 100644
--- a/node.gyp
+++ b/node.gyp
@@ -184,9 +184,9 @@
@@ -182,9 +182,9 @@
'src/timers.cc',
'src/timer_wrap.cc',
'src/tracing/agent.cc',
@@ -182,7 +182,7 @@ index 0620850e0872cfea9a1241b2a56f7bede7fa291a..934bee079ed785e4e72f33f71f445aae
'src/tracing/traced_value.cc',
'src/tty_wrap.cc',
'src/udp_wrap.cc',
@@ -318,9 +318,9 @@
@@ -314,9 +314,9 @@
'src/tcp_wrap.h',
'src/timers.h',
'src/tracing/agent.h',
@@ -194,10 +194,10 @@ index 0620850e0872cfea9a1241b2a56f7bede7fa291a..934bee079ed785e4e72f33f71f445aae
'src/tracing/traced_value.h',
'src/timer_wrap.h',
diff --git a/src/async_wrap.cc b/src/async_wrap.cc
index 4ebb63b2f591dde25ead2ddd869e1740adc243db..f9123d5a629a43783b9a3ffe14b57f6d6cc9ac5d 100644
index 301f77c419f178c4eea258e0896327f69389dda7..d5068a18392a6128ceee7f0146f8f9c77f9924bb 100644
--- a/src/async_wrap.cc
+++ b/src/async_wrap.cc
@@ -113,8 +113,7 @@ void AsyncWrap::EmitPromiseResolve(Environment* env, double async_id) {
@@ -110,8 +110,7 @@ void AsyncWrap::EmitPromiseResolve(Environment* env, double async_id) {
}
void AsyncWrap::EmitTraceAsyncStart() const {
@@ -208,10 +208,10 @@ index 4ebb63b2f591dde25ead2ddd869e1740adc243db..f9123d5a629a43783b9a3ffe14b57f6d
get_trigger_async_id());
TRACE_EVENT_NESTABLE_ASYNC_BEGIN1(TRACING_CATEGORY_NODE1(async_hooks),
diff --git a/src/env.cc b/src/env.cc
index bbb30dd8a50f7b8550caf1967de8547cc7d8af47..ef6e14245fa668d04d0e9b1f71fcce48ec267f4e 100644
index fdabe48dd7776c59298f7d972286d0d2ed062752..c185d822b29c0b691bbf5f724f71f59638c6184d 100644
--- a/src/env.cc
+++ b/src/env.cc
@@ -649,8 +649,8 @@ void TrackingTraceStateObserver::UpdateTraceCategoryState() {
@@ -650,8 +650,8 @@ void TrackingTraceStateObserver::UpdateTraceCategoryState() {
return;
}
@@ -222,7 +222,7 @@ index bbb30dd8a50f7b8550caf1967de8547cc7d8af47..ef6e14245fa668d04d0e9b1f71fcce48
Isolate* isolate = env_->isolate();
HandleScope handle_scope(isolate);
@@ -892,8 +892,7 @@ Environment::Environment(IsolateData* isolate_data,
@@ -893,8 +893,7 @@ Environment::Environment(IsolateData* isolate_data,
time_origin_timestamp_,
MAYBE_FIELD_PTR(env_info, performance_state));
@@ -285,10 +285,10 @@ index 0bc086ccd1ff449c0f3fb08a972a0c45d3178f1c..ca74e83ef6f7b0e8b8496457af3813f0
bool use_wasm_trap_handler =
!per_process::cli_options->disable_wasm_trap_handler;
diff --git a/src/node_contextify.cc b/src/node_contextify.cc
index d3568da72a0f99419b4029b93d9eb1e9328d6bd5..0f5da3e48a2049a3e61fe81da94426a3b5928c7d 100644
index 3c234205e89be7e976dae5c3fcc73ca67953e034..986a2d8da7fd04b5d4060d9c8d44c61a231dcce6 100644
--- a/src/node_contextify.cc
+++ b/src/node_contextify.cc
@@ -1001,8 +1001,7 @@ void ContextifyScript::New(const FunctionCallbackInfo<Value>& args) {
@@ -1026,8 +1026,7 @@ void ContextifyScript::New(const FunctionCallbackInfo<Value>& args) {
ContextifyScript* contextify_script = New(env, args.This());
@@ -331,7 +331,7 @@ index c9173d404c79a69743fc75ddb6bba0ac9579c1ef..8ffac047a69b3900f37d712334c504a1
#define FS_DIR_ASYNC_TRACE_BEGIN0(fs_type, id) \
TRACE_EVENT_NESTABLE_ASYNC_BEGIN0(TRACING_CATEGORY_NODE2(fs_dir, async), \
diff --git a/src/node_file.cc b/src/node_file.cc
index bf202f5e2bf5eaf2dd9192dfd701e621126c492c..4c0a128628e0184af53451d37149b33342f6e647 100644
index 96aac2d86695732bf6805f2ad2168a62241b5045..c7a9648b0f83e910190dc620f4b72577ffde6c44 100644
--- a/src/node_file.cc
+++ b/src/node_file.cc
@@ -147,16 +147,23 @@ static const char* get_fs_func_name_by_type(uv_fs_type req_type) {
@@ -799,7 +799,7 @@ index 0bc9df81d87562243817a6618641a49b602654e3..b6dd8b9a9c21051f3d385d5ecea9c50c
namespace tracing {
diff --git a/unofficial.gni b/unofficial.gni
index df0ae804a5338d8f2ec4d331a1e2ed053c3c3955..eeffbafa4023ed68f56410c858e459f569ed6344 100644
index bff7b0650cfe8578a044e45d0f9e352859909695..a773152813376bef1fa227c331241a1d944c9317 100644
--- a/unofficial.gni
+++ b/unofficial.gni
@@ -143,7 +143,10 @@ template("node_gn_build") {

View File

@@ -7,7 +7,7 @@ Subject: build: ensure native module compilation fails if not using a new
This should not be upstreamed, it is a quality-of-life patch for downstream module builders.
diff --git a/common.gypi b/common.gypi
index d398e33d5acc9acd096e2c263a6f8ad9d947d1b0..f760bb6e6a498c3f9786b46b60fbbf38521ab60a 100644
index d9eb9527e3cbb3b101274ab19e6d6ace42f0e022..a1243ad39b8fcf564285ace0b51b1482bd85071b 100644
--- a/common.gypi
+++ b/common.gypi
@@ -89,6 +89,8 @@
@@ -42,19 +42,19 @@ index d398e33d5acc9acd096e2c263a6f8ad9d947d1b0..f760bb6e6a498c3f9786b46b60fbbf38
# list in v8/BUILD.gn.
['v8_enable_v8_checks == 1', {
diff --git a/configure.py b/configure.py
index fc48438060e0dd84edc60d1aebf3d0946be98ea9..4e30f58c3f33ed400301ed08a365a738b49f530f 100755
index fd13970ae73bbe5db186f81faed792a5597bbcd0..162e3b09c92b49cd39d32a87ff97a54555d3e47b 100755
--- a/configure.py
+++ b/configure.py
@@ -1810,6 +1810,7 @@ def configure_library(lib, output, pkgname=None):
@@ -1802,6 +1802,7 @@ def configure_library(lib, output, pkgname=None):
def configure_v8(o, configs):
set_configuration_variable(configs, 'v8_enable_v8_checks', release=0, debug=1)
set_configuration_variable(configs, 'v8_enable_v8_checks', release=1, debug=0)
+ o['variables']['using_electron_config_gypi'] = 1
o['variables']['v8_enable_webassembly'] = 0 if options.v8_lite_mode else 1
o['variables']['v8_enable_javascript_promise_hooks'] = 1
o['variables']['v8_enable_lite_mode'] = 1 if options.v8_lite_mode else 0
diff --git a/src/node.h b/src/node.h
index 2087509f7961dcacf02c634f2c4940e45f374072..0225ff4f43e8b82c08e8ec5492df73223a82066c 100644
index ebfd7229b5f0044b628fbe0b03ac211f0c6ed9a6..b92a9d42da8419741c435643b7401efcb21a9e8b 100644
--- a/src/node.h
+++ b/src/node.h
@@ -22,6 +22,12 @@

View File

@@ -34,10 +34,10 @@ index a493c9579669072d97c7caa9049e846bda36f8b9..334ffaa6f2d955125ca8b427ace1442c
let kResistStopPropagation;
diff --git a/src/node_builtins.cc b/src/node_builtins.cc
index a077ad673fdf7eab61878940e5fef43921c2e453..7922f2f936f64cbb7bd08f0d367f66f0b9eb083b 100644
index 2c816bef8d64f3e0ba2993c4885641620ee64272..3377d697615ee168e49e83c4202bc227581f1aaf 100644
--- a/src/node_builtins.cc
+++ b/src/node_builtins.cc
@@ -46,6 +46,7 @@ using v8::Value;
@@ -39,6 +39,7 @@ using v8::Value;
BuiltinLoader::BuiltinLoader()
: config_(GetConfig()), code_cache_(std::make_shared<BuiltinCodeCache>()) {
LoadJavaScriptSource();
@@ -46,10 +46,10 @@ index a077ad673fdf7eab61878940e5fef43921c2e453..7922f2f936f64cbb7bd08f0d367f66f0
AddExternalizedBuiltin("internal/deps/undici/undici",
STRINGIFY(NODE_SHARED_BUILTIN_UNDICI_UNDICI_PATH));
diff --git a/src/node_builtins.h b/src/node_builtins.h
index 490f429986e43653e0dd2048d9e3bd2e99ae44b2..05b1c5bbc38f851b11383b7e3e48c1c92f47aba1 100644
index bcdd50f635757f41287c87df1db9cd3b55c4b6b9..e908f3c0e314b90ff7b6c599940ea8f4e657c709 100644
--- a/src/node_builtins.h
+++ b/src/node_builtins.h
@@ -150,6 +150,7 @@ class NODE_EXTERN_PRIVATE BuiltinLoader {
@@ -141,6 +141,7 @@ class NODE_EXTERN_PRIVATE BuiltinLoader {
// Generated by tools/js2c.cc as node_javascript.cc
void LoadJavaScriptSource(); // Loads data into source_

View File

@@ -11,7 +11,7 @@ node-gyp will use the result of `process.config` that reflects the environment
in which the binary got built.
diff --git a/common.gypi b/common.gypi
index f760bb6e6a498c3f9786b46b60fbbf38521ab60a..d9ba0816ae424e5eb77fa742f8fd3d2d1c9845fc 100644
index a1243ad39b8fcf564285ace0b51b1482bd85071b..60ac7a50718fd8239fd96b811cdccd1c73b2d606 100644
--- a/common.gypi
+++ b/common.gypi
@@ -128,6 +128,7 @@

View File

@@ -10,7 +10,7 @@ M151, and so we should allow for building until then.
This patch can be removed at the M151 branch point.
diff --git a/common.gypi b/common.gypi
index d9ba0816ae424e5eb77fa742f8fd3d2d1c9845fc..537332b67dc7a45ef2b9ca4e439081a9f39a5f69 100644
index 60ac7a50718fd8239fd96b811cdccd1c73b2d606..709eb83801eeed81f79c4305a86d1a19710298c2 100644
--- a/common.gypi
+++ b/common.gypi
@@ -677,7 +677,7 @@

View File

@@ -8,10 +8,10 @@ they use themselves as the entry point. We should try to upstream some form
of this.
diff --git a/lib/internal/process/pre_execution.js b/lib/internal/process/pre_execution.js
index e9ab2b2a8958e99a6cd9d4280641c05eedec18aa..4a42dd66f451eece98fded909a72dd5ffcfd9708 100644
index 8ed8802adcda308166d700e463c8d6cbcb26d94a..9a99ff6d44320c0e28f4a787d24ea98ae1c96196 100644
--- a/lib/internal/process/pre_execution.js
+++ b/lib/internal/process/pre_execution.js
@@ -281,12 +281,14 @@ function patchProcessObject(expandArgv1) {
@@ -276,12 +276,14 @@ function patchProcessObject(expandArgv1) {
// the entry point.
if (expandArgv1 && process.argv[1] && process.argv[1][0] !== '-') {
// Expand process.argv[1] into a full path.

View File

@@ -14,10 +14,10 @@ and
This patch can be removed once this is fixed upstream in simdjson.
diff --git a/deps/simdjson/simdjson.h b/deps/simdjson/simdjson.h
index 3a413a7d1e046e93babfdda9982bd602f35ba3fc..76a6d4a30efcf0f5c9d14a314f1be670e8184524 100644
index 1d6560e80fab0458b22f0ac2437056bce4873e8f..c3dbe2b6fc08c36a07ced5e29a814f7bcd85b748 100644
--- a/deps/simdjson/simdjson.h
+++ b/deps/simdjson/simdjson.h
@@ -4408,12 +4408,17 @@ private:
@@ -4215,12 +4215,17 @@ inline std::ostream& operator<<(std::ostream& out, simdjson_result<padded_string
} // namespace simdjson
@@ -35,7 +35,7 @@ index 3a413a7d1e046e93babfdda9982bd602f35ba3fc..76a6d4a30efcf0f5c9d14a314f1be670
namespace simdjson {
namespace internal {
@@ -5105,6 +5110,9 @@ simdjson_inline bool padded_memory_map::is_valid() const noexcept {
@@ -4729,6 +4734,9 @@ inline simdjson_result<padded_string> padded_string::load(std::wstring_view file
} // namespace simdjson
@@ -45,16 +45,16 @@ index 3a413a7d1e046e93babfdda9982bd602f35ba3fc..76a6d4a30efcf0f5c9d14a314f1be670
inline simdjson::padded_string operator ""_padded(const char *str, size_t len) {
return simdjson::padded_string(str, len);
}
@@ -5113,6 +5121,8 @@ inline simdjson::padded_string operator ""_padded(const char8_t *str, size_t len
@@ -4737,6 +4745,8 @@ inline simdjson::padded_string operator ""_padded(const char8_t *str, size_t len
return simdjson::padded_string(reinterpret_cast<const char *>(str), len);
}
#endif
+#pragma clang diagnostic pop
+
#endif // SIMDJSON_PADDED_STRING_INL_H
/* end file simdjson/padded_string-inl.h */
@@ -72412,12 +72422,16 @@ simdjson_inline simdjson_warn_unused std::unique_ptr<ondemand::parser>& parser::
/* skipped duplicate #include "simdjson/padded_string_view.h" */
@@ -44745,12 +44755,16 @@ simdjson_inline simdjson_warn_unused std::unique_ptr<ondemand::parser>& parser::
return parser_instance;
}
@@ -71,7 +71,7 @@ index 3a413a7d1e046e93babfdda9982bd602f35ba3fc..76a6d4a30efcf0f5c9d14a314f1be670
} // namespace ondemand
} // namespace arm64
@@ -85564,12 +85578,16 @@ simdjson_inline simdjson_warn_unused std::unique_ptr<ondemand::parser>& parser::
@@ -59221,12 +59235,16 @@ simdjson_inline simdjson_warn_unused std::unique_ptr<ondemand::parser>& parser::
return parser_instance;
}

View File

@@ -20,7 +20,7 @@ index ab7dc27de3e304f6d912d5834da47e3b4eb25495..b6c0fd4ceee989dac55c7d54e52fef18
}
}
diff --git a/unofficial.gni b/unofficial.gni
index d7c717d5c2799692c7ea4537d7faea234faefaa2..bf44e39c0debe9d3dce4dcb490b6ce3bc16dca2a 100644
index 43f09d1e68c88d3ba3b862a1a74769f73c370894..cedd2b0a0941fe66bdae479c4fc768ce3d7bc6ac 100644
--- a/unofficial.gni
+++ b/unofficial.gni
@@ -146,6 +146,7 @@ template("node_gn_build") {
@@ -31,8 +31,8 @@ index d7c717d5c2799692c7ea4537d7faea234faefaa2..bf44e39c0debe9d3dce4dcb490b6ce3b
]
public_configs = [
":node_external_config",
@@ -370,6 +371,7 @@ template("node_gn_build") {
"src/builtin_info.h",
@@ -368,6 +369,7 @@ template("node_gn_build") {
"src/embedded_data.h",
]
include_dirs = [ "src", "tools" ]
+ configs += [ "//build/config/compiler:no_exit_time_destructors" ]

View File

@@ -11,7 +11,7 @@ its own blended handler between Node and Blink.
Not upstreamable.
diff --git a/lib/internal/modules/esm/utils.js b/lib/internal/modules/esm/utils.js
index 5019477c55e5ff1121a2b51168f12e008d54d59e..8eb390fbe16401a7abf836030233832925ab5c9e 100644
index 0af25ebbf6c3f2b790238e32f01addfb648e4e52..bd726088f7480853b8507c39668cc4716c4ce61f 100644
--- a/lib/internal/modules/esm/utils.js
+++ b/lib/internal/modules/esm/utils.js
@@ -35,7 +35,7 @@ const {

View File

@@ -9,10 +9,10 @@ modules to sandboxed renderers.
TODO(codebytere): remove and replace with a public facing API.
diff --git a/src/node_binding.cc b/src/node_binding.cc
index b76ecc8cab47dfb96adce17294eb0191a60a2efd..0ddc080f6f1e1b3d0aaa0e55c9aa5ddb7409b11b 100644
index 740706e917b7d28c520abdbd743605bf73274f30..9ab30b3c9bc663d2947fcbfaac6f06d2c8f8a5b1 100644
--- a/src/node_binding.cc
+++ b/src/node_binding.cc
@@ -657,6 +657,10 @@ void GetInternalBinding(const FunctionCallbackInfo<Value>& args) {
@@ -656,6 +656,10 @@ void GetInternalBinding(const FunctionCallbackInfo<Value>& args) {
args.GetReturnValue().Set(exports);
}
@@ -24,10 +24,10 @@ index b76ecc8cab47dfb96adce17294eb0191a60a2efd..0ddc080f6f1e1b3d0aaa0e55c9aa5ddb
Environment* env = Environment::GetCurrent(args);
diff --git a/src/node_binding.h b/src/node_binding.h
index bb6547e5dac4086e29fd588d46e1b69d4e5dbc15..58f68485c298dce116a6a8ee6960c85edcb65e93 100644
index a55a9c6a5787983c0477cb268ef1355162e72911..3455eb3d223a49cd73d80c72c209c26d49b769dc 100644
--- a/src/node_binding.h
+++ b/src/node_binding.h
@@ -155,6 +155,8 @@ void GetInternalBinding(const v8::FunctionCallbackInfo<v8::Value>& args);
@@ -154,6 +154,8 @@ void GetInternalBinding(const v8::FunctionCallbackInfo<v8::Value>& args);
void GetLinkedBinding(const v8::FunctionCallbackInfo<v8::Value>& args);
void DLOpen(const v8::FunctionCallbackInfo<v8::Value>& args);

View File

@@ -7,7 +7,7 @@ common.gypi is a file that's included in the node header bundle, despite
the fact that we do not build node with gyp.
diff --git a/common.gypi b/common.gypi
index 576d5057d988fca43a604a6db6a0b5723e960e2e..d398e33d5acc9acd096e2c263a6f8ad9d947d1b0 100644
index 283c60eab356a5befc15027cd186ea0416914ee6..d9eb9527e3cbb3b101274ab19e6d6ace42f0e022 100644
--- a/common.gypi
+++ b/common.gypi
@@ -91,6 +91,23 @@

View File

@@ -8,7 +8,7 @@ an ExternalPointerTypeTag parameter. Use kExternalPointerTypeTagDefault
for all existing call sites.
diff --git a/src/crypto/crypto_context.cc b/src/crypto/crypto_context.cc
index ff7a5917b554fc0c67edf4e36f567e7223e70577..23aaa364e955753a92c3cb575646955c40cdfd55 100644
index d6af2460c3745901415d4e785cf210da8a730a8d..4b5b892b81727c8f93e3041d33902c31d3776a52 100644
--- a/src/crypto/crypto_context.cc
+++ b/src/crypto/crypto_context.cc
@@ -2336,7 +2336,7 @@ int SecureContext::TicketCompatibilityCallback(SSL* ssl,
@@ -100,7 +100,7 @@ index d067b47e7e30a95740fe0275c70445707dec426b..391c57eed9058602bd8311d885cf5fc6
env->compile_cache_handler()->MaybeSave(cache_entry, utf8.ToStringView());
}
diff --git a/src/node_util.cc b/src/node_util.cc
index 065ed602b314f367c2e7dec94019521fd5d23bf4..8be8a8b5726a265a838841a21bb023fa41ceeb13 100644
index e9f4c1cdb60c03dce210f49e18dda57a4934a8b5..263edfd92e38c66f7912c602b306d420b503a839 100644
--- a/src/node_util.cc
+++ b/src/node_util.cc
@@ -93,7 +93,7 @@ static void GetExternalValue(

View File

@@ -9,10 +9,10 @@ conflict with Blink's in renderer and worker processes.
We should try to upstream some version of this.
diff --git a/doc/api/cli.md b/doc/api/cli.md
index ba9fcc3a3abf48f7ab4416a7ec13e689fd5802c7..4311a6e35ab85876b24c03036e5d91d5adb5c369 100644
index f05686608297e538f0a6f65abb389281bced4291..c8da076f80a559b9ee6d2ffed831b088c15c8e88 100644
--- a/doc/api/cli.md
+++ b/doc/api/cli.md
@@ -1802,6 +1802,14 @@ changes:
@@ -1820,6 +1820,14 @@ changes:
Disable using [syntax detection][] to determine module type.
@@ -27,7 +27,7 @@ index ba9fcc3a3abf48f7ab4416a7ec13e689fd5802c7..4311a6e35ab85876b24c03036e5d91d5
### `--no-experimental-global-navigator`
<!-- YAML
@@ -3516,6 +3524,7 @@ one is included in the list below.
@@ -3499,6 +3507,7 @@ one is included in the list below.
* `--no-addons`
* `--no-async-context-frame`
* `--no-deprecation`
@@ -36,7 +36,7 @@ index ba9fcc3a3abf48f7ab4416a7ec13e689fd5802c7..4311a6e35ab85876b24c03036e5d91d5
* `--no-experimental-repl-await`
* `--no-experimental-sqlite`
diff --git a/doc/node.1 b/doc/node.1
index bed0774b43a21a75eac7466905d7c06d6e9f4f8a..ebc364bbd79972f012a7489853530adc87b580a6 100644
index 9a0f5beb5b995fb92b31514c166e1c76e18d8ca9..fab0b24b630e755658b58a7281df1dacc4b7f32a 100644
--- a/doc/node.1
+++ b/doc/node.1
@@ -201,6 +201,9 @@ Enable transformation of TypeScript-only syntax into JavaScript code.
@@ -50,7 +50,7 @@ index bed0774b43a21a75eac7466905d7c06d6e9f4f8a..ebc364bbd79972f012a7489853530adc
Disable experimental support for the WebSocket API.
.
diff --git a/lib/internal/process/pre_execution.js b/lib/internal/process/pre_execution.js
index 4a42dd66f451eece98fded909a72dd5ffcfd9708..a693e35135fc8f7e918e1410a104d4607ece5489 100644
index 9a99ff6d44320c0e28f4a787d24ea98ae1c96196..8f5810267d1ba430bae02be141f087f2a5d3cf9f 100644
--- a/lib/internal/process/pre_execution.js
+++ b/lib/internal/process/pre_execution.js
@@ -117,6 +117,7 @@ function prepareExecution(options) {
@@ -61,7 +61,7 @@ index 4a42dd66f451eece98fded909a72dd5ffcfd9708..a693e35135fc8f7e918e1410a104d460
setupWebsocket();
setupEventsource();
setupCodeCoverage();
@@ -350,6 +351,16 @@ function setupWarningHandler() {
@@ -345,6 +346,16 @@ function setupWarningHandler() {
}
}
@@ -79,10 +79,10 @@ index 4a42dd66f451eece98fded909a72dd5ffcfd9708..a693e35135fc8f7e918e1410a104d460
function setupWebsocket() {
if (getOptionValue('--no-experimental-websocket')) {
diff --git a/src/node_options.cc b/src/node_options.cc
index 79d7c8cac002ba85b95c1d58a8e7cbf84fc35e89..cf66678164f4fca371f853c8afe3c0b7e7e3bc82 100644
index f6f81f50c8bd91a72ca96093dc64c183bd58039b..aa18dab6e4171d8a7f0af4b7db1b8c2c07191859 100644
--- a/src/node_options.cc
+++ b/src/node_options.cc
@@ -551,7 +551,11 @@ EnvironmentOptionsParser::EnvironmentOptionsParser() {
@@ -545,7 +545,11 @@ EnvironmentOptionsParser::EnvironmentOptionsParser() {
&EnvironmentOptions::experimental_eventsource,
kAllowedInEnvvar,
false);

Some files were not shown because too many files have changed in this diff Show More