Compare commits

...

72 Commits

Author SHA1 Message Date
trop[bot]
1079f3bbfa fix: prevent GBytes leak in GdkPixbufFromSkBitmap on Linux/GTK (#49897)
Inside gtk_util::GdkPixbufFromSkBitmap, g_bytes_new() was called
inline as an argument to gdk_pixbuf_new_from_bytes(), which per
GTK docs does not take ownership of the GBytes - it adds its own
internal reference. The caller's GBytes* was never stored or
unreffed, leaking 4 x width x height bytes of pixel data on every
call.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: redeemer <marcin.probola@gmail.com>
2026-02-21 17:11:52 +01:00
trop[bot]
d91adea56f build: use spawn instead of spawnSync for build (#49828)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
2026-02-17 15:58:03 -05:00
trop[bot]
b2b584a320 chore: add Copilot CLI instructions (#49823)
chore: add copilot-instructions

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
2026-02-17 13:25:08 -05:00
trop[bot]
1778a26c46 build: generate artifact attestions for released assets (#49781)
* build: generate artifact attestions for released assets (#48239)

* build: generate artifact attestions for released assets

* chore: address review feedback

---------

Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
(cherry picked from commit dec7f937ae)

Co-authored-by: Samuel Attard <sam@electronjs.org>

* build: fixup attestation for release assets (#49732)

* build: fixup attestation for release assets

* Generate artifact attestation for generated artifacts

* set id-token for attestation

* Add artifact-metadata permission for attestation

* add permissions for testing attestations

* Revert "add permissions for testing attestations"

This reverts commit 0284bed175.

* Revert "set id-token for attestation"

This reverts commit 69a1b13a18.

* Revert "Generate artifact attestation for generated artifacts"

This reverts commit ee0536eceb.

(cherry picked from commit 0852893910)

Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>

* chore: update publish workflow

Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Samuel Attard <sam@electronjs.org>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
2026-02-16 10:51:09 +01:00
Keeley Hammond
9fed98cee5 chore: cherry-pick e045399a1ecb from chromium (#49792)
* chore: cherry-pick e045399a1ecb from chromium

* chore: update patch

* chore: fix older method in patch
2026-02-13 10:27:37 +01:00
trop[bot]
d4d1596d2f fix: menu state in macOS dock menus (#49627)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2026-02-04 18:54:16 +01:00
trop[bot]
356bba8060 fix: duplicate fullscreen macOS menu item (#49596)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2026-02-02 11:50:01 -05:00
trop[bot]
ecbe8ee08a test: remove split dependency (#49556)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2026-01-28 11:49:51 -08:00
trop[bot]
8fb777dab0 revert: use deprecated setAllowedFileTypes in macOS dialogs (#49471)
* revert: use deprectated setAllowedFileTypes in macOS dialogs

Closes https://github.com/electron/electron/issues/48191

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

* chore: remove stray import

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2026-01-23 11:25:38 -05:00
electron-roller[bot]
c8af46e054 chore: bump node to v22.22.0 (38-x-y) (#49388)
* chore: bump node in DEPS to v22.22.0

* chore: update patches

* chore: fixup sandboxed pointers patch

(cherry picked from commit f52fbdbe51)

* tls: route callback exceptions through error handlers

https://github.com/nodejs-private/node-private/pull/782
(cherry picked from commit 87bc8ebd34)
(cherry picked from commit 2b6f185521)

* chore:remove zero-fill sandbox patch component

xref https://github.com/electron/electron/pull/49452

(cherry picked from commit bdb87f9dbb)

* fixup! chore:remove zero-fill sandbox patch component

(cherry picked from commit 6a4e4e3821)
Co-Authored-By: Robo <hop2deep@gmail.com>

* test: correct conditional secure heap flags test

xref:  https://github.com/nodejs/node/pull/60385
(cherry picked from commit 1304ff2d83)

---------

Co-authored-by: electron-roller[bot] <84116207+electron-roller[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
Co-authored-by: deepak1556 <hop2deep@gmail.com>
2026-01-23 10:48:43 -05:00
trop[bot]
e342216d9e ci: detect patch needs update error with problem matcher (#49410)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2026-01-21 20:45:00 -08:00
trop[bot]
90b4003ad5 build: roll build-image to a82b87d (#49451)
* build: roll build-image to a82b87d

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

* build: roll build-tools SHA to 4430e4a

(cherry picked from commit b989c070c6)

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2026-01-21 15:18:11 +01:00
trop[bot]
ab3292cf81 chore: improvements to script/run-clang-tidy.ts (#49344)
* chore: disable color output for clang-tidy in CI

Co-authored-by: David Sanders <dsanders11@ucsbalum.com>

* chore: small QoL improvements to run-clang-tidy.ts

Co-authored-by: David Sanders <dsanders11@ucsbalum.com>

* chore: add --fix option to script/run-clang-tidy.ts

Co-authored-by: David Sanders <dsanders11@ucsbalum.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2026-01-12 11:01:25 +01:00
John Kleinschmidt
75ee26902b build: use @electron-ci/dev-root for package.json default (#49326)
* build: use @electron-ci/dev-root for package.json default

(cherry picked from commit bab6bd3dae)

* fxiup

(cherry picked from commit 218300e57f)
2026-01-08 10:10:58 -06:00
trop[bot]
a586dd3045 build: fixup release notes generation (#49306)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
2026-01-05 14:55:26 -06:00
trop[bot]
bd1561a5b5 ci: disallow non-maintainer changes to Yarn files (#49248)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2026-01-05 10:37:58 -05:00
David Sanders
bc8ecdf96f build: disallow non-maintainer changes to GitHub Actions workflows (#49232)
build: disallow non-maintainer changes to GitHub Actions workflows (#48038)

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-12-18 16:23:04 -08:00
trop[bot]
2515880814 build: drop dugite as a dependency (#49207)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
2025-12-16 15:56:45 -05:00
trop[bot]
4c06de632e build: upgrade yarn to 4.12.0 (#49182)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
2025-12-11 10:34:02 -05:00
John Kleinschmidt
6b2861d063 build: upgrade github-app-auth to 3.2.0 (#49162)
build: upgrade github-app-auth to 3.2.0 (#49152)
2025-12-10 10:51:26 -05:00
trop[bot]
9692c9ea58 ci: don't build yarn modules for linux arm (#49085)
This should fix the oom errors

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2025-12-01 11:42:47 -08:00
trop[bot]
ecb6b6c1c1 ci: use clang problem matcher with nan spec runner (#49100)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2025-11-27 11:11:39 +01:00
trop[bot]
269a5393c0 fix: ensure menu-did-close is emitted for application menus (#49094)
fix: ensure menu-did-close is emitted for application menus

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-11-26 18:35:54 -06:00
Shelley Vohr
822fb2cd4d fix: systemPreferences.getAccentColor inverted color (#49066) 2025-11-25 15:47:09 -06:00
John Kleinschmidt
32fcfe4505 test: fixup test failures on linux (#49068)
* test: fixup spec runner to properly fail on linux when tests fail

* test: fixup dbus tests

* test: disable context menu spellcheck tests on linux

https://github.com/electron/electron/pull/48657 broke those tests
(cherry picked from commit cc3c999148)

* test:rebuild native modules

(cherry picked from commit bb8e2a924b)

* fix: wait for devtools blur event in focus test to avoid race condition

(cherry picked from commit 6fd2575cbc)

* fix: wait for devtools blur event in focus test to avoid race condition

(cherry picked from commit ea830139af)

---------

Co-authored-by: Alice Zhao <alicelovescake@anthropic.com>
2025-11-25 20:49:33 +01:00
trop[bot]
933f0d50d1 docs: update linux build instructions (#49061)
* docs: update linux build instructions

Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>

* Update docs/development/build-instructions-linux.md

Co-authored-by: Erick Zhao <ezhao@slack-corp.com>

Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>

* Update docs/development/build-instructions-linux.md

Co-authored-by: Erick Zhao <ezhao@slack-corp.com>

Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2025-11-24 13:50:40 -06:00
trop[bot]
4bd6182e83 fix: only call popup closecallback for top-level menu (#49047)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-11-23 21:35:11 +01:00
John Kleinschmidt
3e6dd7f771 build: update to yarn v4 (#48995)
* build: update to yarn v4

(cherry picked from commit 6adec744f3)

* chore: fixup types after yarn v4 migration

* chore: update nan yarn.lock file

* build: automatically install git for dugite
2025-11-20 10:13:44 -05:00
trop[bot]
9b89d19b1b fix: revert the parent window remained interactive after the modal window was opened (#49020)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: bill.shen <15865969+cucbin@users.noreply.github.com>
2025-11-19 16:59:27 -05:00
Keeley Hammond
38cb7ab080 chore: cherry-pick 62af07e96173 from v8 (#49009)
* chore: cherry-pick 62af07e96173 from v8

* chore: update patches

* test: move to macos-14-large (needed for Intel)
2025-11-18 15:47:15 -08:00
Robo
f6f0843536 chore: cherry-pick 9fcb46c from v8 (#48984) 2025-11-17 09:44:53 +01:00
trop[bot]
4cc7821d01 build: limit workflow gh token permissions (#48968)
* build: limit workflow gh token permissions

Co-authored-by: Samuel Attard <samuel.r.attard@gmail.com>

* feedback

Co-authored-by: Samuel Attard <sattard@anthropic.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Samuel Attard <samuel.r.attard@gmail.com>
Co-authored-by: Samuel Attard <sattard@anthropic.com>
2025-11-15 11:32:26 +01:00
Samuel Attard
17c909924c chore: cherry-pick 4cf9311810b0 from v8 (#48950)
* chore: cherry-pick 4cf9311810b0 from v8

* chore: update patches

---------

Co-authored-by: Keeley Hammond <khammond@slack-corp.com>
2025-11-13 14:57:06 -08:00
Fedor Indutny
40d65d5a9f fix: crash on windows when UTF-8 is in path (#48947)
In 6399527761 we changed the path strings
that `node_modules.cc` operates on from single-byte to wide strings.
Unfortunately this means that `generic_path()` that the
"fix: ensure TraverseParent bails on resource path exit" patch was
calling was no longer a safe method to call on Windows if the underlying
string has unicode characters in it.

Here we fix it by using `ConvertGenericPathToUTF8` from the Node.js
internal utilities.
2025-11-13 14:49:48 -08:00
Nikita Skovoroda
b32853b8aa fix: devtools crashing on Linux in detach mode (#48926)
Backport of https://github.com/electron/electron/pull/48600
(8756e82b5f24dcda13225968c3655d37f73d195e)

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-11-13 15:28:25 -05:00
trop[bot]
5e9c442b2a fix: restore window's canHide property on macOS (#48900)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: bill.shen <15865969+cucbin@users.noreply.github.com>
2025-11-13 15:24:27 -05:00
trop[bot]
c550d938c4 feat: add bypassCustomProtocolHandlers option to net.request (#48881)
* feat: add bypassCustomProtocolHandlers option to net.request

Co-authored-by: Kai <udbmnm@163.com>

* style: fix lint errors in api-protocol-spec

Co-authored-by: Kai <udbmnm@163.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Kai <udbmnm@163.com>
2025-11-13 10:34:11 -05:00
trop[bot]
9f19d58510 feat: add app.isHardwareAccelerationEnabled() (#48681)
* feat: add app.isHardwareAccelerationEnabled()

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

* chore: address review feedback

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-11-13 10:33:11 -05:00
trop[bot]
41bcdd71fe fix: the parent window remained interactive after the modal window was opened (#48866)
fix: fix the issue where the parent window remained interactive after the modal window was opened in somecases.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Bill Shen <15865969+cucbin@users.noreply.github.com>
2025-11-13 16:08:29 +01:00
trop[bot]
ca1b9e1c2e ci: exclude top-level docs files from full CI (#48897)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-11-11 21:06:44 +01:00
trop[bot]
a9ce0cdf52 fix: ESM-from-CJS import when CJK is in path (#48876)
* fix: ESM-from-CJS import when CJK is in path

Upstream fix: https://github.com/nodejs/node/pull/60575

Co-authored-by: Fedor Indutny <indutny@signal.org>

* chore: update patches

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Fedor Indutny <indutny@signal.org>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2025-11-11 12:21:58 +01:00
Shelley Vohr
3e77a1a359 feat: enable resetting accent color (#48853) 2025-11-10 16:45:40 -05:00
trop[bot]
074cedd561 feat: Focus DevTools when breakpoint is triggered (#48701)
`bringToFront` DevTools message is sent when breakpoint is triggered
or inspect is called and Chromium upon this message activates DevTools
via `DevToolsUIBindings::Delegate::ActivateWindow`:
```
void DevToolsWindow::ActivateWindow() {
  if (life_stage_ != kLoadCompleted)
    return;
\#if BUILDFLAG(IS_ANDROID)
  NOTIMPLEMENTED();
\#else
  if (is_docked_ && GetInspectedBrowserWindow())
    main_web_contents_->Focus();
  else if (!is_docked_ && browser_ && !browser_->window()->IsActive())
    browser_->window()->Activate();
\#endif
}
```
which implements: `DevToolsUIBindings::Delegate::ActivateWindow`.

Electron also implements this interface in:
`electron::InspectableWebContents`. However it was only setting
a zoom level, therefore this commit extends it with activation
of the DevTools.

Only supported for DevTools manged by `electron::InspectableWebContents`.

Closes: #37388

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Michał Pichliński <michal.pichlinski@here.io>
2025-11-10 16:41:46 -05:00
trop[bot]
6140359cd3 fix: oom crash in v8 when optimizing wasm (#48817)
* fix: oom crash in v8 when optimizing wasm

Co-authored-by: deepak1556 <hop2deep@gmail.com>

* chore: update patches

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: deepak1556 <hop2deep@gmail.com>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2025-11-08 10:52:10 +01:00
trop[bot]
a924f1a629 fix: CSD window frame tiles properly on Wayland (#48836)
fix: CSD window frame tiles properly on Linux

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Mitchell Cohen <mitch.cohen@me.com>
2025-11-07 18:41:47 +01:00
trop[bot]
3d5f13a44a fix(reland): allow disabling all NSMenuItems (#48829)
* fix: allow disabling all `NSMenuItems` (#48598)

fix: allow disabling all NSMenuItems

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

* fix: add guard for type

Co-authored-by: George Xu <george.xu@slack-corp.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
Co-authored-by: George Xu <george.xu@slack-corp.com>
2025-11-07 10:37:00 +01:00
trop[bot]
88a4d1c593 fix: revert allow disabling all NSMenuItems, fix menu crash (#48801)
Revert "fix: allow disabling all `NSMenuItems` (#48598)"

This reverts commit 0cb4fdd0f2.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Keeley Hammond <khammond@slack-corp.com>
2025-11-06 10:56:16 -08:00
trop[bot]
fea4fadeda build: use --keep-non-patch flag with git am (#48806)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2025-11-06 12:00:11 +01:00
trop[bot]
60ff1a18ac fix: draw smoothing round rect corner (#48780)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Bill Shen <15865969+cucbin@users.noreply.github.com>
2025-11-05 18:26:38 -05:00
trop[bot]
4b13582af0 fix: release mouse buttons on focus loss on Wayland (#48757)
* fix: release mouse buttons on focus loss on Wayland

Co-authored-by: Mitchell Cohen <mitch.cohen@me.com>

* chore: update patches after trop

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Mitchell Cohen <mitch.cohen@me.com>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2025-11-04 12:18:56 +01:00
electron-roller[bot]
3083fab4e4 chore: bump node to v22.21.1 (38-x-y) (#48615)
* chore: bump node in DEPS to v22.21.0

* chore: bump node in DEPS to v22.21.1

* chore: update patches

* lib,src: refactor assert to load error source from memory

nodejs/node#59751

* src: add percentage support to --max-old-space-size

nodejs/node#59082

---------

Co-authored-by: electron-roller[bot] <84116207+electron-roller[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2025-10-30 17:05:42 +01:00
trop[bot]
112489328c fix: allow disabling all NSMenuItems (#48710)
fix: allow disabling all NSMenuItems

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-10-30 12:07:50 +01:00
trop[bot]
8f23e5a426 fix: use correct signal variable in nan-spec-runner install check (#48708)
The install process spawn was not capturing its own signal variable,
causing the error check to incorrectly reference the build signal
instead. This could lead to:
- Install termination by signal going undetected
- False positive errors when build was killed but install succeeded

This commit ensures the install signal is properly captured and
checked, matching the pattern used for the build process.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: KinshukSS2 <kinshuk380@gmail.com>
2025-10-29 14:11:29 +01:00
trop[bot]
e308928159 ci: use <sup> in release notes generator (#48698)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Erick Zhao <erick@hotmail.ca>
2025-10-28 10:56:59 -04:00
trop[bot]
c1eb83c659 docs: add net.isOnline() to online/offline detection tutorial (#48684)
* docs: add net.isOnline() to online/offline detection tutorial

Co-authored-by: CuzImSlymi <fridolinojustin@gmail.com>

* chore: make linter happy

docs/tutorial/online-offline-events.md:12:1 MD004/ul-style Unordered list style [Expected: dash; Actual: asterisk]
docs/tutorial/online-offline-events.md:13:1 MD004/ul-style Unordered list style [Expected: dash; Actual: asterisk]

Co-authored-by: Charles Kerr <charles@charleskerr.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: CuzImSlymi <fridolinojustin@gmail.com>
Co-authored-by: Charles Kerr <charles@charleskerr.com>
2025-10-28 08:54:08 +01:00
trop[bot]
16099b6cf5 ci: add more fields to Slack payload for backport requested message (#48687)
* ci: add more fields to Slack payload for backport requested message

Co-authored-by: David Sanders <dsanders11@ucsbalum.com>

* chore: wrap values with toJSON

Co-authored-by: David Sanders <dsanders11@ucsbalum.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2025-10-27 19:33:04 -07:00
trop[bot]
74952bd7b4 fix: crash when inspector evaluates on provisional frames (#48513)
* fix: crash when inspector evaluates on provisional frames

Co-authored-by: deepak1556 <hop2deep@gmail.com>

* chore: update .patches

* chore: update patches

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: deepak1556 <hop2deep@gmail.com>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2025-10-27 14:14:26 -05:00
trop[bot]
3d59235245 fix: logical bug in install.js env var handling (#48672)
If either `npm_config_electron_use_remote_checksums` or
`electron_use_remote_checksums` are set as environment variables, then
force Electron to verify with remote checksums instead of embedded ones.

Fixes #48594.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Charles Kerr <charles@charleskerr.com>
2025-10-27 14:47:54 -04:00
trop[bot]
370a737ced docs: modify the thickFrame doc (#48678)
* doc: modify the thickFrame doc

Co-authored-by: zoy <zoy-l@outlook.com>

* chore: update description

Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>

Co-authored-by: zoy <zoy-l@outlook.com>

* update format

Co-authored-by: zoy <zoy-l@outlook.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: zoy <zoy-l@outlook.com>
2025-10-27 12:51:45 -05:00
trop[bot]
6b98259971 docs: fix Ubuntu version used to build Electron (#48644)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Niklas Wenzel <dev@nikwen.de>
2025-10-27 11:24:55 +01:00
trop[bot]
c1097edd15 feat: enable more granular a11y feature management (#48626)
* feat: enable more granular a11y feature management

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

* Update docs/api/app.md

Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-10-27 11:24:23 +01:00
trop[bot]
a0be2f521d fix: crash on empty dialog extensions array on Windows (#48660)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-10-27 09:17:27 +01:00
trop[bot]
d14320748d docs: security.md mark 'Enable process sandboxing' as active by defau… (#48649)
* docs: security.md mark 'Enable process sandboxing' as active by default since electron 20

Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>

* Adjusted according to feedback

Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>

* Updated according to feedback - adjusted sandbox.md

Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>

* formatting

Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>

* Fixed broken markup

Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>

* Implemented docs linting suggestions

Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>

* docs: docs/tutorial/sandbox.md - fixed typo

Co-authored-by: Erick Zhao <erick@hotmail.ca>

Co-authored-by: Teaveloper <49181620+LeUser111@users.noreply.github.com>

* docs: web-preferences.md - sandbox: mention default value and relation to nodeIntegration

Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: LeUser111 <florian.wiedenmann@grob.de>
Co-authored-by: Teaveloper <49181620+LeUser111@users.noreply.github.com>
2025-10-24 11:16:38 +02:00
electron-roller[bot]
e840a3f13f chore: bump chromium to 140.0.7339.249 (38-x-y) (#48569)
chore: bump chromium in DEPS to 140.0.7339.249

Co-authored-by: electron-roller[bot] <84116207+electron-roller[bot]@users.noreply.github.com>
2025-10-23 11:57:21 -04:00
trop[bot]
9008cf70f5 fix: background hover contrast for WCO buttons (#48595)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-10-23 16:09:53 +02:00
trop[bot]
c9c048196a fix: icon in Windows toast notification (#48630)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-10-22 13:55:41 +02:00
trop[bot]
955afdd92b fix: trafficLightPosition incorrect with customButtonsOnHover (#48620)
fix: trafficLightPosition incorrect with customButtonsOnHover

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2025-10-21 19:30:59 +02:00
trop[bot]
5e29e21a60 fix: position window titlebar buttons correctly in Ubuntu on Wayland (#48602)
fix: position window titlebar buttons correctly in Ubuntu on Wayland (#48490)

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Mitchell Cohen <mitch.cohen@me.com>
2025-10-21 15:26:57 +02:00
trop[bot]
d0db2ec333 feat: dynamic ESM import in preload without context isolation (#48489)
Dynamic ESM import in non-context-isolated preload

Extend `HostImportModuleWithPhaseDynamically`'s routing to support
Node.js import resolution in non-context-isolated preloads through
`v8_host_defined_options` length check. The length of host defined
options is distinct between Blink and Node.js and we can use it to
determine which resolver to use.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Fedor Indutny <indutny@signal.org>
2025-10-21 07:28:25 +02:00
John Kleinschmidt
4a2f733d0a build: use one build target (#48527) (#48605)
Optimizes our builds for use with siso/avoids file contention on Windows
2025-10-21 07:26:22 +02:00
trop[bot]
be4805afdd fix: fixed white flash on call to BrowserWindow.show (#48560)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Cezary Kulakowski <cezary@openfin.co>
2025-10-16 14:49:17 +02:00
trop[bot]
41721fc82e fix: enable shader-f16 on windows (#48555)
* fix: Enable shader-f16 on Windows

Co-authored-by: creeper-0910 <56744841+creeper-0910@users.noreply.github.com>

* fix: include dxil.dll and dxcompiler.dll for windows x64 and arm64

Co-authored-by: creeper-0910 <56744841+creeper-0910@users.noreply.github.com>

* fix: modified to follow the chromium dawn build configuration

Co-authored-by: creeper-0910 <56744841+creeper-0910@users.noreply.github.com>

* fix: include dxil.dll and dxcompiler.dll for windows x86

Co-authored-by: creeper-0910 <56744841+creeper-0910@users.noreply.github.com>

* fix: Modified to avoid explicitly specifying dawn_use_built_dxc

Co-authored-by: creeper-0910 <56744841+creeper-0910@users.noreply.github.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: creeper-0910 <56744841+creeper-0910@users.noreply.github.com>
2025-10-14 23:46:01 -07:00
178 changed files with 22865 additions and 11995 deletions

View File

@@ -2,7 +2,7 @@ version: '3'
services:
buildtools:
image: ghcr.io/electron/devcontainer:933c7d6ff6802706875270bec2e3c891cf8add3f
image: ghcr.io/electron/devcontainer:a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb
volumes:
- ..:/workspaces/gclient/src/electron:cached

View File

@@ -60,7 +60,11 @@ runs:
sudo launchctl limit maxfiles 65536 200000
fi
NINJA_SUMMARIZE_BUILD=1 e build
if [ "${{ inputs.is-release }}" = "true" ]; then
NINJA_SUMMARIZE_BUILD=1 e build --target electron:release_build
else
NINJA_SUMMARIZE_BUILD=1 e build --target electron:testing_build
fi
cp out/Default/.ninja_log out/electron_ninja_log
node electron/script/check-symlinks.js
@@ -70,11 +74,10 @@ runs:
else
echo "Skipping build-stats.mjs upload because DD_API_KEY is not set"
fi
- name: Build Electron dist.zip ${{ inputs.step-suffix }}
- name: Verify dist.zip ${{ inputs.step-suffix }}
shell: bash
run: |
cd src
e build --target electron:electron_dist_zip
cd src
if [ "${{ inputs.is-asan }}" != "true" ]; then
target_os=${{ inputs.target-platform == 'macos' && 'mac' || inputs.target-platform }}
if [ "${{ inputs.artifact-platform }}" = "mas" ]; then
@@ -82,11 +85,10 @@ runs:
fi
electron/script/zip_manifests/check-zip-manifest.py out/Default/dist.zip electron/script/zip_manifests/dist_zip.$target_os.${{ inputs.target-arch }}.manifest
fi
- name: Build Mksnapshot ${{ inputs.step-suffix }}
- name: Fixup Mksnapshot ${{ inputs.step-suffix }}
shell: bash
run: |
cd src
e build --target electron:electron_mksnapshot_zip
ELECTRON_DEPOT_TOOLS_DISABLE_LOG=1 e d gn desc out/Default v8:run_mksnapshot_default args > out/Default/mksnapshot_args
# Remove unused args from mksnapshot_args
SEDOPTION="-i"
@@ -130,11 +132,6 @@ runs:
run: |
cd src
e build --target electron:electron_chromedriver_zip
- name: Build Node.js headers ${{ inputs.step-suffix }}
shell: bash
run: |
cd src
e build --target electron:node_headers
- name: Create installed_software.json ${{ inputs.step-suffix }}
shell: powershell
if: ${{ inputs.is-release == 'true' && inputs.target-platform == 'win' }}
@@ -154,17 +151,11 @@ runs:
# Needed for msdia140.dll on 64-bit windows
cd src
export PATH="$PATH:$(pwd)/third_party/llvm-build/Release+Asserts/bin"
- name: Generate & Zip Symbols ${{ inputs.step-suffix }}
- name: Zip Symbols ${{ inputs.step-suffix }}
shell: bash
run: |
# Generate breakpad symbols on release builds
if [ "${{ inputs.generate-symbols }}" = "true" ]; then
e build --target electron:electron_symbols
fi
cd src
export BUILD_PATH="$(pwd)/out/Default"
e build --target electron:licenses
e build --target electron:electron_version_file
if [ "${{ inputs.is-release }}" = "true" ]; then
DELETE_DSYMS_AFTER_ZIP=1 electron/script/zip-symbols.py -b $BUILD_PATH
else
@@ -177,18 +168,6 @@ runs:
cd src
gn gen out/ffmpeg --args="import(\"//electron/build/args/ffmpeg.gn\") use_remoteexec=true use_siso=true $GN_EXTRA_ARGS"
e build --target electron:electron_ffmpeg_zip -C ../../out/ffmpeg
- name: Generate Hunspell Dictionaries ${{ inputs.step-suffix }}
shell: bash
if: ${{ inputs.is-release == 'true' && inputs.target-platform == 'linux' }}
run: |
e build --target electron:hunspell_dictionaries_zip
- name: Generate Libcxx ${{ inputs.step-suffix }}
shell: bash
if: ${{ inputs.is-release == 'true' && inputs.target-platform == 'linux' }}
run: |
e build --target electron:libcxx_headers_zip
e build --target electron:libcxxabi_headers_zip
e build --target electron:libcxx_objects_zip
- name: Remove Clang problem matcher
shell: bash
run: echo "::remove-matcher owner=clang::"
@@ -197,10 +176,11 @@ runs:
shell: bash
run: |
cd src/electron
node script/yarn create-typescript-definitions
node script/yarn.js create-typescript-definitions
- name: Publish Electron Dist ${{ inputs.step-suffix }}
if: ${{ inputs.is-release == 'true' }}
shell: bash
id: github-upload
run: |
rm -rf src/out/Default/obj
cd src/electron
@@ -211,6 +191,11 @@ runs:
echo 'Uploading Electron release distribution to GitHub releases'
script/release/uploaders/upload.py --verbose
fi
- name: Generate artifact attestation
if: ${{ inputs.is-release == 'true' }}
uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3.2.0
with:
subject-path: ${{ steps.github-upload.outputs.UPLOADED_PATHS }}
- name: Generate siso report
if: ${{ inputs.target-platform != 'win' && !cancelled() }}
shell: bash

View File

@@ -143,16 +143,17 @@ runs:
echo "No changes to patches detected"
fi
fi
- name: Remove patch conflict problem matcher
- name: Remove patch conflict problem matchers
shell: bash
run: |
echo "::remove-matcher owner=merge-conflict::"
echo "::remove-matcher owner=patch-conflict::"
echo "::remove-matcher owner=patch-needs-update::"
- name: Upload patches stats
if: ${{ inputs.target-platform == 'linux' && github.ref == 'refs/heads/main' }}
shell: bash
run: |
npx node src/electron/script/patches-stats.mjs --upload-stats || true
node src/electron/script/patches-stats.mjs --upload-stats || true
# delete all .git directories under src/ except for
# third_party/angle/ and third_party/dawn/ because of build time generation of files
# gen/angle/commit.h depends on third_party/angle/.git/HEAD

View File

@@ -13,12 +13,16 @@ runs:
- name: Generating Types for SHA in ${{ inputs.sha-file }}
shell: bash
run: |
git checkout $(cat ${{ inputs.sha-file }})
rm -rf node_modules
yarn install --frozen-lockfile --ignore-scripts
export ELECTRON_DIR=$(pwd)
if [ "${{ inputs.sha-file }}" == ".dig-old" ]; then
cd /tmp
git clone https://github.com/electron/electron.git
cd electron
fi
git checkout $(cat $ELECTRON_DIR/${{ inputs.sha-file }})
node script/yarn.js install --immutable
echo "#!/usr/bin/env node\nglobal.x=1" > node_modules/typescript/bin/tsc
node node_modules/.bin/electron-docs-parser --dir=./ --outDir=./ --moduleVersion=0.0.0-development
node node_modules/.bin/electron-typescript-definitions --api=electron-api.json --outDir=artifacts
mv artifacts/electron.d.ts artifacts/${{ inputs.filename }}
git checkout .
mv artifacts/electron.d.ts $ELECTRON_DIR/artifacts/${{ inputs.filename }}
working-directory: ./electron

View File

@@ -15,7 +15,7 @@ runs:
git config --global core.preloadindex true
git config --global core.longpaths true
fi
export BUILD_TOOLS_SHA=706147b2376f55078f718576b28129a0457f1795
export BUILD_TOOLS_SHA=a0cc95a1884a631559bcca0c948465b725d9295a
npm i -g @electron/build-tools
# Update depot_tools to ensure python
e d update_depot_tools

View File

@@ -6,7 +6,7 @@ runs:
- name: Get yarn cache directory path
shell: bash
id: yarn-cache-dir-path
run: echo "dir=$(node src/electron/script/yarn cache dir)" >> $GITHUB_OUTPUT
run: echo "dir=$(node src/electron/script/yarn.js config get cacheFolder)" >> $GITHUB_OUTPUT
- uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
id: yarn-cache
with:
@@ -18,4 +18,14 @@ runs:
shell: bash
run: |
cd src/electron
node script/yarn install --frozen-lockfile --prefer-offline
if [ "$TARGET_ARCH" = "x86" ]; then
export npm_config_arch="ia32"
fi
# if running on linux arm skip yarn Builds
ARCH=$(uname -m)
if [ "$ARCH" = "armv7l" ]; then
echo "Skipping yarn build on linux arm"
node script/yarn.js install --immutable --mode=skip-build
else
node script/yarn.js install --immutable
fi

122
.github/copilot-instructions.md vendored Normal file
View File

@@ -0,0 +1,122 @@
# Copilot Instructions for Electron
## Build System
Electron uses `@electron/build-tools` (`e` CLI). Install with `npm i -g @electron/build-tools`.
```bash
e sync # Fetch sources and apply patches
e build # Build Electron (GN + Ninja)
e build -k 999 # Build, continuing through errors
e start # Run built Electron
e start --version # Verify Electron launches
e test # Run full test suite
e debug # Run in debugger (lldb on macOS, gdb on Linux)
```
### Linting
```bash
npm run lint # Run all linters (JS, C++, Python, GN, docs)
npm run lint:js # JavaScript/TypeScript only
npm run lint:clang-format # C++ formatting only
npm run lint:cpp # C++ linting only
npm run lint:docs # Documentation only
```
### Running a Single Test
```bash
npm run test -- -g "pattern" # Run tests matching a regex pattern
# Example: npm run test -- -g "ipc"
```
### Running a Single Node.js Test
```bash
node script/node-spec-runner.js parallel/test-crypto-keygen
```
## Architecture
Electron embeds Chromium (rendering) and Node.js (backend) to enable desktop apps with web technologies. The parent directory (`../`) is the Chromium source tree.
### Process Model
Electron has two primary process types, mirroring Chromium:
- **Main process** (`shell/browser/` + `lib/browser/`): Controls app lifecycle, creates windows, system APIs
- **Renderer process** (`shell/renderer/` + `lib/renderer/`): Runs web content in BrowserWindows
### Native ↔ JavaScript Bridge
Each API is implemented as a C++/JS pair:
- C++ side: `shell/browser/api/electron_api_{name}.cc/.h` — uses `gin::Wrappable` and `ObjectTemplateBuilder`
- JS side: `lib/browser/api/{name}.ts` — exports the module, registered in `lib/browser/api/module-list.ts`
- Binding: `NODE_LINKED_BINDING_CONTEXT_AWARE(electron_browser_{name}, Initialize)` in C++ and registered in `shell/common/node_bindings.cc`
- Type declaration: `typings/internal-ambient.d.ts` maps `process._linkedBinding('electron_browser_{name}')`
### Patches System
Electron patches upstream dependencies (Chromium, Node.js, V8, etc.) rather than forking them. Patches live in `patches/` organized by target, with `patches/config.json` mapping directories to repos.
```text
patches/{target}/*.patch → [e sync] → target repo commits
← [e patches] ←
```
Key rules:
- Fix existing patches rather than creating new ones
- Preserve original authorship in TODO comments — never change `TODO(name)` assignees
- Each patch commit message must explain why the patch exists
- After modifying patches, run `e patches {target}` to export
When working on the `roller/chromium/main` branch for Chromium upgrades, use `e sync --3` for 3-way merge conflict resolution.
## Conventions
### File Naming
- JS/TS files: kebab-case (`file-name.ts`)
- C++ files: snake_case with `electron_api_` prefix (`electron_api_safe_storage.cc`)
- Test files: `api-{module-name}-spec.ts` in `spec/`
- Source file lists are maintained in `filenames.gni` (with platform-specific sections)
### JavaScript/TypeScript
- Semicolons required (`"semi": ["error", "always"]`)
- `const` and `let` only (no `var`)
- Arrow functions preferred
- Import order enforced: `@electron/internal``@electron``electron` → external → builtin → relative
- API naming: `PascalCase` for classes (`BrowserWindow`), `camelCase` for module APIs (`globalShortcut`)
- Prefer getters/setters over jQuery-style `.text([text])` patterns
### C++
- Follows Chromium coding style, enforced by `clang-format` and `clang-tidy`
- Uses Chromium abstractions (`base::`, `content::`, etc.)
- Header guards: `#ifndef ELECTRON_SHELL_BROWSER_API_ELECTRON_API_{NAME}_H_`
- Platform-specific files: `_mac.mm`, `_win.cc`, `_linux.cc`
### Testing
- Framework: Mocha + Chai + Sinon
- Test helpers in `spec/lib/` (e.g., `spec-helpers.ts`, `window-helpers.ts`)
- Use `defer()` from spec-helpers for cleanup, `closeAllWindows()` for window teardown
- Tests import from `electron/main` or `electron/renderer`
### Documentation
- API docs in `docs/api/` as Markdown, parsed by `@electron/docs-parser` to generate `electron.d.ts`
- API history tracked via YAML blocks in HTML comments within doc files
- Docs must pass `npm run lint:docs`
### Build Configuration
- `BUILD.gn`: Main GN build config
- `buildflags/buildflags.gni`: Feature flags (PDF viewer, extensions, spellchecker)
- `build/args/`: Build argument profiles (`testing.gn`, `release.gn`, `all.gn`)
- `DEPS`: Dependency versions and checkout paths
- `chromium_src/`: Chromium source file overrides (compiled instead of originals)

View File

@@ -19,6 +19,16 @@
"line": 3
}
]
},
{
"owner": "patch-needs-update",
"pattern": [
{
"regexp": "^((patches\/.*): needs update)$",
"message": 1,
"file": 2
}
]
}
]
}

View File

@@ -3,10 +3,14 @@ name: Archaeologist
on:
pull_request:
permissions: {}
jobs:
archaeologist-dig:
name: Archaeologist Dig
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout Electron
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.0.2

View File

@@ -6,11 +6,15 @@ on:
schedule:
- cron: "0 0 * * *"
permissions: {}
jobs:
build-git-cache-linux:
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
image: ghcr.io/electron/build:a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
@@ -30,8 +34,10 @@ jobs:
build-git-cache-windows:
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
image: ghcr.io/electron/build:a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb
options: --user root --device /dev/fuse --cap-add SYS_ADMIN
volumes:
- /mnt/win-cache:/mnt/win-cache
@@ -52,10 +58,12 @@ jobs:
build-git-cache-macos:
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
# This job updates the same git cache as linux, so it needs to run after the linux one.
needs: build-git-cache-linux
container:
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
image: ghcr.io/electron/build:a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -6,7 +6,7 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: '933c7d6ff6802706875270bec2e3c891cf8add3f'
default: 'a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb'
required: true
skip-macos:
type: boolean
@@ -43,10 +43,13 @@ defaults:
run:
shell: bash
permissions: {}
jobs:
setup:
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
outputs:
docs: ${{ steps.filter.outputs.docs }}
@@ -63,13 +66,17 @@ jobs:
filters: |
docs:
- 'docs/**'
- README.md
- SECURITY.md
- CONTRIBUTING.md
- CODE_OF_CONDUCT.md
src:
- '!docs/**'
- name: Set Outputs for Build Image SHA & Docs Only
id: set-output
run: |
if [ -z "${{ inputs.build-image-sha }}" ]; then
echo "build-image-sha=933c7d6ff6802706875270bec2e3c891cf8add3f" >> "$GITHUB_OUTPUT"
echo "build-image-sha=a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb" >> "$GITHUB_OUTPUT"
else
echo "build-image-sha=${{ inputs.build-image-sha }}" >> "$GITHUB_OUTPUT"
fi
@@ -80,6 +87,8 @@ jobs:
needs: setup
if: ${{ !inputs.skip-lint }}
uses: ./.github/workflows/pipeline-electron-lint.yml
permissions:
contents: read
with:
container: '{"image":"ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}","options":"--user root"}'
secrets: inherit
@@ -89,6 +98,8 @@ jobs:
needs: [setup, checkout-linux]
if: ${{ needs.setup.outputs.docs-only == 'true' }}
uses: ./.github/workflows/pipeline-electron-docs-only.yml
permissions:
contents: read
with:
container: '{"image":"ghcr.io/electron/build:${{ needs.checkout-linux.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
secrets: inherit
@@ -98,6 +109,8 @@ jobs:
needs: setup
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-macos}}
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
@@ -126,6 +139,8 @@ jobs:
needs: setup
if: ${{ !inputs.skip-linux}}
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
@@ -155,6 +170,8 @@ jobs:
needs: setup
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root --device /dev/fuse --cap-add SYS_ADMIN
@@ -185,6 +202,8 @@ jobs:
# GN Check Jobs
macos-gn-check:
uses: ./.github/workflows/pipeline-segment-electron-gn-check.yml
permissions:
contents: read
needs: checkout-macos
with:
target-platform: macos
@@ -195,6 +214,8 @@ jobs:
linux-gn-check:
uses: ./.github/workflows/pipeline-segment-electron-gn-check.yml
permissions:
contents: read
needs: checkout-linux
if: ${{ needs.setup.outputs.src == 'true' }}
with:
@@ -207,6 +228,8 @@ jobs:
windows-gn-check:
uses: ./.github/workflows/pipeline-segment-electron-gn-check.yml
permissions:
contents: read
needs: checkout-windows
with:
target-platform: win
@@ -226,7 +249,7 @@ jobs:
needs: checkout-macos
with:
build-runs-on: macos-14-xlarge
test-runs-on: macos-13
test-runs-on: macos-14-large
target-platform: macos
target-arch: x64
is-release: false
@@ -310,7 +333,7 @@ jobs:
build-runs-on: electron-arc-centralus-linux-amd64-32core
test-runs-on: electron-arc-centralus-linux-arm64-4core
build-container: '{"image":"ghcr.io/electron/build:${{ needs.checkout-linux.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
test-container: '{"image":"ghcr.io/electron/test:arm32v7-${{ needs.checkout-linux.outputs.build-image-sha }}","options":"--user root --privileged --init","volumes":["/home/runner/externals:/mnt/runner-externals"]}'
test-container: '{"image":"ghcr.io/electron/test:arm32v7-${{ needs.checkout-linux.outputs.build-image-sha }}","options":"--user root --privileged --init --memory=12g","volumes":["/home/runner/externals:/mnt/runner-externals"]}'
target-platform: linux
target-arch: arm
is-release: false
@@ -400,6 +423,8 @@ jobs:
gha-done:
name: GitHub Actions Completed
runs-on: ubuntu-latest
permissions:
contents: read
needs: [docs-only, macos-x64, macos-arm64, linux-x64, linux-x64-asan, linux-arm, linux-arm64, windows-x64, windows-x86, windows-arm64]
if: always() && !contains(needs.*.result, 'failure')
steps:

View File

@@ -1,16 +1,20 @@
name: Clean Source Cache
description: |
This workflow cleans up the source cache on the cross-instance cache volume
to free up space. It runs daily at midnight and clears files older than 15 days.
# Description:
# This workflow cleans up the source cache on the cross-instance cache volume
# to free up space. It runs daily at midnight and clears files older than 15 days.
on:
schedule:
- cron: "0 0 * * *"
permissions: {}
jobs:
clean-src-cache:
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
options: --user root

View File

@@ -4,14 +4,15 @@ on:
issues:
types: [labeled]
permissions: # added using https://github.com/step-security/secure-workflows
contents: read
permissions: {}
jobs:
issue-labeled-with-status:
name: status/{confirmed,reviewed} label added
if: github.event.label.name == 'status/confirmed' || github.event.label.name == 'status/reviewed'
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Generate GitHub App token
uses: electron/github-app-auth-action@384fd19694fe7b6dcc9a684746c6976ad78228ae # v1.1.1
@@ -31,6 +32,8 @@ jobs:
name: blocked/* label added
if: startsWith(github.event.label.name, 'blocked/')
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Generate GitHub App token
uses: electron/github-app-auth-action@384fd19694fe7b6dcc9a684746c6976ad78228ae # v1.1.1

View File

@@ -11,6 +11,7 @@ jobs:
add-to-issue-triage:
if: ${{ contains(github.event.issue.labels.*.name, 'bug :beetle:') }}
runs-on: ubuntu-latest
permissions: {}
steps:
- name: Generate GitHub App token
uses: electron/github-app-auth-action@384fd19694fe7b6dcc9a684746c6976ad78228ae # v1.1.1
@@ -28,6 +29,7 @@ jobs:
set-labels:
if: ${{ contains(github.event.issue.labels.*.name, 'bug :beetle:') }}
runs-on: ubuntu-latest
permissions: {}
steps:
- name: Generate GitHub App token
uses: electron/github-app-auth-action@384fd19694fe7b6dcc9a684746c6976ad78228ae # v1.1.1

View File

@@ -10,6 +10,7 @@ jobs:
issue-transferred:
name: Issue Transferred
runs-on: ubuntu-latest
permissions: {}
if: ${{ !github.event.changes.new_repository.private }}
steps:
- name: Generate GitHub App token

View File

@@ -4,14 +4,15 @@ on:
issues:
types: [unlabeled]
permissions:
contents: read
permissions: {}
jobs:
issue-unlabeled-blocked:
name: All blocked/* labels removed
if: startsWith(github.event.label.name, 'blocked/') && github.event.issue.state == 'open'
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Check for any blocked labels
id: check-for-blocked-labels

View File

@@ -6,7 +6,7 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: '933c7d6ff6802706875270bec2e3c891cf8add3f'
default: 'a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb'
upload-to-storage:
description: 'Uploads to Azure storage'
required: false
@@ -17,9 +17,13 @@ on:
type: boolean
default: false
permissions: {}
jobs:
checkout-linux:
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
options: --user root
@@ -39,7 +43,12 @@ jobs:
uses: ./src/electron/.github/actions/checkout
publish-x64:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-linux
with:
environment: production-release
@@ -54,7 +63,12 @@ jobs:
secrets: inherit
publish-arm:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-linux
with:
environment: production-release
@@ -69,7 +83,12 @@ jobs:
secrets: inherit
publish-arm64:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-linux
with:
environment: production-release

View File

@@ -6,7 +6,7 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: '933c7d6ff6802706875270bec2e3c891cf8add3f'
default: 'a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb'
required: true
upload-to-storage:
description: 'Uploads to Azure storage'
@@ -18,9 +18,13 @@ on:
type: boolean
default: false
permissions: {}
jobs:
checkout-macos:
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
options: --user root
@@ -43,7 +47,12 @@ jobs:
target-platform: macos
publish-x64-darwin:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-macos
with:
environment: production-release
@@ -58,7 +67,12 @@ jobs:
secrets: inherit
publish-x64-mas:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-macos
with:
environment: production-release
@@ -73,7 +87,12 @@ jobs:
secrets: inherit
publish-arm64-darwin:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-macos
with:
environment: production-release
@@ -88,7 +107,12 @@ jobs:
secrets: inherit
publish-arm64-mas:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-macos
with:
environment: production-release

View File

@@ -1,16 +1,20 @@
name: Check for Non-Maintainer Dependency Change
name: Check for Disallowed Non-Maintainer Change
on:
pull_request_target:
paths:
- 'yarn.lock'
- 'spec/yarn.lock'
- '.github/workflows/**'
- '.github/actions/**'
- '.yarn/**'
- '.yarnrc.yml'
permissions: {}
jobs:
check-for-non-maintainer-dependency-change:
name: Check for non-maintainer dependency change
name: Check for disallowed non-maintainer change
if: ${{ !contains(fromJSON('["MEMBER", "OWNER"]'), github.event.pull_request.author_association) && github.event.pull_request.user.type != 'Bot' && !github.event.pull_request.draft }}
permissions:
contents: read
@@ -24,7 +28,7 @@ jobs:
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
set -eo pipefail
REVIEW_COUNT=$(gh pr view $PR_URL --json reviews | jq '[ .reviews[] | select(.author.login == "github-actions") | select(.body | startswith("<!-- no-dependency-change -->")) ] | length')
REVIEW_COUNT=$(gh pr view $PR_URL --json reviews | jq '[ .reviews[] | select(.author.login == "github-actions") | select(.body | startswith("<!-- disallowed-non-maintainer-change -->")) ] | length')
if [[ $REVIEW_COUNT -eq 0 ]]; then
echo "SHOULD_REVIEW=1" >> "$GITHUB_OUTPUT"
fi
@@ -34,4 +38,4 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
printf "<!-- no-dependency-change -->\n\nHello @${{ github.event.pull_request.user.login }}! It looks like this pull request touches one of our dependency files, and per [our contribution policy](https://github.com/electron/electron/blob/main/CONTRIBUTING.md#dependencies-upgrades-policy) we do not accept these types of changes in PRs." | gh pr review $PR_URL -r --body-file=-
printf "<!-- disallowed-non-maintainer-change -->\n\nHello @${{ github.event.pull_request.user.login }}! It looks like this pull request touches one of our dependency or CI files, and per [our contribution policy](https://github.com/electron/electron/blob/main/CONTRIBUTING.md#dependencies-upgrades-policy) we do not accept these types of changes in PRs." | gh pr review $PR_URL -r --body-file=-

View File

@@ -55,6 +55,8 @@ on:
type: boolean
default: false
permissions: {}
concurrency:
group: electron-build-and-test-and-nan-${{ inputs.target-platform }}-${{ inputs.target-arch }}-${{ github.ref_protected == true && github.run_id || github.ref }}
cancel-in-progress: ${{ github.ref_protected != true }}
@@ -62,6 +64,8 @@ concurrency:
jobs:
build:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
permissions:
contents: read
with:
build-runs-on: ${{ inputs.build-runs-on }}
build-container: ${{ inputs.build-container }}
@@ -74,6 +78,10 @@ jobs:
secrets: inherit
test:
uses: ./.github/workflows/pipeline-segment-electron-test.yml
permissions:
contents: read
issues: read
pull-requests: read
needs: build
with:
target-arch: ${{ inputs.target-arch }}
@@ -83,6 +91,8 @@ jobs:
secrets: inherit
nn-test:
uses: ./.github/workflows/pipeline-segment-node-nan-test.yml
permissions:
contents: read
needs: build
with:
target-arch: ${{ inputs.target-arch }}

View File

@@ -64,14 +64,13 @@ concurrency:
group: electron-build-and-test-${{ inputs.target-platform }}-${{ inputs.target-arch }}-${{ github.ref_protected == true && github.run_id || github.ref }}
cancel-in-progress: ${{ github.ref_protected != true }}
permissions:
contents: read
issues: read
pull-requests: read
permissions: {}
jobs:
build:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
permissions:
contents: read
with:
build-runs-on: ${{ inputs.build-runs-on }}
build-container: ${{ inputs.build-container }}
@@ -86,6 +85,10 @@ jobs:
secrets: inherit
test:
uses: ./.github/workflows/pipeline-segment-electron-test.yml
permissions:
contents: read
issues: read
pull-requests: read
needs: build
with:
target-arch: ${{ inputs.target-arch }}

View File

@@ -8,6 +8,8 @@ on:
description: 'Container to run the docs-only ts compile in'
type: string
permissions: {}
concurrency:
group: electron-docs-only-${{ github.ref }}
cancel-in-progress: true
@@ -19,6 +21,8 @@ jobs:
docs-only:
name: Docs Only Compile
runs-on: electron-arc-centralus-linux-amd64-4core
permissions:
contents: read
timeout-minutes: 20
container: ${{ fromJSON(inputs.container) }}
steps:
@@ -50,12 +54,12 @@ jobs:
shell: bash
run: |
cd src/electron
node script/yarn create-typescript-definitions
node script/yarn tsc -p tsconfig.default_app.json --noEmit
node script/yarn.js create-typescript-definitions
node script/yarn.js tsc -p tsconfig.default_app.json --noEmit
for f in build/webpack/*.js
do
out="${f:29}"
if [ "$out" != "base.js" ]; then
node script/yarn webpack --config $f --output-filename=$out --output-path=./.tmp --env mode=development
node script/yarn.js webpack --config $f --output-filename=$out --output-path=./.tmp --env mode=development
fi
done

View File

@@ -8,6 +8,8 @@ on:
description: 'Container to run lint in'
type: string
permissions: {}
concurrency:
group: electron-lint-${{ github.ref_protected == true && github.run_id || github.ref }}
cancel-in-progress: ${{ github.ref_protected != true }}
@@ -19,6 +21,8 @@ jobs:
lint:
name: Lint
runs-on: electron-arc-centralus-linux-amd64-4core
permissions:
contents: read
timeout-minutes: 20
container: ${{ fromJSON(inputs.container) }}
steps:
@@ -74,11 +78,15 @@ jobs:
# but then we would lint its contents (at least gn format), and it doesn't pass it.
cd src/electron
node script/yarn install --frozen-lockfile
node script/yarn lint
node script/yarn.js install --immutable
node script/yarn.js lint
- name: Run Script Typechecker
shell: bash
run: |
cd src/electron
node script/yarn tsc -p tsconfig.script.json
node script/yarn.js tsc -p tsconfig.script.json
- name: Check GHA Workflows
shell: bash
run: |
cd src/electron
node script/copy-pipeline-segment-publish.js --check

View File

@@ -59,6 +59,8 @@ on:
type: boolean
default: false
permissions: {}
concurrency:
group: electron-build-${{ inputs.target-platform }}-${{ inputs.target-arch }}-${{ inputs.target-variant }}-${{ inputs.is-asan }}-${{ github.ref_protected == true && github.run_id || github.ref }}
cancel-in-progress: ${{ github.ref_protected != true }}
@@ -81,6 +83,8 @@ jobs:
run:
shell: bash
runs-on: ${{ inputs.build-runs-on }}
permissions:
contents: read
container: ${{ fromJSON(inputs.build-container) }}
environment: ${{ inputs.environment }}
env:

View File

@@ -26,6 +26,8 @@ on:
type: string
default: testing
permissions: {}
concurrency:
group: electron-gn-check-${{ inputs.target-platform }}-${{ github.ref }}
cancel-in-progress: true
@@ -41,6 +43,8 @@ jobs:
run:
shell: bash
runs-on: ${{ inputs.check-runs-on }}
permissions:
contents: read
container: ${{ fromJSON(inputs.check-container) }}
steps:
- name: Checkout Electron

View File

@@ -0,0 +1,237 @@
# AUTOGENERATED FILE - DO NOT EDIT MANUALLY
# ONLY EDIT .github/workflows/pipeline-segment-electron-build.yml
name: Pipeline Segment - Electron Build
on:
workflow_call:
inputs:
environment:
description: using the production or testing environment
required: false
type: string
target-platform:
type: string
description: Platform to run on, can be macos, win or linux
required: true
target-arch:
type: string
description: Arch to build for, can be x64, arm64, ia32 or arm
required: true
target-variant:
type: string
description: Variant to build for, no effect on non-macOS target platforms. Can
be darwin, mas or all.
default: all
build-runs-on:
type: string
description: What host to run the build
required: true
build-container:
type: string
description: JSON container information for aks runs-on
required: false
default: '{"image":null}'
is-release:
description: Whether this build job is a release job
required: true
type: boolean
default: false
gn-build-type:
description: The gn build type - testing or release
required: true
type: string
default: testing
generate-symbols:
description: Whether or not to generate symbols
required: true
type: boolean
default: false
upload-to-storage:
description: Whether or not to upload build artifacts to external storage
required: true
type: string
default: "0"
is-asan:
description: Building the Address Sanitizer (ASan) Linux build
required: false
type: boolean
default: false
enable-ssh:
description: Enable SSH debugging
required: false
type: boolean
default: false
permissions: {}
concurrency:
group: electron-build-${{ inputs.target-platform }}-${{ inputs.target-arch
}}-${{ inputs.target-variant }}-${{ inputs.is-asan }}-${{
github.ref_protected == true && github.run_id || github.ref }}
cancel-in-progress: ${{ github.ref_protected != true }}
env:
CHROMIUM_GIT_COOKIE: ${{ secrets.CHROMIUM_GIT_COOKIE }}
CHROMIUM_GIT_COOKIE_WINDOWS_STRING: ${{ secrets.CHROMIUM_GIT_COOKIE_WINDOWS_STRING }}
DD_API_KEY: ${{ secrets.DD_API_KEY }}
ELECTRON_ARTIFACTS_BLOB_STORAGE: ${{ secrets.ELECTRON_ARTIFACTS_BLOB_STORAGE }}
ELECTRON_RBE_JWT: ${{ secrets.ELECTRON_RBE_JWT }}
SUDOWOODO_EXCHANGE_URL: ${{ secrets.SUDOWOODO_EXCHANGE_URL }}
SUDOWOODO_EXCHANGE_TOKEN: ${{ secrets.SUDOWOODO_EXCHANGE_TOKEN }}
GCLIENT_EXTRA_ARGS: ${{ inputs.target-platform == 'macos' &&
'--custom-var=checkout_mac=True --custom-var=host_os=mac' ||
inputs.target-platform == 'win' && '--custom-var=checkout_win=True' ||
'--custom-var=checkout_arm=True --custom-var=checkout_arm64=True' }}
ELECTRON_OUT_DIR: Default
ACTIONS_STEP_DEBUG: ${{ secrets.ACTIONS_STEP_DEBUG }}
jobs:
build:
defaults:
run:
shell: bash
runs-on: ${{ inputs.build-runs-on }}
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
container: ${{ fromJSON(inputs.build-container) }}
environment: ${{ inputs.environment }}
env:
TARGET_ARCH: ${{ inputs.target-arch }}
TARGET_PLATFORM: ${{ inputs.target-platform }}
steps:
- name: Create src dir
run: |
mkdir src
- name: Checkout Electron
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
path: src/electron
fetch-depth: 0
ref: ${{ github.event.pull_request.head.sha }}
- name: Setup SSH Debugging
if: ${{ inputs.target-platform == 'macos' && (inputs.enable-ssh ||
env.ACTIONS_STEP_DEBUG == 'true') }}
uses: ./src/electron/.github/actions/ssh-debug
with:
tunnel: "true"
env:
CLOUDFLARE_TUNNEL_CERT: ${{ secrets.CLOUDFLARE_TUNNEL_CERT }}
CLOUDFLARE_TUNNEL_HOSTNAME: ${{ vars.CLOUDFLARE_TUNNEL_HOSTNAME }}
CLOUDFLARE_USER_CA_CERT: ${{ secrets.CLOUDFLARE_USER_CA_CERT }}
AUTHORIZED_USERS: ${{ secrets.SSH_DEBUG_AUTHORIZED_USERS }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Free up space (macOS)
if: ${{ inputs.target-platform == 'macos' }}
uses: ./src/electron/.github/actions/free-space-macos
- name: Check disk space after freeing up space
if: ${{ inputs.target-platform == 'macos' }}
run: df -h
- name: Setup Node.js/npm
if: ${{ inputs.target-platform == 'macos' }}
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: 20.19.x
cache: yarn
cache-dependency-path: src/electron/yarn.lock
- name: Install Dependencies
uses: ./src/electron/.github/actions/install-dependencies
- name: Install AZCopy
if: ${{ inputs.target-platform == 'macos' }}
run: brew install azcopy
- name: Set GN_EXTRA_ARGS for Linux
if: ${{ inputs.target-platform == 'linux' }}
run: >
if [ "${{ inputs.target-arch }}" = "arm" ]; then
if [ "${{ inputs.is-release }}" = true ]; then
GN_EXTRA_ARGS='target_cpu="arm" build_tflite_with_xnnpack=false symbol_level=1'
else
GN_EXTRA_ARGS='target_cpu="arm" build_tflite_with_xnnpack=false'
fi
elif [ "${{ inputs.target-arch }}" = "arm64" ]; then
GN_EXTRA_ARGS='target_cpu="arm64" fatal_linker_warnings=false enable_linux_installer=false'
elif [ "${{ inputs.is-asan }}" = true ]; then
GN_EXTRA_ARGS='is_asan=true'
fi
echo "GN_EXTRA_ARGS=$GN_EXTRA_ARGS" >> $GITHUB_ENV
- name: Set Chromium Git Cookie
uses: ./src/electron/.github/actions/set-chromium-cookie
- name: Install Build Tools
uses: ./src/electron/.github/actions/install-build-tools
- name: Generate DEPS Hash
run: |
node src/electron/script/generate-deps-hash.js
DEPSHASH=v1-src-cache-$(cat src/electron/.depshash)
echo "DEPSHASH=$DEPSHASH" >> $GITHUB_ENV
echo "CACHE_PATH=$DEPSHASH.tar" >> $GITHUB_ENV
- name: Restore src cache via AZCopy
if: ${{ inputs.target-platform != 'linux' }}
uses: ./src/electron/.github/actions/restore-cache-azcopy
with:
target-platform: ${{ inputs.target-platform }}
- name: Restore src cache via AKS
if: ${{ inputs.target-platform == 'linux' }}
uses: ./src/electron/.github/actions/restore-cache-aks
- name: Checkout Electron
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
path: src/electron
fetch-depth: 0
ref: ${{ github.event.pull_request.head.sha }}
- name: Fix Sync
if: ${{ inputs.target-platform != 'linux' }}
uses: ./src/electron/.github/actions/fix-sync
with:
target-platform: ${{ inputs.target-platform }}
env:
ELECTRON_DEPOT_TOOLS_DISABLE_LOG: true
- name: Init Build Tools
run: >
e init -f --root=$(pwd) --out=Default ${{ inputs.gn-build-type }}
--import ${{ inputs.gn-build-type }} --target-cpu ${{
inputs.target-arch }} --remote-build siso
- name: Run Electron Only Hooks
run: |
e d gclient runhooks --spec="solutions=[{'name':'src/electron','url':None,'deps_file':'DEPS','custom_vars':{'process_deps':False},'managed':False}]"
- name: Regenerate DEPS Hash
run: >
(cd src/electron && git checkout .) && node
src/electron/script/generate-deps-hash.js
echo "DEPSHASH=$(cat src/electron/.depshash)" >> $GITHUB_ENV
- name: Add CHROMIUM_BUILDTOOLS_PATH to env
run: echo "CHROMIUM_BUILDTOOLS_PATH=$(pwd)/src/buildtools" >> $GITHUB_ENV
- name: Free up space (macOS)
if: ${{ inputs.target-platform == 'macos' }}
uses: ./src/electron/.github/actions/free-space-macos
- name: Build Electron
if: ${{ inputs.target-platform != 'macos' || (inputs.target-variant == 'all' ||
inputs.target-variant == 'darwin') }}
uses: ./src/electron/.github/actions/build-electron
with:
target-arch: ${{ inputs.target-arch }}
target-platform: ${{ inputs.target-platform }}
artifact-platform: ${{ inputs.target-platform == 'macos' && 'darwin' ||
inputs.target-platform }}
is-release: ${{ inputs.is-release }}
generate-symbols: ${{ inputs.generate-symbols }}
upload-to-storage: ${{ inputs.upload-to-storage }}
is-asan: ${{ inputs.is-asan }}
- name: Set GN_EXTRA_ARGS for MAS Build
if: ${{ inputs.target-platform == 'macos' && (inputs.target-variant == 'all' ||
inputs.target-variant == 'mas') }}
run: |
echo "MAS_BUILD=true" >> $GITHUB_ENV
GN_EXTRA_ARGS='is_mas_build=true'
echo "GN_EXTRA_ARGS=$GN_EXTRA_ARGS" >> $GITHUB_ENV
- name: Build Electron (MAS)
if: ${{ inputs.target-platform == 'macos' && (inputs.target-variant == 'all' ||
inputs.target-variant == 'mas') }}
uses: ./src/electron/.github/actions/build-electron
with:
target-arch: ${{ inputs.target-arch }}
target-platform: ${{ inputs.target-platform }}
artifact-platform: mas
is-release: ${{ inputs.is-release }}
generate-symbols: ${{ inputs.generate-symbols }}
upload-to-storage: ${{ inputs.upload-to-storage }}
step-suffix: (mas)

View File

@@ -35,10 +35,7 @@ concurrency:
group: electron-test-${{ inputs.target-platform }}-${{ inputs.target-arch }}-${{ inputs.is-asan }}-${{ github.ref_protected == true && github.run_id || github.ref }}
cancel-in-progress: ${{ github.ref_protected != true }}
permissions:
contents: read
issues: read
pull-requests: read
permissions: {}
env:
CHROMIUM_GIT_COOKIE: ${{ secrets.CHROMIUM_GIT_COOKIE }}
@@ -53,6 +50,10 @@ jobs:
run:
shell: bash
runs-on: ${{ inputs.test-runs-on }}
permissions:
contents: read
issues: read
pull-requests: read
container: ${{ fromJSON(inputs.test-container) }}
strategy:
fail-fast: false
@@ -182,10 +183,6 @@ jobs:
sudo security authorizationdb write com.apple.trust-settings.admin allow
cd src/electron
./script/codesign/generate-identity.sh
- name: Install Datadog CLI
run: |
cd src/electron
node script/yarn global add @datadog/datadog-ci
- name: Run Electron Tests
shell: bash
env:
@@ -211,7 +208,7 @@ jobs:
export ELECTRON_FORCE_TEST_SUITE_EXIT="true"
fi
fi
node script/yarn test --runners=main --enableRerun=3 --trace-uncaught --enable-logging --files $tests_files
node script/yarn.js test --runners=main --enableRerun=3 --trace-uncaught --enable-logging --files $tests_files
else
chown :builduser .. && chmod g+w ..
chown -R :builduser . && chmod -R g+w .
@@ -228,9 +225,14 @@ jobs:
export MOCHA_TIMEOUT=180000
echo "Piping output to ASAN_SYMBOLIZE ($ASAN_SYMBOLIZE)"
cd electron
runuser -u builduser -- xvfb-run script/actions/run-tests.sh script/yarn test --runners=main --trace-uncaught --enable-logging --files $tests_files | $ASAN_SYMBOLIZE
runuser -u builduser -- xvfb-run script/actions/run-tests.sh script/yarn.js test --runners=main --trace-uncaught --enable-logging --files $tests_files | $ASAN_SYMBOLIZE
else
runuser -u builduser -- xvfb-run script/actions/run-tests.sh script/yarn test --runners=main --trace-uncaught --enable-logging --files $tests_files
if [ "${{ inputs.target-arch }}" = "arm" ]; then
runuser -u builduser -- xvfb-run script/actions/run-tests.sh script/yarn.js test --skipYarnInstall --runners=main --enableRerun=3 --trace-uncaught --enable-logging --files $tests_files
else
runuser -u builduser -- xvfb-run script/actions/run-tests.sh script/yarn.js test --runners=main --enableRerun=3 --trace-uncaught --enable-logging --files $tests_files
fi
fi
fi
- name: Upload Test results to Datadog
@@ -242,9 +244,10 @@ jobs:
DD_TAGS: "os.architecture:${{ inputs.target-arch }},os.family:${{ inputs.target-platform }},os.platform:${{ inputs.target-platform }},asan:${{ inputs.is-asan }}"
run: |
if ! [ -z $DD_API_KEY ] && [ -f src/electron/junit/test-results-main.xml ]; then
export DATADOG_PATH=`node src/electron/script/yarn global bin`
$DATADOG_PATH/datadog-ci junit upload src/electron/junit/test-results-main.xml
fi
cd src/electron
export DATADOG_PATH=`node script/yarn.js bin datadog-ci`
$DATADOG_PATH junit upload junit/test-results-main.xml
fi
if: always() && !cancelled()
- name: Upload Test Artifacts
if: always() && !cancelled()

View File

@@ -26,6 +26,8 @@ on:
type: string
default: testing
permissions: {}
concurrency:
group: electron-node-nan-test-${{ inputs.target-platform }}-${{ inputs.target-arch }}-${{ github.ref_protected == true && github.run_id || github.ref }}
cancel-in-progress: ${{ github.ref_protected != true }}
@@ -39,6 +41,8 @@ jobs:
node-tests:
name: Run Node.js Tests
runs-on: electron-arc-centralus-linux-amd64-8core
permissions:
contents: read
timeout-minutes: 30
env:
TARGET_ARCH: ${{ inputs.target-arch }}
@@ -93,6 +97,8 @@ jobs:
nan-tests:
name: Run Nan Tests
runs-on: electron-arc-centralus-linux-amd64-4core
permissions:
contents: read
timeout-minutes: 30
env:
TARGET_ARCH: ${{ inputs.target-arch }}
@@ -132,10 +138,16 @@ jobs:
unzip -:o dist.zip
- name: Setup Linux for Headless Testing
run: sh -e /etc/init.d/xvfb start
- name: Add Clang problem matcher
shell: bash
run: echo "::add-matcher::src/electron/.github/problem-matchers/clang.json"
- name: Run Nan Tests
run: |
cd src
node electron/script/nan-spec-runner.js
- name: Remove Clang problem matcher
shell: bash
run: echo "::remove-matcher owner=clang::"
- name: Wait for active SSH sessions
shell: bash
if: always() && !cancelled()

View File

@@ -11,6 +11,7 @@ jobs:
name: backport/requested label added
if: github.event.label.name == 'backport/requested 🗳'
runs-on: ubuntu-latest
permissions: {}
steps:
- name: Trigger Slack workflow
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52 # v2.1.0
@@ -19,12 +20,16 @@ jobs:
webhook-type: webhook-trigger
payload: |
{
"url": "${{ github.event.pull_request.html_url }}"
"base_ref": ${{ toJSON(github.event.pull_request.base.ref) }},
"title": ${{ toJSON(github.event.pull_request.title) }},
"url": ${{ toJSON(github.event.pull_request.html_url) }},
"user": ${{ toJSON(github.event.pull_request.user.login) }}
}
pull-request-labeled-deprecation-review-complete:
name: deprecation-review/complete label added
if: github.event.label.name == 'deprecation-review/complete ✅'
runs-on: ubuntu-latest
permissions: {}
steps:
- name: Generate GitHub App token
uses: electron/github-app-auth-action@384fd19694fe7b6dcc9a684746c6976ad78228ae # v1.1.1

View File

@@ -7,8 +7,7 @@ on:
- edited
- synchronize
permissions:
contents: read
permissions: {}
jobs:
main:

View File

@@ -11,6 +11,7 @@ jobs:
check-stable-prep-items:
name: Check Stable Prep Items
runs-on: ubuntu-latest
permissions: {}
steps:
- name: Generate GitHub App token
uses: electron/github-app-auth-action@384fd19694fe7b6dcc9a684746c6976ad78228ae # v1.1.1

View File

@@ -10,6 +10,7 @@ permissions: {}
jobs:
stale:
runs-on: ubuntu-latest
permissions: {}
steps:
- name: Generate GitHub App token
uses: electron/github-app-auth-action@384fd19694fe7b6dcc9a684746c6976ad78228ae # v1.1.1
@@ -31,6 +32,7 @@ jobs:
only-pr-labels: not-a-real-label
pending-repro:
runs-on: ubuntu-latest
permissions: {}
if: ${{ always() }}
needs: stale
steps:

View File

@@ -6,7 +6,7 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: '933c7d6ff6802706875270bec2e3c891cf8add3f'
default: 'a82b87d7a4f5ff0cab61405f8151ac4cf4942aeb'
required: true
upload-to-storage:
description: 'Uploads to Azure storage'
@@ -18,9 +18,13 @@ on:
type: boolean
default: false
permissions: {}
jobs:
checkout-windows:
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
options: --user root --device /dev/fuse --cap-add SYS_ADMIN
@@ -47,7 +51,12 @@ jobs:
target-platform: win
publish-x64-win:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-windows
with:
environment: production-release
@@ -61,7 +70,12 @@ jobs:
secrets: inherit
publish-arm64-win:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-windows
with:
environment: production-release
@@ -75,7 +89,12 @@ jobs:
secrets: inherit
publish-x86-win:
uses: ./.github/workflows/pipeline-segment-electron-build.yml
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
artifact-metadata: write
attestations: write
contents: read
id-token: write
needs: checkout-windows
with:
environment: production-release

2
.gitignore vendored
View File

@@ -53,3 +53,5 @@ ts-gen
patches/mtime-cache.json
spec/fixtures/logo.png
.yarn/install-state.gz

942
.yarn/releases/yarn-4.12.0.cjs vendored Executable file

File diff suppressed because one or more lines are too long

12
.yarnrc.yml Normal file
View File

@@ -0,0 +1,12 @@
enableScripts: false
nmHoistingLimits: workspaces
nodeLinker: node-modules
npmMinimalAgeGate: 10080
npmPreapprovedPackages:
- "@electron/*"
yarnPath: .yarn/releases/yarn-4.12.0.cjs

View File

@@ -1616,6 +1616,29 @@ group("node_headers") {
public_deps = [ ":tar_node_headers" ]
}
group("testing_build") {
public_deps = [
":electron_dist_zip",
":electron_mksnapshot_zip",
":node_headers",
]
}
group("release_build") {
public_deps = [ ":testing_build" ]
if (is_official_build) {
public_deps += [ ":electron_symbols" ]
}
if (is_linux) {
public_deps += [
":hunspell_dictionaries_zip",
":libcxx_headers_zip",
":libcxx_objects_zip",
":libcxxabi_headers_zip",
]
}
}
if (is_linux && is_official_build) {
strip_binary("strip_electron_binary") {
binary_input = "$root_out_dir/$electron_project_name"

9
DEPS
View File

@@ -2,9 +2,9 @@ gclient_gn_args_from = 'src'
vars = {
'chromium_version':
'140.0.7339.240',
'140.0.7339.249',
'node_version':
'v22.20.0',
'v22.22.0',
'nan_version':
'e14bdcd1f72d62bca1d541b66da43130384ec213',
'squirrel.mac_version':
@@ -30,9 +30,6 @@ vars = {
# The path of the sysroots.json file.
'sysroots_json_path': 'electron/script/sysroots.json',
# KEEP IN SYNC WITH utils.js FILE
'yarn_version': '1.22.22',
# To be able to build clean Chromium from sources.
'apply_patches': True,
@@ -155,7 +152,7 @@ hooks = [
'action': [
'python3',
'-c',
'import os, subprocess; os.chdir(os.path.join("src", "electron")); subprocess.check_call(["python3", "script/lib/npx.py", "yarn@' + (Var("yarn_version")) + '", "install", "--frozen-lockfile"]);',
'import os, subprocess; os.chdir(os.path.join("src", "electron")); subprocess.check_call(["node", ".yarn/releases/yarn-4.12.0.cjs", "install", "--immutable"]);',
],
},
{

View File

@@ -39,7 +39,7 @@ Each Electron release provides binaries for macOS, Windows, and Linux.
* macOS (Big Sur and up): Electron provides 64-bit Intel and Apple Silicon / ARM binaries for macOS.
* Windows (Windows 10 and up): Electron provides `ia32` (`x86`), `x64` (`amd64`), and `arm64` binaries for Windows. Windows on ARM support was added in Electron 5.0.8. Support for Windows 7, 8 and 8.1 was [removed in Electron 23, in line with Chromium's Windows deprecation policy](https://www.electronjs.org/blog/windows-7-to-8-1-deprecation-notice).
* Linux: The prebuilt binaries of Electron are built on Ubuntu 20.04. They have also been verified to work on:
* Linux: The prebuilt binaries of Electron are built on Ubuntu 22.04. They have also been verified to work on:
* Ubuntu 18.04 and newer
* Fedora 32 and newer
* Debian 10 and newer

View File

@@ -24,10 +24,6 @@ enable_printing = true
angle_enable_vulkan_validation_layers = false
dawn_enable_vulkan_validation_layers = false
# Removes dxc dll's that are only used experimentally.
# See https://bugs.chromium.org/p/chromium/issues/detail?id=1474897
dawn_use_built_dxc = false
# These are disabled because they cause the zip manifest to differ between
# testing and release builds.
# See https://chromium-review.googlesource.com/c/chromium/src/+/2774898.

View File

@@ -1214,6 +1214,13 @@ Disables hardware acceleration for current app.
This method can only be called before app is ready.
### `app.isHardwareAccelerationEnabled()`
Returns `boolean` - whether hardware acceleration is currently disabled.
> [!NOTE]
> This information is only usable after the `gpu-info-update` event is emitted.
### `app.disableDomainBlockingFor3DAPIs()`
By default, Chromium disables 3D APIs (e.g. WebGL) until restart on a per
@@ -1397,7 +1404,75 @@ details. Disabled by default.
This API must be called after the `ready` event is emitted.
> [!NOTE]
> Rendering accessibility tree can significantly affect the performance of your app. It should not be enabled by default.
> Rendering accessibility tree can significantly affect the performance of your app. It should not be enabled by default. Calling this method will enable the following accessibility support features: `nativeAPIs`, `webContents`, `inlineTextBoxes`, and `extendedProperties`.
### `app.getAccessibilitySupportFeatures()` _macOS_ _Windows_
Returns `string[]` - Array of strings naming currently enabled accessibility support components. Possible values:
* `nativeAPIs` - Native OS accessibility APIs integration enabled.
* `webContents` - Web contents accessibility tree exposure enabled.
* `inlineTextBoxes` - Inline text boxes (character bounding boxes) enabled.
* `extendedProperties` - Extended accessibility properties enabled.
* `screenReader` - Screen reader specific mode enabled.
* `html` - HTML accessibility tree construction enabled.
* `labelImages` - Accessibility support for automatic image annotations.
* `pdfPrinting` - Accessibility support for PDF printing enabled.
Notes:
* The array may be empty if no accessibility modes are active.
* Use `app.isAccessibilitySupportEnabled()` for the legacy boolean check;
prefer this method for granular diagnostics or telemetry.
Example:
```js
const { app } = require('electron')
app.whenReady().then(() => {
if (app.getAccessibilitySupportFeatures().includes('screenReader')) {
// Change some app UI to better work with Screen Readers.
}
})
```
### `app.setAccessibilitySupportFeatures(features)` _macOS_ _Windows_
* `features` string[] - An array of the accessibility features to enable.
Possible values are:
* `nativeAPIs` - Native OS accessibility APIs integration enabled.
* `webContents` - Web contents accessibility tree exposure enabled.
* `inlineTextBoxes` - Inline text boxes (character bounding boxes) enabled.
* `extendedProperties` - Extended accessibility properties enabled.
* `screenReader` - Screen reader specific mode enabled.
* `html` - HTML accessibility tree construction enabled.
* `labelImages` - Accessibility support for automatic image annotations.
* `pdfPrinting` - Accessibility support for PDF printing enabled.
To disable all supported features, pass an empty array `[]`.
Example:
```js
const { app } = require('electron')
app.whenReady().then(() => {
// Enable a subset of features:
app.setAccessibilitySupportFeatures([
'screenReader',
'pdfPrinting',
'webContents'
])
// Other logic
// Some time later, disable all features:
app.setAccessibilitySupportFeatures([])
})
```
### `app.showAboutPanel()`

View File

@@ -1262,15 +1262,16 @@ Sets the properties for the window's taskbar button.
#### `win.setAccentColor(accentColor)` _Windows_
* `accentColor` boolean | string - The accent color for the window. By default, follows user preference in System Settings.
* `accentColor` boolean | string | null - The accent color for the window. By default, follows user preference in System Settings. To reset to system default, pass `null`.
Sets the system accent color and highlighting of active window border.
The `accentColor` parameter accepts the following values:
* **Color string** - Sets a custom accent color using standard CSS color formats (Hex, RGB, RGBA, HSL, HSLA, or named colors). Alpha values in RGBA/HSLA formats are ignored and the color is treated as fully opaque.
* **`true`** - Uses the system's default accent color from user preferences in System Settings.
* **`false`** - Explicitly disables accent color highlighting for the window.
* **Color string** - Like `true`, but sets a custom accent color using standard CSS color formats (Hex, RGB, RGBA, HSL, HSLA, or named colors). Alpha values in RGBA/HSLA formats are ignored and the color is treated as fully opaque.
* **`true`** - Enable accent color highlighting for the window with the system accent color regardless of whether accent colors are enabled for windows in System `Settings.`
* **`false`** - Disable accent color highlighting for the window regardless of whether accent colors are currently enabled for windows in System Settings.
* **`null`** - Reset window accent color behavior to follow behavior set in System Settings.
Examples:
@@ -1283,11 +1284,14 @@ win.setAccentColor('#ff0000')
// RGB format (alpha ignored if present).
win.setAccentColor('rgba(255,0,0,0.5)')
// Use system accent color.
// Enable accent color, using the color specified in System Settings.
win.setAccentColor(true)
// Disable accent color.
win.setAccentColor(false)
// Reset window accent color behavior to follow behavior set in System Settings.
win.setAccentColor(null)
```
#### `win.getAccentColor()` _Windows_

View File

@@ -1442,15 +1442,16 @@ Sets the properties for the window's taskbar button.
#### `win.setAccentColor(accentColor)` _Windows_
* `accentColor` boolean | string - The accent color for the window. By default, follows user preference in System Settings.
* `accentColor` boolean | string | null - The accent color for the window. By default, follows user preference in System Settings. To reset to system default, pass `null`.
Sets the system accent color and highlighting of active window border.
The `accentColor` parameter accepts the following values:
* **Color string** - Sets a custom accent color using standard CSS color formats (Hex, RGB, RGBA, HSL, HSLA, or named colors). Alpha values in RGBA/HSLA formats are ignored and the color is treated as fully opaque.
* **`true`** - Uses the system's default accent color from user preferences in System Settings.
* **`false`** - Explicitly disables accent color highlighting for the window.
* **Color string** - Like `true`, but sets a custom accent color using standard CSS color formats (Hex, RGB, RGBA, HSL, HSLA, or named colors). Alpha values in RGBA/HSLA formats are ignored and the color is treated as fully opaque.
* **`true`** - Enable accent color highlighting for the window with the system accent color regardless of whether accent colors are enabled for windows in System `Settings.`
* **`false`** - Disable accent color highlighting for the window regardless of whether accent colors are currently enabled for windows in System Settings.
* **`null`** - Reset window accent color behavior to follow behavior set in System Settings.
Examples:
@@ -1463,11 +1464,14 @@ win.setAccentColor('#ff0000')
// RGB format (alpha ignored if present).
win.setAccentColor('rgba(255,0,0,0.5)')
// Use system accent color.
// Enable accent color, using the color specified in System Settings.
win.setAccentColor(true)
// Disable accent color.
win.setAccentColor(false)
// Reset window accent color behavior to follow behavior set in System Settings.
win.setAccentColor(null)
```
#### `win.getAccentColor()` _Windows_

View File

@@ -25,6 +25,11 @@ following properties:
with which the request is associated. Defaults to the empty string. The
`session` option supersedes `partition`. Thus if a `session` is explicitly
specified, `partition` is ignored.
* `bypassCustomProtocolHandlers` boolean (optional) - When set to `true`,
custom protocol handlers registered for the request's URL scheme will not be
called. This allows forwarding an intercepted request to the built-in
handler. [webRequest](web-request.md) handlers will still be triggered
when bypassing custom protocols. Defaults to `false`.
* `credentials` string (optional) - Can be `include`, `omit` or
`same-origin`. Whether to send
[credentials](https://fetch.spec.whatwg.org/#credentials) with this

View File

@@ -186,14 +186,3 @@ the one downloaded by `npm install`. Usage:
```sh
export ELECTRON_OVERRIDE_DIST_PATH=/Users/username/projects/electron/out/Testing
```
## Set By Electron
Electron sets some variables in your environment at runtime.
### `ORIGINAL_XDG_CURRENT_DESKTOP`
This variable is set to the value of `XDG_CURRENT_DESKTOP` that your application
originally launched with. Electron sometimes modifies the value of `XDG_CURRENT_DESKTOP`
to affect other logic within Chromium so if you want access to the _original_ value
you should look up this environment variable instead.

View File

@@ -102,9 +102,10 @@
should have rounded corners. Default is `true`. Setting this property
to `false` will prevent the window from being fullscreenable on macOS.
On Windows versions older than Windows 11 Build 22000 this property has no effect, and frameless windows will not have rounded corners.
* `thickFrame` boolean (optional) - Use `WS_THICKFRAME` style for frameless windows on
Windows, which adds standard window frame. Setting it to `false` will remove
window shadow and window animations. Default is `true`.
* `thickFrame` boolean (optional) _Windows_ - Use `WS_THICKFRAME` style for
frameless windows on Windows, which adds the standard window frame. Setting it
to `false` will remove window shadow and window animations, and disable window
resizing via dragging the window edges. Default is `true`.
* `vibrancy` string (optional) _macOS_ - Add a type of vibrancy effect to
the window, only on macOS. Can be `appearance-based`, `titlebar`, `selection`,
`menu`, `popover`, `sidebar`, `header`, `sheet`, `window`, `hud`, `fullscreen-ui`,

View File

@@ -21,7 +21,9 @@
associated with the window, making it compatible with the Chromium
OS-level sandbox and disabling the Node.js engine. This is not the same as
the `nodeIntegration` option and the APIs available to the preload script
are more limited. Read more about the option [here](../../tutorial/sandbox.md).
are more limited. Default is `true` since Electron 20. The sandbox will
automatically be disabled when `nodeIntegration` is set to `true`.
Read more about the option [here](../../tutorial/sandbox.md).
* `session` [Session](../session.md#class-session) (optional) - Sets the session used by the
page. Instead of passing the Session object directly, you can also choose to
use the `partition` option instead, which accepts a partition string. When

View File

@@ -16,9 +16,15 @@ This document uses the following convention to categorize breaking changes:
### Removed: `ELECTRON_OZONE_PLATFORM_HINT` environment variable
The default value of the `--ozone-plaftform` flag [changed to `auto`](https://chromium-review.googlesource.com/c/chromium/src/+/6775426).
The default value of the `--ozone-platform` flag [changed to `auto`](https://chromium-review.googlesource.com/c/chromium/src/+/6775426).
You should use the `XDG_SESSION_TYPE=wayland` environment variable instead to use Wayland.
Electron now defaults to running as a native Wayland app when launched in a Wayland session (when `XDG_SESSION_TYPE=wayland`).
Users can force XWayland by passing `--ozone-platform=x11`.
### Removed: `ORIGINAL_XDG_CURRENT_DESKTOP` environment variable
Previously, Electron changed the value of `XDG_CURRENT_DESKTOP` internally to `Unity`, and stored the original name of the desktop session
in a separate variable. `XDG_CURRENT_DESKTOP` is no longer overriden and now reflects the actual desktop environment.
### Removed: macOS 11 support

View File

@@ -6,77 +6,17 @@ Follow the guidelines below for building **Electron itself** on Linux, for the p
## Prerequisites
* At least 25GB disk space and 8GB RAM.
* Python >= 3.9.
* [Node.js](https://nodejs.org/download/) >= 22.12.0
* [clang](https://clang.llvm.org/get_started.html) 3.4 or later.
* Development headers of GTK 3 and libnotify.
Due to Electron's dependency on Chromium, prerequisites and dependencies for Electron change over time. [Chromium's documentation on building on Linux](https://chromium.googlesource.com/chromium/src/+/HEAD/docs/linux/build_instructions.md) has up to date information for building Chromium on Linux. This documentation can generally
be followed for building Electron on Linux as well.
On Ubuntu >= 20.04, install the following libraries:
```sh
$ sudo apt-get install build-essential clang libdbus-1-dev libgtk-3-dev \
libnotify-dev libasound2-dev libcap-dev \
libcups2-dev libxtst-dev \
libxss1 libnss3-dev gcc-multilib g++-multilib curl \
gperf bison python3-dbusmock openjdk-8-jre
```
On Ubuntu < 20.04, install the following libraries:
```sh
$ sudo apt-get install build-essential clang libdbus-1-dev libgtk-3-dev \
libnotify-dev libgnome-keyring-dev \
libasound2-dev libcap-dev libcups2-dev libxtst-dev \
libxss1 libnss3-dev gcc-multilib g++-multilib curl \
gperf bison python-dbusmock openjdk-8-jre
```
On RHEL / CentOS, install the following libraries:
```sh
$ sudo yum install clang dbus-devel gtk3-devel libnotify-devel \
libgnome-keyring-devel xorg-x11-server-utils libcap-devel \
cups-devel libXtst-devel alsa-lib-devel libXrandr-devel \
nss-devel python-dbusmock openjdk-8-jre
```
On Fedora, install the following libraries:
```sh
$ sudo dnf install clang dbus-devel gperf gtk3-devel \
libnotify-devel libgnome-keyring-devel libcap-devel \
cups-devel libXtst-devel alsa-lib-devel libXrandr-devel \
nss-devel python-dbusmock
```
On Arch Linux / Manjaro, install the following libraries:
```sh
$ sudo pacman -Syu base-devel clang libdbus gtk2 libnotify \
libgnome-keyring alsa-lib libcap libcups libxtst \
libxss nss gcc-multilib curl gperf bison \
python2 python-dbusmock jdk8-openjdk
```
Other distributions may offer similar packages for installation via package
managers such as pacman. Or one can compile from source code.
Additionally, Electron's [Linux dependency installer](https://github.com/electron/build-images/blob/main/tools/install-deps.sh) can be referenced to get the current dependencies that Electron requires in addition to what Chromium installs via [build/install-deps.sh](https://chromium.googlesource.com/chromium/src/+/HEAD/build/install-build-deps.sh).
### Cross compilation
If you want to build for an `arm` target you should also install the following
dependencies:
If you want to build for an `arm` target, you can use Electron's [Linux dependency installer](https://github.com/electron/build-images/blob/main/tools/install-deps.sh) to install the additional dependencies by passing the `--arm argument`:
```sh
$ sudo apt-get install libc6-dev-armhf-cross linux-libc-dev-armhf-cross \
g++-arm-linux-gnueabihf
```
Similarly for `arm64`, install the following:
```sh
$ sudo apt-get install libc6-dev-arm64-cross linux-libc-dev-arm64-cross \
g++-aarch64-linux-gnu
$ sudo install-deps.sh --arm
```
And to cross-compile for `arm` or targets, you should pass the

View File

@@ -32,7 +32,7 @@ This table gives a general overview of where ESM is supported and which ESM load
| Main | Node.js | N/A | <ul><li> [You must use `await` generously before the app's `ready` event](#you-must-use-await-generously-before-the-apps-ready-event) </li></ul> |
| Renderer (Sandboxed) | Chromium | Unsupported | <ul><li> [Sandboxed preload scripts can't use ESM imports](#sandboxed-preload-scripts-cant-use-esm-imports) </li></ul> |
| Renderer (Unsandboxed & Context Isolated) | Chromium | Node.js | <ul><li> [Unsandboxed ESM preload scripts will run after page load on pages with no content](#unsandboxed-esm-preload-scripts-will-run-after-page-load-on-pages-with-no-content) </li> <li>[ESM Preload Scripts must have the `.mjs` extension](#esm-preload-scripts-must-have-the-mjs-extension)</li></ul> |
| Renderer (Unsandboxed & Non Context Isolated) | Chromium | Node.js | <ul><li>[Unsandboxed ESM preload scripts will run after page load on pages with no content](#unsandboxed-esm-preload-scripts-will-run-after-page-load-on-pages-with-no-content)</li><li>[ESM Preload Scripts must have the `.mjs` extension](#esm-preload-scripts-must-have-the-mjs-extension)</li><li>[ESM preload scripts must be context isolated to use dynamic Node.js ESM imports](#esm-preload-scripts-must-be-context-isolated-to-use-dynamic-nodejs-esm-imports)</li></ul> |
| Renderer (Unsandboxed & Non Context Isolated) | Chromium | Node.js | <ul><li>[Unsandboxed ESM preload scripts will run after page load on pages with no content](#unsandboxed-esm-preload-scripts-will-run-after-page-load-on-pages-with-no-content)</li><li>[ESM Preload Scripts must have the `.mjs` extension](#esm-preload-scripts-must-have-the-mjs-extension)</li></ul> |
## Main process

View File

@@ -2,15 +2,15 @@
## Overview
[Online and offline event](https://developer.mozilla.org/en-US/docs/Online_and_offline_events)
detection can be implemented in the Renderer process using the
[`navigator.onLine`](http://html5index.org/Offline%20-%20NavigatorOnLine.html)
attribute, part of standard HTML5 API.
Online and offline event detection can be implemented in both the main and renderer processes:
- **Renderer process**: Use the [`navigator.onLine`](http://html5index.org/Offline%20-%20NavigatorOnLine.html) attribute and [online/offline events](https://developer.mozilla.org/en-US/docs/Online_and_offline_events), part of standard HTML5 API.
- **Main process**: Use the [`net.isOnline()`](../api/net.md#netisonline) method or the [`net.online`](../api/net.md#netonline-readonly) property.
The `navigator.onLine` attribute returns:
* `false` if all network requests are guaranteed to fail (e.g. when disconnected from the network).
* `true` in all other cases.
- `false` if all network requests are guaranteed to fail (e.g. when disconnected from the network).
- `true` in all other cases.
Since many cases return `true`, you should treat with care situations of
getting false positives, as we cannot always assume that `true` value means
@@ -19,7 +19,27 @@ is running a virtualization software that has virtual Ethernet adapters in "alwa
connected" state. Therefore, if you want to determine the Internet access
status of Electron, you should develop additional means for this check.
## Example
## Main Process Detection
In the main process, you can use the `net` module to detect online/offline status:
```js
const { net } = require('electron')
// Method 1: Using net.isOnline()
const isOnline = net.isOnline()
console.log('Online status:', isOnline)
// Method 2: Using net.online property
console.log('Online status:', net.online)
```
Both `net.isOnline()` and `net.online` return the same boolean value with the same reliability characteristics as `navigator.onLine` - they provide a strong indicator when offline (`false`), but a `true` value doesn't guarantee successful internet connectivity.
> [!NOTE]
> The `net` module is only available after the app emits the `ready` event.
## Renderer Process Example
Starting with an HTML file `index.html`, this example will demonstrate how the `navigator.onLine` API can be used to build a connection status indicator.
@@ -84,4 +104,4 @@ After launching the Electron application, you should see the notification:
![Connection status](../images/connection-status.png)
> [!NOTE]
> If you need to communicate the connection status to the main process, use the [IPC renderer](../api/ipc-renderer.md) API.
> If you need to check the connection status in the main process, you can use [`net.isOnline()`](../api/net.md#netisonline) directly instead of communicating from the renderer process via [IPC](../api/ipc-renderer.md).

View File

@@ -13,7 +13,13 @@ the GPU service and the network service.
See Chromium's [Sandbox design document][sandbox] for more information.
Starting from Electron 20, the sandbox is enabled for renderer processes without any
further configuration. If you want to disable the sandbox for a process, see the
further configuration.
Sandboxing is tied to Node.js integration. _Enabling Node.js integration_ for a
renderer process by setting `nodeIntegration: true` _disables the sandbox_ for the
process.
If you want to disable the sandbox for a process, see the
[Disabling the sandbox for a single process](#disabling-the-sandbox-for-a-single-process)
section.
@@ -98,7 +104,8 @@ app.whenReady().then(() => {
```
Sandboxing is also disabled whenever Node.js integration is enabled in the renderer.
This can be done through the BrowserWindow constructor with the `nodeIntegration: true` flag.
This can be done through the BrowserWindow constructor with the `nodeIntegration: true` flag
or by providing the respective HTML boolean attribute for a `webview`.
```js title='main.js'
app.whenReady().then(() => {
@@ -111,6 +118,10 @@ app.whenReady().then(() => {
})
```
```html title='index.html (Renderer Process)'
<webview nodeIntegration src="page.html"></webview>
```
### Enabling the sandbox globally
If you want to force sandboxing for all renderers, you can also use the

View File

@@ -244,6 +244,10 @@ to enable this behavior.
Even when `nodeIntegration: false` is used, to truly enforce strong isolation
and prevent the use of Node primitives `contextIsolation` **must** also be used.
Beware that _disabling context isolation_ for a renderer process by setting
`nodeIntegration: true` _also disables process sandboxing_ for that process.
See section below.
:::info
For more information on what `contextIsolation` is and how to enable it please
see our dedicated [Context Isolation](context-isolation.md) document.
@@ -251,6 +255,16 @@ see our dedicated [Context Isolation](context-isolation.md) document.
### 4. Enable process sandboxing
:::info
This recommendation is the default behavior in Electron since 20.0.0.
Additionally, process sandboxing can be enforced for all renderer processes
application wide: [Enabling the sandbox globally](sandbox.md#enabling-the-sandbox-globally)
_Disabling context isolation_ (see above) _also disables process sandboxing_,
regardless of the default, `sandbox: false` or globally enabled sandboxing!
:::
[Sandboxing](https://chromium.googlesource.com/chromium/src/+/HEAD/docs/design/sandbox.md)
is a Chromium feature that uses the operating system to
significantly limit what renderer processes have access to. You should enable

View File

@@ -25,11 +25,30 @@ Menu.prototype._isCommandIdChecked = function (id) {
};
Menu.prototype._isCommandIdEnabled = function (id) {
return this.commandsMap[id] ? this.commandsMap[id].enabled : false;
const item = this.commandsMap[id];
if (!item) return false;
const focusedWindow = BaseWindow.getFocusedWindow();
if (item.role === 'minimize' && focusedWindow) {
return focusedWindow.isMinimizable();
}
if (item.role === 'togglefullscreen' && focusedWindow) {
return focusedWindow.isFullScreenable();
}
if (item.role === 'close' && focusedWindow) {
return focusedWindow.isClosable();
}
return item.enabled;
};
Menu.prototype._shouldCommandIdWorkWhenHidden = function (id) {
return this.commandsMap[id] ? !!this.commandsMap[id].acceleratorWorksWhenHidden : false;
};
Menu.prototype._isCommandIdVisible = function (id) {
return this.commandsMap[id] ? this.commandsMap[id].visible : false;
};

View File

@@ -119,7 +119,10 @@ export function fetchWithSession (input: RequestInfo, init: (RequestInit & {bypa
p.reject(err);
});
if (!req.body?.pipeTo(Writable.toWeb(r as unknown as Writable)).then(() => r.end())) { r.end(); }
// pipeTo expects a WritableStream<Uint8Array>. Node.js' Writable.toWeb returns WritableStream<any>,
// which causes a TS structural mismatch.
const writable = Writable.toWeb(r as unknown as Writable) as unknown as WritableStream<Uint8Array>;
if (!req.body?.pipeTo(writable).then(() => r.end())) { r.end(); }
return p.promise;
}

View File

@@ -4,6 +4,8 @@ import { createReadStream } from 'fs';
import { Readable } from 'stream';
import { ReadableStream } from 'stream/web';
import type { ReadableStreamDefaultReader } from 'stream/web';
// Global protocol APIs.
const { registerSchemesAsPrivileged, getStandardSchemes, Protocol } = process._linkedBinding('electron_browser_protocol');
@@ -12,7 +14,7 @@ const ERR_UNEXPECTED = -9;
const isBuiltInScheme = (scheme: string) => ['http', 'https', 'file'].includes(scheme);
function makeStreamFromPipe (pipe: any): ReadableStream {
function makeStreamFromPipe (pipe: any): ReadableStream<Uint8Array> {
const buf = new Uint8Array(1024 * 1024 /* 1 MB */);
return new ReadableStream({
async pull (controller) {
@@ -38,21 +40,26 @@ function makeStreamFromFileInfo ({
filePath: string;
offset?: number;
length?: number;
}): ReadableStream {
}): ReadableStream<Uint8Array> {
// Node's Readable.toWeb produces a WHATWG ReadableStream whose chunks are Uint8Array.
return Readable.toWeb(createReadStream(filePath, {
start: offset,
end: length >= 0 ? offset + length : undefined
}));
})) as ReadableStream<Uint8Array>;
}
function convertToRequestBody (uploadData: ProtocolRequest['uploadData']): RequestInit['body'] {
if (!uploadData) return null;
// Optimization: skip creating a stream if the request is just a single buffer.
if (uploadData.length === 1 && (uploadData[0] as any).type === 'rawData') return uploadData[0].bytes;
if (uploadData.length === 1 && (uploadData[0] as any).type === 'rawData') {
return uploadData[0].bytes as any;
}
const chunks = [...uploadData] as any[]; // TODO: types are wrong
let current: ReadableStreamDefaultReader | null = null;
return new ReadableStream({
const chunks = [...uploadData] as any[]; // TODO: refine ProtocolRequest types
// Use Node's web stream types explicitly to avoid DOM lib vs Node lib structural mismatches.
// Generic <Uint8Array> ensures reader.read() returns value?: Uint8Array consistent with enqueue.
let current: ReadableStreamDefaultReader<Uint8Array> | null = null;
return new ReadableStream<Uint8Array>({
async pull (controller) {
if (current) {
const { done, value } = await current.read();
@@ -67,7 +74,7 @@ function convertToRequestBody (uploadData: ProtocolRequest['uploadData']): Reque
if (!chunks.length) { return controller.close(); }
const chunk = chunks.shift()!;
if (chunk.type === 'rawData') {
controller.enqueue(chunk.bytes);
controller.enqueue(chunk.bytes as Uint8Array);
} else if (chunk.type === 'file') {
current = makeStreamFromFileInfo(chunk).getReader();
return this.pull!(controller);

View File

@@ -40,7 +40,7 @@ process.on('uncaughtException', function (error) {
// Emit 'exit' event on quit.
const { app } = require('electron');
app.on('quit', (_event, exitCode) => {
app.on('quit', (_event: any, exitCode: number) => {
process.emit('exit', exitCode);
});
@@ -162,27 +162,6 @@ require('@electron/internal/browser/api/web-contents-view');
// Set main startup script of the app.
const mainStartupScript = packageJson.main || 'index.js';
const KNOWN_XDG_DESKTOP_VALUES = new Set(['Pantheon', 'Unity:Unity7', 'pop:GNOME']);
function currentPlatformSupportsAppIndicator () {
if (process.platform !== 'linux') return false;
const currentDesktop = process.env.XDG_CURRENT_DESKTOP;
if (!currentDesktop) return false;
if (KNOWN_XDG_DESKTOP_VALUES.has(currentDesktop)) return true;
// ubuntu based or derived session (default ubuntu one, communitheme…) supports
// indicator too.
if (/ubuntu/ig.test(currentDesktop)) return true;
return false;
}
// Workaround for electron/electron#5050 and electron/electron#9046
process.env.ORIGINAL_XDG_CURRENT_DESKTOP = process.env.XDG_CURRENT_DESKTOP;
if (currentPlatformSupportsAppIndicator()) {
process.env.XDG_CURRENT_DESKTOP = 'Unity';
}
// Quit when all windows are closed and no other one is listening to this.
app.on('window-all-closed', () => {
if (app.listenerCount('window-all-closed') === 1) {

View File

@@ -289,7 +289,8 @@ function parseOptions (optionsIn: ClientRequestConstructorOptions | string): Nod
referrerPolicy: options.referrerPolicy,
cache: options.cache,
allowNonHttpProtocols: Object.hasOwn(options, kAllowNonHttpProtocols),
priority: options.priority
priority: options.priority,
bypassCustomProtocolHandlers: options.bypassCustomProtocolHandlers
};
if ('priorityIncremental' in options) {
urlLoaderOptions.priorityIncremental = options.priorityIncremental;

View File

@@ -44,7 +44,7 @@ downloadArtifact({
artifactName: 'electron',
force: process.env.force_no_cache === 'true',
cacheRoot: process.env.electron_config_cache,
checksums: process.env.electron_use_remote_checksums ?? process.env.npm_config_electron_use_remote_checksums ? undefined : require('./checksums.json'),
checksums: (process.env.electron_use_remote_checksums || process.env.npm_config_electron_use_remote_checksums) ? undefined : require('./checksums.json'),
platform,
arch
}).then(extractFile).catch(err => {

View File

@@ -1,14 +1,15 @@
{
"name": "electron",
"name": "@electron-ci/dev-root",
"version": "0.0.0-development",
"repository": "https://github.com/electron/electron",
"description": "Build cross platform desktop apps with JavaScript, HTML, and CSS",
"devDependencies": {
"@azure/storage-blob": "^12.25.0",
"@datadog/datadog-ci": "^4.1.2",
"@electron/asar": "^3.2.13",
"@electron/docs-parser": "^2.0.0",
"@electron/fiddle-core": "^1.3.4",
"@electron/github-app-auth": "^2.2.1",
"@electron/github-app-auth": "^3.2.0",
"@electron/lint-roller": "^3.1.1",
"@electron/typescript-definitions": "^9.1.2",
"@octokit/rest": "^20.0.2",
@@ -24,7 +25,6 @@
"buffer": "^6.0.3",
"chalk": "^4.1.0",
"check-for-leaks": "^1.2.1",
"dugite": "^2.7.1",
"eslint": "^8.57.1",
"eslint-config-standard": "^17.1.0",
"eslint-plugin-import": "^2.32.0",
@@ -40,6 +40,7 @@
"lint-staged": "^16.1.0",
"markdownlint-cli2": "^0.18.0",
"minimist": "^1.2.8",
"node-gyp": "^11.4.2",
"null-loader": "^4.0.1",
"pre-flight": "^2.0.0",
"process": "^0.11.10",
@@ -56,7 +57,8 @@
"url": "^0.11.4",
"webpack": "^5.95.0",
"webpack-cli": "^5.1.4",
"wrapper-webpack-plugin": "^2.2.0"
"wrapper-webpack-plugin": "^2.2.0",
"yaml": "^2.8.1"
},
"private": true,
"scripts": {
@@ -131,9 +133,25 @@
"DEPS": [
"node script/gen-hunspell-filenames.js",
"node script/gen-libc++-filenames.js"
],
".github/workflows/pipeline-segment-electron-build.yml": [
"node script/copy-pipeline-segment-publish.js",
"git add .github/workflows/pipeline-segment-electron-publish.yml"
]
},
"resolutions": {
"nan": "nodejs/nan#e14bdcd1f72d62bca1d541b66da43130384ec213"
"dbus-native/xml2js": "0.5.0",
"abstract-socket": "github:deepak1556/node-abstractsocket#928cc591decd12aff7dad96449da8afc29832c19",
"minimist@npm:~0.0.1": "0.2.4"
},
"packageManager": "yarn@4.12.0",
"workspaces": [
"spec",
"spec/fixtures/native-addon/*"
],
"dependenciesMeta": {
"abstract-socket": {
"built": true
}
}
}

View File

@@ -137,3 +137,7 @@ fix_add_macos_memory_query_fallback_to_avoid_crash.patch
fix_resolve_dynamic_background_material_update_issue_on_windows_11.patch
feat_add_support_for_embedder_snapshot_validation.patch
band-aid_over_an_issue_with_using_deprecated_nsopenpanel_api.patch
inspectorpageagent_provisional_frame_speculative_fix.patch
expose_referrerscriptinfo_hostdefinedoptionsindex.patch
fix_release_mouse_buttons_on_focus_loss_on_wayland.patch
cherry-pick-e045399a1ecb.patch

View File

@@ -0,0 +1,133 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Dominik=20R=C3=B6ttsches?= <drott@chromium.org>
Date: Thu, 12 Feb 2026 06:35:36 -0800
Subject: Avoid stale iteration in CSSFontFeatureValuesMap
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
To avoid invalid iterator state, take a snapshot of the
map when creating the iteration source. This addresses
the immediate problem of iterating while modifying.
Remaining work tracked in https://crbug.com/483936078
Fixed: 483569511
Change-Id: Ie29cfdf7ed94bbe189b44c842a5efce571bb2cee
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7566570
Commit-Queue: Dominik Röttsches <drott@chromium.org>
Reviewed-by: Anders Hartvoll Ruud <andruud@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1583927}
diff --git a/third_party/blink/renderer/core/css/css_font_feature_values_map.cc b/third_party/blink/renderer/core/css/css_font_feature_values_map.cc
index 24303069e2531afebec29977378ba708051e117d..1862dae14a63769f0fe1fe1cf5f6f880148ce37b 100644
--- a/third_party/blink/renderer/core/css/css_font_feature_values_map.cc
+++ b/third_party/blink/renderer/core/css/css_font_feature_values_map.cc
@@ -13,17 +13,16 @@ class FontFeatureValuesMapIterationSource final
: public PairSyncIterable<CSSFontFeatureValuesMap>::IterationSource {
public:
FontFeatureValuesMapIterationSource(const CSSFontFeatureValuesMap& map,
- const FontFeatureAliases* aliases)
- : map_(map), aliases_(aliases), iterator_(aliases->begin()) {}
+ const FontFeatureAliases aliases)
+ : map_(map),
+ aliases_(std::move(aliases)),
+ iterator_(aliases_.begin()) {}
bool FetchNextItem(ScriptState* script_state,
String& map_key,
Vector<uint32_t>& map_value,
ExceptionState&) override {
- if (!aliases_) {
- return false;
- }
- if (iterator_ == aliases_->end()) {
+ if (iterator_ == aliases_.end()) {
return false;
}
map_key = iterator_->key;
@@ -38,9 +37,13 @@ class FontFeatureValuesMapIterationSource final
}
private:
- // Needs to be kept alive while we're iterating over it.
const Member<const CSSFontFeatureValuesMap> map_;
- const FontFeatureAliases* aliases_;
+ // Create a copy to keep the iterator from becoming invalid if there are
+ // modifications to the aliases HashMap while iterating.
+ // TODO(https://crbug.com/483936078): Implement live/stable iteration over
+ // FontFeatureAliases by changing its storage type, avoiding taking a copy
+ // here.
+ const FontFeatureAliases aliases_;
FontFeatureAliases::const_iterator iterator_;
};
@@ -50,8 +53,8 @@ uint32_t CSSFontFeatureValuesMap::size() const {
PairSyncIterable<CSSFontFeatureValuesMap>::IterationSource*
CSSFontFeatureValuesMap::CreateIterationSource(ScriptState*, ExceptionState&) {
- return MakeGarbageCollected<FontFeatureValuesMapIterationSource>(*this,
- aliases_);
+ return MakeGarbageCollected<FontFeatureValuesMapIterationSource>(
+ *this, aliases_ ? *aliases_ : FontFeatureAliases());
}
bool CSSFontFeatureValuesMap::GetMapEntry(ScriptState*,
diff --git a/third_party/blink/web_tests/external/wpt/css/css-fonts/font_feature_values_map_iteration.html b/third_party/blink/web_tests/external/wpt/css/css-fonts/font_feature_values_map_iteration.html
new file mode 100644
index 0000000000000000000000000000000000000000..eac7198b0b4a58007cbcc77ad3e9357a1009117c
--- /dev/null
+++ b/third_party/blink/web_tests/external/wpt/css/css-fonts/font_feature_values_map_iteration.html
@@ -0,0 +1,52 @@
+<!DOCTYPE html>
+<html>
+ <head>
+ <title>CSSFontFeatureValuesMap Iteration and Modification</title>
+ <link
+ rel="help"
+ href="https://drafts.csswg.org/css-fonts-4/#om-fontfeaturevalues"
+ />
+ <meta
+ name="assert"
+ content="Iteration while modifying CSSFontFeatureValuesMap does not crash."
+ />
+ <script type="text/javascript" src="/resources/testharness.js"></script>
+ <script
+ type="text/javascript"
+ src="/resources/testharnessreport.js"
+ ></script>
+ </head>
+ <body>
+ <style>
+ @font-feature-values TestFont {
+ @styleset {
+ a: 1;
+ b: 2;
+ c: 3;
+ }
+ }
+ </style>
+ <script>
+ test(() => {
+ const rule = document.styleSheets[0].cssRules[0];
+ const map = rule.styleset;
+ const iterator = map.entries();
+ let count = 0;
+
+ while (count < 10) {
+ const { value: entry, done } = iterator.next();
+ if (done) break;
+
+ const [key, value] = entry;
+
+ map.delete(key);
+ for (let i = 0; i < 100; i++) {
+ map.set(`newkey_${count}_${i}`, i);
+ }
+
+ count++;
+ }
+ }, "Iteration of the CSSFontFeatureValuesMap does not crash.");
+ </script>
+ </body>
+</html>

View File

@@ -0,0 +1,53 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Fedor Indutny <indutny@signal.org>
Date: Wed, 24 Sep 2025 10:08:48 -0700
Subject: Expose ReferrerScriptInfo::HostDefinedOptionsIndex
In `shell/common/node_bindings.cc`'s
`HostImportModuleWithPhaseDynamically` we route dynamic imports to
either Node.js's or Blink's resolver based on presence of Node.js
environment, process type, etc. Exporting `HostDefinedOptionsIndex`
allows us to route based on the size of `v8_host_defined_options` data
which enables us to support dynamic imports in non-context-isolated
preload scripts.
diff --git a/third_party/blink/renderer/bindings/core/v8/referrer_script_info.cc b/third_party/blink/renderer/bindings/core/v8/referrer_script_info.cc
index 1b797783987255622735047bd78ca0e8bb635d5e..b209c736bb80c186ed51999af1dac0a1d50fc232 100644
--- a/third_party/blink/renderer/bindings/core/v8/referrer_script_info.cc
+++ b/third_party/blink/renderer/bindings/core/v8/referrer_script_info.cc
@@ -12,15 +12,6 @@ namespace blink {
namespace {
-enum HostDefinedOptionsIndex : size_t {
- kBaseURL,
- kCredentialsMode,
- kNonce,
- kParserState,
- kReferrerPolicy,
- kLength
-};
-
// Omit storing base URL if it is same as ScriptOrigin::ResourceName().
// Note: This improves chance of getting into a fast path in
// ReferrerScriptInfo::ToV8HostDefinedOptions.
diff --git a/third_party/blink/renderer/bindings/core/v8/referrer_script_info.h b/third_party/blink/renderer/bindings/core/v8/referrer_script_info.h
index 0119624a028bec3e53e4e402938a98fe6def1483..743865839448748fe00e3e7d5027587cb65393c9 100644
--- a/third_party/blink/renderer/bindings/core/v8/referrer_script_info.h
+++ b/third_party/blink/renderer/bindings/core/v8/referrer_script_info.h
@@ -23,6 +23,15 @@ class CORE_EXPORT ReferrerScriptInfo {
STACK_ALLOCATED();
public:
+ enum HostDefinedOptionsIndex : size_t {
+ kBaseURL,
+ kCredentialsMode,
+ kNonce,
+ kParserState,
+ kReferrerPolicy,
+ kLength
+ };
+
ReferrerScriptInfo() {}
ReferrerScriptInfo(const KURL& base_url,
network::mojom::CredentialsMode credentials_mode,

View File

@@ -0,0 +1,34 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Mitchell COhen <mitch.cohen@me.com>
Date: Sun, 2 Nov 2025 15:30:56 -0500
Subject: fix: release mouse buttons on focus loss on Wayland
Fixes an issue where the mouse flags would get stuck if you right-click
the CSD titlebar in Wayland.
Bug: 455147429
Change-Id: I9f5a9f395b3c1d85094a40a92d40691a897dbd05
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7091872
Reviewed-by: Thomas Anderson <thomasanderson@chromium.org>
Reviewed-by: Kramer Ge <fangzhoug@chromium.org>
Commit-Queue: Thomas Anderson <thomasanderson@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1538048}
diff --git a/ui/ozone/platform/wayland/host/wayland_event_source.cc b/ui/ozone/platform/wayland/host/wayland_event_source.cc
index 4cc15e842b633e93f1d6654225765769eb75fffd..4e421ccbd36d4efebd43c9def5b575b7d8d5e336 100644
--- a/ui/ozone/platform/wayland/host/wayland_event_source.cc
+++ b/ui/ozone/platform/wayland/host/wayland_event_source.cc
@@ -336,6 +336,13 @@ void WaylandEventSource::OnPointerFocusChanged(
// Save new pointer location.
pointer_location_ = location;
window_manager_->SetPointerFocusedWindow(window);
+ } else {
+ // The compositor may swallow the release event for any buttons that are
+ // pressed when the window loses focus, e.g. when right-clicking the
+ // titlebar to open the system menu on GNOME.
+ if (!connection_->IsDragInProgress()) {
+ ReleasePressedPointerButtons(window, ui::EventTimeForNow());
+ }
}
auto closure = focused ? base::NullCallback()

View File

@@ -0,0 +1,116 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Joey Arhar <jarhar@chromium.org>
Date: Wed, 1 Oct 2025 02:03:37 -0700
Subject: InspectorPageAgent provisional frame speculative fix
According to crash reports, addScriptToEvaluateOnNewDocument is running
on provisional frames.
Fixed: 390710982
Change-Id: I5cecf63c9517d0b28fff40361c607b0aa54e68cf
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/6216479
Reviewed-by: Alex Rudenko <alexrudenko@chromium.org>
Commit-Queue: Alex Rudenko <alexrudenko@chromium.org>
Auto-Submit: Joey Arhar <jarhar@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1523418}
diff --git a/third_party/blink/renderer/core/inspector/inspector_page_agent.cc b/third_party/blink/renderer/core/inspector/inspector_page_agent.cc
index fe17063fa8f572368096b88e1e3cc35b469e816a..07c1b9dd216518f70257a6637e3b50d87f098e8b 100644
--- a/third_party/blink/renderer/core/inspector/inspector_page_agent.cc
+++ b/third_party/blink/renderer/core/inspector/inspector_page_agent.cc
@@ -603,7 +603,11 @@ protocol::Response InspectorPageAgent::addScriptToEvaluateOnNewDocument(
// Runtime.enable that forces main context creation. In this case, we would
// not normally evaluate the script, but we should.
for (LocalFrame* frame : *inspected_frames_) {
- EvaluateScriptOnNewDocument(*frame, *identifier);
+ // Don't evaluate scripts on provisional frames:
+ // https://crbug.com/390710982
+ if (!frame->IsProvisional()) {
+ EvaluateScriptOnNewDocument(*frame, *identifier);
+ }
}
}
diff --git a/third_party/blink/web_tests/FlagExpectations/disable-site-isolation-trials b/third_party/blink/web_tests/FlagExpectations/disable-site-isolation-trials
index f462af59547e93378034b7e9301053a43f636ea4..3b829d94fa447e58a1acc57b7c7bdbd4564c9507 100644
--- a/third_party/blink/web_tests/FlagExpectations/disable-site-isolation-trials
+++ b/third_party/blink/web_tests/FlagExpectations/disable-site-isolation-trials
@@ -63,6 +63,7 @@ http/tests/inspector-protocol/target/target-filter.js [ Skip ]
virtual/fenced-frame-mparch/http/tests/inspector-protocol/fenced-frame/fenced-frame-in-oopif-auto-attach.js [ Skip ]
http/tests/inspector-protocol/target/target-info-changed-auto-attach.js [ Skip ]
http/tests/inspector-protocol/page/frame-detached-oopif.js [ Skip ]
+http/tests/inspector-protocol/page/addScriptToEvaluateOnNewDocument-reload.js [ Skip ]
# Rely on OOPIF for an iframe to be a separate devtools target
http/tests/inspector-protocol/timeline/auction-worklet-frame.js [ Skip ]
diff --git a/third_party/blink/web_tests/http/tests/inspector-protocol/page/addScriptToEvaluateOnNewDocument-reload-expected.txt b/third_party/blink/web_tests/http/tests/inspector-protocol/page/addScriptToEvaluateOnNewDocument-reload-expected.txt
new file mode 100644
index 0000000000000000000000000000000000000000..0131df6c227e1803741e654d42b15f589275061a
--- /dev/null
+++ b/third_party/blink/web_tests/http/tests/inspector-protocol/page/addScriptToEvaluateOnNewDocument-reload-expected.txt
@@ -0,0 +1,28 @@
+Tests that Page.addScriptToEvaluateOnNewDocument on auto-attach with runImmediately=true.
+Regression test for crbug.com/390710982.
+console called: {
+ method : Runtime.consoleAPICalled
+ params : {
+ args : [
+ [0] : {
+ type : string
+ value : evaluated
+ }
+ ]
+ executionContextId : <number>
+ stackTrace : {
+ callFrames : [
+ [0] : {
+ columnNumber : 8
+ functionName :
+ lineNumber : 0
+ scriptId : <string>
+ url :
+ }
+ ]
+ }
+ timestamp : <number>
+ type : log
+ }
+ sessionId : <string>
+}
diff --git a/third_party/blink/web_tests/http/tests/inspector-protocol/page/addScriptToEvaluateOnNewDocument-reload.js b/third_party/blink/web_tests/http/tests/inspector-protocol/page/addScriptToEvaluateOnNewDocument-reload.js
new file mode 100644
index 0000000000000000000000000000000000000000..52ebe845c323c6d692147052f3458777dcd7f966
--- /dev/null
+++ b/third_party/blink/web_tests/http/tests/inspector-protocol/page/addScriptToEvaluateOnNewDocument-reload.js
@@ -0,0 +1,31 @@
+(async function(/** @type {import('test_runner').TestRunner} */ testRunner) {
+ const { session, dp } = await testRunner.startBlank(
+ `Tests that Page.addScriptToEvaluateOnNewDocument on auto-attach with runImmediately=true.
+Regression test for crbug.com/390710982.`);
+
+ await dp.Page.enable();
+ await dp.Target.enable();
+ await dp.Target.setAutoAttach({ flatten: true, autoAttach: true, waitForDebuggerOnStart: true });
+
+ dp.Target.onAttachedToTarget(async event => {
+ const dp2 = session.createChild(event.params.sessionId).protocol;
+ dp2.Page.enable();
+ dp2.Runtime.enable();
+ dp2.Runtime.onConsoleAPICalled(event => {
+ testRunner.log(event, 'console called: ');
+ });
+ dp2.Page.addScriptToEvaluateOnNewDocument({
+ source: 'console.log("evaluated")',
+ runImmediately: true,
+ });
+ await dp2.Runtime.runIfWaitingForDebugger();
+ });
+
+ const loaded = dp.Page.onceLoadEventFired();
+ await dp.Page.navigate({
+ url: testRunner.url('resources/iframe-src.html')
+ });
+ await loaded;
+
+ testRunner.completeTest();
+});

View File

@@ -6,3 +6,4 @@ apply_allcan_read_write.patch
fix_support_new_variant_of_namedpropertyhandlerconfiguration_and.patch
fix_correct_usages_of_v8_returnvalue_void_set_nonempty_for_new.patch
chore_remove_deprecated_functioncallbackinfo_holder.patch
chore_add_yarnrc_yml_and_yarn_lock_file_to_use_yarn_v4.patch

File diff suppressed because it is too large Load Diff

View File

@@ -17,7 +17,6 @@ chore_expose_importmoduledynamically_and.patch
test_formally_mark_some_tests_as_flaky.patch
fix_do_not_resolve_electron_entrypoints.patch
ci_ensure_node_tests_set_electron_run_as_node.patch
fix_assert_module_in_the_renderer_process.patch
fix_allow_passing_fileexists_fn_to_legacymainresolve.patch
fix_remove_deprecated_errno_constants.patch
build_enable_perfetto.patch
@@ -40,10 +39,14 @@ build_change_crdtp_protocoltypetraits_signatures_to_avoid_conflict.patch
fix_adjust_wpt_and_webidl_tests_for_enabled_float16array.patch
chore_add_createexternalizabletwobytestring_to_globals.patch
refactor_attach_cppgc_heap_on_v8_isolate_creation.patch
fix_ensure_traverseparent_bails_on_resource_path_exit.patch
cli_move_--trace-atomics-wait_to_eol.patch
fix_cppgc_initializing_twice.patch
fix_task_starvation_in_inspector_context_test.patch
fix_expose_readfilesync_override_for_modules.patch
fix_array_out-of-bounds_read_in_boyer-moore_search.patch
chore_add_missing_include_of_iterator.patch
src_use_cp_utf8_for_wide_file_names_on_win32.patch
fix_ensure_traverseparent_bails_on_resource_path_exit.patch
remove_obsolete_noarraybufferzerofillscope.patch
src_prepare_for_v8_sandboxing.patch
test_correct_conditional_secure_heap_flags_test.patch

View File

@@ -14,7 +14,7 @@ error: duplicate symbol: crdtp::ProtocolTypeTraits<std::__Cr::basic_string<char,
Some distinguishing change should be upstreamed to Node.js.
diff --git a/src/inspector/node_string.cc b/src/inspector/node_string.cc
index 8521730bd03cdfce47e9b5d0f5d68a568bc3de8c..28f4598aa7ea0e93350f79566c06d0f08313be9f 100644
index e2148e954217b9b999e9713e95f1a115ccf7d657..7ec7464cdc0ef00e6600fb897ae99e44ed0f4ad8 100644
--- a/src/inspector/node_string.cc
+++ b/src/inspector/node_string.cc
@@ -7,7 +7,8 @@

View File

@@ -33,10 +33,10 @@ index 8d7204f6cb48f783adc4d1c1eb2de0c83b7fffe2..a154559a56bf383d3c26af523c9bb07b
// Non-alphabetic chars.
diff --git a/lib/internal/http.js b/lib/internal/http.js
index 251f51ec454f9cba4023b8b6729241ee753aac13..1de8cac6e3953ce9cab9db03530da327199acfd5 100644
index 4f250a2e70a20fddeeb7886e0c269822883f7ccf..4e5c2dd1f13e6818576c3c4c88200b5cf5fb1257 100644
--- a/lib/internal/http.js
+++ b/lib/internal/http.js
@@ -8,8 +8,8 @@ const {
@@ -10,8 +10,8 @@ const {
const { setUnrefTimeout } = require('internal/timers');
const { getCategoryEnabledBuffer, trace } = internalBinding('trace_events');
const {
@@ -46,8 +46,8 @@ index 251f51ec454f9cba4023b8b6729241ee753aac13..1de8cac6e3953ce9cab9db03530da327
+ CHAR_UPPERCASE_E,
} = require('internal/constants');
let utcCache;
@@ -44,11 +44,13 @@ function isTraceHTTPEnabled() {
const { URL } = require('internal/url');
@@ -50,11 +50,13 @@ function isTraceHTTPEnabled() {
const traceEventCategory = 'node,node.http';
function traceBegin(...args) {
@@ -62,7 +62,7 @@ index 251f51ec454f9cba4023b8b6729241ee753aac13..1de8cac6e3953ce9cab9db03530da327
+ trace(CHAR_UPPERCASE_E, traceEventCategory, ...args);
}
module.exports = {
function ipToInt(ip) {
diff --git a/node.gyp b/node.gyp
index 0e0071b508f605bb9b7722f8304814dc176d907e..bcb9f371c4e4d8c665058115dc39eaa65125d679 100644
--- a/node.gyp

View File

@@ -8,10 +8,10 @@ they use themselves as the entry point. We should try to upstream some form
of this.
diff --git a/lib/internal/process/pre_execution.js b/lib/internal/process/pre_execution.js
index 98ed40e3076f6628b1771dade63ac51600e8e447..1eba13caf1e00a8b41b2cf8afc4168c8f98be69f 100644
index 15443a710ccf53fae333da3b1fbb52a970c658d5..464b34829c1a566836bfca6bbc2b87fcf5e50016 100644
--- a/lib/internal/process/pre_execution.js
+++ b/lib/internal/process/pre_execution.js
@@ -245,12 +245,14 @@ function patchProcessObject(expandArgv1) {
@@ -265,12 +265,14 @@ function patchProcessObject(expandArgv1) {
// the entry point.
if (expandArgv1 && process.argv[1] && process.argv[1][0] !== '-') {
// Expand process.argv[1] into a full path.

View File

@@ -40,11 +40,11 @@ index 9b41db8b0714b7408f79cbd5b4c460d9bc08f239..35ecfb9bbaf2c8e7351e1c69da84c82a
/**
diff --git a/src/module_wrap.cc b/src/module_wrap.cc
index c52e20d742942667f43ea3e151fc6702260b176b..cbb3e7f4df72f83cb8a1afc25a7429218792e964 100644
index 1ff4971d6fedf6090120a63ba0af420dd6929c8b..4a35e41e78a22993f87ab9d5919f401a7b742438 100644
--- a/src/module_wrap.cc
+++ b/src/module_wrap.cc
@@ -901,7 +901,7 @@ MaybeLocal<Module> ModuleWrap::ResolveModuleCallback(
return module->module_.Get(isolate);
@@ -1063,7 +1063,7 @@ Maybe<ModuleWrap*> ModuleWrap::ResolveModule(
return Just(module_wrap);
}
-static MaybeLocal<Promise> ImportModuleDynamically(
@@ -52,7 +52,7 @@ index c52e20d742942667f43ea3e151fc6702260b176b..cbb3e7f4df72f83cb8a1afc25a742921
Local<Context> context,
Local<Data> host_defined_options,
Local<Value> resource_name,
@@ -973,12 +973,13 @@ void ModuleWrap::SetImportModuleDynamicallyCallback(
@@ -1135,12 +1135,13 @@ void ModuleWrap::SetImportModuleDynamicallyCallback(
Realm* realm = Realm::GetCurrent(args);
HandleScope handle_scope(isolate);
@@ -68,7 +68,7 @@ index c52e20d742942667f43ea3e151fc6702260b176b..cbb3e7f4df72f83cb8a1afc25a742921
}
void ModuleWrap::HostInitializeImportMetaObjectCallback(
@@ -1020,13 +1021,14 @@ void ModuleWrap::SetInitializeImportMetaObjectCallback(
@@ -1182,13 +1183,14 @@ void ModuleWrap::SetInitializeImportMetaObjectCallback(
Realm* realm = Realm::GetCurrent(args);
Isolate* isolate = realm->isolate();
@@ -87,7 +87,7 @@ index c52e20d742942667f43ea3e151fc6702260b176b..cbb3e7f4df72f83cb8a1afc25a742921
MaybeLocal<Value> ModuleWrap::SyntheticModuleEvaluationStepsCallback(
diff --git a/src/module_wrap.h b/src/module_wrap.h
index 9363ce73e51cde3d3a94f9912f072d532d0f8560..c0e972ed293157726efc5fa76dfa62d3da51c22a 100644
index 695b73ca7ffea850cd88e94546125d12ddc991f6..09602a8e619ba52f3525e3d312047c6dba0484a5 100644
--- a/src/module_wrap.h
+++ b/src/module_wrap.h
@@ -8,6 +8,7 @@
@@ -98,8 +98,8 @@ index 9363ce73e51cde3d3a94f9912f072d532d0f8560..c0e972ed293157726efc5fa76dfa62d3
#include "v8-script.h"
namespace node {
@@ -33,7 +34,14 @@ enum HostDefinedOptions : int {
kLength = 9,
@@ -86,7 +87,14 @@ struct ModuleCacheKey : public MemoryRetainer {
hash(hash) {}
};
-class ModuleWrap : public BaseObject {
@@ -111,10 +111,10 @@ index 9363ce73e51cde3d3a94f9912f072d532d0f8560..c0e972ed293157726efc5fa76dfa62d3
+ v8::Local<v8::FixedArray> import_assertions);
+
+class NODE_EXTERN ModuleWrap : public BaseObject {
public:
enum InternalFields {
kModuleSlot = BaseObject::kInternalFieldCount,
@@ -92,6 +100,8 @@ class ModuleWrap : public BaseObject {
using ResolveCache =
std::unordered_map<ModuleCacheKey, uint32_t, ModuleCacheKey::Hash>;
@@ -151,6 +159,8 @@ class ModuleWrap : public BaseObject {
static void CreateRequiredModuleFacade(
const v8::FunctionCallbackInfo<v8::Value>& args);
@@ -123,11 +123,11 @@ index 9363ce73e51cde3d3a94f9912f072d532d0f8560..c0e972ed293157726efc5fa76dfa62d3
private:
ModuleWrap(Realm* realm,
v8::Local<v8::Object> object,
@@ -131,7 +141,6 @@ class ModuleWrap : public BaseObject {
@@ -190,7 +200,6 @@ class ModuleWrap : public BaseObject {
v8::Local<v8::String> specifier,
v8::Local<v8::FixedArray> import_attributes,
v8::Local<v8::Module> referrer);
- static ModuleWrap* GetFromModule(node::Environment*, v8::Local<v8::Module>);
v8::Global<v8::Module> module_;
std::unordered_map<std::string, v8::Global<v8::Object>> resolve_cache_;
// This method may throw a JavaScript exception, so the return type is
// wrapped in a Maybe.

View File

@@ -15,10 +15,10 @@ Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com>
Reviewed-By: Yagiz Nizipli <yagiz.nizipli@sentry.io>
diff --git a/doc/api/cli.md b/doc/api/cli.md
index 9a0e83b95a72486ab9751b3b8818f4beeb527041..1da7126b9d51238e9b89ee6bed602df3f5598a9e 100644
index a97053929c81ac18bcb3beda7cecb69621b6e70c..a54d0e46c8e4e3aa6be433fba73ef9a3228fa175 100644
--- a/doc/api/cli.md
+++ b/doc/api/cli.md
@@ -2727,39 +2727,6 @@ added: v12.0.0
@@ -2749,39 +2749,6 @@ added: v12.0.0
Set default [`tls.DEFAULT_MIN_VERSION`][] to 'TLSv1.3'. Use to disable support
for TLSv1.2, which is not as secure as TLSv1.3.
@@ -58,7 +58,7 @@ index 9a0e83b95a72486ab9751b3b8818f4beeb527041..1da7126b9d51238e9b89ee6bed602df3
### `--trace-deprecation`
<!-- YAML
@@ -3445,7 +3412,6 @@ one is included in the list below.
@@ -3483,7 +3450,6 @@ one is included in the list below.
* `--tls-min-v1.1`
* `--tls-min-v1.2`
* `--tls-min-v1.3`
@@ -67,10 +67,10 @@ index 9a0e83b95a72486ab9751b3b8818f4beeb527041..1da7126b9d51238e9b89ee6bed602df3
* `--trace-env-js-stack`
* `--trace-env-native-stack`
diff --git a/doc/node.1 b/doc/node.1
index e3b2c45af01b2e9b9522964da2572988edd2b9e9..64e975546285a1042dda6fdb54fdd502f338a929 100644
index fed2c77c2afed665be7aa17c2d53824f049a909e..7a3c09a40fca9458f83be1e7d8eec930b7d2d263 100644
--- a/doc/node.1
+++ b/doc/node.1
@@ -542,11 +542,6 @@ but the option is supported for compatibility with older Node.js versions.
@@ -556,11 +556,6 @@ but the option is supported for compatibility with older Node.js versions.
Set default minVersion to 'TLSv1.3'. Use to disable support for TLSv1.2 in
favour of TLSv1.3, which is more secure.
.
@@ -83,7 +83,7 @@ index e3b2c45af01b2e9b9522964da2572988edd2b9e9..64e975546285a1042dda6fdb54fdd502
Print stack traces for deprecations.
.
diff --git a/src/node.cc b/src/node.cc
index f0c0b6229048a2e9bc05684fab44ab09bc34e1f6..9027df9a321f7db76edd1218c194df519017dfaf 100644
index 029b0d219e3bbfa4306d7d5fb5e75f1abc1571bb..2224b5d4e21f435cff1f1ebef2b9e8635bd0368e 100644
--- a/src/node.cc
+++ b/src/node.cc
@@ -232,44 +232,6 @@ void Environment::WaitForInspectorFrontendByOptions() {
@@ -150,10 +150,10 @@ index f0c0b6229048a2e9bc05684fab44ab09bc34e1f6..9027df9a321f7db76edd1218c194df51
isolate_->SetPromiseHook(TracePromises);
}
diff --git a/src/node_options.cc b/src/node_options.cc
index e8424d7539db191a55edebb7d33a3c1dc37e2403..556776b79282d953fdc371d1901f21ca301bec1a 100644
index 4bbace2f702777fa12ba9246984894721df99b50..b067685822dc056e446e1a9402a5a6cba86cc722 100644
--- a/src/node_options.cc
+++ b/src/node_options.cc
@@ -773,10 +773,6 @@ EnvironmentOptionsParser::EnvironmentOptionsParser() {
@@ -827,10 +827,6 @@ EnvironmentOptionsParser::EnvironmentOptionsParser() {
"throw an exception on deprecations",
&EnvironmentOptions::throw_deprecation,
kAllowedInEnvvar);
@@ -165,7 +165,7 @@ index e8424d7539db191a55edebb7d33a3c1dc37e2403..556776b79282d953fdc371d1901f21ca
"show stack traces on deprecations",
&EnvironmentOptions::trace_deprecation,
diff --git a/src/node_options.h b/src/node_options.h
index 418dee360f867c363f1576012b32213a51c4fdd0..7078d2493ed696bc5bd92df9c629b714c1a8fbfb 100644
index d8751a6ee734233e2fc24866ed87d9cd516072ae..e12abb55e43068e8446eaabc65deb63cc469f554 100644
--- a/src/node_options.h
+++ b/src/node_options.h
@@ -205,7 +205,6 @@ class EnvironmentOptions : public Options {

View File

@@ -18,10 +18,10 @@ Reviewed-By: Michaël Zasso <targos@protonmail.com>
Reviewed-By: Yagiz Nizipli <yagiz@nizipli.com>
diff --git a/doc/api/cli.md b/doc/api/cli.md
index b8f9fb49fcb45602828e79bd79902233b5987dda..9a0e83b95a72486ab9751b3b8818f4beeb527041 100644
index 7ff68d28f1c80d8a852f649e2c39216a2f4bdb16..a97053929c81ac18bcb3beda7cecb69621b6e70c 100644
--- a/doc/api/cli.md
+++ b/doc/api/cli.md
@@ -3483,7 +3483,6 @@ V8 options that are allowed are:
@@ -3522,7 +3522,6 @@ V8 options that are allowed are:
* `--disallow-code-generation-from-strings`
* `--enable-etw-stack-walking`
* `--expose-gc`
@@ -30,10 +30,10 @@ index b8f9fb49fcb45602828e79bd79902233b5987dda..9a0e83b95a72486ab9751b3b8818f4be
* `--jitless`
* `--max-old-space-size`
diff --git a/src/node_options.cc b/src/node_options.cc
index 8afded658c3f569de7b329ea9dddc11010748cf9..e8424d7539db191a55edebb7d33a3c1dc37e2403 100644
index 3026b3d814ae652a9996c1dcba62b4fa678ac871..4bbace2f702777fa12ba9246984894721df99b50 100644
--- a/src/node_options.cc
+++ b/src/node_options.cc
@@ -1001,11 +1001,6 @@ PerIsolateOptionsParser::PerIsolateOptionsParser(
@@ -1060,11 +1060,6 @@ PerIsolateOptionsParser::PerIsolateOptionsParser(
"disallow eval and friends",
V8Option{},
kAllowedInEnvvar);

View File

@@ -53,10 +53,10 @@ index e3afd30ba1f591d0298793bc42fd7166a4219bce..408dc96307d7f52f92db41004b358051
const maybeMain = resolvedOption <= legacyMainResolveExtensionsIndexes.kResolvedByMainIndexNode ?
packageConfig.main || './' : '';
diff --git a/src/node_file.cc b/src/node_file.cc
index e78326ed0de805a8bf4f621cad9158635eb44aa2..d7009937b31729f33d9c45cbda7f5440fbdac2aa 100644
index 77f8f1bd4e8294f2ebc7e0724aea5902eb0f95ab..5de3ebb04b12286a07e3041d0a6dd1cc9072e76a 100644
--- a/src/node_file.cc
+++ b/src/node_file.cc
@@ -3502,13 +3502,25 @@ static void CpSyncCopyDir(const FunctionCallbackInfo<Value>& args) {
@@ -3504,13 +3504,25 @@ static void CpSyncCopyDir(const FunctionCallbackInfo<Value>& args) {
}
BindingData::FilePathIsFileReturnType BindingData::FilePathIsFile(
@@ -83,7 +83,7 @@ index e78326ed0de805a8bf4f621cad9158635eb44aa2..d7009937b31729f33d9c45cbda7f5440
uv_fs_t req;
int rc = uv_fs_stat(env->event_loop(), &req, file_path.c_str(), nullptr);
@@ -3566,6 +3578,11 @@ void BindingData::LegacyMainResolve(const FunctionCallbackInfo<Value>& args) {
@@ -3568,6 +3580,11 @@ void BindingData::LegacyMainResolve(const FunctionCallbackInfo<Value>& args) {
std::optional<std::string> initial_file_path;
std::string file_path;
@@ -95,7 +95,7 @@ index e78326ed0de805a8bf4f621cad9158635eb44aa2..d7009937b31729f33d9c45cbda7f5440
if (args.Length() >= 2 && args[1]->IsString()) {
auto package_config_main = Utf8Value(isolate, args[1]).ToString();
@@ -3586,7 +3603,7 @@ void BindingData::LegacyMainResolve(const FunctionCallbackInfo<Value>& args) {
@@ -3588,7 +3605,7 @@ void BindingData::LegacyMainResolve(const FunctionCallbackInfo<Value>& args) {
BufferValue buff_file_path(isolate, local_file_path);
ToNamespacedPath(env, &buff_file_path);
@@ -104,7 +104,7 @@ index e78326ed0de805a8bf4f621cad9158635eb44aa2..d7009937b31729f33d9c45cbda7f5440
case BindingData::FilePathIsFileReturnType::kIsFile:
return args.GetReturnValue().Set(i);
case BindingData::FilePathIsFileReturnType::kIsNotFile:
@@ -3623,7 +3640,7 @@ void BindingData::LegacyMainResolve(const FunctionCallbackInfo<Value>& args) {
@@ -3625,7 +3642,7 @@ void BindingData::LegacyMainResolve(const FunctionCallbackInfo<Value>& args) {
BufferValue buff_file_path(isolate, local_file_path);
ToNamespacedPath(env, &buff_file_path);

View File

@@ -1,67 +0,0 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Shelley Vohr <shelley.vohr@gmail.com>
Date: Wed, 16 Aug 2023 19:15:29 +0200
Subject: fix: assert module in the renderer process
When creating a Node.js Environment, embedders have the option to disable Node.js'
default overriding of Error.prepareStackTrace. However, the assert module depends on
a WeakMap that is populated with the error stacktraces in the overridden function.
This adds handling to fall back to the default implementation if Error.prepareStackTrace
if the override has been disabled.
This will be upstreamed.
diff --git a/lib/internal/assert/utils.js b/lib/internal/assert/utils.js
index 13e41d67c635c27bd5e69eb4960eace34beaef0d..9a99c9ca93907630f9f3ba7ba24577a11465661c 100644
--- a/lib/internal/assert/utils.js
+++ b/lib/internal/assert/utils.js
@@ -24,6 +24,7 @@ const AssertionError = require('internal/assert/assertion_error');
const { openSync, closeSync, readSync } = require('fs');
const { EOL } = require('internal/constants');
const { BuiltinModule } = require('internal/bootstrap/realm');
+const { getEmbedderOptions } = require('internal/options');
const { isError } = require('internal/util');
const errorCache = new SafeMap();
@@ -166,8 +167,16 @@ function getErrMessage(message, fn) {
ErrorCaptureStackTrace(err, fn);
if (errorStackTraceLimitIsWritable) Error.stackTraceLimit = tmpLimit;
- overrideStackTrace.set(err, (_, stack) => stack);
- const call = err.stack[0];
+ let call;
+ if (getEmbedderOptions().hasPrepareStackTraceCallback) {
+ overrideStackTrace.set(err, (_, stack) => stack);
+ call = err.stack[0];
+ } else {
+ const tmpPrepare = Error.prepareStackTrace;
+ Error.prepareStackTrace = (_, stack) => stack;
+ call = err.stack[0];
+ Error.prepareStackTrace = tmpPrepare;
+ }
let filename = call.getFileName();
const line = call.getLineNumber() - 1;
diff --git a/src/node_options.cc b/src/node_options.cc
index e3509abbc3bf84ac0edcd495eb3dde6219dbfc2d..8afded658c3f569de7b329ea9dddc11010748cf9 100644
--- a/src/node_options.cc
+++ b/src/node_options.cc
@@ -1566,14 +1566,16 @@ void GetEmbedderOptions(const FunctionCallbackInfo<Value>& args) {
}
Isolate* isolate = args.GetIsolate();
- constexpr size_t kOptionsSize = 4;
+ constexpr size_t kOptionsSize = 5;
std::array<Local<Name>, kOptionsSize> names = {
+ FIXED_ONE_BYTE_STRING(env->isolate(), "hasPrepareStackTraceCallback"),
FIXED_ONE_BYTE_STRING(env->isolate(), "shouldNotRegisterESMLoader"),
FIXED_ONE_BYTE_STRING(env->isolate(), "noGlobalSearchPaths"),
FIXED_ONE_BYTE_STRING(env->isolate(), "noBrowserGlobals"),
FIXED_ONE_BYTE_STRING(env->isolate(), "hasEmbedderPreload")};
std::array<Local<Value>, kOptionsSize> values = {
+ Boolean::New(isolate, env->prepare_stack_trace_callback().IsEmpty()),
Boolean::New(isolate, env->should_not_register_esm_loader()),
Boolean::New(isolate, env->no_global_search_paths()),
Boolean::New(isolate, env->no_browser_globals()),

View File

@@ -12,10 +12,10 @@ This can be removed/refactored once Node.js upgrades to a version of V8
containing the above CL.
diff --git a/src/node.cc b/src/node.cc
index 9027df9a321f7db76edd1218c194df519017dfaf..cc1c35da5601fffc3c53985c5d95cc466662649d 100644
index 2224b5d4e21f435cff1f1ebef2b9e8635bd0368e..4830d5b738f40e035394d5eff76bb3b801f193fb 100644
--- a/src/node.cc
+++ b/src/node.cc
@@ -1246,7 +1246,7 @@ InitializeOncePerProcessInternal(const std::vector<std::string>& args,
@@ -1239,7 +1239,7 @@ InitializeOncePerProcessInternal(const std::vector<std::string>& args,
result->platform_ = per_process::v8_platform.Platform();
}

View File

@@ -8,10 +8,10 @@ resource path. This commit ensures that the TraverseParent function
bails out if the parent path is outside of the resource path.
diff --git a/src/node_modules.cc b/src/node_modules.cc
index 55d628f0c5e7f330e548878807de26d51ef025b5..c06779dea471b6f6a8dd29d4657162ef0faec043 100644
index 0e820f8623ecf41008a938f16d11bd3206a7b0c1..6e06717cd4e50a8c2a1d91d14a23b7332e817bdd 100644
--- a/src/node_modules.cc
+++ b/src/node_modules.cc
@@ -291,8 +291,41 @@ const BindingData::PackageConfig* BindingData::TraverseParent(
@@ -315,8 +315,41 @@ const BindingData::PackageConfig* BindingData::TraverseParent(
Realm* realm, const std::filesystem::path& check_path) {
std::filesystem::path current_path = check_path;
auto env = realm->env();
@@ -47,22 +47,22 @@ index 55d628f0c5e7f330e548878807de26d51ef025b5..c06779dea471b6f6a8dd29d4657162ef
+ });
+ };
+
+ bool did_original_path_start_with_resources_path = starts_with(check_path.
+ generic_string(), resources_path);
+ bool did_original_path_start_with_resources_path = starts_with(
+ ConvertGenericPathToUTF8(check_path), resources_path);
+
do {
current_path = current_path.parent_path();
@@ -311,6 +344,12 @@ const BindingData::PackageConfig* BindingData::TraverseParent(
return nullptr;
@@ -336,6 +369,12 @@ const BindingData::PackageConfig* BindingData::TraverseParent(
}
}
+ // If current path is outside the resources path, bail.
+ if (did_original_path_start_with_resources_path &&
+ !starts_with(current_path.generic_string(), resources_path)) {
+ !starts_with(ConvertGenericPathToUTF8(current_path), resources_path)) {
+ return nullptr;
+ }
+
// Check if the path ends with `/node_modules`
if (current_path.generic_string().ends_with("/node_modules")) {
if (current_path.filename() == "node_modules") {
return nullptr;

View File

@@ -20,7 +20,7 @@ index 82225b0a53dd828750991a4e15a060b736b6ea2b..4b0d31356a2496a7fc67876a22da2453
V(performance_entry_callback, v8::Function) \
V(prepare_stack_trace_callback, v8::Function) \
diff --git a/src/node_modules.cc b/src/node_modules.cc
index c06779dea471b6f6a8dd29d4657162ef0faec043..6204986dc97686a248d6ae483f3a413ee5c51e47 100644
index 60b03b1563b230f78d8a9bdd8480da4d2ae90a27..62050791174563f88b8629d576eed8959b3c2e20 100644
--- a/src/node_modules.cc
+++ b/src/node_modules.cc
@@ -21,6 +21,7 @@ namespace modules {
@@ -76,8 +76,8 @@ index c06779dea471b6f6a8dd29d4657162ef0faec043..6204986dc97686a248d6ae483f3a413e
+ if (read_err < 0) {
return nullptr;
}
// In some systems, std::string is annotated to generate an
@@ -249,6 +279,12 @@ const BindingData::PackageConfig* BindingData::GetPackageJSON(
simdjson::ondemand::document document;
@@ -237,6 +267,12 @@ const BindingData::PackageConfig* BindingData::GetPackageJSON(
return &cached.first->second;
}
@@ -90,7 +90,7 @@ index c06779dea471b6f6a8dd29d4657162ef0faec043..6204986dc97686a248d6ae483f3a413e
void BindingData::ReadPackageJSON(const FunctionCallbackInfo<Value>& args) {
CHECK_GE(args.Length(), 1); // path, [is_esm, base, specifier]
CHECK(args[0]->IsString()); // path
@@ -643,6 +679,8 @@ void InitImportMetaPathHelpers(const FunctionCallbackInfo<Value>& args) {
@@ -558,6 +594,8 @@ void InitImportMetaPathHelpers(const FunctionCallbackInfo<Value>& args) {
void BindingData::CreatePerIsolateProperties(IsolateData* isolate_data,
Local<ObjectTemplate> target) {
Isolate* isolate = isolate_data->isolate();
@@ -99,7 +99,7 @@ index c06779dea471b6f6a8dd29d4657162ef0faec043..6204986dc97686a248d6ae483f3a413e
SetMethod(isolate, target, "readPackageJSON", ReadPackageJSON);
SetMethod(isolate,
target,
@@ -685,6 +723,8 @@ void BindingData::CreatePerContextProperties(Local<Object> target,
@@ -596,6 +634,8 @@ void BindingData::CreatePerContextProperties(Local<Object> target,
void BindingData::RegisterExternalReferences(
ExternalReferenceRegistry* registry) {
@@ -107,9 +107,9 @@ index c06779dea471b6f6a8dd29d4657162ef0faec043..6204986dc97686a248d6ae483f3a413e
+
registry->Register(ReadPackageJSON);
registry->Register(GetNearestParentPackageJSONType);
registry->Register(GetNearestParentPackageJSON);
registry->Register(GetPackageScopeConfig<false>);
diff --git a/src/node_modules.h b/src/node_modules.h
index eb2900d8f8385238f89a6dcc972a28e5fcb1d288..e28f38d98f4f8749048af135f0dcbe55aa69c4fe 100644
index e4ba6b75bc86d14deada835903ba68a4cb0eccc5..ae77f9ec81b358bd356993617cd07671d382e8ca 100644
--- a/src/node_modules.h
+++ b/src/node_modules.h
@@ -54,6 +54,8 @@ class BindingData : public SnapshotableObject {
@@ -119,5 +119,5 @@ index eb2900d8f8385238f89a6dcc972a28e5fcb1d288..e28f38d98f4f8749048af135f0dcbe55
+ static void OverrideReadFileSync(
+ const v8::FunctionCallbackInfo<v8::Value>& args);
static void ReadPackageJSON(const v8::FunctionCallbackInfo<v8::Value>& args);
static void GetNearestParentPackageJSON(
static void GetNearestParentPackageJSONType(
const v8::FunctionCallbackInfo<v8::Value>& args);

View File

@@ -138,10 +138,10 @@ index 757f093becd112002f3422302f4c29bb464f1a6c..c8cea2117080930105b33e4e50586a2c
// This translator function must be sync, as `require` is sync.
translators.set('require-commonjs-typescript', (url, source, isMain) => {
diff --git a/lib/internal/url.js b/lib/internal/url.js
index d0c04be7c6ebc352d5958a987f3a4ba538e0d23a..00f9f3b73ed84c04ae712f6057b68737bd416333 100644
index ad1c2c9966085b8febd261b2fc776ce49bc1bd36..96fdce31168ae70ce20f3bfb81931705b2d55f31 100644
--- a/lib/internal/url.js
+++ b/lib/internal/url.js
@@ -1605,6 +1605,8 @@ function fileURLToPath(path, options = kEmptyObject) {
@@ -1608,6 +1608,8 @@ function fileURLToPath(path, options = kEmptyObject) {
path = new URL(path);
else if (!isURL(path))
throw new ERR_INVALID_ARG_TYPE('path', ['string', 'URL'], path);

View File

@@ -228,7 +228,7 @@ index d94f6e1c82c4a62547b3b395f375c86ce4deb5de..b81b9005365272217c77e2b9289bd9f8
X509View ca(sk_X509_value(peer_certs.get(), i));
if (!cert->view().isIssuedBy(ca)) continue;
diff --git a/src/crypto/crypto_context.cc b/src/crypto/crypto_context.cc
index c08dab17fa229d1d67d3ad5174c97192989b2bd0..a3d309d832c73ddc79564b9644d825bec7459e7f 100644
index d1430cd66dd045dcb52dd166e1eabc7202d1bd94..8f50d0cc132ac65fa74cf1fc2172247b5ad42962 100644
--- a/src/crypto/crypto_context.cc
+++ b/src/crypto/crypto_context.cc
@@ -141,7 +141,7 @@ int SSL_CTX_use_certificate_chain(SSL_CTX* ctx,
@@ -240,7 +240,7 @@ index c08dab17fa229d1d67d3ad5174c97192989b2bd0..a3d309d832c73ddc79564b9644d825be
X509* ca = sk_X509_value(extra_certs, i);
// NOTE: Increments reference count on `ca`
@@ -1773,11 +1773,12 @@ void SecureContext::SetDHParam(const FunctionCallbackInfo<Value>& args) {
@@ -1831,11 +1831,12 @@ void SecureContext::SetDHParam(const FunctionCallbackInfo<Value>& args) {
// If the user specified "auto" for dhparams, the JavaScript layer will pass
// true to this function instead of the original string. Any other string
// value will be interpreted as custom DH parameters below.
@@ -254,7 +254,7 @@ index c08dab17fa229d1d67d3ad5174c97192989b2bd0..a3d309d832c73ddc79564b9644d825be
DHPointer dh;
{
BIOPointer bio(LoadBIO(env, args[0]));
@@ -2003,7 +2004,7 @@ void SecureContext::LoadPKCS12(const FunctionCallbackInfo<Value>& args) {
@@ -2061,7 +2062,7 @@ void SecureContext::LoadPKCS12(const FunctionCallbackInfo<Value>& args) {
}
// Add CA certs too
@@ -396,10 +396,10 @@ index ca5edc8ebdf2550bb62b7969a5650733a2647f4f..198e18b58f31e361a9d2865cbe81e067
return EVPKeyCtxPointer();
diff --git a/src/crypto/crypto_keys.cc b/src/crypto/crypto_keys.cc
index 7238cda445fd663e6b45fa134f31d017bb267dfc..522655555cdb2ab2083797f736bf167d1f42c15e 100644
index fe852bdebacae49dce19a731a46fe9f1bb66eb1b..5dbcaafaf26aa3b638fdfd471cedab87c9296319 100644
--- a/src/crypto/crypto_keys.cc
+++ b/src/crypto/crypto_keys.cc
@@ -949,6 +949,7 @@ void KeyObjectHandle::GetAsymmetricKeyType(
@@ -948,6 +948,7 @@ void KeyObjectHandle::GetAsymmetricKeyType(
}
bool KeyObjectHandle::CheckEcKeyData() const {
@@ -407,7 +407,7 @@ index 7238cda445fd663e6b45fa134f31d017bb267dfc..522655555cdb2ab2083797f736bf167d
MarkPopErrorOnReturn mark_pop_error_on_return;
const auto& key = data_.GetAsymmetricKey();
@@ -965,6 +966,9 @@ bool KeyObjectHandle::CheckEcKeyData() const {
@@ -964,6 +965,9 @@ bool KeyObjectHandle::CheckEcKeyData() const {
#else
return EVP_PKEY_public_check(ctx.get()) == 1;
#endif
@@ -449,7 +449,7 @@ index 05a3882c7e17d78e27aabb29891aa250789a47c0..1f2fccce6ed8f14525557644e0bdd130
if (target
diff --git a/src/crypto/crypto_util.cc b/src/crypto/crypto_util.cc
index 7c548d32b40365343f0e208c3aa856a1c847f4c3..6346f8f7199cf7b7d3736c59571606fff102fbb6 100644
index 9c2360df7150571377eff37fc5e958d17900da30..4505786745c54a529f904d5e7813a86204e0a78b 100644
--- a/src/crypto/crypto_util.cc
+++ b/src/crypto/crypto_util.cc
@@ -207,7 +207,8 @@ void TestFipsCrypto(const v8::FunctionCallbackInfo<v8::Value>& args) {
@@ -506,15 +506,14 @@ index 7c548d32b40365343f0e208c3aa856a1c847f4c3..6346f8f7199cf7b7d3736c59571606ff
},
data);
Local<ArrayBuffer> buffer = ArrayBuffer::New(env->isolate(), store);
@@ -705,10 +697,12 @@ void SecureBuffer(const FunctionCallbackInfo<Value>& args) {
@@ -705,9 +697,11 @@ void SecureBuffer(const FunctionCallbackInfo<Value>& args) {
}
void SecureHeapUsed(const FunctionCallbackInfo<Value>& args) {
+#ifndef OPENSSL_IS_BORINGSSL
Environment* env = Environment::GetCurrent(args);
if (CRYPTO_secure_malloc_initialized())
args.GetReturnValue().Set(
BigInt::New(env->isolate(), CRYPTO_secure_used()));
BigInt::New(args.GetIsolate(), CRYPTO_secure_used()));
+#endif
}
} // namespace
@@ -555,20 +554,20 @@ index d9c533f100d25aeab1fe8589932a8ddead431258..2acab8786a8a752b17961445edeb872c
#if NODE_OPENSSL_HAS_QUIC
#include <openssl/quic.h>
diff --git a/src/node_options.cc b/src/node_options.cc
index ed85bf11f6f325823b59b3b0275908f9210c1b24..e3509abbc3bf84ac0edcd495eb3dde6219dbfc2d 100644
index 31fc23fdbfabceab3cffd81a3e6650dde1ccd13a..3026b3d814ae652a9996c1dcba62b4fa678ac871 100644
--- a/src/node_options.cc
+++ b/src/node_options.cc
@@ -7,7 +7,7 @@
#include "node_external_reference.h"
@@ -8,7 +8,7 @@
#include "node_internals.h"
#include "node_sea.h"
#include "uv.h"
-#if HAVE_OPENSSL
+#if HAVE_OPENSSL && !defined(OPENSSL_IS_BORINGSSL)
#include "openssl/opensslv.h"
#endif
diff --git a/src/node_options.h b/src/node_options.h
index cdbd9ca39e907ab22515293eac2c5512223f4ca2..418dee360f867c363f1576012b32213a51c4fdd0 100644
index 2e73fd2a05e329910d4c064474880f770c9f5957..d8751a6ee734233e2fc24866ed87d9cd516072ae 100644
--- a/src/node_options.h
+++ b/src/node_options.h
@@ -11,7 +11,7 @@

View File

@@ -86,10 +86,10 @@ index 0ca643aa74d13f278685d2330b791182b55c15b4..cbcecfba33070b820aca0e2814982160
NODE_DEFINE_CONSTANT(target, ETIMEDOUT);
#endif
diff --git a/src/node_errors.cc b/src/node_errors.cc
index 5f51add4cdf68a9487edfc9382f586cc94539571..befb642f1effa3c4139e4cd99ff64d9c5175fd72 100644
index 238942d45a136facec55ca5a2534e2dc407137e9..36a21b9523351fe2f225ffe7fca184d737640b62 100644
--- a/src/node_errors.cc
+++ b/src/node_errors.cc
@@ -862,10 +862,6 @@ const char* errno_string(int errorno) {
@@ -899,10 +899,6 @@ const char* errno_string(int errorno) {
ERRNO_CASE(ENOBUFS);
#endif
@@ -100,7 +100,7 @@ index 5f51add4cdf68a9487edfc9382f586cc94539571..befb642f1effa3c4139e4cd99ff64d9c
#ifdef ENODEV
ERRNO_CASE(ENODEV);
#endif
@@ -904,14 +900,6 @@ const char* errno_string(int errorno) {
@@ -941,14 +937,6 @@ const char* errno_string(int errorno) {
ERRNO_CASE(ENOSPC);
#endif
@@ -115,7 +115,7 @@ index 5f51add4cdf68a9487edfc9382f586cc94539571..befb642f1effa3c4139e4cd99ff64d9c
#ifdef ENOSYS
ERRNO_CASE(ENOSYS);
#endif
@@ -994,10 +982,6 @@ const char* errno_string(int errorno) {
@@ -1031,10 +1019,6 @@ const char* errno_string(int errorno) {
ERRNO_CASE(ESTALE);
#endif

View File

@@ -48,7 +48,7 @@ index fe669d40c31a29334b047b9cfee3067f64ef0a7b..9e5de7bbe574add017cd12ee091304d0
static CFunction fast_timing_safe_equal(CFunction::Make(FastTimingSafeEqual));
diff --git a/src/node_buffer.cc b/src/node_buffer.cc
index e39852c8e0392e0a9ae5d4ea58be115416e19233..c94b14741c827a81d69a6f036426a344e563ad72 100644
index b9f0c97938203b4652780a7d707c5e83319330b0..8a5b6b57321c2843a965a7e51b2ebed991a1e424 100644
--- a/src/node_buffer.cc
+++ b/src/node_buffer.cc
@@ -44,6 +44,14 @@
@@ -74,7 +74,7 @@ index e39852c8e0392e0a9ae5d4ea58be115416e19233..c94b14741c827a81d69a6f036426a344
using v8::FunctionCallbackInfo;
using v8::Global;
using v8::HandleScope;
@@ -584,19 +591,24 @@ void SlowCopy(const FunctionCallbackInfo<Value>& args) {
@@ -583,19 +590,24 @@ void SlowCopy(const FunctionCallbackInfo<Value>& args) {
// Assume caller has properly validated args.
uint32_t FastCopy(Local<Value> receiver,
@@ -107,7 +107,7 @@ index e39852c8e0392e0a9ae5d4ea58be115416e19233..c94b14741c827a81d69a6f036426a344
return to_copy;
}
@@ -865,19 +877,17 @@ void Compare(const FunctionCallbackInfo<Value> &args) {
@@ -864,19 +876,17 @@ void Compare(const FunctionCallbackInfo<Value> &args) {
}
int32_t FastCompare(v8::Local<v8::Value>,
@@ -135,7 +135,7 @@ index e39852c8e0392e0a9ae5d4ea58be115416e19233..c94b14741c827a81d69a6f036426a344
}
static v8::CFunction fast_compare(v8::CFunction::Make(FastCompare));
@@ -1149,14 +1159,13 @@ void SlowIndexOfNumber(const FunctionCallbackInfo<Value>& args) {
@@ -1148,14 +1158,13 @@ void SlowIndexOfNumber(const FunctionCallbackInfo<Value>& args) {
}
int32_t FastIndexOfNumber(v8::Local<v8::Value>,
@@ -153,7 +153,7 @@ index e39852c8e0392e0a9ae5d4ea58be115416e19233..c94b14741c827a81d69a6f036426a344
}
static v8::CFunction fast_index_of_number(
@@ -1496,21 +1505,31 @@ void SlowWriteString(const FunctionCallbackInfo<Value>& args) {
@@ -1510,21 +1519,31 @@ void SlowWriteString(const FunctionCallbackInfo<Value>& args) {
template <encoding encoding>
uint32_t FastWriteString(Local<Value> receiver,
@@ -194,7 +194,7 @@ index e39852c8e0392e0a9ae5d4ea58be115416e19233..c94b14741c827a81d69a6f036426a344
static const v8::CFunction fast_write_string_ascii(
diff --git a/src/util.h b/src/util.h
index dcd6548d41be786c42ce8328d89e532a8e9d43a2..7c98de621ca4d53cbaaa5bd4488aab20c7b033a7 100644
index 8f27afbb9e4e453655f94b94daf47026b9cf2177..8460fe26bbf9e83d080fdfc458d570d0ee29e6f0 100644
--- a/src/util.h
+++ b/src/util.h
@@ -62,6 +62,7 @@
@@ -205,7 +205,7 @@ index dcd6548d41be786c42ce8328d89e532a8e9d43a2..7c98de621ca4d53cbaaa5bd4488aab20
#ifdef _WIN32
/* MAX_PATH is in characters, not bytes. Make sure we have enough headroom. */
@@ -589,6 +590,16 @@ class BufferValue : public MaybeStackBuffer<char> {
@@ -582,6 +583,16 @@ class BufferValue : public MaybeStackBuffer<char> {
static_cast<char*>(name->Buffer()->Data()) + name##_offset; \
if (name##_length > 0) CHECK_NE(name##_data, nullptr);

View File

@@ -13,7 +13,7 @@ This patch can be removed when we upgrade to a V8 version that
contains the above CLs.
diff --git a/src/node.cc b/src/node.cc
index 07684482f855363e26c3d7299a585a8a5654015e..627337efae49319e2a77b4686176ce92a8493024 100644
index 2d8d0000d52411992d2bd513cc7dd96b2292bab9..bebd75922733bbf219f125a3a6f35d98bc8210d3 100644
--- a/src/node.cc
+++ b/src/node.cc
@@ -816,7 +816,7 @@ static ExitCode ProcessGlobalArgsInternal(std::vector<std::string>* args,
@@ -23,9 +23,9 @@ index 07684482f855363e26c3d7299a585a8a5654015e..627337efae49319e2a77b4686176ce92
- v8_args.emplace_back("--no-harmony-import-assertions");
+ // v8_args.emplace_back("--no-harmony-import-assertions");
auto env_opts = per_process::cli_options->per_isolate->per_env;
if (std::find(v8_args.begin(), v8_args.end(),
@@ -828,7 +828,7 @@ static ExitCode ProcessGlobalArgsInternal(std::vector<std::string>* args,
if (!per_process::cli_options->per_isolate->max_old_space_size_percentage
.empty()) {
@@ -835,7 +835,7 @@ static ExitCode ProcessGlobalArgsInternal(std::vector<std::string>* args,
// Support stable Phase 5 WebAssembly proposals
v8_args.emplace_back("--experimental-wasm-imported-strings");

View File

@@ -18,10 +18,10 @@ This can be removed when Node.js upgrades to a version of V8 containing CLs
from the above issue.
diff --git a/src/api/environment.cc b/src/api/environment.cc
index cb37fa080fc8e8d524cfa2758c4a8c2c5652324d..8e227ddd1be50c046a8cf2895a31d607eb7d31de 100644
index fd71ceac65ccef1d2832b45b0b5612877cee22c1..ceac508418f489a8077c1bc85a2feaf85bf60480 100644
--- a/src/api/environment.cc
+++ b/src/api/environment.cc
@@ -316,6 +316,10 @@ Isolate* NewIsolate(Isolate::CreateParams* params,
@@ -308,6 +308,10 @@ Isolate* NewIsolate(Isolate::CreateParams* params,
MultiIsolatePlatform* platform,
const SnapshotData* snapshot_data,
const IsolateSettings& settings) {
@@ -32,7 +32,7 @@ index cb37fa080fc8e8d524cfa2758c4a8c2c5652324d..8e227ddd1be50c046a8cf2895a31d607
Isolate* isolate = Isolate::Allocate();
if (isolate == nullptr) return nullptr;
@@ -359,9 +363,12 @@ Isolate* NewIsolate(ArrayBufferAllocator* allocator,
@@ -351,9 +355,12 @@ Isolate* NewIsolate(ArrayBufferAllocator* allocator,
uv_loop_t* event_loop,
MultiIsolatePlatform* platform,
const EmbedderSnapshotData* snapshot_data,
@@ -102,10 +102,10 @@ index 2d5fa8dbd75851bca30453548f6cbe0159509f26..c346e3a9c827993036438685d758a734
worker::Worker* worker_context_ = nullptr;
PerIsolateWrapperData* wrapper_data_;
diff --git a/src/node.cc b/src/node.cc
index 627337efae49319e2a77b4686176ce92a8493024..f0c0b6229048a2e9bc05684fab44ab09bc34e1f6 100644
index bebd75922733bbf219f125a3a6f35d98bc8210d3..029b0d219e3bbfa4306d7d5fb5e75f1abc1571bb 100644
--- a/src/node.cc
+++ b/src/node.cc
@@ -1295,6 +1295,14 @@ InitializeOncePerProcessInternal(const std::vector<std::string>& args,
@@ -1288,6 +1288,14 @@ InitializeOncePerProcessInternal(const std::vector<std::string>& args,
result->platform_ = per_process::v8_platform.Platform();
}
@@ -120,7 +120,7 @@ index 627337efae49319e2a77b4686176ce92a8493024..f0c0b6229048a2e9bc05684fab44ab09
if (!(flags & ProcessInitializationFlags::kNoInitializeV8)) {
V8::Initialize();
@@ -1304,14 +1312,6 @@ InitializeOncePerProcessInternal(const std::vector<std::string>& args,
@@ -1297,14 +1305,6 @@ InitializeOncePerProcessInternal(const std::vector<std::string>& args,
absl::SetMutexDeadlockDetectionMode(absl::OnDeadlockCycle::kIgnore);
}

View File

@@ -0,0 +1,653 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Robo <hop2deep@gmail.com>
Date: Wed, 21 Jan 2026 09:53:15 +0000
Subject: remove obsolete NoArrayBufferZeroFillScope
Replace the scope in favor of the V8 api added in
https://chromium-review.googlesource.com/c/v8/v8/+/5679067
Ports changes from
1) https://github.com/nodejs/node/commit/869ea331f3a8215229290e2e6038956874c382a6
2) https://github.com/nodejs/node/commit/ef9dc0857a73610f5de5dc9f37afd0a927c4c17f
3) partially from https://github.com/nodejs/node/commit/e0a71517fef4ca83f2d40d2d1600022bc82a7f9f
This is needed to remove dependency on the zero_fill_field_
that is exposed to JS
Refs https://github.com/nodejs/node/commit/3cdb1cd437f63dd256ae2ab3b7e9016257326cb4
diff --git a/src/api/environment.cc b/src/api/environment.cc
index ceac508418f489a8077c1bc85a2feaf85bf60480..645c4cbc0fcf9ec004dcb55493104796b0d64de2 100644
--- a/src/api/environment.cc
+++ b/src/api/environment.cc
@@ -107,11 +107,7 @@ MaybeLocal<Value> PrepareStackTraceCallback(Local<Context> context,
}
void* NodeArrayBufferAllocator::Allocate(size_t size) {
- void* ret;
- if (zero_fill_field_ || per_process::cli_options->zero_fill_all_buffers)
- ret = allocator_->Allocate(size);
- else
- ret = allocator_->AllocateUninitialized(size);
+ void* ret = allocator_->Allocate(size);
if (ret != nullptr) [[likely]] {
total_mem_usage_.fetch_add(size, std::memory_order_relaxed);
}
diff --git a/src/crypto/crypto_cipher.cc b/src/crypto/crypto_cipher.cc
index c00d3616e08b00b1e0a3a29b2dbb5278e1e14fcc..8939c5e5085d00b098f66074b9ee033f5be55d08 100644
--- a/src/crypto/crypto_cipher.cc
+++ b/src/crypto/crypto_cipher.cc
@@ -20,6 +20,7 @@ using ncrypto::SSLPointer;
using v8::Array;
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Context;
using v8::FunctionCallbackInfo;
using v8::FunctionTemplate;
@@ -774,10 +775,10 @@ CipherBase::UpdateResult CipherBase::Update(
return kErrorState;
}
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env()->isolate_data());
- *out = ArrayBuffer::NewBackingStore(env()->isolate(), buf_len);
- }
+ *out = ArrayBuffer::NewBackingStore(
+ env()->isolate(),
+ buf_len,
+ BackingStoreInitializationMode::kUninitialized);
buffer = {
.data = reinterpret_cast<const unsigned char*>(data),
@@ -852,11 +853,10 @@ bool CipherBase::Final(std::unique_ptr<BackingStore>* out) {
const int mode = ctx_.getMode();
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env()->isolate_data());
- *out = ArrayBuffer::NewBackingStore(
- env()->isolate(), static_cast<size_t>(ctx_.getBlockSize()));
- }
+ *out = ArrayBuffer::NewBackingStore(
+ env()->isolate(),
+ static_cast<size_t>(ctx_.getBlockSize()),
+ BackingStoreInitializationMode::kUninitialized);
if (kind_ == kDecipher &&
Cipher::FromCtx(ctx_).isSupportedAuthenticatedMode()) {
@@ -972,10 +972,10 @@ bool PublicKeyCipher::Cipher(
return false;
}
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- *out = ArrayBuffer::NewBackingStore(env->isolate(), out_len);
- }
+ *out = ArrayBuffer::NewBackingStore(
+ env->isolate(),
+ out_len,
+ BackingStoreInitializationMode::kUninitialized);
if (EVP_PKEY_cipher(
ctx.get(),
diff --git a/src/crypto/crypto_common.cc b/src/crypto/crypto_common.cc
index b81b9005365272217c77e2b9289bd9f877c0e77c..185b1d8fe657b5db64dc92497ca742d69f7a2764 100644
--- a/src/crypto/crypto_common.cc
+++ b/src/crypto/crypto_common.cc
@@ -37,6 +37,7 @@ using ncrypto::X509Pointer;
using ncrypto::X509View;
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Context;
using v8::EscapableHandleScope;
using v8::Integer;
@@ -307,11 +308,10 @@ MaybeLocal<Object> ECPointToBuffer(Environment* env,
return MaybeLocal<Object>();
}
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(), len);
- }
+ std::unique_ptr<BackingStore> bs = ArrayBuffer::NewBackingStore(
+ env->isolate(),
+ len,
+ BackingStoreInitializationMode::kUninitialized);
len = EC_POINT_point2oct(group,
point,
diff --git a/src/crypto/crypto_ec.cc b/src/crypto/crypto_ec.cc
index 2a3107dbbf5c0dfe70c2e105338d186e5620ddbf..f36c84a77313bd57d0a7a902d1a2529636ca1422 100644
--- a/src/crypto/crypto_ec.cc
+++ b/src/crypto/crypto_ec.cc
@@ -29,6 +29,7 @@ using ncrypto::MarkPopErrorOnReturn;
using v8::Array;
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Context;
using v8::FunctionCallbackInfo;
using v8::FunctionTemplate;
@@ -201,14 +202,10 @@ void ECDH::ComputeSecret(const FunctionCallbackInfo<Value>& args) {
return;
}
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- // NOTE: field_size is in bits
- int field_size = EC_GROUP_get_degree(ecdh->group_);
- size_t out_len = (field_size + 7) / 8;
- bs = ArrayBuffer::NewBackingStore(env->isolate(), out_len);
- }
+ int field_size = EC_GROUP_get_degree(ecdh->group_);
+ size_t out_len = (field_size + 7) / 8;
+ auto bs = ArrayBuffer::NewBackingStore(
+ env->isolate(), out_len, BackingStoreInitializationMode::kUninitialized);
if (!ECDH_compute_key(
bs->Data(), bs->ByteLength(), pub, ecdh->key_.get(), nullptr))
@@ -257,12 +254,10 @@ void ECDH::GetPrivateKey(const FunctionCallbackInfo<Value>& args) {
return THROW_ERR_CRYPTO_OPERATION_FAILED(env,
"Failed to get ECDH private key");
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(),
- BignumPointer::GetByteCount(b));
- }
+ auto bs = ArrayBuffer::NewBackingStore(
+ env->isolate(),
+ BignumPointer::GetByteCount(b),
+ BackingStoreInitializationMode::kUninitialized);
CHECK_EQ(bs->ByteLength(),
BignumPointer::EncodePaddedInto(
b, static_cast<unsigned char*>(bs->Data()), bs->ByteLength()));
diff --git a/src/crypto/crypto_rsa.cc b/src/crypto/crypto_rsa.cc
index 1f2fccce6ed8f14525557644e0bdd130eedf3337..1099a8f89bb53083f01942ee14fff453d8cdc0af 100644
--- a/src/crypto/crypto_rsa.cc
+++ b/src/crypto/crypto_rsa.cc
@@ -20,6 +20,7 @@ using ncrypto::EVPKeyPointer;
using ncrypto::RSAPointer;
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::FunctionCallbackInfo;
using v8::Int32;
using v8::JustVoid;
@@ -535,12 +536,10 @@ Maybe<void> GetRsaKeyDetail(Environment* env,
return Nothing<void>();
}
- std::unique_ptr<BackingStore> public_exponent;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- public_exponent = ArrayBuffer::NewBackingStore(
- env->isolate(), BignumPointer::GetByteCount(e));
- }
+ auto public_exponent = ArrayBuffer::NewBackingStore(
+ env->isolate(),
+ BignumPointer::GetByteCount(e),
+ BackingStoreInitializationMode::kUninitialized);
CHECK_EQ(BignumPointer::EncodePaddedInto(
e,
static_cast<unsigned char*>(public_exponent->Data()),
diff --git a/src/crypto/crypto_sig.cc b/src/crypto/crypto_sig.cc
index 2f6e683e3497d4315259773d09443e5215bff28f..c33e93c34ef32c18e6de6bc03698ed6b851b4aa3 100644
--- a/src/crypto/crypto_sig.cc
+++ b/src/crypto/crypto_sig.cc
@@ -21,6 +21,7 @@ using ncrypto::EVPKeyPointer;
using ncrypto::EVPMDCtxPointer;
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Boolean;
using v8::FunctionCallbackInfo;
using v8::FunctionTemplate;
@@ -92,11 +93,8 @@ std::unique_ptr<BackingStore> Node_SignFinal(Environment* env,
return nullptr;
size_t sig_len = pkey.size();
- std::unique_ptr<BackingStore> sig;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- sig = ArrayBuffer::NewBackingStore(env->isolate(), sig_len);
- }
+ auto sig = ArrayBuffer::NewBackingStore(
+ env->isolate(), sig_len, BackingStoreInitializationMode::kUninitialized);
EVPKeyCtxPointer pkctx = pkey.newCtx();
if (pkctx && EVP_PKEY_sign_init(pkctx.get()) > 0 &&
ApplyRSAOptions(pkey, pkctx.get(), padding, pss_salt_len) &&
@@ -168,11 +166,9 @@ std::unique_ptr<BackingStore> ConvertSignatureToP1363(
if (n == kNoDsaSignature)
return std::move(signature);
- std::unique_ptr<BackingStore> buf;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- buf = ArrayBuffer::NewBackingStore(env->isolate(), 2 * n);
- }
+ auto buf = ArrayBuffer::NewBackingStore(
+ env->isolate(), 2 * n, BackingStoreInitializationMode::kUninitialized);
+
if (!ExtractP1363(static_cast<unsigned char*>(signature->Data()),
static_cast<unsigned char*>(buf->Data()),
signature->ByteLength(), n))
diff --git a/src/crypto/crypto_tls.cc b/src/crypto/crypto_tls.cc
index a80685790bd29102d99929ff81f866e0aee370f1..24336b86f6f25533a7b668e9f9806a5635e3a398 100644
--- a/src/crypto/crypto_tls.cc
+++ b/src/crypto/crypto_tls.cc
@@ -46,6 +46,7 @@ using v8::Array;
using v8::ArrayBuffer;
using v8::ArrayBufferView;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Boolean;
using v8::Context;
using v8::DontDelete;
@@ -1087,10 +1088,10 @@ int TLSWrap::DoWrite(WriteWrap* w,
// and copying it when it could just be used.
if (nonempty_count != 1) {
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env()->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env()->isolate(), length);
- }
+ bs = ArrayBuffer::NewBackingStore(
+ env()->isolate(),
+ length,
+ BackingStoreInitializationMode::kUninitialized);
size_t offset = 0;
for (i = 0; i < count; i++) {
memcpy(static_cast<char*>(bs->Data()) + offset,
@@ -1107,8 +1108,10 @@ int TLSWrap::DoWrite(WriteWrap* w,
written = SSL_write(ssl_.get(), buf->base, buf->len);
if (written == -1) {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env()->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env()->isolate(), length);
+ bs = ArrayBuffer::NewBackingStore(
+ env()->isolate(),
+ length,
+ BackingStoreInitializationMode::kUninitialized);
memcpy(bs->Data(), buf->base, buf->len);
}
}
@@ -1746,11 +1749,8 @@ void TLSWrap::GetFinished(const FunctionCallbackInfo<Value>& args) {
if (len == 0)
return;
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(), len);
- }
+ auto bs = ArrayBuffer::NewBackingStore(
+ env->isolate(), len, BackingStoreInitializationMode::kUninitialized);
CHECK_EQ(bs->ByteLength(),
SSL_get_finished(w->ssl_.get(), bs->Data(), bs->ByteLength()));
@@ -1777,11 +1777,8 @@ void TLSWrap::GetPeerFinished(const FunctionCallbackInfo<Value>& args) {
if (len == 0)
return;
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(), len);
- }
+ auto bs = ArrayBuffer::NewBackingStore(
+ env->isolate(), len, BackingStoreInitializationMode::kUninitialized);
CHECK_EQ(bs->ByteLength(),
SSL_get_peer_finished(w->ssl_.get(), bs->Data(), bs->ByteLength()));
@@ -1806,11 +1803,8 @@ void TLSWrap::GetSession(const FunctionCallbackInfo<Value>& args) {
if (slen <= 0)
return; // Invalid or malformed session.
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(), slen);
- }
+ auto bs = ArrayBuffer::NewBackingStore(
+ env->isolate(), slen, BackingStoreInitializationMode::kUninitialized);
unsigned char* p = static_cast<unsigned char*>(bs->Data());
CHECK_LT(0, i2d_SSL_SESSION(sess, &p));
@@ -1993,11 +1987,8 @@ void TLSWrap::ExportKeyingMaterial(const FunctionCallbackInfo<Value>& args) {
uint32_t olen = args[0].As<Uint32>()->Value();
Utf8Value label(env->isolate(), args[1]);
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(), olen);
- }
+ auto bs = ArrayBuffer::NewBackingStore(
+ env->isolate(), olen, BackingStoreInitializationMode::kUninitialized);
ByteSource context;
bool use_context = !args[2]->IsUndefined();
diff --git a/src/crypto/crypto_x509.cc b/src/crypto/crypto_x509.cc
index eb6dad44a49d997097c8fb5009eeb60a7305da27..0a1a3c694f3f544ada62235ade6a03a9deadfede 100644
--- a/src/crypto/crypto_x509.cc
+++ b/src/crypto/crypto_x509.cc
@@ -28,6 +28,7 @@ using v8::Array;
using v8::ArrayBuffer;
using v8::ArrayBufferView;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Boolean;
using v8::Context;
using v8::Date;
@@ -683,11 +684,8 @@ MaybeLocal<Object> GetPubKey(Environment* env, OSSL3_CONST RSA* rsa) {
int size = i2d_RSA_PUBKEY(rsa, nullptr);
CHECK_GE(size, 0);
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(), size);
- }
+ auto bs = ArrayBuffer::NewBackingStore(
+ env->isolate(), size, BackingStoreInitializationMode::kUninitialized);
unsigned char* serialized = reinterpret_cast<unsigned char*>(bs->Data());
CHECK_GE(i2d_RSA_PUBKEY(rsa, &serialized), 0);
diff --git a/src/encoding_binding.cc b/src/encoding_binding.cc
index 31ed995714bb99ab534f26ba9ebc6051c258a1c9..9bdb2a660364c66f3f141b505225dcf35c5bbc65 100644
--- a/src/encoding_binding.cc
+++ b/src/encoding_binding.cc
@@ -15,6 +15,7 @@ namespace encoding_binding {
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Context;
using v8::FunctionCallbackInfo;
using v8::Isolate;
@@ -123,9 +124,8 @@ void BindingData::EncodeUtf8String(const FunctionCallbackInfo<Value>& args) {
Local<ArrayBuffer> ab;
{
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- std::unique_ptr<BackingStore> bs =
- ArrayBuffer::NewBackingStore(isolate, length);
+ std::unique_ptr<BackingStore> bs = ArrayBuffer::NewBackingStore(
+ isolate, length, BackingStoreInitializationMode::kUninitialized);
CHECK(bs);
diff --git a/src/env-inl.h b/src/env-inl.h
index 98e1e1e75bae94038bba0049447ab48b0acfb8cc..fe395bf89f9c1e5bb2dabc8fceda7b9b2b877415 100644
--- a/src/env-inl.h
+++ b/src/env-inl.h
@@ -44,16 +44,6 @@
namespace node {
-NoArrayBufferZeroFillScope::NoArrayBufferZeroFillScope(
- IsolateData* isolate_data)
- : node_allocator_(isolate_data->node_allocator()) {
- if (node_allocator_ != nullptr) node_allocator_->zero_fill_field()[0] = 0;
-}
-
-NoArrayBufferZeroFillScope::~NoArrayBufferZeroFillScope() {
- if (node_allocator_ != nullptr) node_allocator_->zero_fill_field()[0] = 1;
-}
-
inline v8::Isolate* IsolateData::isolate() const {
return isolate_;
}
diff --git a/src/env.cc b/src/env.cc
index 5fa667382bc957aee800d612f78b18c37a58c67f..49a85e9a23e50f201d0b93d8b205e104c0f0fe2c 100644
--- a/src/env.cc
+++ b/src/env.cc
@@ -39,6 +39,9 @@ namespace node {
using errors::TryCatchScope;
using v8::Array;
+using v8::ArrayBuffer;
+using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Boolean;
using v8::Context;
using v8::CppHeap;
@@ -724,9 +727,10 @@ void Environment::add_refs(int64_t diff) {
}
uv_buf_t Environment::allocate_managed_buffer(const size_t suggested_size) {
- NoArrayBufferZeroFillScope no_zero_fill_scope(isolate_data());
- std::unique_ptr<v8::BackingStore> bs =
- v8::ArrayBuffer::NewBackingStore(isolate(), suggested_size);
+ std::unique_ptr<BackingStore> bs = ArrayBuffer::NewBackingStore(
+ isolate(),
+ suggested_size,
+ BackingStoreInitializationMode::kUninitialized);
uv_buf_t buf = uv_buf_init(static_cast<char*>(bs->Data()), bs->ByteLength());
released_allocated_buffers_.emplace(buf.base, std::move(bs));
return buf;
diff --git a/src/env.h b/src/env.h
index c346e3a9c827993036438685d758a734f9ce8c05..28c8df87c8e2f06e2ed8c554260bfdedb860bb4a 100644
--- a/src/env.h
+++ b/src/env.h
@@ -114,19 +114,6 @@ class ModuleWrap;
class Environment;
class Realm;
-// Disables zero-filling for ArrayBuffer allocations in this scope. This is
-// similar to how we implement Buffer.allocUnsafe() in JS land.
-class NoArrayBufferZeroFillScope {
- public:
- inline explicit NoArrayBufferZeroFillScope(IsolateData* isolate_data);
- inline ~NoArrayBufferZeroFillScope();
-
- private:
- NodeArrayBufferAllocator* node_allocator_;
-
- friend class Environment;
-};
-
struct IsolateDataSerializeInfo {
std::vector<SnapshotIndex> primitive_values;
std::vector<PropInfo> template_values;
diff --git a/src/node_buffer.cc b/src/node_buffer.cc
index 8a5b6b57321c2843a965a7e51b2ebed991a1e424..d7bd35a1025487d7a810b14cd5d3f2ba74cc2378 100644
--- a/src/node_buffer.cc
+++ b/src/node_buffer.cc
@@ -66,6 +66,7 @@ namespace Buffer {
using v8::ArrayBuffer;
using v8::ArrayBufferView;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Context;
using v8::EscapableHandleScope;
using v8::FunctionCallbackInfo;
@@ -375,9 +376,8 @@ MaybeLocal<Object> New(Environment* env, size_t length) {
Local<ArrayBuffer> ab;
{
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- std::unique_ptr<BackingStore> bs =
- ArrayBuffer::NewBackingStore(isolate, length);
+ std::unique_ptr<BackingStore> bs = ArrayBuffer::NewBackingStore(
+ isolate, length, BackingStoreInitializationMode::kUninitialized);
CHECK(bs);
@@ -416,18 +416,14 @@ MaybeLocal<Object> Copy(Environment* env, const char* data, size_t length) {
return Local<Object>();
}
- Local<ArrayBuffer> ab;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- std::unique_ptr<BackingStore> bs =
- ArrayBuffer::NewBackingStore(isolate, length);
+ std::unique_ptr<BackingStore> bs = ArrayBuffer::NewBackingStore(
+ isolate, length, BackingStoreInitializationMode::kUninitialized);
- CHECK(bs);
+ CHECK(bs);
- memcpy(bs->Data(), data, length);
+ memcpy(bs->Data(), data, length);
- ab = ArrayBuffer::New(isolate, std::move(bs));
- }
+ Local<ArrayBuffer> ab = ArrayBuffer::New(isolate, std::move(bs));
MaybeLocal<Object> obj =
New(env, ab, 0, ab->ByteLength())
@@ -1439,14 +1435,16 @@ void CreateUnsafeArrayBuffer(const FunctionCallbackInfo<Value>& args) {
Local<ArrayBuffer> buf;
- NodeArrayBufferAllocator* allocator = env->isolate_data()->node_allocator();
// 0-length, or zero-fill flag is set, or building snapshot
if (size == 0 || per_process::cli_options->zero_fill_all_buffers ||
- allocator == nullptr) {
+ env->isolate_data()->is_building_snapshot()) {
buf = ArrayBuffer::New(isolate, size);
} else {
- std::unique_ptr<BackingStore> store =
- ArrayBuffer::NewBackingStore(isolate, size);
+ std::unique_ptr<BackingStore> store = ArrayBuffer::NewBackingStore(
+ isolate,
+ size,
+ BackingStoreInitializationMode::kUninitialized,
+ v8::BackingStoreOnFailureMode::kReturnNull);
if (!store) {
return env->ThrowRangeError("Array buffer allocation failed");
}
diff --git a/src/node_http2.cc b/src/node_http2.cc
index 8237c9b7d325dd925ae8798d7795fcd94eeb13d0..a22cf6c4e33e5cf2d3168ce03dc35af8a9584af7 100644
--- a/src/node_http2.cc
+++ b/src/node_http2.cc
@@ -27,6 +27,7 @@ using v8::Array;
using v8::ArrayBuffer;
using v8::ArrayBufferView;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::Boolean;
using v8::Context;
using v8::EscapableHandleScope;
@@ -298,11 +299,10 @@ Local<Value> Http2Settings::Pack(
size_t count,
const nghttp2_settings_entry* entries) {
EscapableHandleScope scope(env->isolate());
- std::unique_ptr<BackingStore> bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(env->isolate(), count * 6);
- }
+ std::unique_ptr<BackingStore> bs = ArrayBuffer::NewBackingStore(
+ env->isolate(),
+ count * 6,
+ BackingStoreInitializationMode::kUninitialized);
if (nghttp2_pack_settings_payload(static_cast<uint8_t*>(bs->Data()),
bs->ByteLength(),
entries,
@@ -468,13 +468,11 @@ Origins::Origins(
return;
}
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs_ = ArrayBuffer::NewBackingStore(env->isolate(),
- alignof(nghttp2_origin_entry) - 1 +
- count_ * sizeof(nghttp2_origin_entry) +
- origin_string_len);
- }
+ bs_ = ArrayBuffer::NewBackingStore(
+ env->isolate(),
+ alignof(nghttp2_origin_entry) - 1 +
+ count_ * sizeof(nghttp2_origin_entry) + origin_string_len,
+ BackingStoreInitializationMode::kUninitialized);
// Make sure the start address is aligned appropriately for an nghttp2_nv*.
char* start = nbytes::AlignUp(static_cast<char*>(bs_->Data()),
@@ -2120,12 +2118,10 @@ void Http2Session::OnStreamRead(ssize_t nread, const uv_buf_t& buf_) {
// happen, we concatenate the data we received with the already-stored
// pending input data, slicing off the already processed part.
size_t pending_len = stream_buf_.len - stream_buf_offset_;
- std::unique_ptr<BackingStore> new_bs;
- {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env()->isolate_data());
- new_bs = ArrayBuffer::NewBackingStore(env()->isolate(),
- pending_len + nread);
- }
+ std::unique_ptr<BackingStore> new_bs = ArrayBuffer::NewBackingStore(
+ env()->isolate(),
+ pending_len + nread,
+ BackingStoreInitializationMode::kUninitialized);
memcpy(static_cast<char*>(new_bs->Data()),
stream_buf_.base + stream_buf_offset_,
pending_len);
diff --git a/src/node_internals.h b/src/node_internals.h
index 12ea72b61b0a5e194207bb369dfed4b8667107cb..18844e18a32d6b07e62481138fa2342765643484 100644
--- a/src/node_internals.h
+++ b/src/node_internals.h
@@ -121,8 +121,6 @@ v8::MaybeLocal<v8::Object> InitializePrivateSymbols(
class NodeArrayBufferAllocator : public ArrayBufferAllocator {
public:
- inline uint32_t* zero_fill_field() { return &zero_fill_field_; }
-
void* Allocate(size_t size) override; // Defined in src/node.cc
void* AllocateUninitialized(size_t size) override;
void Free(void* data, size_t size) override;
@@ -139,7 +137,6 @@ class NodeArrayBufferAllocator : public ArrayBufferAllocator {
}
private:
- uint32_t zero_fill_field_ = 1; // Boolean but exposed as uint32 to JS land.
std::atomic<size_t> total_mem_usage_ {0};
// Delegate to V8's allocator for compatibility with the V8 memory cage.
diff --git a/src/stream_base.cc b/src/stream_base.cc
index fc81108120f0066f2d5dabedc74e22cb6c84d8e4..0bf2642599ee91e2d2aa20d6d066c80b3026ecc5 100644
--- a/src/stream_base.cc
+++ b/src/stream_base.cc
@@ -19,6 +19,7 @@ namespace node {
using v8::Array;
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
using v8::ConstructorBehavior;
using v8::Context;
using v8::DontDelete;
@@ -243,8 +244,8 @@ int StreamBase::Writev(const FunctionCallbackInfo<Value>& args) {
std::unique_ptr<BackingStore> bs;
if (storage_size > 0) {
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(isolate, storage_size);
+ bs = ArrayBuffer::NewBackingStore(
+ isolate, storage_size, BackingStoreInitializationMode::kUninitialized);
}
offset = 0;
@@ -398,14 +399,14 @@ int StreamBase::WriteString(const FunctionCallbackInfo<Value>& args) {
if (try_write) {
// Copy partial data
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(isolate, buf.len);
+ bs = ArrayBuffer::NewBackingStore(
+ isolate, buf.len, BackingStoreInitializationMode::kUninitialized);
memcpy(bs->Data(), buf.base, buf.len);
data_size = buf.len;
} else {
// Write it
- NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
- bs = ArrayBuffer::NewBackingStore(isolate, storage_size);
+ bs = ArrayBuffer::NewBackingStore(
+ isolate, storage_size, BackingStoreInitializationMode::kUninitialized);
data_size = StringBytes::Write(isolate,
static_cast<char*>(bs->Data()),
storage_size,

View File

@@ -0,0 +1,276 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: James M Snell <jasnell@gmail.com>
Date: Sun, 18 May 2025 10:46:30 -0700
Subject: src: prepare for v8 sandboxing
PR-URL: https://github.com/nodejs/node/pull/58376
Reviewed-By: Yagiz Nizipli <yagiz@nizipli.com>
Reviewed-By: Chengzhong Wu <legendecas@gmail.com>
diff --git a/src/crypto/crypto_dh.cc b/src/crypto/crypto_dh.cc
index f23cedf4f2449d8edc9a8de1b70332e75d693cdd..5ac2b1a83688fe99b13c37bf375ca6e22497dc18 100644
--- a/src/crypto/crypto_dh.cc
+++ b/src/crypto/crypto_dh.cc
@@ -22,6 +22,8 @@ using ncrypto::DHPointer;
using ncrypto::EVPKeyCtxPointer;
using ncrypto::EVPKeyPointer;
using v8::ArrayBuffer;
+using v8::BackingStoreInitializationMode;
+using v8::BackingStoreOnFailureMode;
using v8::ConstructorBehavior;
using v8::Context;
using v8::DontDelete;
@@ -55,12 +57,27 @@ void DiffieHellman::MemoryInfo(MemoryTracker* tracker) const {
namespace {
MaybeLocal<Value> DataPointerToBuffer(Environment* env, DataPointer&& data) {
+#ifdef V8_ENABLE_SANDBOX
+ auto backing = ArrayBuffer::NewBackingStore(
+ env->isolate(),
+ data.size(),
+ BackingStoreInitializationMode::kUninitialized,
+ BackingStoreOnFailureMode::kReturnNull);
+ if (!backing) {
+ THROW_ERR_MEMORY_ALLOCATION_FAILED(env);
+ return MaybeLocal<Value>();
+ }
+ if (data.size() > 0) {
+ memcpy(backing->Data(), data.get(), data.size());
+ }
+#else
auto backing = ArrayBuffer::NewBackingStore(
data.get(),
data.size(),
[](void* data, size_t len, void* ptr) { DataPointer free_me(data, len); },
nullptr);
data.release();
+#endif // V8_ENABLE_SANDBOX
auto ab = ArrayBuffer::New(env->isolate(), std::move(backing));
return Buffer::New(env, ab, 0, ab->ByteLength()).FromMaybe(Local<Value>());
diff --git a/src/crypto/crypto_util.cc b/src/crypto/crypto_util.cc
index eab18ab9888e2f7c0757fefab80505d8c99dc742..7ecf810ea0f4106c7c44593dae1b0a3cf0673380 100644
--- a/src/crypto/crypto_util.cc
+++ b/src/crypto/crypto_util.cc
@@ -37,6 +37,8 @@ using ncrypto::SSLCtxPointer;
using ncrypto::SSLPointer;
using v8::ArrayBuffer;
using v8::BackingStore;
+using v8::BackingStoreInitializationMode;
+using v8::BackingStoreOnFailureMode;
using v8::BigInt;
using v8::Context;
using v8::Exception;
@@ -359,34 +361,29 @@ ByteSource& ByteSource::operator=(ByteSource&& other) noexcept {
return *this;
}
-std::unique_ptr<BackingStore> ByteSource::ReleaseToBackingStore(Environment* env) {
+std::unique_ptr<BackingStore> ByteSource::ReleaseToBackingStore(
+ Environment* env) {
// It's ok for allocated_data_ to be nullptr but
// only if size_ is zero.
CHECK_IMPLIES(size_ > 0, allocated_data_ != nullptr);
-#if defined(V8_ENABLE_SANDBOX)
- // When V8 sandboxed pointers are enabled, we have to copy into the memory
- // cage. We still want to ensure we erase the data on free though, so
- // provide a custom deleter that calls OPENSSL_cleanse.
- if (!size())
- return ArrayBuffer::NewBackingStore(env->isolate(), 0);
- std::unique_ptr<ArrayBuffer::Allocator> allocator(ArrayBuffer::Allocator::NewDefaultAllocator());
- void* v8_data = allocator->Allocate(size());
- CHECK(v8_data);
- memcpy(v8_data, allocated_data_, size());
- OPENSSL_clear_free(allocated_data_, size());
+#ifdef V8_ENABLE_SANDBOX
+ // If the v8 sandbox is enabled, then all array buffers must be allocated
+ // via the isolate. External buffers are not allowed. So, instead of wrapping
+ // the allocated data we'll copy it instead.
+
+ // TODO(@jasnell): It would be nice to use an abstracted utility to do this
+ // branch instead of duplicating the V8_ENABLE_SANDBOX check each time.
std::unique_ptr<BackingStore> ptr = ArrayBuffer::NewBackingStore(
- v8_data,
+ env->isolate(),
size(),
- [](void* data, size_t length, void*) {
- OPENSSL_cleanse(data, length);
- std::unique_ptr<ArrayBuffer::Allocator> allocator(ArrayBuffer::Allocator::NewDefaultAllocator());
- allocator->Free(data, length);
- }, nullptr);
- CHECK(ptr);
- allocated_data_ = nullptr;
- data_ = nullptr;
- size_ = 0;
- return ptr;
+ BackingStoreInitializationMode::kUninitialized,
+ BackingStoreOnFailureMode::kReturnNull);
+ if (!ptr) {
+ THROW_ERR_MEMORY_ALLOCATION_FAILED(env);
+ return nullptr;
+ }
+ memcpy(ptr->Data(), allocated_data_, size());
+ OPENSSL_clear_free(allocated_data_, size_);
#else
std::unique_ptr<BackingStore> ptr = ArrayBuffer::NewBackingStore(
allocated_data_,
@@ -394,12 +391,12 @@ std::unique_ptr<BackingStore> ByteSource::ReleaseToBackingStore(Environment* env
[](void* data, size_t length, void* deleter_data) {
OPENSSL_clear_free(deleter_data, length);
}, allocated_data_);
+#endif // V8_ENABLE_SANDBOX
CHECK(ptr);
allocated_data_ = nullptr;
data_ = nullptr;
size_ = 0;
return ptr;
-#endif // defined(V8_ENABLE_SANDBOX)
}
Local<ArrayBuffer> ByteSource::ToArrayBuffer(Environment* env) {
@@ -711,8 +708,19 @@ void SecureBuffer(const FunctionCallbackInfo<Value>& args) {
}
#else
void SecureBuffer(const FunctionCallbackInfo<Value>& args) {
- CHECK(args[0]->IsUint32());
Environment* env = Environment::GetCurrent(args);
+#ifdef V8_ENABLE_SANDBOX
+ // The v8 sandbox is enabled, so we cannot use the secure heap because
+ // the sandbox requires that all array buffers be allocated via the isolate.
+ // That is fundamentally incompatible with the secure heap which allocates
+ // in openssl's secure heap area. Instead we'll just throw an error here.
+ //
+ // That said, we really shouldn't get here in the first place since the
+ // option to enable the secure heap is only available when the sandbox
+ // is disabled.
+ UNREACHABLE();
+#else
+ CHECK(args[0]->IsUint32());
uint32_t len = args[0].As<Uint32>()->Value();
void* data = OPENSSL_malloc(len);
if (data == nullptr) {
@@ -730,6 +738,7 @@ void SecureBuffer(const FunctionCallbackInfo<Value>& args) {
data);
Local<ArrayBuffer> buffer = ArrayBuffer::New(env->isolate(), store);
args.GetReturnValue().Set(Uint8Array::New(buffer, 0, len));
+#endif // V8_ENABLE_SANDBOX
}
#endif // defined(V8_ENABLE_SANDBOX)
diff --git a/src/crypto/crypto_x509.cc b/src/crypto/crypto_x509.cc
index 0a1a3c694f3f544ada62235ade6a03a9deadfede..0ac2379c0899f1080dd325d492496555c5e1c6af 100644
--- a/src/crypto/crypto_x509.cc
+++ b/src/crypto/crypto_x509.cc
@@ -29,6 +29,7 @@ using v8::ArrayBuffer;
using v8::ArrayBufferView;
using v8::BackingStore;
using v8::BackingStoreInitializationMode;
+using v8::BackingStoreOnFailureMode;
using v8::Boolean;
using v8::Context;
using v8::Date;
@@ -168,18 +169,20 @@ MaybeLocal<Value> ToV8Value(Local<Context> context, const BIOPointer& bio) {
MaybeLocal<Value> ToBuffer(Environment* env, BIOPointer* bio) {
if (bio == nullptr || !*bio) return {};
BUF_MEM* mem = *bio;
-#if defined(V8_ENABLE_SANDBOX)
- std::unique_ptr<ArrayBuffer::Allocator> allocator(ArrayBuffer::Allocator::NewDefaultAllocator());
- void* v8_data = allocator->Allocate(mem->length);
- CHECK(v8_data);
- memcpy(v8_data, mem->data, mem->length);
- std::unique_ptr<v8::BackingStore> backing = ArrayBuffer::NewBackingStore(
- v8_data,
+#ifdef V8_ENABLE_SANDBOX
+ // If the v8 sandbox is enabled, then all array buffers must be allocated
+ // via the isolate. External buffers are not allowed. So, instead of wrapping
+ // the BIOPointer we'll copy it instead.
+ auto backing = ArrayBuffer::NewBackingStore(
+ env->isolate(),
mem->length,
- [](void* data, size_t length, void*) {
- std::unique_ptr<ArrayBuffer::Allocator> allocator(ArrayBuffer::Allocator::NewDefaultAllocator());
- allocator->Free(data, length);
- }, nullptr);
+ BackingStoreInitializationMode::kUninitialized,
+ BackingStoreOnFailureMode::kReturnNull);
+ if (!backing) {
+ THROW_ERR_MEMORY_ALLOCATION_FAILED(env);
+ return MaybeLocal<Value>();
+ }
+ memcpy(backing->Data(), mem->data, mem->length);
#else
auto backing = ArrayBuffer::NewBackingStore(
mem->data,
@@ -188,8 +191,7 @@ MaybeLocal<Value> ToBuffer(Environment* env, BIOPointer* bio) {
BIOPointer free_me(static_cast<BIO*>(data));
},
bio->release());
-#endif
-
+#endif // V8_ENABLE_SANDBOX
auto ab = ArrayBuffer::New(env->isolate(), std::move(backing));
Local<Value> ret;
if (!Buffer::New(env, ab, 0, ab->ByteLength()).ToLocal(&ret)) return {};
diff --git a/src/js_native_api_v8.cc b/src/js_native_api_v8.cc
index 7b2efa49468c0bed2f5935552addd3ab37d0a50b..97eb62047b6692d63deacf2e7346e85351337e85 100644
--- a/src/js_native_api_v8.cc
+++ b/src/js_native_api_v8.cc
@@ -114,7 +114,7 @@ napi_status NewExternalString(napi_env env,
CHECK_NEW_STRING_ARGS(env, str, length, result);
napi_status status;
-#if defined(V8_ENABLE_SANDBOX)
+#ifdef V8_ENABLE_SANDBOX
status = create_api(env, str, length, result);
if (status == napi_ok) {
if (copied != nullptr) {
diff --git a/src/node_api.cc b/src/node_api.cc
index 2769997f0ede0e921fcb8826942609e497e5f9cb..d9b17780f6143f1c3f8488a20144376963e43fbc 100644
--- a/src/node_api.cc
+++ b/src/node_api.cc
@@ -1056,7 +1056,7 @@ napi_create_external_buffer(napi_env env,
NAPI_PREAMBLE(env);
CHECK_ARG(env, result);
-#if defined(V8_ENABLE_SANDBOX)
+#ifdef V8_ENABLE_SANDBOX
return napi_set_last_error(env, napi_no_external_buffers_allowed);
#else
v8::Isolate* isolate = env->isolate;
diff --git a/src/node_options.cc b/src/node_options.cc
index b067685822dc056e446e1a9402a5a6cba86cc722..cf70816d3ebacee1aaec6d465e6ebdc5f1dec5c3 100644
--- a/src/node_options.cc
+++ b/src/node_options.cc
@@ -83,6 +83,8 @@ void PerProcessOptions::CheckOptions(std::vector<std::string>* errors,
}
// Any value less than 2 disables use of the secure heap.
+#ifndef V8_ENABLE_SANDBOX
+ // The secure heap is not supported when V8_ENABLE_SANDBOX is enabled.
if (secure_heap >= 2) {
if ((secure_heap & (secure_heap - 1)) != 0)
errors->push_back("--secure-heap must be a power of 2");
@@ -95,6 +97,7 @@ void PerProcessOptions::CheckOptions(std::vector<std::string>* errors,
if ((secure_heap_min & (secure_heap_min - 1)) != 0)
errors->push_back("--secure-heap-min must be a power of 2");
}
+#endif // V8_ENABLE_SANDBOX
#endif // HAVE_OPENSSL
if (use_largepages != "off" &&
@@ -1243,6 +1246,7 @@ PerProcessOptionsParser::PerProcessOptionsParser(
"force FIPS crypto (cannot be disabled)",
&PerProcessOptions::force_fips_crypto,
kAllowedInEnvvar);
+#ifndef V8_ENABLE_SANDBOX
AddOption("--secure-heap",
"total size of the OpenSSL secure heap",
&PerProcessOptions::secure_heap,
@@ -1251,6 +1255,7 @@ PerProcessOptionsParser::PerProcessOptionsParser(
"minimum allocation size from the OpenSSL secure heap",
&PerProcessOptions::secure_heap_min,
kAllowedInEnvvar);
+#endif // V8_ENABLE_SANDBOX
#endif // HAVE_OPENSSL
#if OPENSSL_VERSION_MAJOR >= 3
AddOption("--openssl-legacy-provider",

View File

@@ -16,7 +16,7 @@ patch:
(cherry picked from commit 30329d06235a9f9733b1d4da479b403462d1b326)
diff --git a/src/env-inl.h b/src/env-inl.h
index 67b4cc2037b8e02f6382cd12a7abb157d0dbac65..4906c6c4c0ab5260d6e6387d0ed8e0687f982a38 100644
index da2c468f11cdc320cfec794b1b8b24904b93491e..98e1e1e75bae94038bba0049447ab48b0acfb8cc 100644
--- a/src/env-inl.h
+++ b/src/env-inl.h
@@ -62,31 +62,6 @@ inline uv_loop_t* IsolateData::event_loop() const {

View File

@@ -26,7 +26,7 @@ index dbc46400501b61814d5be0ec1cb01b0dcd94e1d0..fe669d40c31a29334b047b9cfee3067f
}
diff --git a/src/histogram.cc b/src/histogram.cc
index b655808e43d7c700ddeab7690e287bdbc9bfa50a..b0f7ae4e3af652c6dfe09f66d88485c5783f4037 100644
index 4a67b4a725ff768cbd10aef72a84af311d6ec9ec..ec7d5b1120724b7c752f92a3e3124a33cf51c0f5 100644
--- a/src/histogram.cc
+++ b/src/histogram.cc
@@ -187,7 +187,8 @@ void HistogramBase::FastRecord(Local<Value> unused,
@@ -40,10 +40,10 @@ index b655808e43d7c700ddeab7690e287bdbc9bfa50a..b0f7ae4e3af652c6dfe09f66d88485c5
}
HistogramBase* histogram;
diff --git a/src/node_wasi.cc b/src/node_wasi.cc
index 090866960beb8f1759c99e95536924b8b61fb723..3f91b651b83a20e70d5b368e012f5ee4b9d16092 100644
index 85e549e4592e1d718eced31ae165dee250149b08..9b5ada71c174567498c4902259d97f9d11fefb91 100644
--- a/src/node_wasi.cc
+++ b/src/node_wasi.cc
@@ -275,17 +275,19 @@ R WASI::WasiFunction<FT, F, R, Args...>::FastCallback(
@@ -274,17 +274,19 @@ R WASI::WasiFunction<FT, F, R, Args...>::FastCallback(
return EinvalError<R>();
}

View File

@@ -0,0 +1,308 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Fedor Indutny <238531+indutny@users.noreply.github.com>
Date: Fri, 7 Nov 2025 19:41:44 -0800
Subject: src: use CP_UTF8 for wide file names on win32
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
`src/node_modules.cc` needs to be consistent with `src/node_file.cc` in
how it translates the utf8 strings to `std::wstring` otherwise we might
end up in situation where we can read the source code of imported
package from disk, but fail to recognize that it is an ESM (or CJS) and
cause runtime errors. This type of error is possible on Windows when the
path contains unicode characters and "Language for non-Unicode programs"
is set to "Chinese (Traditional, Taiwan)".
See: #58768
PR-URL: https://github.com/nodejs/node/pull/60575
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Darshan Sen <raisinten@gmail.com>
Reviewed-By: Stefan Stojanovic <stefan.stojanovic@janeasystems.com>
Reviewed-By: Juan José Arboleda <soyjuanarbol@gmail.com>
Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com>
Reviewed-By: Rafael Gonzaga <rafael.nunu@hotmail.com>
diff --git a/src/node_file.cc b/src/node_file.cc
index 5de3ebb04b12286a07e3041d0a6dd1cc9072e76a..52075b7d21c7c4674ccd81698ea5595c95698120 100644
--- a/src/node_file.cc
+++ b/src/node_file.cc
@@ -3056,42 +3056,6 @@ static void GetFormatOfExtensionlessFile(
return args.GetReturnValue().Set(EXTENSIONLESS_FORMAT_JAVASCRIPT);
}
-#ifdef _WIN32
-#define BufferValueToPath(str) \
- std::filesystem::path(ConvertToWideString(str.ToString(), CP_UTF8))
-
-std::string ConvertWideToUTF8(const std::wstring& wstr) {
- if (wstr.empty()) return std::string();
-
- int size_needed = WideCharToMultiByte(CP_UTF8,
- 0,
- &wstr[0],
- static_cast<int>(wstr.size()),
- nullptr,
- 0,
- nullptr,
- nullptr);
- std::string strTo(size_needed, 0);
- WideCharToMultiByte(CP_UTF8,
- 0,
- &wstr[0],
- static_cast<int>(wstr.size()),
- &strTo[0],
- size_needed,
- nullptr,
- nullptr);
- return strTo;
-}
-
-#define PathToString(path) ConvertWideToUTF8(path.wstring());
-
-#else // _WIN32
-
-#define BufferValueToPath(str) std::filesystem::path(str.ToStringView());
-#define PathToString(path) path.native();
-
-#endif // _WIN32
-
static void CpSyncCheckPaths(const FunctionCallbackInfo<Value>& args) {
Environment* env = Environment::GetCurrent(args);
Isolate* isolate = env->isolate();
@@ -3104,7 +3068,7 @@ static void CpSyncCheckPaths(const FunctionCallbackInfo<Value>& args) {
THROW_IF_INSUFFICIENT_PERMISSIONS(
env, permission::PermissionScope::kFileSystemRead, src.ToStringView());
- auto src_path = BufferValueToPath(src);
+ auto src_path = src.ToPath();
BufferValue dest(isolate, args[1]);
CHECK_NOT_NULL(*dest);
@@ -3112,7 +3076,7 @@ static void CpSyncCheckPaths(const FunctionCallbackInfo<Value>& args) {
THROW_IF_INSUFFICIENT_PERMISSIONS(
env, permission::PermissionScope::kFileSystemWrite, dest.ToStringView());
- auto dest_path = BufferValueToPath(dest);
+ auto dest_path = dest.ToPath();
bool dereference = args[2]->IsTrue();
bool recursive = args[3]->IsTrue();
@@ -3141,8 +3105,8 @@ static void CpSyncCheckPaths(const FunctionCallbackInfo<Value>& args) {
(src_status.type() == std::filesystem::file_type::directory) ||
(dereference && src_status.type() == std::filesystem::file_type::symlink);
- auto src_path_str = PathToString(src_path);
- auto dest_path_str = PathToString(dest_path);
+ auto src_path_str = ConvertPathToUTF8(src_path);
+ auto dest_path_str = ConvertPathToUTF8(dest_path);
if (!error_code) {
// Check if src and dest are identical.
@@ -3237,7 +3201,7 @@ static bool CopyUtimes(const std::filesystem::path& src,
uv_fs_t req;
auto cleanup = OnScopeLeave([&req]() { uv_fs_req_cleanup(&req); });
- auto src_path_str = PathToString(src);
+ auto src_path_str = ConvertPathToUTF8(src);
int result = uv_fs_stat(nullptr, &req, src_path_str.c_str(), nullptr);
if (is_uv_error(result)) {
env->ThrowUVException(result, "stat", nullptr, src_path_str.c_str());
@@ -3248,7 +3212,7 @@ static bool CopyUtimes(const std::filesystem::path& src,
const double source_atime = s->st_atim.tv_sec + s->st_atim.tv_nsec / 1e9;
const double source_mtime = s->st_mtim.tv_sec + s->st_mtim.tv_nsec / 1e9;
- auto dest_file_path_str = PathToString(dest);
+ auto dest_file_path_str = ConvertPathToUTF8(dest);
int utime_result = uv_fs_utime(nullptr,
&req,
dest_file_path_str.c_str(),
@@ -3383,7 +3347,7 @@ static void CpSyncCopyDir(const FunctionCallbackInfo<Value>& args) {
std::error_code error;
for (auto dir_entry : std::filesystem::directory_iterator(src)) {
auto dest_file_path = dest / dir_entry.path().filename();
- auto dest_str = PathToString(dest);
+ auto dest_str = ConvertPathToUTF8(dest);
if (dir_entry.is_symlink()) {
if (verbatim_symlinks) {
@@ -3446,7 +3410,7 @@ static void CpSyncCopyDir(const FunctionCallbackInfo<Value>& args) {
}
} else if (std::filesystem::is_regular_file(dest_file_path)) {
if (!dereference || (!force && error_on_exist)) {
- auto dest_file_path_str = PathToString(dest_file_path);
+ auto dest_file_path_str = ConvertPathToUTF8(dest_file_path);
env->ThrowStdErrException(
std::make_error_code(std::errc::file_exists),
"cp",
diff --git a/src/node_modules.cc b/src/node_modules.cc
index 62050791174563f88b8629d576eed8959b3c2e20..0e820f8623ecf41008a938f16d11bd3206a7b0c1 100644
--- a/src/node_modules.cc
+++ b/src/node_modules.cc
@@ -327,22 +327,24 @@ const BindingData::PackageConfig* BindingData::TraverseParent(
// Stop the search when the process doesn't have permissions
// to walk upwards
- if (is_permissions_enabled &&
- !env->permission()->is_granted(
- env,
- permission::PermissionScope::kFileSystemRead,
- current_path.generic_string())) [[unlikely]] {
- return nullptr;
+ if (is_permissions_enabled) {
+ if (!env->permission()->is_granted(
+ env,
+ permission::PermissionScope::kFileSystemRead,
+ ConvertGenericPathToUTF8(current_path))) [[unlikely]] {
+ return nullptr;
+ }
}
// Check if the path ends with `/node_modules`
- if (current_path.generic_string().ends_with("/node_modules")) {
+ if (current_path.filename() == "node_modules") {
return nullptr;
}
auto package_json_path = current_path / "package.json";
+
auto package_json =
- GetPackageJSON(realm, package_json_path.string(), nullptr);
+ GetPackageJSON(realm, ConvertPathToUTF8(package_json_path), nullptr);
if (package_json != nullptr) {
return package_json;
}
@@ -364,20 +366,12 @@ void BindingData::GetNearestParentPackageJSONType(
ToNamespacedPath(realm->env(), &path_value);
- std::string path_value_str = path_value.ToString();
+ auto path = path_value.ToPath();
+
if (slashCheck) {
- path_value_str.push_back(kPathSeparator);
+ path /= "";
}
- std::filesystem::path path;
-
-#ifdef _WIN32
- std::wstring wide_path = ConvertToWideString(path_value_str, GetACP());
- path = std::filesystem::path(wide_path);
-#else
- path = std::filesystem::path(path_value_str);
-#endif
-
auto package_json = TraverseParent(realm, path);
if (package_json == nullptr) {
diff --git a/src/util-inl.h b/src/util-inl.h
index 17b870e2dd91ab6affd1097d0a4f691d5a1d9d80..9ca2ff66d812b54a7e0f845f0f0fd63f7accbfe1 100644
--- a/src/util-inl.h
+++ b/src/util-inl.h
@@ -678,12 +678,11 @@ inline bool IsWindowsBatchFile(const char* filename) {
return !extension.empty() && (extension == "cmd" || extension == "bat");
}
-inline std::wstring ConvertToWideString(const std::string& str,
- UINT code_page) {
+inline std::wstring ConvertUTF8ToWideString(const std::string& str) {
int size_needed = MultiByteToWideChar(
- code_page, 0, &str[0], static_cast<int>(str.size()), nullptr, 0);
+ CP_UTF8, 0, &str[0], static_cast<int>(str.size()), nullptr, 0);
std::wstring wstrTo(size_needed, 0);
- MultiByteToWideChar(code_page,
+ MultiByteToWideChar(CP_UTF8,
0,
&str[0],
static_cast<int>(str.size()),
@@ -691,6 +690,59 @@ inline std::wstring ConvertToWideString(const std::string& str,
size_needed);
return wstrTo;
}
+
+std::string ConvertWideStringToUTF8(const std::wstring& wstr) {
+ if (wstr.empty()) return std::string();
+
+ int size_needed = WideCharToMultiByte(CP_UTF8,
+ 0,
+ &wstr[0],
+ static_cast<int>(wstr.size()),
+ nullptr,
+ 0,
+ nullptr,
+ nullptr);
+ std::string strTo(size_needed, 0);
+ WideCharToMultiByte(CP_UTF8,
+ 0,
+ &wstr[0],
+ static_cast<int>(wstr.size()),
+ &strTo[0],
+ size_needed,
+ nullptr,
+ nullptr);
+ return strTo;
+}
+
+template <typename T, size_t kStackStorageSize>
+std::filesystem::path MaybeStackBuffer<T, kStackStorageSize>::ToPath() const {
+ std::wstring wide_path = ConvertUTF8ToWideString(ToString());
+ return std::filesystem::path(wide_path);
+}
+
+std::string ConvertPathToUTF8(const std::filesystem::path& path) {
+ return ConvertWideStringToUTF8(path.wstring());
+}
+
+std::string ConvertGenericPathToUTF8(const std::filesystem::path& path) {
+ return ConvertWideStringToUTF8(path.generic_wstring());
+}
+
+#else // _WIN32
+
+template <typename T, size_t kStackStorageSize>
+std::filesystem::path MaybeStackBuffer<T, kStackStorageSize>::ToPath() const {
+ return std::filesystem::path(ToStringView());
+}
+
+std::string ConvertPathToUTF8(const std::filesystem::path& path) {
+ return path.native();
+}
+
+std::string ConvertGenericPathToUTF8(const std::filesystem::path& path) {
+ return path.generic_string();
+}
+
#endif // _WIN32
} // namespace node
diff --git a/src/util.h b/src/util.h
index 8460fe26bbf9e83d080fdfc458d570d0ee29e6f0..6137a32dc8f6208e05b8505e3603f3a747192cea 100644
--- a/src/util.h
+++ b/src/util.h
@@ -507,6 +507,8 @@ class MaybeStackBuffer {
inline std::basic_string_view<T> ToStringView() const {
return {out(), length()};
}
+ // This can only be used if the buffer contains path data in UTF8
+ inline std::filesystem::path ToPath() const;
private:
size_t length_;
@@ -1022,9 +1024,15 @@ class JSONOutputStream final : public v8::OutputStream {
// Returns true if OS==Windows and filename ends in .bat or .cmd,
// case insensitive.
inline bool IsWindowsBatchFile(const char* filename);
-inline std::wstring ConvertToWideString(const std::string& str, UINT code_page);
+inline std::wstring ConvertUTF8ToWideString(const std::string& str);
+inline std::string ConvertWideStringToUTF8(const std::wstring& wstr);
+
#endif // _WIN32
+inline std::filesystem::path ConvertUTF8ToPath(const std::string& str);
+inline std::string ConvertPathToUTF8(const std::filesystem::path& path);
+inline std::string ConvertGenericPathToUTF8(const std::filesystem::path& path);
+
} // namespace node
#endif // defined(NODE_WANT_INTERNALS) && NODE_WANT_INTERNALS

View File

@@ -6,65 +6,8 @@ Subject: support V8 sandboxed pointers
This refactors several allocators to allocate within the V8 memory cage,
allowing them to be compatible with the V8_SANDBOXED_POINTERS feature.
diff --git a/src/api/environment.cc b/src/api/environment.cc
index fd71ceac65ccef1d2832b45b0b5612877cee22c1..cb37fa080fc8e8d524cfa2758c4a8c2c5652324d 100644
--- a/src/api/environment.cc
+++ b/src/api/environment.cc
@@ -106,6 +106,14 @@ MaybeLocal<Value> PrepareStackTraceCallback(Local<Context> context,
return result;
}
+NodeArrayBufferAllocator::NodeArrayBufferAllocator() {
+ zero_fill_field_ = static_cast<uint32_t*>(allocator_->Allocate(sizeof(*zero_fill_field_)));
+}
+
+NodeArrayBufferAllocator::~NodeArrayBufferAllocator() {
+ allocator_->Free(zero_fill_field_, sizeof(*zero_fill_field_));
+}
+
void* NodeArrayBufferAllocator::Allocate(size_t size) {
void* ret;
if (zero_fill_field_ || per_process::cli_options->zero_fill_all_buffers)
diff --git a/src/crypto/crypto_dh.cc b/src/crypto/crypto_dh.cc
index f23cedf4f2449d8edc9a8de1b70332e75d693cdd..976653dd1e9363e046788fc3419a9b649ceb2ea4 100644
--- a/src/crypto/crypto_dh.cc
+++ b/src/crypto/crypto_dh.cc
@@ -55,13 +55,32 @@ void DiffieHellman::MemoryInfo(MemoryTracker* tracker) const {
namespace {
MaybeLocal<Value> DataPointerToBuffer(Environment* env, DataPointer&& data) {
+#if defined(V8_ENABLE_SANDBOX)
+ std::unique_ptr<v8::BackingStore> backing;
+ if (data.size() > 0) {
+ std::unique_ptr<ArrayBuffer::Allocator> allocator(ArrayBuffer::Allocator::NewDefaultAllocator());
+ void* v8_data = allocator->Allocate(data.size());
+ CHECK(v8_data);
+ memcpy(v8_data, data.get(), data.size());
+ backing = ArrayBuffer::NewBackingStore(
+ v8_data,
+ data.size(),
+ [](void* data, size_t length, void*) {
+ std::unique_ptr<ArrayBuffer::Allocator> allocator(ArrayBuffer::Allocator::NewDefaultAllocator());
+ allocator->Free(data, length);
+ }, nullptr);
+ } else {
+ NoArrayBufferZeroFillScope no_zero_fill_scope(env->isolate_data());
+ backing = v8::ArrayBuffer::NewBackingStore(env->isolate(), data.size());
+ }
+#else
auto backing = ArrayBuffer::NewBackingStore(
data.get(),
data.size(),
[](void* data, size_t len, void* ptr) { DataPointer free_me(data, len); },
nullptr);
data.release();
-
+#endif
auto ab = ArrayBuffer::New(env->isolate(), std::move(backing));
return Buffer::New(env, ab, 0, ab->ByteLength()).FromMaybe(Local<Value>());
}
diff --git a/src/crypto/crypto_util.cc b/src/crypto/crypto_util.cc
index 6346f8f7199cf7b7d3736c59571606fff102fbb6..7eea2eaefcad5780663a6b87985925ae5d70a5f9 100644
index 4505786745c54a529f904d5e7813a86204e0a78b..eab18ab9888e2f7c0757fefab80505d8c99dc742 100644
--- a/src/crypto/crypto_util.cc
+++ b/src/crypto/crypto_util.cc
@@ -359,10 +359,35 @@ ByteSource& ByteSource::operator=(ByteSource&& other) noexcept {
@@ -143,10 +86,10 @@ index 6346f8f7199cf7b7d3736c59571606fff102fbb6..7eea2eaefcad5780663a6b87985925ae
void SecureHeapUsed(const FunctionCallbackInfo<Value>& args) {
#ifndef OPENSSL_IS_BORINGSSL
diff --git a/src/crypto/crypto_util.h b/src/crypto/crypto_util.h
index ebc7fddeccf04a92c610849b626b33f900d63493..ed7d202d1b041f8a6cd43ae767d696fb29ab9cd9 100644
index 1592134716da2de40de4ba028ee937b765423e37..8f3ba65f1fef2c066d6df6087a08ba71100d1090 100644
--- a/src/crypto/crypto_util.h
+++ b/src/crypto/crypto_util.h
@@ -243,7 +243,7 @@ class ByteSource {
@@ -242,7 +242,7 @@ class ByteSource {
// Creates a v8::BackingStore that takes over responsibility for
// any allocated data. The ByteSource will be reset with size = 0
// after being called.
@@ -188,8 +131,30 @@ index f616223cfb0f6e10f7cf57ada9704316bde2797e..eb6dad44a49d997097c8fb5009eeb60a
auto ab = ArrayBuffer::New(env->isolate(), std::move(backing));
Local<Value> ret;
if (!Buffer::New(env, ab, 0, ab->ByteLength()).ToLocal(&ret)) return {};
diff --git a/src/node_buffer.cc b/src/node_buffer.cc
index 357dc5f6d1c1c2d3756a94c1326b0502403e1eaf..b9f0c97938203b4652780a7d707c5e83319330b0 100644
--- a/src/node_buffer.cc
+++ b/src/node_buffer.cc
@@ -1412,7 +1412,7 @@ inline size_t CheckNumberToSize(Local<Value> number) {
CHECK(value >= 0 && value < maxSize);
size_t size = static_cast<size_t>(value);
#ifdef V8_ENABLE_SANDBOX
- CHECK_LE(size, kMaxSafeBufferSizeForSandbox);
+ CHECK_LE(size, v8::internal::kMaxSafeBufferSizeForSandbox);
#endif
return size;
}
@@ -1437,7 +1437,7 @@ void CreateUnsafeArrayBuffer(const FunctionCallbackInfo<Value>& args) {
buf = ArrayBuffer::New(isolate, size);
} else {
std::unique_ptr<BackingStore> store =
- ArrayBuffer::NewBackingStoreForNodeLTS(isolate, size);
+ ArrayBuffer::NewBackingStore(isolate, size);
if (!store) {
return env->ThrowRangeError("Array buffer allocation failed");
}
diff --git a/src/node_i18n.cc b/src/node_i18n.cc
index 61b6ecd240c9500f21f683065a2f920af3afb502..ad2b1c76325cb5c8f18a618c5a85ae87b6a7bbe7 100644
index 6be3920632b25db450025ebab6a2636e4811cdbe..b49916d2b5fc5e58cf3fb67329430fd3df8fb813 100644
--- a/src/node_i18n.cc
+++ b/src/node_i18n.cc
@@ -104,7 +104,7 @@ namespace {
@@ -228,30 +193,6 @@ index 61b6ecd240c9500f21f683065a2f920af3afb502..ad2b1c76325cb5c8f18a618c5a85ae87
}
constexpr const char* EncodingName(const enum encoding encoding) {
diff --git a/src/node_internals.h b/src/node_internals.h
index 12ea72b61b0a5e194207bb369dfed4b8667107cb..64442215714a98f648971e517ddd9c77e38fe3f2 100644
--- a/src/node_internals.h
+++ b/src/node_internals.h
@@ -121,7 +121,9 @@ v8::MaybeLocal<v8::Object> InitializePrivateSymbols(
class NodeArrayBufferAllocator : public ArrayBufferAllocator {
public:
- inline uint32_t* zero_fill_field() { return &zero_fill_field_; }
+ NodeArrayBufferAllocator();
+ ~NodeArrayBufferAllocator() override;
+ inline uint32_t* zero_fill_field() { return zero_fill_field_; }
void* Allocate(size_t size) override; // Defined in src/node.cc
void* AllocateUninitialized(size_t size) override;
@@ -139,7 +141,7 @@ class NodeArrayBufferAllocator : public ArrayBufferAllocator {
}
private:
- uint32_t zero_fill_field_ = 1; // Boolean but exposed as uint32 to JS land.
+ uint32_t* zero_fill_field_ = nullptr; // Boolean but exposed as uint32 to JS land.
std::atomic<size_t> total_mem_usage_ {0};
// Delegate to V8's allocator for compatibility with the V8 memory cage.
diff --git a/src/node_serdes.cc b/src/node_serdes.cc
index c55a2e28066147ae5ca5def10ec76ccc03c634b4..c54183c72944989219b6437c9e571a3f7f3f8dd5 100644
--- a/src/node_serdes.cc

View File

@@ -0,0 +1,33 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Shelley Vohr <shelley.vohr@gmail.com>
Date: Tue, 4 Nov 2025 21:20:26 +0100
Subject: test: correct conditional secure heap flags test
PR-URL: https://github.com/nodejs/node/pull/60385
Reviewed-By: Colin Ihrig <cjihrig@gmail.com>
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
(cherry picked from commit 53c4a39fec941e04150554fdd3e654b48f2e1b31)
diff --git a/test/parallel/test-process-env-allowed-flags-are-documented.js b/test/parallel/test-process-env-allowed-flags-are-documented.js
index afd43cfffe638f4f084f1c36949068e7239eadc3..c70e073bab888c349e8f5c691f5679a3796c896c 100644
--- a/test/parallel/test-process-env-allowed-flags-are-documented.js
+++ b/test/parallel/test-process-env-allowed-flags-are-documented.js
@@ -49,6 +49,8 @@ if (!hasOpenSSL3) {
documented.delete('--openssl-shared-config');
}
+const isV8Sandboxed = process.config.variables.v8_enable_sandbox;
+
// Filter out options that are conditionally present.
const conditionalOpts = [
{
@@ -74,6 +76,9 @@ const conditionalOpts = [
}, {
include: process.features.inspector,
filter: (opt) => opt.startsWith('--inspect') || opt === '--debug-port'
+ }, {
+ include: !isV8Sandboxed,
+ filter: (opt) => ['--secure-heap', '--secure-heap-min'].includes(opt)
},
];
documented.forEach((opt) => {

View File

@@ -7,7 +7,7 @@ Instead of disabling the tests, flag them as flaky so they still run
but don't cause CI failures on flakes.
diff --git a/test/parallel/parallel.status b/test/parallel/parallel.status
index 67c0c04d2365e59db111258d008f8c73173e3e96..a4204e7580e7823399f6057d57c09cba56b5ff78 100644
index 9822dc622ebfc2d1a31539474b78e03e64426195..045a415f7b9f461bd9b2c97d42051f2580b7df1a 100644
--- a/test/parallel/parallel.status
+++ b/test/parallel/parallel.status
@@ -5,6 +5,16 @@ prefix parallel
@@ -28,7 +28,7 @@ index 67c0c04d2365e59db111258d008f8c73173e3e96..a4204e7580e7823399f6057d57c09cba
test-net-write-fully-async-hex-string: PASS, FLAKY
# https://github.com/nodejs/node/issues/52273
diff --git a/test/sequential/sequential.status b/test/sequential/sequential.status
index 4ae3b6c5fd2eb633ae78bed1824046d862d7579b..d291954d4451b63aeb2bf46232e8705150eb9e79 100644
index f6c9c77379930a8234cd4c5f933c261d6bd1a238..28d5378e4051b78e9a7ff81886c3382e92a5da2e 100644
--- a/test/sequential/sequential.status
+++ b/test/sequential/sequential.status
@@ -7,6 +7,18 @@ prefix sequential

View File

@@ -1 +1,5 @@
chore_allow_customizing_microtask_policy_per_context.patch
turboshaft_avoid_introducing_too_many_variables.patch
cherry-pick-4cf9311810b0.patch
merged_maglev_fix_left_over_register_allocations_from_regalloc.patch
cherry-pick-62af07e96173.patch

View File

@@ -0,0 +1,302 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Leszek Swirski <leszeks@chromium.org>
Date: Wed, 12 Nov 2025 16:46:01 +0100
Subject: [compiler] Preserve field repr in property array extension
Walk the descriptor array in lockstep with the property array when
extending the latter.
Fixed: 460017370
Change-Id: If0b4fc3c5f62fc0cc373588cbddc3c0a95c7225c
Reviewed-on: https://chromium-review.googlesource.com/c/v8/v8/+/7146166
Commit-Queue: Leszek Swirski <leszeks@chromium.org>
Reviewed-by: Nico Hartmann <nicohartmann@chromium.org>
Reviewed-by: Igor Sheludko <ishell@chromium.org>
Cr-Commit-Position: refs/heads/main@{#103674}
diff --git a/src/compiler/access-builder.cc b/src/compiler/access-builder.cc
index 0df286daceb0002af9c8ce73633b4ef2ece5f2d5..bad4292c7ef198cacd3a9e80b563234e8a69fe36 100644
--- a/src/compiler/access-builder.cc
+++ b/src/compiler/access-builder.cc
@@ -4,6 +4,8 @@
#include "src/compiler/access-builder.h"
+#include "src/codegen/machine-type.h"
+#include "src/compiler/property-access-builder.h"
#include "src/compiler/type-cache.h"
#include "src/handles/handles-inl.h"
#include "src/objects/arguments.h"
@@ -1099,12 +1101,16 @@ FieldAccess AccessBuilder::ForFeedbackVectorSlot(int index) {
}
// static
-FieldAccess AccessBuilder::ForPropertyArraySlot(int index) {
+FieldAccess AccessBuilder::ForPropertyArraySlot(int index,
+ Representation representation) {
int offset = PropertyArray::OffsetOfElementAt(index);
- FieldAccess access = {kTaggedBase, offset,
- Handle<Name>(), OptionalMapRef(),
- Type::Any(), MachineType::AnyTagged(),
- kFullWriteBarrier, "PropertyArraySlot"};
+ MachineType machine_type =
+ representation.IsHeapObject() || representation.IsDouble()
+ ? MachineType::TaggedPointer()
+ : MachineType::AnyTagged();
+ FieldAccess access = {
+ kTaggedBase, offset, Handle<Name>(), OptionalMapRef(),
+ Type::Any(), machine_type, kFullWriteBarrier, "PropertyArraySlot"};
return access;
}
diff --git a/src/compiler/access-builder.h b/src/compiler/access-builder.h
index 2974012bc3d41ddad7bf8c89d8d91ac7cf2a9c49..2b648443fde2aa1a73b64ecb56d9c7fe8221ced8 100644
--- a/src/compiler/access-builder.h
+++ b/src/compiler/access-builder.h
@@ -11,6 +11,7 @@
#include "src/compiler/write-barrier-kind.h"
#include "src/objects/elements-kind.h"
#include "src/objects/js-objects.h"
+#include "src/objects/property-details.h"
namespace v8 {
namespace internal {
@@ -323,7 +324,8 @@ class V8_EXPORT_PRIVATE AccessBuilder final
static FieldAccess ForFeedbackVectorSlot(int index);
// Provides access to PropertyArray slots.
- static FieldAccess ForPropertyArraySlot(int index);
+ static FieldAccess ForPropertyArraySlot(int index,
+ Representation representation);
// Provides access to ScopeInfo flags.
static FieldAccess ForScopeInfoFlags();
diff --git a/src/compiler/js-native-context-specialization.cc b/src/compiler/js-native-context-specialization.cc
index c16f14ebacbd45311583f7e22195142ddb47ab73..8200bd59f0c4ad05bd3e3c739155807af85a8f93 100644
--- a/src/compiler/js-native-context-specialization.cc
+++ b/src/compiler/js-native-context-specialization.cc
@@ -38,6 +38,7 @@
#include "src/objects/elements-kind.h"
#include "src/objects/feedback-vector.h"
#include "src/objects/heap-number.h"
+#include "src/objects/property-details.h"
#include "src/objects/string.h"
namespace v8 {
@@ -4227,25 +4228,59 @@ Node* JSNativeContextSpecialization::BuildExtendPropertiesBackingStore(
// for intermediate states of chains of property additions. That makes
// it unclear what the best approach is here.
DCHECK_EQ(map.UnusedPropertyFields(), 0);
- int length = map.NextFreePropertyIndex() - map.GetInObjectProperties();
+ int in_object_length = map.GetInObjectProperties();
+ int length = map.NextFreePropertyIndex() - in_object_length;
// Under normal circumstances, NextFreePropertyIndex() will always be larger
// than GetInObjectProperties(). However, an attacker able to corrupt heap
// memory can break this invariant, in which case we'll get confused here,
// potentially causing a sandbox violation. This CHECK defends against that.
SBXCHECK_GE(length, 0);
int new_length = length + JSObject::kFieldsAdded;
+
+ // Find the descriptor index corresponding to the first out-of-object
+ // property.
+ DescriptorArrayRef descs = map.instance_descriptors(broker());
+ InternalIndex first_out_of_object_descriptor(in_object_length);
+ InternalIndex number_of_descriptors(descs.object()->number_of_descriptors());
+ for (InternalIndex i(in_object_length); i < number_of_descriptors; ++i) {
+ PropertyDetails details = descs.GetPropertyDetails(i);
+ // Skip over non-field properties.
+ if (details.location() != PropertyLocation::kField) {
+ continue;
+ }
+ // Skip over in-object fields.
+ // TODO(leszeks): We could make this smarter, like a binary search.
+ if (details.field_index() < in_object_length) {
+ continue;
+ }
+ first_out_of_object_descriptor = i;
+ break;
+ }
+
// Collect the field values from the {properties}.
- ZoneVector<Node*> values(zone());
+ ZoneVector<std::pair<Node*, Representation>> values(zone());
values.reserve(new_length);
- for (int i = 0; i < length; ++i) {
+
+ // Walk the property descriptors alongside the property values, to make
+ // sure to get and store them with the right machine type.
+ InternalIndex descriptor = first_out_of_object_descriptor;
+ for (int i = 0; i < length; ++i, ++descriptor) {
+ PropertyDetails details = descs.GetPropertyDetails(descriptor);
+ while (details.location() != PropertyLocation::kField) {
+ ++descriptor;
+ details = descs.GetPropertyDetails(descriptor);
+ }
+ DCHECK_EQ(i, details.field_index() - in_object_length);
Node* value = effect = graph()->NewNode(
- simplified()->LoadField(AccessBuilder::ForFixedArraySlot(i)),
+ simplified()->LoadField(
+ AccessBuilder::ForPropertyArraySlot(i, details.representation())),
properties, effect, control);
- values.push_back(value);
+ values.push_back({value, details.representation()});
}
// Initialize the new fields to undefined.
for (int i = 0; i < JSObject::kFieldsAdded; ++i) {
- values.push_back(jsgraph()->UndefinedConstant());
+ values.push_back(
+ {jsgraph()->UndefinedConstant(), Representation::Tagged()});
}
// Compute new length and hash.
@@ -4283,7 +4318,8 @@ Node* JSNativeContextSpecialization::BuildExtendPropertiesBackingStore(
a.Store(AccessBuilder::ForMap(), jsgraph()->PropertyArrayMapConstant());
a.Store(AccessBuilder::ForPropertyArrayLengthAndHash(), new_length_and_hash);
for (int i = 0; i < new_length; ++i) {
- a.Store(AccessBuilder::ForFixedArraySlot(i), values[i]);
+ a.Store(AccessBuilder::ForPropertyArraySlot(i, values[i].second),
+ values[i].first);
}
return a.Finish();
}
diff --git a/src/compiler/turboshaft/turbolev-early-lowering-reducer-inl.h b/src/compiler/turboshaft/turbolev-early-lowering-reducer-inl.h
index bf0832ecc67bec1a6da711b9bff7266ba3df7d51..a6452e7cfb9e7d7d6d222bd625d13dabe4f97cb7 100644
--- a/src/compiler/turboshaft/turbolev-early-lowering-reducer-inl.h
+++ b/src/compiler/turboshaft/turbolev-early-lowering-reducer-inl.h
@@ -14,6 +14,7 @@
#include "src/compiler/turboshaft/representations.h"
#include "src/deoptimizer/deoptimize-reason.h"
#include "src/objects/contexts.h"
+#include "src/objects/descriptor-array-inl.h"
#include "src/objects/instance-type-inl.h"
namespace v8::internal::compiler::turboshaft {
@@ -323,8 +324,32 @@ class TurbolevEarlyLoweringReducer : public Next {
}
V<PropertyArray> ExtendPropertiesBackingStore(
- V<PropertyArray> old_property_array, V<JSObject> object, int old_length,
+ V<PropertyArray> old_property_array, V<JSObject> object,
+ const compiler::MapRef& old_map, int old_length,
V<FrameState> frame_state, const FeedbackSource& feedback) {
+ int in_object_length = old_map.GetInObjectProperties();
+
+ // Find the descriptor index corresponding to the first out-of-object
+ // property.
+ DescriptorArrayRef descs = old_map.instance_descriptors(broker_);
+ InternalIndex first_out_of_object_descriptor(in_object_length);
+ InternalIndex number_of_descriptors(
+ descs.object()->number_of_descriptors());
+ for (InternalIndex i(in_object_length); i < number_of_descriptors; ++i) {
+ PropertyDetails details = descs.GetPropertyDetails(i);
+ // Skip over non-field properties.
+ if (details.location() != PropertyLocation::kField) {
+ continue;
+ }
+ // Skip over in-object fields.
+ // TODO(leszeks): We could make this smarter, like a binary search.
+ if (details.field_index() < in_object_length) {
+ continue;
+ }
+ first_out_of_object_descriptor = i;
+ break;
+ }
+
// Allocate new PropertyArray.
int new_length = old_length + JSObject::kFieldsAdded;
Uninitialized<PropertyArray> new_property_array =
@@ -335,18 +360,28 @@ class TurbolevEarlyLoweringReducer : public Next {
__ HeapConstant(factory_->property_array_map()));
// Copy existing properties over.
- for (int i = 0; i < old_length; i++) {
+ InternalIndex descriptor = first_out_of_object_descriptor;
+ for (int i = 0; i < old_length; ++i, ++descriptor) {
+ PropertyDetails details = descs.GetPropertyDetails(descriptor);
+ while (details.location() != PropertyLocation::kField) {
+ ++descriptor;
+ details = descs.GetPropertyDetails(descriptor);
+ }
+ DCHECK_EQ(i, details.field_index() - in_object_length);
+ Representation r = details.representation();
+
V<Object> old_value = __ template LoadField<Object>(
- old_property_array, AccessBuilder::ForPropertyArraySlot(i));
+ old_property_array, AccessBuilder::ForPropertyArraySlot(i, r));
__ InitializeField(new_property_array,
- AccessBuilder::ForPropertyArraySlot(i), old_value);
+ AccessBuilder::ForPropertyArraySlot(i, r), old_value);
}
// Initialize new properties to undefined.
V<Undefined> undefined = __ HeapConstant(factory_->undefined_value());
for (int i = 0; i < JSObject::kFieldsAdded; ++i) {
__ InitializeField(new_property_array,
- AccessBuilder::ForPropertyArraySlot(old_length + i),
+ AccessBuilder::ForPropertyArraySlot(
+ old_length + i, Representation::Tagged()),
undefined);
}
diff --git a/src/compiler/turboshaft/turbolev-graph-builder.cc b/src/compiler/turboshaft/turbolev-graph-builder.cc
index e9db0d8260c784a1da406b7b70d2e623135b572d..8758e09fd7bcdb9221904147b5f7e9ac366e49b6 100644
--- a/src/compiler/turboshaft/turbolev-graph-builder.cc
+++ b/src/compiler/turboshaft/turbolev-graph-builder.cc
@@ -2666,10 +2666,11 @@ class GraphBuildingNodeProcessor {
maglev::ProcessResult Process(maglev::ExtendPropertiesBackingStore* node,
const maglev::ProcessingState& state) {
GET_FRAME_STATE_MAYBE_ABORT(frame_state, node->eager_deopt_info());
- SetMap(node, __ ExtendPropertiesBackingStore(
- Map(node->property_array_input()),
- Map(node->object_input()), node->old_length(), frame_state,
- node->eager_deopt_info()->feedback_to_update()));
+ SetMap(node,
+ __ ExtendPropertiesBackingStore(
+ Map(node->property_array_input()), Map(node->object_input()),
+ node->old_map(), node->old_length(), frame_state,
+ node->eager_deopt_info()->feedback_to_update()));
return maglev::ProcessResult::kContinue;
}
diff --git a/src/maglev/maglev-graph-builder.cc b/src/maglev/maglev-graph-builder.cc
index 784f6b4eaad502b514b4a9e36560138a7fd50df2..733fad289b326f6f52369431aa00136715ee0283 100644
--- a/src/maglev/maglev-graph-builder.cc
+++ b/src/maglev/maglev-graph-builder.cc
@@ -5282,7 +5282,7 @@ ReduceResult MaglevGraphBuilder::BuildExtendPropertiesBackingStore(
// potentially causing a sandbox violation. This CHECK defends against that.
SBXCHECK_GE(length, 0);
return AddNewNode<ExtendPropertiesBackingStore>({property_array, receiver},
- length);
+ map, length);
}
MaybeReduceResult MaglevGraphBuilder::TryBuildStoreField(
diff --git a/src/maglev/maglev-ir.h b/src/maglev/maglev-ir.h
index a994392e3f4697ec8c6ffc3f9a64d6093a712f20..7f20ae81b5b12b2b747f0b46caebd9258d399ab1 100644
--- a/src/maglev/maglev-ir.h
+++ b/src/maglev/maglev-ir.h
@@ -8531,8 +8531,10 @@ class ExtendPropertiesBackingStore
using Base = FixedInputValueNodeT<2, ExtendPropertiesBackingStore>;
public:
- explicit ExtendPropertiesBackingStore(uint64_t bitfield, int old_length)
- : Base(bitfield), old_length_(old_length) {}
+ explicit ExtendPropertiesBackingStore(uint64_t bitfield,
+ const compiler::MapRef& old_map,
+ int old_length)
+ : Base(bitfield), old_map_(old_map), old_length_(old_length) {}
static constexpr OpProperties kProperties =
OpProperties::CanAllocate() | OpProperties::CanRead() |
@@ -8552,9 +8554,11 @@ class ExtendPropertiesBackingStore
void GenerateCode(MaglevAssembler*, const ProcessingState&);
void PrintParams(std::ostream&) const;
+ const compiler::MapRef& old_map() const { return old_map_; }
int old_length() const { return old_length_; }
private:
+ const compiler::MapRef old_map_;
const int old_length_;
};

View File

@@ -0,0 +1,405 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Igor Sheludko <ishell@chromium.org>
Date: Mon, 13 Oct 2025 14:22:20 +0200
Subject: [ic] Cleanup AccessorAssembler::CallGetterIfAccessor()
This CL
- reorders parameters to make |expected_receiver_mode| a mandatory
one and properly computed,
- makes sure we don't pass PropertyCell as a holder when JSReceiver is
expected.
Bug: 450328966
Change-Id: I921dfbd99245d01143600b4f4713fe602c817657
Reviewed-on: https://chromium-review.googlesource.com/c/v8/v8/+/7036691
Commit-Queue: Igor Sheludko <ishell@chromium.org>
Reviewed-by: Leszek Swirski <leszeks@chromium.org>
Cr-Commit-Position: refs/heads/main@{#103085}
diff --git a/src/codegen/code-stub-assembler.cc b/src/codegen/code-stub-assembler.cc
index f56eba583a0a2117fc895f12b7a0712f59e3a1b6..e27d74877d9ef9b9bfa958b9b49773fd7b3c3d08 100644
--- a/src/codegen/code-stub-assembler.cc
+++ b/src/codegen/code-stub-assembler.cc
@@ -11549,7 +11549,8 @@ void CodeStubAssembler::ForEachEnumerableOwnProperty(
var_value = CallGetterIfAccessor(
value_or_accessor, object, var_details.value(), context,
- object, next_key, &slow_load, kCallJSGetterUseCachedName);
+ object, kExpectingJSReceiver, next_key, &slow_load,
+ kCallJSGetterUseCachedName);
Goto(&value_ready);
BIND(&slow_load);
@@ -12045,15 +12046,11 @@ template void CodeStubAssembler::LoadPropertyFromDictionary(
TNode<SwissNameDictionary> dictionary, TNode<IntPtrT> name_index,
TVariable<Uint32T>* var_details, TVariable<Object>* var_value);
-// |value| is the property backing store's contents, which is either a value or
-// an accessor pair, as specified by |details|. |holder| is a JSReceiver or a
-// PropertyCell. Returns either the original value, or the result of the getter
-// call.
TNode<Object> CodeStubAssembler::CallGetterIfAccessor(
- TNode<Object> value, TNode<Union<JSReceiver, PropertyCell>> holder,
+ TNode<Object> value, std::optional<TNode<JSReceiver>> holder,
TNode<Uint32T> details, TNode<Context> context, TNode<JSAny> receiver,
- TNode<Object> name, Label* if_bailout, GetOwnPropertyMode mode,
- ExpectedReceiverMode expected_receiver_mode) {
+ ExpectedReceiverMode expected_receiver_mode, TNode<Object> name,
+ Label* if_bailout, GetOwnPropertyMode mode) {
TVARIABLE(Object, var_value, value);
Label done(this), if_accessor_info(this, Label::kDeferred);
@@ -12094,44 +12091,51 @@ TNode<Object> CodeStubAssembler::CallGetterIfAccessor(
BIND(&if_function_template_info);
{
- Label use_cached_property(this);
- TNode<HeapObject> cached_property_name = LoadObjectField<HeapObject>(
- getter, FunctionTemplateInfo::kCachedPropertyNameOffset);
-
- Label* has_cached_property = mode == kCallJSGetterUseCachedName
- ? &use_cached_property
- : if_bailout;
- GotoIfNot(IsTheHole(cached_property_name), has_cached_property);
-
- TNode<JSReceiver> js_receiver;
- switch (expected_receiver_mode) {
- case kExpectingJSReceiver:
- js_receiver = CAST(receiver);
- break;
- case kExpectingAnyReceiver:
- // TODO(ishell): in case the function template info has a signature
- // and receiver is not a JSReceiver the signature check in
- // CallFunctionTemplate builtin will fail anyway, so we can short
- // cut it here and throw kIllegalInvocation immediately.
- js_receiver = ToObject_Inline(context, receiver);
- break;
- }
- TNode<JSReceiver> holder_receiver = CAST(holder);
- TNode<NativeContext> creation_context =
- GetCreationContext(holder_receiver, if_bailout);
- TNode<Context> caller_context = context;
- var_value = CallBuiltin(
- Builtin::kCallFunctionTemplate_Generic, creation_context, getter,
- Int32Constant(i::JSParameterCount(0)), caller_context, js_receiver);
- Goto(&done);
+ if (holder.has_value()) {
+ Label use_cached_property(this);
+ TNode<HeapObject> cached_property_name = LoadObjectField<HeapObject>(
+ getter, FunctionTemplateInfo::kCachedPropertyNameOffset);
+
+ Label* has_cached_property = mode == kCallJSGetterUseCachedName
+ ? &use_cached_property
+ : if_bailout;
+ GotoIfNot(IsTheHole(cached_property_name), has_cached_property);
+
+ TNode<JSReceiver> js_receiver;
+ switch (expected_receiver_mode) {
+ case kExpectingJSReceiver:
+ js_receiver = CAST(receiver);
+ break;
+ case kExpectingAnyReceiver:
+ // TODO(ishell): in case the function template info has a
+ // signature and receiver is not a JSReceiver the signature check
+ // in CallFunctionTemplate builtin will fail anyway, so we can
+ // short cut it here and throw kIllegalInvocation immediately.
+ js_receiver = ToObject_Inline(context, receiver);
+ break;
+ }
+ TNode<JSReceiver> holder_receiver = *holder;
+ TNode<NativeContext> creation_context =
+ GetCreationContext(holder_receiver, if_bailout);
+ TNode<Context> caller_context = context;
+ var_value = CallBuiltin(Builtin::kCallFunctionTemplate_Generic,
+ creation_context, getter,
+ Int32Constant(i::JSParameterCount(0)),
+ caller_context, js_receiver);
+ Goto(&done);
- if (mode == kCallJSGetterUseCachedName) {
- Bind(&use_cached_property);
+ if (mode == kCallJSGetterUseCachedName) {
+ Bind(&use_cached_property);
- var_value =
- GetProperty(context, holder_receiver, cached_property_name);
+ var_value =
+ GetProperty(context, holder_receiver, cached_property_name);
- Goto(&done);
+ Goto(&done);
+ }
+ } else {
+ // |holder| must be available in order to handle lazy AccessorPair
+ // case (we need it for computing the function's context).
+ Unreachable();
}
}
} else {
@@ -12143,56 +12147,61 @@ TNode<Object> CodeStubAssembler::CallGetterIfAccessor(
// AccessorInfo case.
BIND(&if_accessor_info);
{
- TNode<AccessorInfo> accessor_info = CAST(value);
- Label if_array(this), if_function(this), if_wrapper(this);
-
- // Dispatch based on {holder} instance type.
- TNode<Map> holder_map = LoadMap(holder);
- TNode<Uint16T> holder_instance_type = LoadMapInstanceType(holder_map);
- GotoIf(IsJSArrayInstanceType(holder_instance_type), &if_array);
- GotoIf(IsJSFunctionInstanceType(holder_instance_type), &if_function);
- Branch(IsJSPrimitiveWrapperInstanceType(holder_instance_type), &if_wrapper,
- if_bailout);
-
- // JSArray AccessorInfo case.
- BIND(&if_array);
- {
- // We only deal with the "length" accessor on JSArray.
- GotoIfNot(IsLengthString(
- LoadObjectField(accessor_info, AccessorInfo::kNameOffset)),
- if_bailout);
- TNode<JSArray> array = CAST(holder);
- var_value = LoadJSArrayLength(array);
- Goto(&done);
- }
-
- // JSFunction AccessorInfo case.
- BIND(&if_function);
- {
- // We only deal with the "prototype" accessor on JSFunction here.
- GotoIfNot(IsPrototypeString(
- LoadObjectField(accessor_info, AccessorInfo::kNameOffset)),
- if_bailout);
+ if (holder.has_value()) {
+ TNode<AccessorInfo> accessor_info = CAST(value);
+ Label if_array(this), if_function(this), if_wrapper(this);
+ // Dispatch based on {holder} instance type.
+ TNode<Map> holder_map = LoadMap(*holder);
+ TNode<Uint16T> holder_instance_type = LoadMapInstanceType(holder_map);
+ GotoIf(IsJSArrayInstanceType(holder_instance_type), &if_array);
+ GotoIf(IsJSFunctionInstanceType(holder_instance_type), &if_function);
+ Branch(IsJSPrimitiveWrapperInstanceType(holder_instance_type),
+ &if_wrapper, if_bailout);
+
+ // JSArray AccessorInfo case.
+ BIND(&if_array);
+ {
+ // We only deal with the "length" accessor on JSArray.
+ GotoIfNot(IsLengthString(LoadObjectField(accessor_info,
+ AccessorInfo::kNameOffset)),
+ if_bailout);
+ TNode<JSArray> array = CAST(*holder);
+ var_value = LoadJSArrayLength(array);
+ Goto(&done);
+ }
- TNode<JSFunction> function = CAST(holder);
- GotoIfPrototypeRequiresRuntimeLookup(function, holder_map, if_bailout);
- var_value = LoadJSFunctionPrototype(function, if_bailout);
- Goto(&done);
- }
+ // JSFunction AccessorInfo case.
+ BIND(&if_function);
+ {
+ // We only deal with the "prototype" accessor on JSFunction here.
+ GotoIfNot(IsPrototypeString(LoadObjectField(accessor_info,
+ AccessorInfo::kNameOffset)),
+ if_bailout);
+
+ TNode<JSFunction> function = CAST(*holder);
+ GotoIfPrototypeRequiresRuntimeLookup(function, holder_map, if_bailout);
+ var_value = LoadJSFunctionPrototype(function, if_bailout);
+ Goto(&done);
+ }
- // JSPrimitiveWrapper AccessorInfo case.
- BIND(&if_wrapper);
- {
- // We only deal with the "length" accessor on JSPrimitiveWrapper string
- // wrappers.
- GotoIfNot(IsLengthString(
- LoadObjectField(accessor_info, AccessorInfo::kNameOffset)),
- if_bailout);
- TNode<Object> holder_value = LoadJSPrimitiveWrapperValue(CAST(holder));
- GotoIfNot(TaggedIsNotSmi(holder_value), if_bailout);
- GotoIfNot(IsString(CAST(holder_value)), if_bailout);
- var_value = LoadStringLengthAsSmi(CAST(holder_value));
- Goto(&done);
+ // JSPrimitiveWrapper AccessorInfo case.
+ BIND(&if_wrapper);
+ {
+ // We only deal with the "length" accessor on JSPrimitiveWrapper string
+ // wrappers.
+ GotoIfNot(IsLengthString(LoadObjectField(accessor_info,
+ AccessorInfo::kNameOffset)),
+ if_bailout);
+ TNode<Object> holder_value = LoadJSPrimitiveWrapperValue(CAST(*holder));
+ GotoIfNot(TaggedIsNotSmi(holder_value), if_bailout);
+ GotoIfNot(IsString(CAST(holder_value)), if_bailout);
+ var_value = LoadStringLengthAsSmi(CAST(holder_value));
+ Goto(&done);
+ }
+ } else {
+ // |holder| must be available in order to handle AccessorInfo case (we
+ // need to pass it to the callback).
+ Unreachable();
}
}
@@ -12277,7 +12286,7 @@ void CodeStubAssembler::TryGetOwnProperty(
}
TNode<Object> value = CallGetterIfAccessor(
var_value->value(), object, var_details->value(), context, receiver,
- unique_name, if_bailout, mode, expected_receiver_mode);
+ expected_receiver_mode, unique_name, if_bailout, mode);
*var_value = value;
Goto(if_found_value);
}
diff --git a/src/codegen/code-stub-assembler.h b/src/codegen/code-stub-assembler.h
index 506442eddf629a893a6661ff87bdf85c21d93445..f143592226c4009167e38e1b883e3448eb588776 100644
--- a/src/codegen/code-stub-assembler.h
+++ b/src/codegen/code-stub-assembler.h
@@ -4582,12 +4582,16 @@ class V8_EXPORT_PRIVATE CodeStubAssembler
const ForEachKeyValueFunction& body,
Label* bailout);
+ // |value| is the property backing store's contents, which is either a value
+ // or an accessor pair, as specified by |details|. |holder| is a JSReceiver
+ // or empty std::nullopt if holder is not available.
+ // Returns either the original value, or the result of the getter call.
TNode<Object> CallGetterIfAccessor(
- TNode<Object> value, TNode<Union<JSReceiver, PropertyCell>> holder,
+ TNode<Object> value, std::optional<TNode<JSReceiver>> holder,
TNode<Uint32T> details, TNode<Context> context, TNode<JSAny> receiver,
- TNode<Object> name, Label* if_bailout,
- GetOwnPropertyMode mode = kCallJSGetterDontUseCachedName,
- ExpectedReceiverMode expected_receiver_mode = kExpectingJSReceiver);
+ ExpectedReceiverMode expected_receiver_mode, TNode<Object> name,
+ Label* if_bailout,
+ GetOwnPropertyMode mode = kCallJSGetterDontUseCachedName);
TNode<IntPtrT> TryToIntptr(TNode<Object> key, Label* if_not_intptr,
TVariable<Int32T>* var_instance_type = nullptr);
diff --git a/src/ic/accessor-assembler.cc b/src/ic/accessor-assembler.cc
index c1cd3790f12418ecf7d3f027475abde3b8e2dedd..457fbc25d91e740190662a564ba315e390830b7f 100644
--- a/src/ic/accessor-assembler.cc
+++ b/src/ic/accessor-assembler.cc
@@ -845,9 +845,13 @@ void AccessorAssembler::HandleLoadICSmiHandlerLoadNamedCase(
TVARIABLE(Object, var_value);
LoadPropertyFromDictionary<PropertyDictionary>(
properties, var_name_index.value(), &var_details, &var_value);
+
+ ExpectedReceiverMode expected_receiver_mode =
+ p->IsLoadSuperIC() ? kExpectingAnyReceiver : kExpectingJSReceiver;
+
TNode<Object> value = CallGetterIfAccessor(
var_value.value(), CAST(holder), var_details.value(), p->context(),
- p->receiver(), p->name(), miss);
+ p->receiver(), expected_receiver_mode, p->name(), miss);
exit_point->Return(value);
}
}
@@ -925,17 +929,18 @@ void AccessorAssembler::HandleLoadICSmiHandlerLoadNamedCase(
BIND(&global);
{
- CSA_DCHECK(this, IsPropertyCell(CAST(holder)));
// Ensure the property cell doesn't contain the hole.
- TNode<Object> value =
- LoadObjectField(CAST(holder), PropertyCell::kValueOffset);
+ TNode<Object> value = LoadPropertyCellValue(CAST(holder));
+ GotoIf(IsPropertyCellHole(value), miss);
TNode<Uint32T> details = Unsigned(LoadAndUntagToWord32ObjectField(
CAST(holder), PropertyCell::kPropertyDetailsRawOffset));
- GotoIf(IsPropertyCellHole(value), miss);
- exit_point->Return(CallGetterIfAccessor(value, CAST(holder), details,
- p->context(), p->receiver(),
- p->name(), miss));
+ ExpectedReceiverMode expected_receiver_mode =
+ p->IsLoadSuperIC() ? kExpectingAnyReceiver : kExpectingJSReceiver;
+
+ exit_point->Return(CallGetterIfAccessor(
+ value, std::nullopt, details, p->context(), p->receiver(),
+ expected_receiver_mode, p->name(), miss));
}
BIND(&interceptor);
@@ -1221,9 +1226,14 @@ void AccessorAssembler::HandleLoadICProtoHandler(
TVARIABLE(Object, var_value);
LoadPropertyFromDictionary<PropertyDictionary>(
properties, name_index, &var_details, &var_value);
+
+ ExpectedReceiverMode expected_receiver_mode =
+ p->IsLoadSuperIC() ? kExpectingAnyReceiver : kExpectingJSReceiver;
+
TNode<Object> value = CallGetterIfAccessor(
var_value.value(), CAST(var_holder->value()), var_details.value(),
- p->context(), p->receiver(), p->name(), miss);
+ p->context(), p->receiver(), expected_receiver_mode, p->name(),
+ miss);
exit_point->Return(value);
}
},
@@ -2960,9 +2970,12 @@ void AccessorAssembler::GenericPropertyLoad(
BIND(&if_found_on_lookup_start_object);
{
+ ExpectedReceiverMode expected_receiver_mode =
+ p->IsLoadSuperIC() ? kExpectingAnyReceiver : kExpectingJSReceiver;
+
TNode<Object> value = CallGetterIfAccessor(
var_value.value(), CAST(lookup_start_object), var_details.value(),
- p->context(), p->receiver(), p->name(), slow);
+ p->context(), p->receiver(), expected_receiver_mode, p->name(), slow);
Return(value);
}
diff --git a/src/ic/accessor-assembler.h b/src/ic/accessor-assembler.h
index 29cbf283ff143d1c62ba13e93e3b4fbef0931226..30bb186feb0cec20228b27aadd1c26949188e991 100644
--- a/src/ic/accessor-assembler.h
+++ b/src/ic/accessor-assembler.h
@@ -138,9 +138,7 @@ class V8_EXPORT_PRIVATE AccessorAssembler : public CodeStubAssembler {
TNode<Object> name() const { return name_; }
TNode<TaggedIndex> slot() const { return slot_; }
TNode<HeapObject> vector() const { return vector_; }
- TNode<JSAny> lookup_start_object() const {
- return lookup_start_object_.value();
- }
+ TNode<JSAny> lookup_start_object() const { return lookup_start_object_; }
TNode<Smi> enum_index() const { return *enum_index_; }
TNode<Object> cache_type() const { return *cache_type_; }
@@ -152,6 +150,11 @@ class V8_EXPORT_PRIVATE AccessorAssembler : public CodeStubAssembler {
return receiver_;
}
+ // This is useful for figuring out whether we know anything about receiver
+ // type. If |receiver| and |lookup_start_object| are different TNodes
+ // then this ICParameters object belongs to LoadSuperIC.
+ bool IsLoadSuperIC() const { return lookup_start_object_ != receiver_; }
+
bool IsEnumeratedKeyedLoad() const { return enum_index_ != std::nullopt; }
private:
@@ -160,7 +163,7 @@ class V8_EXPORT_PRIVATE AccessorAssembler : public CodeStubAssembler {
TNode<Object> name_;
TNode<TaggedIndex> slot_;
TNode<HeapObject> vector_;
- std::optional<TNode<JSAny>> lookup_start_object_;
+ TNode<JSAny> lookup_start_object_;
std::optional<TNode<Smi>> enum_index_;
std::optional<TNode<Object>> cache_type_;
};
@@ -202,6 +205,11 @@ class V8_EXPORT_PRIVATE AccessorAssembler : public CodeStubAssembler {
return receiver_;
}
+ // This is useful for figuring out whether we know anything about receiver
+ // type. If |receiver| and |lookup_start_object| are different TNodes
+ // then this ICParameters object belongs to LoadSuperIC.
+ bool IsLoadSuperIC() const { return lookup_start_object_ != receiver_; }
+
private:
LazyNode<Context> context_;
TNode<JSAny> receiver_;

View File

@@ -0,0 +1,109 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Olivier=20Fl=C3=BCckiger?= <olivf@chromium.org>
Date: Wed, 5 Nov 2025 14:11:51 +0100
Subject: Merged: [maglev] Fix left over register allocations from regalloc
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
The regalloc should clear the node allocations when it is done.
Failing to do so can cause the codegen to use stale register state.
In this concrete example the exception handler trampolines would not
load from the spill slot due to the left over allocation.
Bug: 457351015
(cherry picked from commit 7ef5ae531a9e79a084b5f0bebd5496d5d481e0ea)
Change-Id: Ibf50e9c77f68654abf1b610bc1f37ccd17904c84
Reviewed-on: https://chromium-review.googlesource.com/c/v8/v8/+/7137280
Reviewed-by: Victor Gomes <victorgomes@chromium.org>
Commit-Queue: Victor Gomes <victorgomes@chromium.org>
Auto-Submit: Olivier Flückiger <olivf@chromium.org>
Cr-Commit-Position: refs/branch-heads/14.2@{#35}
Cr-Branched-From: 37f82dbb9f640dc5eea09870dd391cd3712546e5-refs/heads/14.2.231@{#1}
Cr-Branched-From: d1a6089b861336cf4b3887edfd3fdd280b23b5dd-refs/heads/main@{#102804}
diff --git a/src/maglev/maglev-code-generator.cc b/src/maglev/maglev-code-generator.cc
index 4207c84b70267b98a3fa5069cb1bbcee6df78d6a..d096c6d4d5b6ff175a960b1a0dcaf0ff68d21bdd 100644
--- a/src/maglev/maglev-code-generator.cc
+++ b/src/maglev/maglev-code-generator.cc
@@ -801,6 +801,12 @@ class MaglevCodeGeneratingNodeProcessor {
template <typename NodeT>
ProcessResult Process(NodeT* node, const ProcessingState& state) {
+#ifdef DEBUG
+ if constexpr (std::is_base_of_v<ValueNode, NodeT>) {
+ // Regalloc must clear its temp allocations.
+ DCHECK(!node->regalloc_info()->has_register());
+ }
+#endif
if (v8_flags.code_comments) {
std::stringstream ss;
ss << "-- " << graph_labeller()->NodeId(node) << ": "
diff --git a/src/maglev/maglev-regalloc.cc b/src/maglev/maglev-regalloc.cc
index efba3bc4758a18ed03acc97732c971a2fb59cac2..1906a263bd01b01c616bf3fc75a805f24a747852 100644
--- a/src/maglev/maglev-regalloc.cc
+++ b/src/maglev/maglev-regalloc.cc
@@ -631,6 +631,9 @@ void StraightForwardRegisterAllocator::AllocateRegisters() {
AllocateControlNode(block->control_node(), block);
ApplyPatches(block);
}
+
+ // Clean up remaining register allocations at the end
+ ClearRegisters();
}
void StraightForwardRegisterAllocator::FreeRegistersUsedBy(ValueNode* node) {
@@ -1607,8 +1610,8 @@ void StraightForwardRegisterAllocator::SpillRegisters() {
double_registers_.ForEachUsedRegister(spill);
}
-template <typename RegisterT>
-void StraightForwardRegisterAllocator::SpillAndClearRegisters(
+template <typename RegisterT, bool spill>
+void StraightForwardRegisterAllocator::ClearRegisters(
RegisterFrameState<RegisterT>& registers) {
while (registers.used() != registers.empty()) {
RegisterT reg = registers.used().first();
@@ -1617,7 +1620,9 @@ void StraightForwardRegisterAllocator::SpillAndClearRegisters(
printing_visitor_->os()
<< " clearing registers with " << PrintNodeLabel(node) << "\n";
}
- Spill(node);
+ if (spill) {
+ Spill(node);
+ }
registers.FreeRegistersUsedBy(node);
DCHECK(!registers.used().has(reg));
}
@@ -1628,6 +1633,11 @@ void StraightForwardRegisterAllocator::SpillAndClearRegisters() {
SpillAndClearRegisters(double_registers_);
}
+void StraightForwardRegisterAllocator::ClearRegisters() {
+ ClearRegisters(general_registers_);
+ ClearRegisters(double_registers_);
+}
+
void StraightForwardRegisterAllocator::SaveRegisterSnapshot(NodeBase* node) {
RegisterSnapshot snapshot;
general_registers_.ForEachUsedRegister([&](Register reg, ValueNode* node) {
diff --git a/src/maglev/maglev-regalloc.h b/src/maglev/maglev-regalloc.h
index 6688f494c4cfcee74e9836765960089b42926c72..ad0ff7beb73e5e02130a310941b0e4bb69c2c9c1 100644
--- a/src/maglev/maglev-regalloc.h
+++ b/src/maglev/maglev-regalloc.h
@@ -224,8 +224,13 @@ class StraightForwardRegisterAllocator {
void Spill(ValueNode* node);
void SpillRegisters();
+ template <typename RegisterT, bool spill = false>
+ void ClearRegisters(RegisterFrameState<RegisterT>& registers);
template <typename RegisterT>
- void SpillAndClearRegisters(RegisterFrameState<RegisterT>& registers);
+ void SpillAndClearRegisters(RegisterFrameState<RegisterT>& registers) {
+ ClearRegisters<RegisterT, true>(registers);
+ }
+ void ClearRegisters();
void SpillAndClearRegisters();
void SaveRegisterSnapshot(NodeBase* node);

View File

@@ -0,0 +1,475 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Darius Mercadier <dmercadier@chromium.org>
Date: Wed, 5 Nov 2025 14:06:54 +0100
Subject: [turboshaft] Avoid introducing too many Variables
.... if we have very large merges.
Cf https://crbug.com/418027512#comment5 for explanations of why this
is necessary (and the following comment for why I don't see a good
alternative to this CL).
I've locally confirmed that this fixes the OOM from
https://crbug.com/457625181, and it reduces memory consumption on
binaries/crbug-40219016-zelda/zelda.wasm (from
https://crbug.com/418027512) by 20+%.
Bug: 418027512, 457625181
Change-Id: If55af659667723ce85ff71bcac66a43aff863e05
Reviewed-on: https://chromium-review.googlesource.com/c/v8/v8/+/7119378
Commit-Queue: Darius Mercadier <dmercadier@chromium.org>
Auto-Submit: Darius Mercadier <dmercadier@chromium.org>
Reviewed-by: Matthias Liedtke <mliedtke@chromium.org>
Cr-Commit-Position: refs/heads/main@{#103534}
diff --git a/src/compiler/turboshaft/branch-elimination-reducer.h b/src/compiler/turboshaft/branch-elimination-reducer.h
index f115c86894f0cf739d6381f7844e5589831cc209..d917d27bd3964ba07b41efa49b86435ae7720064 100644
--- a/src/compiler/turboshaft/branch-elimination-reducer.h
+++ b/src/compiler/turboshaft/branch-elimination-reducer.h
@@ -323,6 +323,10 @@ class BranchEliminationReducer : public Next {
goto no_change;
}
+ if (!__ CanCreateNVariables(destination_origin->OpCountUpperBound())) {
+ goto no_change;
+ }
+
if (const BranchOp* branch = last_op.template TryCast<BranchOp>()) {
V<Word32> condition =
__ template MapToNewGraph<true>(branch->condition());
diff --git a/src/compiler/turboshaft/copying-phase.h b/src/compiler/turboshaft/copying-phase.h
index c1d20d06016a61927dc1791df44079b0793ba3a0..095b2ed41e6cbd524e364a4b71279262d3f7b8e6 100644
--- a/src/compiler/turboshaft/copying-phase.h
+++ b/src/compiler/turboshaft/copying-phase.h
@@ -714,9 +714,23 @@ class GraphVisitor : public OutputGraphAssembler<GraphVisitor<AfterNext>,
if (Asm().CanAutoInlineBlocksWithSinglePredecessor() &&
terminator.Is<GotoOp>()) {
Block* destination = terminator.Cast<GotoOp>().destination;
- if (destination->PredecessorCount() == 1) {
- block_to_inline_now_ = destination;
- return;
+ // Inlining the destination will require setting it in needs_variables_
+ // mode; we thus check that we can actually create enough variables to do
+ // this.
+ // TODO(dmercadier): in practice, the only reason we need variables for
+ // the destination is because we could be currently in a phase that cloned
+ // the current block, which could lead to {destination} being cloned as
+ // well. No all phases can do this, so we could check that we're not in
+ // such a phase, and if so, not use variables for the destination. One way
+ // to do this would be to have a DisallowCloningReducer which would
+ // static_assert that LoopUnrolling/LoopPeeling/BranchElimination aren't
+ // on the stack and would also prevent using CloneSubGraph,
+ // CloneAndInlineBlock and CloneBlockAndGoto.
+ if (Asm().CanCreateNVariables(destination->OpCountUpperBound())) {
+ if (destination->PredecessorCount() == 1) {
+ block_to_inline_now_ = destination;
+ return;
+ }
}
}
// Just going through the regular VisitOp function.
diff --git a/src/compiler/turboshaft/graph.h b/src/compiler/turboshaft/graph.h
index 030171f54156b2c6c89887856444f99463b3b98a..7fa8245290a0e311150f7ac3d86212bd1943a37c 100644
--- a/src/compiler/turboshaft/graph.h
+++ b/src/compiler/turboshaft/graph.h
@@ -608,6 +608,7 @@ class Graph {
operation_origins_.Reset();
operation_types_.Reset();
dominator_tree_depth_ = 0;
+ max_merge_pred_count_ = 0;
#ifdef DEBUG
block_type_refinement_.Reset();
// Do not reset of graph_created_from_turbofan_ as it is propagated along
@@ -791,6 +792,8 @@ class Graph {
bound_blocks_.push_back(block);
uint32_t depth = block->ComputeDominator();
dominator_tree_depth_ = std::max<uint32_t>(dominator_tree_depth_, depth);
+ max_merge_pred_count_ =
+ std::max<uint32_t>(max_merge_pred_count_, block->PredecessorCount());
#ifdef DEBUG
if (v8_flags.turboshaft_trace_emitted) {
@@ -1016,6 +1019,8 @@ class Graph {
uint32_t DominatorTreeDepth() const { return dominator_tree_depth_; }
+ uint32_t max_merge_pred_count() const { return max_merge_pred_count_; }
+
const GrowingOpIndexSidetable<Type>& operation_types() const {
return operation_types_;
}
@@ -1068,6 +1073,7 @@ class Graph {
std::swap(next_block_, companion.next_block_);
std::swap(block_permutation_, companion.block_permutation_);
std::swap(graph_zone_, companion.graph_zone_);
+ std::swap(max_merge_pred_count_, companion.max_merge_pred_count_);
op_to_block_.SwapData(companion.op_to_block_);
source_positions_.SwapData(companion.source_positions_);
operation_origins_.SwapData(companion.operation_origins_);
@@ -1204,6 +1210,9 @@ class Graph {
GrowingOpIndexSidetable<SourcePosition> source_positions_;
GrowingOpIndexSidetable<OpIndex> operation_origins_;
uint32_t dominator_tree_depth_ = 0;
+ // {max_merge_pred_count_} stores the maximum number of predecessors that any
+ // Merge in the graph has.
+ uint32_t max_merge_pred_count_ = 0;
GrowingOpIndexSidetable<Type> operation_types_;
#ifdef DEBUG
GrowingBlockSidetable<TypeRefinements> block_type_refinement_;
diff --git a/src/compiler/turboshaft/loop-peeling-reducer.h b/src/compiler/turboshaft/loop-peeling-reducer.h
index a9b5eaaf4c88354164b3a5833d4bd6b2760b12a0..b7df7acb61d048669a2cacfbc4e2156df69788dc 100644
--- a/src/compiler/turboshaft/loop-peeling-reducer.h
+++ b/src/compiler/turboshaft/loop-peeling-reducer.h
@@ -57,8 +57,7 @@ class LoopPeelingReducer : public Next {
const Block* dst = gto.destination;
if (dst->IsLoop() && !gto.is_backedge && CanPeelLoop(dst)) {
if (ShouldSkipOptimizationStep()) goto no_change;
- PeelFirstIteration(dst);
- return {};
+ if (PeelFirstIteration(dst)) return {};
} else if (IsEmittingPeeledIteration() && dst == current_loop_header_) {
// We skip the backedge of the loop: PeelFirstIeration will instead emit a
// forward edge to the non-peeled header.
@@ -111,13 +110,21 @@ class LoopPeelingReducer : public Next {
kEmittingUnpeeledBody
};
- void PeelFirstIteration(const Block* header) {
+ bool PeelFirstIteration(const Block* header) {
TRACE("LoopPeeling: peeling loop at " << header->index());
DCHECK_EQ(peeling_, PeelingStatus::kNotPeeling);
ScopedModification<PeelingStatus> scope(&peeling_,
PeelingStatus::kEmittingPeeledLoop);
current_loop_header_ = header;
+ constexpr int kNumberOfLoopCopies = 2; // peeled + unpeeled
+ size_t op_count_upper_bound =
+ loop_finder_.GetLoopInfo(header).op_count * kNumberOfLoopCopies;
+ if (!__ CanCreateNVariables(op_count_upper_bound)) {
+ TRACE("> Too many variables, skipping peeling");
+ return false;
+ }
+
// Emitting the peeled iteration.
auto loop_body = loop_finder_.GetLoopBody(header);
// Note that this call to CloneSubGraph will not emit the backedge because
@@ -133,7 +140,7 @@ class LoopPeelingReducer : public Next {
// While peeling, we realized that the 2nd iteration of the loop is not
// reachable.
TRACE("> Second iteration is not reachable, stopping now");
- return;
+ return true;
}
// We now emit the regular unpeeled loop.
@@ -141,6 +148,7 @@ class LoopPeelingReducer : public Next {
TRACE("> Emitting unpeeled loop body");
__ CloneSubGraph(loop_body, /* keep_loop_kinds */ true,
/* is_loop_after_peeling */ true);
+ return true;
}
bool CanPeelLoop(const Block* header) {
diff --git a/src/compiler/turboshaft/loop-unrolling-reducer.h b/src/compiler/turboshaft/loop-unrolling-reducer.h
index 181d298bfa27d21f013016b34a586078d12f8a58..92d6f7b36d4c5c0a64723f7d18427a62347bad9f 100644
--- a/src/compiler/turboshaft/loop-unrolling-reducer.h
+++ b/src/compiler/turboshaft/loop-unrolling-reducer.h
@@ -211,6 +211,11 @@ class V8_EXPORT_PRIVATE LoopUnrollingAnalyzer {
info.op_count < kMaxLoopSizeForPartialUnrolling;
}
+ size_t GetLoopOpCount(const Block* loop_header) {
+ DCHECK(loop_header->IsLoop());
+ return loop_finder_.GetLoopInfo(loop_header).op_count;
+ }
+
// The returned unroll count is the total number of copies of the loop body
// in the resulting graph, i.e., an unroll count of N means N-1 copies of the
// body which were partially unrolled, and 1 for the original/remaining body.
@@ -383,14 +388,12 @@ class LoopUnrollingReducer : public Next {
// header (note that loop headers only have 2 predecessor, including the
// backedge), and that isn't the backedge.
if (ShouldSkipOptimizationStep()) goto no_change;
- if (analyzer_.ShouldRemoveLoop(dst)) {
- RemoveLoop(dst);
+ if (analyzer_.ShouldRemoveLoop(dst) && RemoveLoop(dst)) {
return {};
- } else if (analyzer_.ShouldFullyUnrollLoop(dst)) {
- FullyUnrollLoop(dst);
+ } else if (analyzer_.ShouldFullyUnrollLoop(dst) && FullyUnrollLoop(dst)) {
return {};
- } else if (analyzer_.ShouldPartiallyUnrollLoop(dst)) {
- PartiallyUnrollLoop(dst);
+ } else if (analyzer_.ShouldPartiallyUnrollLoop(dst) &&
+ PartiallyUnrollLoop(dst)) {
return {};
}
} else if ((unrolling_ == UnrollingStatus::kUnrolling) &&
@@ -467,9 +470,9 @@ class LoopUnrollingReducer : public Next {
// and would like to not emit the loop body that follows.
kRemoveLoop,
};
- void RemoveLoop(const Block* header);
- void FullyUnrollLoop(const Block* header);
- void PartiallyUnrollLoop(const Block* header);
+ bool RemoveLoop(const Block* header);
+ bool FullyUnrollLoop(const Block* header);
+ bool PartiallyUnrollLoop(const Block* header);
void FixLoopPhis(const Block* input_graph_loop, Block* output_graph_loop,
const Block* backedge_block);
bool IsRunningBuiltinPipeline() {
@@ -508,10 +511,16 @@ class LoopUnrollingReducer : public Next {
};
template <class Next>
-void LoopUnrollingReducer<Next>::PartiallyUnrollLoop(const Block* header) {
+bool LoopUnrollingReducer<Next>::PartiallyUnrollLoop(const Block* header) {
TRACE("LoopUnrolling: partially unrolling loop at " << header->index().id());
DCHECK_EQ(unrolling_, UnrollingStatus::kNotUnrolling);
DCHECK(!skip_next_stack_check_);
+
+ if (!__ CanCreateNVariables(analyzer_.GetLoopOpCount(header))) {
+ TRACE("> Too many variables, skipping unrolling");
+ return false;
+ }
+
unrolling_ = UnrollingStatus::kUnrolling;
auto loop_body = analyzer_.GetLoopBody(header);
@@ -533,7 +542,7 @@ void LoopUnrollingReducer<Next>::PartiallyUnrollLoop(const Block* header) {
__ CloneSubGraph(loop_body, /* keep_loop_kinds */ true);
if (StopUnrollingIfUnreachable(output_graph_header)) {
TRACE("> Next iteration is unreachable, stopping unrolling");
- return;
+ return true;
}
// Emitting the subsequent folded iterations. We set `unrolling_` to
@@ -549,7 +558,7 @@ void LoopUnrollingReducer<Next>::PartiallyUnrollLoop(const Block* header) {
__ CloneSubGraph(loop_body, /* keep_loop_kinds */ false);
if (StopUnrollingIfUnreachable(output_graph_header)) {
TRACE("> Next iteration is unreachable, stopping unrolling");
- return;
+ return true;
}
}
@@ -567,6 +576,7 @@ void LoopUnrollingReducer<Next>::PartiallyUnrollLoop(const Block* header) {
unrolling_ = UnrollingStatus::kNotUnrolling;
TRACE("> Finished partially unrolling loop " << header->index().id());
+ return true;
}
template <class Next>
@@ -622,10 +632,20 @@ void LoopUnrollingReducer<Next>::FixLoopPhis(const Block* input_graph_loop,
}
template <class Next>
-void LoopUnrollingReducer<Next>::RemoveLoop(const Block* header) {
+bool LoopUnrollingReducer<Next>::RemoveLoop(const Block* header) {
TRACE("LoopUnrolling: removing loop at " << header->index().id());
DCHECK_EQ(unrolling_, UnrollingStatus::kNotUnrolling);
DCHECK(!skip_next_stack_check_);
+
+ if (!__ CanCreateNVariables(analyzer_.GetLoopOpCount(header))) {
+ TRACE("> Too many variables, skipping removal");
+ // TODO(dmercadier): in theory, RemoveLoop shouldn't need Variables, since
+ // it cannot be called while unrolling an outer loop, since we only unroll
+ // innermost loops. We should teach CloneAndInlineBlock that it doesn't
+ // always need to introduce Variables, and then remove this bailout.
+ return false;
+ }
+
// When removing a loop, we still need to emit the header (since it has to
// always be executed before the 1st iteration anyways), but by setting
// {unrolling_} to `kRemoveLoop`, the final Branch of the loop will become a
@@ -633,15 +653,21 @@ void LoopUnrollingReducer<Next>::RemoveLoop(const Block* header) {
unrolling_ = UnrollingStatus::kRemoveLoop;
__ CloneAndInlineBlock(header);
unrolling_ = UnrollingStatus::kNotUnrolling;
+ return true;
}
template <class Next>
-void LoopUnrollingReducer<Next>::FullyUnrollLoop(const Block* header) {
+bool LoopUnrollingReducer<Next>::FullyUnrollLoop(const Block* header) {
TRACE("LoopUnrolling: fully unrolling loop at " << header->index().id());
DCHECK_EQ(unrolling_, UnrollingStatus::kNotUnrolling);
DCHECK(!skip_next_stack_check_);
ScopedModification<bool> skip_stack_checks(&skip_next_stack_check_, true);
+ if (!__ CanCreateNVariables(analyzer_.GetLoopOpCount(header))) {
+ TRACE("> Too many variables, skipping unrolling");
+ return false;
+ }
+
size_t iter_count = analyzer_.GetIterationCount(header).exact_count();
TRACE("> iter_count: " << iter_count);
@@ -654,7 +680,7 @@ void LoopUnrollingReducer<Next>::FullyUnrollLoop(const Block* header) {
__ CloneSubGraph(loop_body, /* keep_loop_kinds */ false);
if (StopUnrollingIfUnreachable()) {
TRACE("> Next iteration is unreachable, stopping unrolling");
- return;
+ return true;
}
}
@@ -667,6 +693,7 @@ void LoopUnrollingReducer<Next>::FullyUnrollLoop(const Block* header) {
unrolling_ = UnrollingStatus::kNotUnrolling;
TRACE("> Finished fully unrolling loop " << header->index().id());
+ return true;
}
#undef TRACE
diff --git a/src/compiler/turboshaft/turbolev-graph-builder.cc b/src/compiler/turboshaft/turbolev-graph-builder.cc
index 1963ddb4442e450b7f186f517574e31b4d961030..e9db0d8260c784a1da406b7b70d2e623135b572d 100644
--- a/src/compiler/turboshaft/turbolev-graph-builder.cc
+++ b/src/compiler/turboshaft/turbolev-graph-builder.cc
@@ -104,12 +104,7 @@ class BlockOriginTrackingReducer : public Next {
}
void Bind(Block* block) {
Next::Bind(block);
- // The 1st block we bind doesn't exist in Maglev and is meant to hold
- // Constants (which in Maglev are not in any block), and thus
- // {maglev_input_block_} should still be nullptr. In all other cases,
- // {maglev_input_block_} should not be nullptr.
- DCHECK_EQ(maglev_input_block_ == nullptr,
- block == &__ output_graph().StartBlock());
+ DCHECK_NOT_NULL(maglev_input_block_);
turboshaft_block_origins_[block->index()] = maglev_input_block_;
}
@@ -504,9 +499,11 @@ class GraphBuildingNodeProcessor {
block_mapping_[block] =
block->is_loop() ? __ NewLoopHeader() : __ NewBlock();
}
- // Constants are not in a block in Maglev but are in Turboshaft. We bind a
- // block now, so that Constants can then be emitted.
- __ Bind(__ NewBlock());
+ // Constants are not in a block in Maglev but are in Turboshaft. We bind the
+ // 1st block now, so that Constants can then be emitted.
+ const maglev::BasicBlock* first_maglev_block = graph->blocks().front();
+ __ SetMaglevInputBlock(first_maglev_block);
+ __ Bind(block_mapping_[first_maglev_block]);
// Initializing undefined constant so that we don't need to recreate it too
// often.
@@ -590,9 +587,20 @@ class GraphBuildingNodeProcessor {
Block* turboshaft_block = Map(maglev_block);
if (__ current_block() != nullptr) {
- // The first block for Constants doesn't end with a Jump, so we add one
- // now.
- __ Goto(turboshaft_block);
+ // We must be in the first block of the graph, inserted by Turboshaft in
+ // PreProcessGraph so that constants can be bound in a block. No need to
+ // do anything else: we don't emit a Goto so that the actual 1st block of
+ // the Maglev graph gets inlined into this first block of the Turboshaft
+ // graph, which, in addition to saving a Goto, saves the need to clone the
+ // destination into the current block later, and also ensures that
+ // Parameters are always in the 1st block.
+ DCHECK_EQ(__ output_graph().block_count(), 1);
+ DCHECK_EQ(maglev_block->id(), 0);
+ DCHECK_EQ(__ current_block(), block_mapping_[maglev_block]);
+ // maglev_input_block should have been set by calling SetMaglevInputBlock
+ // in PreProcessGraph.
+ DCHECK_EQ(__ maglev_input_block(), maglev_block);
+ return maglev::BlockProcessResult::kContinue;
}
#ifdef DEBUG
diff --git a/src/compiler/turboshaft/variable-reducer.h b/src/compiler/turboshaft/variable-reducer.h
index b11338bdf6e928cd09a0bdbad42fd835c8210c36..03cc2fa77f0d4a194893a8be5747d6de887e5ee9 100644
--- a/src/compiler/turboshaft/variable-reducer.h
+++ b/src/compiler/turboshaft/variable-reducer.h
@@ -9,6 +9,7 @@
#include <optional>
#include "src/base/logging.h"
+#include "src/base/macros.h"
#include "src/codegen/machine-type.h"
#include "src/compiler/turboshaft/assembler.h"
#include "src/compiler/turboshaft/graph.h"
@@ -91,6 +92,15 @@ class VariableReducer : public RequiredOptimizationReducer<AfterNext> {
public:
TURBOSHAFT_REDUCER_BOILERPLATE(VariableReducer)
+ ~VariableReducer() {
+ if (too_many_variables_bailouts_count_ != 0 &&
+ V8_UNLIKELY(v8_flags.trace_turbo_bailouts)) {
+ std::cout << "Bailing out from block cloning "
+ << too_many_variables_bailouts_count_ << " time"
+ << (too_many_variables_bailouts_count_ > 1 ? "s" : "") << "\n";
+ }
+ }
+
void Bind(Block* new_block) {
Next::Bind(new_block);
@@ -190,6 +200,26 @@ class VariableReducer : public RequiredOptimizationReducer<AfterNext> {
return table_.GetPredecessorValue(var, predecessor_index);
}
+ bool CanCreateNVariables(size_t n) {
+ // Merges with many predecessors combined with many variables can quickly
+ // blow up memory since the SnapshotTable needs to create a state whose
+ // size can be up to number_of_predecessor*variable_count (note: in
+ // practice, it's often not quite variable_count but less since only
+ // variables that are live in at least one predecessor are counted). To
+ // avoid OOM or otherwise huge memory consumption, we thus stop creating
+ // variables (and bail out on optimizations that need variables) when this
+ // number becomes too large. I somewhat arbitrarily selected 100K here,
+ // which sounds high, but in terms of memory, it's just 100K*8=800KB, which
+ // is less than 1MB, which isn't going to amount for much in a function
+ // that is probably very large if it managed to reach this limit.
+ constexpr uint32_t kMaxAllowedMergeStateSize = 100'000;
+ bool can_create =
+ __ input_graph().max_merge_pred_count() * (variable_count_ + n) <
+ kMaxAllowedMergeStateSize;
+ if (!can_create) too_many_variables_bailouts_count_++;
+ return can_create;
+ }
+
void SetVariable(Variable var, OpIndex new_index) {
DCHECK(!is_temporary_);
if (V8_UNLIKELY(__ generating_unreachable_operations())) return;
@@ -206,10 +236,12 @@ class VariableReducer : public RequiredOptimizationReducer<AfterNext> {
Variable NewLoopInvariantVariable(MaybeRegisterRepresentation rep) {
DCHECK(!is_temporary_);
+ variable_count_++;
return table_.NewKey(VariableData{rep, true}, OpIndex::Invalid());
}
Variable NewVariable(MaybeRegisterRepresentation rep) {
DCHECK(!is_temporary_);
+ variable_count_++;
return table_.NewKey(VariableData{rep, false}, OpIndex::Invalid());
}
@@ -314,6 +346,10 @@ class VariableReducer : public RequiredOptimizationReducer<AfterNext> {
__ input_graph().block_count(), std::nullopt, __ phase_zone()};
bool is_temporary_ = false;
+ // Tracks the number of variables that have been created.
+ uint32_t variable_count_ = 0;
+ uint32_t too_many_variables_bailouts_count_ = 0;
+
// {predecessors_} is used during merging, but we use an instance variable for
// it, in order to save memory and not reallocate it for each merge.
ZoneVector<Snapshot> predecessors_{__ phase_zone()};
diff --git a/test/unittests/compiler/turboshaft/control-flow-unittest.cc b/test/unittests/compiler/turboshaft/control-flow-unittest.cc
index 49e1c8c2561bd010d12e5229c4d6594b9846b40b..b39b073a2ea899550fe0df6a81dcebc2d75efa49 100644
--- a/test/unittests/compiler/turboshaft/control-flow-unittest.cc
+++ b/test/unittests/compiler/turboshaft/control-flow-unittest.cc
@@ -55,7 +55,7 @@ TEST_F(ControlFlowTest, DefaultBlockInlining) {
// BranchElimination should remove such branches by cloning the block with the
// branch. In the end, the graph should contain (almost) no branches anymore.
TEST_F(ControlFlowTest, BranchElimination) {
- static constexpr int kSize = 10000;
+ static constexpr int kSize = 200;
auto test = CreateFromGraph(1, [](auto& Asm) {
V<Word32> cond =

View File

@@ -40,6 +40,7 @@ DevToolsSecurity -enable
# security import "$dir"/public.key -k $KEY_CHAIN
# Generate Trust Settings
# TODO: Remove NPX
npm_config_yes=true npx ts-node "$(dirname $0)"/gen-trust.ts "$dir"/certificate.cer "$dir"/trust.xml
# Import Trust Settings

View File

@@ -0,0 +1,32 @@
const yaml = require('yaml');
const fs = require('node:fs');
const path = require('node:path');
const PREFIX = '# AUTOGENERATED FILE - DO NOT EDIT MANUALLY\n# ONLY EDIT .github/workflows/pipeline-segment-electron-build.yml\n\n';
const base = path.resolve(__dirname, '../.github/workflows/pipeline-segment-electron-build.yml');
const target = path.resolve(__dirname, '../.github/workflows/pipeline-segment-electron-publish.yml');
const baseContents = fs.readFileSync(base, 'utf-8');
const parsedBase = yaml.parse(baseContents);
parsedBase.jobs.build.permissions = {
'artifact-metadata': 'write',
attestations: 'write',
contents: 'read',
'id-token': 'write'
};
if (process.argv.includes('--check')) {
if (fs.readFileSync(target, 'utf-8') !== PREFIX + yaml.stringify(parsedBase)) {
console.error(`${target} is out of date`);
console.error('Please run "copy-pipeline-segment-publish.js" to update it');
process.exit(1);
}
} else {
fs.writeFileSync(
target,
PREFIX + yaml.stringify(parsedBase)
);
}

Some files were not shown because too many files have changed in this diff Show More