Compare commits

...

144 Commits

Author SHA1 Message Date
Keeley Hammond
26437c6202 fix: discover ThinLTO cache path from GN instead of hardcoding
Addresses review feedback from @deepak1556: the hardcoded
`out\Default\thinlto-cache` path goes out of sync if upstream
changes `cache_dir` in Chromium's build/config/compiler/BUILD.gn.

Read the `/lldltocache:` flag from `gn desc` on a linked target
(`//electron:electron_app`) and pre-create whatever path GN
actually configured. Skips the pre-create entirely when ThinLTO
is disabled (non-official builds), which is the correct no-op.
2026-04-23 12:12:31 -07:00
Keeley Hammond
f038413cf9 fix: pre-create thinlto-cache dir on Windows to avoid bindflt race
Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>
2026-04-23 10:16:38 -07:00
Samuel Attard
099c5c0038 build: use 32-core Windows ARC runners for build jobs (#51256)
* build: use 32-core Windows ARC runners for build jobs

* ci: add siso patch to retry ERROR_INVALID_PARAMETER on ninja file open

The existing patch removes the redundant per-chunk re-open in
fileParser.readFile, but the single remaining os.Open per subninja can
still hit the bindflt race under the ~90k-file concurrent open burst on
the 32-core Windows runners. Layer a second patch on top that wraps that
open in a Windows-only 5-attempt retry (5-80ms backoff) so a single
transient failure no longer aborts the whole manifest load.
2026-04-23 02:23:32 -07:00
Charles Kerr
2c46abe361 test: add linux coverage for default protocol client APIs (#51253)
Add Linux-only app tests to check the default protocol handler.
This includes adding reusable XDG mock fixtures.
2026-04-23 10:10:08 +02:00
David Sanders
05e0cd085c build: drop script/run-gn-format.py (#51263) 2026-04-23 09:52:36 +02:00
Asish Kumar
7c56577639 fix: preserve return value in deprecate.removeFunction (#51028)
The wrapper returned by `deprecate.removeFunction` dropped the wrapped
function's return value because it did not `return` from `fn.apply`.
Every other function wrapper in this module (`renameFunction`,
`moveAPI`) forwards the return value, and the generic type signature
`<T extends Function>(fn: T, ...): T` promises that `T`'s return type
is preserved. Callers that relied on the return value of a function
wrapped by `removeFunction` would silently receive `undefined` from
the wrapper.

Mirror the forwarding done by `renameFunction` / `moveAPI` and extend
the existing spec to assert that the wrapper preserves both the return
value and the `this` context of the deprecated function.

Assisted-By: Claude Opus 4.6

Signed-off-by: Asish Kumar <officialasishkumar@gmail.com>
2026-04-23 00:05:30 +00:00
Robo
350de668e2 refactor: api::autoUpdater managed by cppgc (#51241) 2026-04-23 05:38:47 +09:00
Robo
111b6275ee build: track PDL files as inputs in inspector GN build (#51239)
* fix: track PDL files as inputs in inspector GN build

* chore: address review feedback
2026-04-22 13:48:44 +00:00
Charles Kerr
54eb30a642 chore: remove calls to DeprecatedLayoutImmediately (#51183)
* chore: remove calls to DeprecatedLayoutImmediately

Replace the calls to `DeprecatedLayoutImmediately` that have test
coverage.

- The docked DevTools test added last week covers the three calls in IWCV.

- api-web-contents-view-spec covers the calls from NativeWindow{Views,Mac}.

There are a couple of remaining calls that don't have test coverage yet.
I'll get to them in a followup.

* test: handle both sync or microtask layout

* refactor: add a FlushPendingRootLayout() helper
2026-04-22 12:34:07 +02:00
Charles Kerr
71e8a5ca80 build: FTBFS when pdf is disabled (#51223)
fix: FTBFS when pdf is disabled

pdf_features.h has a static_assert that pdf is enabled
2026-04-22 12:33:59 +02:00
Shelley Vohr
c74de25b01 fix: ignore draggable regions in hidden WebContentsView (#51200)
fix: ignore draggable regions in hidden WebContentsView

Hidden child WebContentsViews were still contributing their draggable
regions to the parent window's non-client hit test, so clicks in the
area where a hidden view's draggable element would render still dragged
the window. Early-return HTNOWHERE when the view is not visible.
2026-04-22 12:33:41 +02:00
Samuel Attard
44e4839580 build: update @electron/get to v5 (#51234) 2026-04-21 19:54:24 -07:00
Samuel Attard
fac0a1624b build: use Yarn JsZipImpl for node-modules link to fix arm32 OOM (#51226)
* build: use Yarn JsZipImpl for node-modules link step

Patch the vendored .yarn/releases/yarn-4.12.0.cjs so the node-modules
(and pnpm-loose) linker constructs its read-only ZipOpenFS with
customZipImplementation = JsZipImpl instead of the default WASM
LibZipImpl.

LibZipImpl loads each cache zip fully into the Emscripten WASM heap and
malloc's a WASM buffer per file read. With up to 80 zips held open, the
32-bit arm32v7 test container intermittently fails to allocate the ~9 MB
buffer for typescript/lib/typescript.js. Yarn's cross-FS copyFilePromise
swallows the real error and surfaces it as a generic
"EINVAL: invalid argument, copyfile", which has been failing ~1-in-3
linux-arm test shards at Install Dependencies since 2026-04-13.

JsZipImpl opens zips by fd, reads only the central directory, and pulls
individual entries into plain Node Buffers — no WASM heap involved.
There is no .yarnrc.yml or env knob for this, so the vendored release is
edited directly. .claude/README.md documents the patch and how to
re-apply it on Yarn upgrades.

Refs: yarnpkg/berry#3972, yarnpkg/berry#6722, yarnpkg/berry#6550

* docs: move JsZipImpl patch notes to .yarn/README.md

Relocate the patch rationale next to the vendored release it documents,
reword the intro for its new home, and update the header comment in
yarn-4.12.0.cjs to point at .yarn/README.md.
2026-04-21 17:58:46 -07:00
Robo
23d95ea9f8 refactor: api::cookies managed by cppgc (#51196)
* refactor: api::cookies managed by cppgc

* chore: remove unused header
2026-04-22 05:28:31 +09:00
Shelley Vohr
89050762c6 fix: reset printToPDF queue after a rejection (#51174)
fix: reset printToPDF queue after a rejection

The module-scoped `pendingPromise` in `webContents.printToPDF` was chained
with `.then(onFulfilled)` and never cleared. Once a call rejected (e.g.
an out-of-range `pageRanges` like `"999"`), subsequent calls chained onto
the rejected promise and short-circuited without ever invoking
`_printToPDF` — so every following call re-surfaced the original error.

Replace the shared variable with a per-`WebContents` `WeakMap` queue that
swallows prior rejections before chaining and clears its entry once the
tail drains.
2026-04-21 11:36:02 -07:00
Niklas Wenzel
b8be33814e fix: skip heap profiling tests in CI (#51209) 2026-04-21 18:26:14 +00:00
Mr.Chaofan
76a03e1010 feat: Add WebContents::Clone method (#49959) 2026-04-21 10:37:48 -07:00
Shelley Vohr
2ba6d28c09 fix: preserve transparency across setResizable toggles on Windows (#51175)
After #49428 made `NativeWindowViews::CanResize()` return `resizable_`
for frameless windows (instead of `resizable_ && thick_frame_`),
`HWNDMessageHandler::SizeConstraintsChanged()` started adding
`WS_THICKFRAME` to the window style whenever `CanResize()` reported true.
`WS_THICKFRAME` is incompatible with layered (translucent) windows and
destroys their transparency.

`SetContentSizeConstraints` already guards against this by skipping
`OnSizeConstraintsChanged()` when `!thick_frame_`. `SetResizable` did
not, so toggling resizability on a transparent window (e.g.
`setResizable(false)` then `setResizable(true)`) caused the Chromium
path to add `WS_THICKFRAME` and strip transparency.

Apply the same guard in `SetResizable`. Min/max constraints are still
enforced — Chromium reads them from the widget delegate on every
`WM_GETMINMAXINFO`, independent of `SizeConstraintsChanged()`.
2026-04-21 10:32:29 -07:00
Charles Kerr
5ed6e4bf62 test: add Linux-specific test for app.getApplicationNameForProtocol() (#51197)
* test: add Linux-specific test for getApplicationNameForProtocol()

On Linux, use XDG env vars to inject a mock that we can use
to test app.getApplicationNameForProtocol().

* fixup! test: add Linux-specific test for getApplicationNameForProtocol()

better system mocks
2026-04-21 10:16:14 -07:00
Charles Kerr
4d780c67f9 test: improve browser window layout coverage (#51189)
* test: browser window webContents viewport matches content bounds

* test: browser window webContents same after vibrancy change

* fixup! test: browser window webContents viewport matches content bounds

* refactor: extract-method helper getViewportSize()

* refactor: remove non-spec change

* chore: more accurate naming
2026-04-21 10:06:49 -07:00
Keeley Hammond
313f8955d1 fix: trigger ShipIt Mach service after SMJobSubmit to unblock on-demand-only mode (#51191)
* fix: trigger ShipIt Mach service to unblock on-demand-only mode

When a macOS system update is pending, launchd puts the user domain
into on-demand-only mode, preventing ShipIt from starting. The
MachServices endpoint in the job dictionary was registered but never
connected to (a leftover from the XPC removal in 2013).

Instead of removing MachServices, fire a lightweight XPC connection
to the Mach port after SMJobSubmit. This satisfies launchd's
on-demand trigger, starting ShipIt immediately while preserving
KeepAlive retry behavior.

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* fix: add ResetAtClose to ShipIt MachServices to prevent standing demand

The XPC trigger message sent after SMJobSubmit sits in the Mach port's
kernel queue unread. Without ResetAtClose, this creates standing demand
that causes launchd to respawn ShipIt after a successful exit(0),
defeating KeepAlive.SuccessfulExit = NO.

Set ResetAtClose on the MachServices registration so launchd tears down
and recreates the port when ShipIt exits, flushing the stale trigger.

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* fix: drain Mach port before exit(0) instead of using ResetAtClose

ResetAtClose blocks KeepAlive.SuccessfulExit retries in on-demand-only
mode because it removes demand when the port resets. Instead, have
ShipIt drain its own Mach service port (via bootstrap_check_in +
mach_msg) before each exit(EXIT_SUCCESS). This clears the standing
demand from the trigger message so launchd won't respawn after a
successful exit, while leaving the message in place on failure exits
so KeepAlive retries remain demand-backed.

Tested in on-demand-only mode (pending macOS update):
- exit(0) + drain: 1 run, no respawn ✓
- exit(1) + no drain: continuous respawn every 2s ✓

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* chore: update patch

* chore: harden ShipIt Mach trigger and simplify port drain

Scope the XPC trigger to the unprivileged path and add a send barrier
so the connection cannot be released before the message is on the wire.
Reduce drainMachServicePort to bootstrap_check_in (process exit flushes
the queue), dropping the mach_msg loop whose buffer/dealloc usage was
incorrect, and remove the no-op drain from the posix_spawn'd launch
helper. Patch filename regenerated to match the commit subject.

* fix: restore explicit mach_msg drain in drainMachServicePort

bootstrap_check_in alone does not prevent respawn: launchd tracks
outstanding demand independently of the receive right's lifetime, so the
queued trigger message must be explicitly dequeued with mach_msg before
exit(0). Verified empirically (check-in-only: 5 respawns in 10s; full
drain: 1 run). Keep the correctness fixes from the previous commit
(4K buffer, mach_msg_destroy on each receive, no mach_port_deallocate).

---------

Co-authored-by: Claude <svc-devxp-claude@slack-corp.com>
Co-authored-by: Samuel Attard <sattard@anthropic.com>
2026-04-21 09:46:18 -07:00
Robo
1ad6173286 fix: add crash diagnostics for ARM64 power notification crash (#51198)
On ARM64 Windows, UnregisterSuspendResumeNotification (user32) forwards
to PowerUnregisterSuspendResumeNotification (powrprof), which treats the
HPOWERNOTIFY handle as a pointer and dereferences it. The user32 API
returns an opaque handle, not a pointer-backed allocation, causing an
access violation at shutdown.

Add crash keys (pm-reg-handle, pm-reg-memstate, pm-unreg-memstate) to
capture
- The handle value
- VirtualQuery memory state at both registration and unregistration

If the handle address is MEM_FREE, it confirms the handle is an opaque
index and powrprof is incorrectly dereferencing it. If MEM_COMMIT, it
would indicate a use-after-free of the underlying allocation.

Refs https://github.com/MicrosoftDocs/sdk-api/blob/docs/sdk-api-src/content/powerbase/nf-powerbase-powerunregistersuspendresumenotification.md
2026-04-21 21:20:51 +09:00
dependabot[bot]
9861250310 build(deps): bump dsanders11/project-actions from 2.0.0 to 2.0.1 (#51195)
Bumps [dsanders11/project-actions](https://github.com/dsanders11/project-actions) from 2.0.0 to 2.0.1.
- [Release notes](https://github.com/dsanders11/project-actions/releases)
- [Commits](https://github.com/dsanders11/project-actions/compare/v2.0.0...4b06452b0128cf601dac14399aa668a8eed2d684)

---
updated-dependencies:
- dependency-name: dsanders11/project-actions
  dependency-version: 2.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-21 09:10:23 +00:00
David Sanders
0396220dce ci: don't upload build stats on Windows if build fails (#51199) 2026-04-21 09:05:25 +00:00
dependabot[bot]
a0c3be5656 build(deps): bump actions/setup-node from 6.3.0 to 6.4.0 (#51193)
Bumps [actions/setup-node](https://github.com/actions/setup-node) from 6.3.0 to 6.4.0.
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](53b83947a5...48b55a011b)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: 6.4.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-21 10:46:09 +02:00
dependabot[bot]
b5c68d13d5 build(deps): bump github/codeql-action from 4.35.1 to 4.35.2 (#51194)
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.35.1 to 4.35.2.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](c10b8064de...95e58e9a2c)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.35.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-21 10:45:59 +02:00
Samuel Attard
f7ba34064e ci: centralize build-image SHA and pre-seed node-gyp headers (#51148)
* ci: centralize build-image SHA and pre-seed node-gyp headers

- Add .github/actions/build-image-sha as the single source of truth for
  the ghcr.io/electron/build (and arch-tagged electron/test) image SHA,
  with an optional override input for workflow_dispatch.
- Refactor build.yml, apply-patches.yml, build-git-cache.yml,
  clean-src-cache.yml, clean-orphaned-cache-uploads.yml, and the three
  publish workflows to resolve the SHA via a small ubuntu-slim setup job
  instead of hardcoding it in each file.
- Bump the image to daad061f (electron/build-images#68, which pre-warms
  the node-gyp header cache in the Linux images).
- Run the build.yml setup job on ubuntu-slim instead of ubuntu-latest.
- In install-dependencies (and the inline yarn installs in
  pipeline-electron-lint and generate-types), link deps with
  --mode=skip-build first, run `node-gyp install` with up to 3 retries
  (5s backoff) to populate the header cache, then run the build phase.
  This avoids the parallel-download race that intermittently fails the
  first native-addon configure with an empty common.gypi on cold
  macOS/Windows runners.

* ci: skip node-gyp header pre-seed on Linux

* ci: invoke node-gyp via its JS entrypoint for Windows compat
2026-04-20 15:44:10 -07:00
David Sanders
61e815c28a ci: run clang-tidy on macOS and Windows (#50771)
* ci: run clang-tidy on macOS and Windows

* ci: copy framework headers for clang-tidy on macOS

* chore: exclude electron_smooth_round_rect.cc in CI

* chore: C-style casts are discouraged; use static_cast [google-readability-casting]

* chore: add extra args on Windows to clear out warnings

* ci: fix for macOS --remote-build none
2026-04-20 13:31:22 -07:00
Asish Kumar
51c0b025d8 docs: remove obsolete websql from clearStorageData storages list (#51097)
WebSQL support was removed in Electron 31 (see breaking-changes.md), and
the 'websql' key was subsequently removed from the storages lookup in
session.ClearStorageData() during the Chromium 126 bump (#41868).

The documentation for `ses.clearStorageData()` was not updated at the
time and still lists `websql` as a valid value for the `storages`
option. Passing `websql` now silently has no effect, leaving the docs
out of sync with the implementation.

Remove `websql` from the documented list of storage types so the
documentation matches the actual behavior.

Refs: #33900

Signed-off-by: Asish Kumar <officialasishkumar@gmail.com>
2026-04-20 12:45:00 -07:00
David Franco
bc8ed1808c feat: import shared texture supports nv16 (#50728) 2026-04-20 12:39:13 -07:00
Shelley Vohr
db9bcbd3e4 chore: add Phase Three (node smoke tests) to Node.js upgrade skill (#51127)
Adds test suite workflow, BoringSSL incompatibility reference table,
snapshot regeneration instructions, and commit guidelines.
2026-04-20 12:38:13 -07:00
Samuel Attard
92f0993d94 fix: ensure corsEnabled: false protocol handlers do not work across protocols (#51152)
* fix: ensure corsEnabled: false protocol handlers do not work across protocols

Subresource requests for registered custom protocols are routed to
ElectronURLLoaderFactory via the renderer's per-scheme URLLoaderFactoryBundle
entry, which bypasses the network service's CorsURLLoaderFactory. This meant a
cross-origin page could fetch() a scheme registered with {supportFetchAPI: true}
and read the response body even when {corsEnabled: true} was not set.

Replicate CorsURLLoader::StartRequest's kCorsDisabledScheme gate in
ElectronURLLoaderFactory::CreateLoaderAndStart so cross-origin mode=cors
requests to such schemes fail before the JS handler runs, and tag cross-origin
mode=no-cors responses as opaque so the body is not script-readable while <img>
and similar subresource loads continue to work.

Re-enable the long-disabled "disallows CORS and fetch requests when only
supportFetchAPI is specified" test, add coverage for the opaque/no-cors,
same-origin, handler-not-invoked, corsEnabled-unaffected and net.fetch-unaffected
cases, and migrate spec helpers that were exercising a {supportFetchAPI: true}
scheme cross-origin to a corsEnabled scheme.

* chore: oxfmt
2026-04-20 09:34:37 -07:00
Charles Kerr
bef68b6bb7 fix: dangling raw_ptr regression in DesktopCapturer (#51158)
fix: dangling raw_ptr in desktopCapturer
2026-04-20 10:26:43 -05:00
Erick Zhao
0866d39006 docs: update versioning references (#51030)
* docs: update versioning references

* fixups
2026-04-20 12:02:00 +02:00
Charles Kerr
2e17a57d49 test: increase test coverage for docked DevTools and attached WebContentsView (#51071)
* test: strengthen layout-sensitive coverage for docked DevTools and attached WebContentsView

Improve test coverage for layout-dependent behavior by asserting geometry results:

- Test that opening right-docked DevTools shrinks the inspected page
  viewport and that closing DevTools restores it.

- Test that a newly-attached WebContentsView becomes visible with a
  viewport that immediately matches the window’s content bounds.

* fixup! test: strengthen layout-sensitive coverage for docked DevTools and attached WebContentsView
2026-04-20 12:01:25 +02:00
electron-roller[bot]
0ab23201e7 chore: bump node to v24.15.0 (main) (#51091) 2026-04-20 11:52:38 +02:00
David Sanders
1bea748a45 ci: randomly assign stable prep tasks (#51131)
Assisted-by: Claude Opus 4.6
2026-04-20 11:52:11 +02:00
Samuel Attard
09361b6d7b build: update ANGLE repository URL to GitHub mirror (#51156)
Clone angle from github.com/google/angle in fix-sync action

Co-authored-by: Claude <noreply@anthropic.com>
2026-04-20 11:51:39 +02:00
Samuel Attard
a98d19a3ba build: resolve electron_version from git when building in a worktree (#51146)
BUILD.gn previously hard-coded read_file(".git/packed-refs", ...) and
".git/HEAD" to derive electron_version. In a `git worktree` checkout
.git is a file containing a gitdir: pointer, not a directory, so GN's
read_file() fails and gn gen aborts unless override_electron_version is
set manually.

Ask git itself for the real locations via `git rev-parse --git-dir` /
`--git-common-dir` in a small helper script, and feed those resolved
paths to read_file() and the exec_script dependency list. Behaviour in
a plain clone is unchanged (both resolve to electron/.git/...), and the
tarball case still fails loudly with a pointer to
override_electron_version.
2026-04-20 00:54:14 -07:00
Niklas Wenzel
e2143f5e8e feat: support heap profiling in contentTracing (#50826)
Co-authored-by: deepak1556 <hop2deep@gmail.com>
2026-04-20 15:01:13 +09:00
Charles Kerr
a1d28e6764 fix: do not block indefinitely on thumbnails in desktopCapturer (#51128)
* fix: do not block indefinitely on thumbnails in desktopCapturer

fixes dad4ab6 regression

* fix: build error

* fixup! fix: do not block indefinitely on thumbnails in desktopCapturer

chore: remove unnecessary code

* Update shell/browser/api/electron_api_desktop_capturer.cc

Co-authored-by: Niklas Wenzel <dev@nikwen.de>

---------

Co-authored-by: Niklas Wenzel <dev@nikwen.de>
2026-04-19 13:48:38 -05:00
Charles Kerr
72a168f653 fix: linux test shutdown error "AttributeError: type object 'DBusTestCase' has no attribute 'stop_dbus'" (#51129)
stop_dbus() was removed on 2025-09-14 by
99c4800e9e

I think CI isn't seeing this yet because its image has an older version.

This patched script should work on old & new versions of python-dbusmock.
2026-04-18 17:30:53 -07:00
Niklas Wenzel
f6f71fa787 build: remove maintainer issue template (#51142) 2026-04-18 16:59:12 -07:00
Robo
9d77099a8c build: update rules for chrome-release-cls skill (#51140) 2026-04-18 07:43:22 +00:00
Samuel Attard
85be1a05e1 build: add chrome-release-verify and chrome-release-cls skills (#51138)
* build: add chrome-release-verify and chrome-release-cls skills

Adds two project skills under .claude/skills/ for security backports:

* chrome-release-cls — given a Chrome Releases blog post URL, extract
  every CVE/bug and locate the underlying Gerrit CL by searching the
  local Chromium checkout and sub-repos.
* chrome-release-verify — end-to-end backport flow for a release
  branch: maps CVEs→CLs, verifies which fixes are already in the synced
  source tree, writes the cherry-pick patches locally, validates with
  `e sync --3` + `lint --patches` (with the export→lint→re-apply loop),
  then opens a single PR with the linked-CL/crbug/CVE body format.

* ci: skip platform builds for .claude/** changes
2026-04-17 20:28:36 -07:00
Charles Kerr
2f749e24ed fix: intermittent CI failure is-not-alwaysOnTop (#51110)
* fix: intermittent CI failure is-not-alwaysOnTop

Ensure that the `always-on-top-changed` event always fires with the
right 'alwaysOnTop' boolean, regardless of interaction between
SetZOrderLevel() and MoveBehindTaskBarIfNeeded(). We know what the
value will be when all of the HWND events settle, so use that value.

* test: temporary commit to torture-test the new change with 1000 iterations

* test: keep eventually-becomes-consistent test but do not loop 1000 times
2026-04-17 19:03:16 -05:00
David Sanders
15ed78d807 build: don't use //third_party/depot_tools in lint.js (#51034)
* build: don't use //third_party/depot_tools in lint.js

* chore: also run python3 through depot tools
2026-04-17 12:20:27 +02:00
Keeley Hammond
2fbd11d978 feat: add Notification.getHistory() for macOS (#50325)
* feat: add `Notification.getHistory()` static method (macOS)

Add `Notification.getHistory()` which returns a `Promise<Notification[]>`
of all delivered notifications still present in Notification Center.

Each returned Notification is a live object connected to the corresponding
delivered notification — interaction events (click, reply, action, close)
will fire on these objects, enabling apps to re-attach event handlers after
a restart.

Key implementation details:
- Queries UNUserNotificationCenter's getDeliveredNotifications API
- Creates live Notification objects with populated id, groupId, title,
  subtitle, and body properties from what macOS provides
- Registers each object with the presenter via Restore() so the
  NotificationCenterDelegate routes events correctly
- Restored notifications use is_restored_ flag to prevent removal from
  Notification Center when the JS object is garbage collected
- Requires code-signed builds (unsigned builds resolve with empty array)

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* test: fix typecheck

* fix: avoid dangling presenter pointer in GetHistory callback

* fix: document show() behavior

Notifications returned by getHistory() now set is_restored_ so that Dismiss() skips removal from Notification Center on GC. Calling show() on a restored notification removes the original from NC and posts a new one.

* fix: address code review feedback

* test: fix oxfmt linting

* docs: update docs/api/notification.md

Co-authored-by: Erick Zhao <erick@hotmail.ca>

---------

Co-authored-by: Claude <svc-devxp-claude@slack-corp.com>
Co-authored-by: Erick Zhao <erick@hotmail.ca>
2026-04-16 16:49:10 -07:00
Keeley Hammond
d813618538 fix: remove vestigial MachServices from ShipIt launchd job (#51070) 2026-04-16 15:12:38 -07:00
Keeley Hammond
99e8170f3f fix: fix types in devtools console for release (#51104) 2026-04-16 13:28:18 -07:00
Shelley Vohr
02d90a5ba3 chore: add Node.js skill to settings (#51092) 2026-04-16 15:28:45 -04:00
Niklas Wenzel
3c826c7503 fix: linter issue (#51105) 2026-04-16 12:07:16 -07:00
Charles Kerr
f35122b21e test: add tab source ID tests for media handler (#51068)
* test: add getMediaSourceId tab source coverage

* chore: move captureWithTabSourceId() to a shared helper

* test: improve "webContents module getMediaSourceId()" testing
2026-04-16 13:13:10 -04:00
Samuel Attard
5523130c92 ci: build a patched siso for Windows builds (#51077)
* ci: build a patched siso for Windows builds

The Windows Chromium builds intermittently fail during manifest load
with 'The parameter is incorrect.' (ERROR_INVALID_PARAMETER) out of
bindflt.sys. Root cause is a handle-relative NtCreateFile race in
siso/toolsupport/ninjautil/file_parser.go, which opens each subninja
twice — once in the outer goroutine and once more per chunk for
ReadAt. (*os.File).ReadAt is documented as safe for concurrent use,
so the extra open is redundant and removing it both halves the
CreateFileW calls per subninja and sidesteps the race.

Add a new build-siso-windows job on ubuntu-latest (runs in parallel
with checkout-windows) that:

- reads chromium_version from DEPS and pulls the matching siso_version
  SHA from the Chromium mirror's DEPS at that ref
- shallow-clones chromium.googlesource.com/build at that SHA
- applies the in-tree patches under .github/siso-patches/ via git am
- cross-compiles siso.exe for windows/amd64
- caches the binary keyed on siso SHA + sha256 of the patches, so
  subsequent runs hit the cache and skip the clone/patch/build steps
- uploads the result as a siso-windows-amd64 artifact

The Windows build jobs now depend on build-siso-windows, download the
artifact into $RUNNER_TEMP/siso, and export SISO_PATH, which
depot_tools/siso.py already honors. Mirrored into windows-publish.yml
and the regenerated pipeline-segment-electron-publish.yml so release
builds pick it up too.

Notes: none

* ci: extract siso build into a reusable workflow segment

Move the build-siso-windows job body into
pipeline-segment-build-siso-windows.yml and call it from both build.yml
and windows-publish.yml via workflow_call. Also pin actions/cache to
v5.0.5 and add version comments next to the action SHAs introduced by
this change.
2026-04-16 12:40:33 -04:00
Samuel Attard
abffba4548 fix: use CreateDataProperty when copying objects across contextBridge (#50900) 2026-04-16 12:39:53 +02:00
Shelley Vohr
d164b7af01 fix: prevent uaf when destroying guest WebContents during event emission (#50833)
fix: prevent use-after-free when destroying guest WebContents during event emission

Multiple event emission sites in WebContents destroy the underlying C++
object via a JavaScript event handler calling webContents.destroy(), then
continue to dereference the freed `this` pointer. This is exploitable
through <webview> guest WebContents because Destroy() calls `delete this`
synchronously for guests, unlike non-guests which safely defer deletion.

The fix has two layers:

1. A new `is_emitting_event_` flag is checked in Destroy() — when true,
   guest deletion is deferred to a posted task instead of executing
   synchronously. This is separate from `is_safe_to_delete_` (which
   gates LoadURL re-entrancy) to avoid rejecting legitimate loadURL
   calls from event handlers.

2. AutoReset<bool> guards on `is_emitting_event_` are added to
   CloseContents, RenderViewDeleted, DidFinishNavigation, and
   SetContentsBounds, preventing synchronous destruction while their
   Emit() calls are on the stack.

Destroy() now requires both `is_safe_to_delete_` (navigation re-entrancy)
and `!is_emitting_event_` (event emission) to allow synchronous guest
deletion. The existing AutoReset guards on `is_safe_to_delete_` in
DidStartNavigation, DidRedirectNavigation, and ReadyToCommitNavigation
are also now effective for guests.
2026-04-16 12:26:55 +02:00
Fedor Indutny
2434c5a73c fix: show 'Electron Isolated Context' in Dev Tools (#51062)
Because of a bug after the [upstream refactor][0] Dev Tools stopped
showing 'Electron Isolated Context' in the execution context selector.
'Electron Isolated Context' runs with origin set to `file://`. Since
domain name is empty for the origin the respective UI item in the
context selector is created with an empty `subtitle`. However, with the
upstream change items with either of `title` or `subtitle` are omitted
from rendering.

Here we float an [in-review patch][1] until it is fixed upstream.

[0]: dbb61cf4b2
[1]: https://chromium-review.googlesource.com/c/devtools/devtools-frontend/+/7761316
2026-04-16 11:48:09 +02:00
Charles Kerr
cc738f2a6c refactor: avoid unnecessary extends EventEmitter in lib (#51058)
* refactor: MessageChannel does not need to extend EventEmitter

Added in f66d4c7 but never used.

Apparently added as a copy-paste side effect when adding better interface
info into the lib types, eg extends/implements.

* refactor: ShareMenu does not need to extend EventEmitter

Added in f66d4c7 but never used.

Apparently added as a copy-paste side effect when adding better interface
info into the lib types, eg extends/implements.
2026-04-15 23:24:15 -05:00
Charles Kerr
9569c48bfe refactor: SafeStorage never emits, so do not inherit from EventEmitter (#51057) 2026-04-15 23:24:00 -05:00
Charles Kerr
0a80d4d879 fix: UAF in api::UtilityProcessWrapper (#51069)
* fix: UAF in api::UtilityProcessWrapper

Detach the wrapper from ServiceProcessHost during termination instead
of waiting for destruction. Add a regression test that forces GC.

This fixes a UAF error reported by ASAN: the wrapper lost its last JS
reference and become collectible after emitting exit *but* before it
had been removed from the global observer list.

UtilityProcessWrapper is now cppgc-managed as of b9e462f397, but its
ServiceProcessHost observer cleanup still depended on destructor-time
teardown.

* fixup! fix: UAF in api::UtilityProcessWrapper

fix: much better cleanup from Deepak code review
2026-04-15 16:56:57 -07:00
Shelley Vohr
0227bcfb9f fix: allow PDF viewer to show save file picker (#51042)
The PDF viewer's "save with changes" feature uses
`window.showSaveFilePicker()`, but the PDF extension runs in a
cross-origin iframe (chrome-extension:// inside the app's origin).
Chromium's File System Access API blocks cross-origin subframes from
showing file pickers unless the embedder explicitly allows them via
`ContentClient::IsFilePickerAllowedForCrossOriginSubframe()`.

Chrome overrides this in `ChromeContentClient` to allowlist the PDF
extension origin, but Electron never did — so the picker was always
blocked with a SecurityError.

This adds the same override to `ElectronContentClient`, allowing the
built-in PDF extension origin to bypass the cross-origin check.
2026-04-15 18:04:24 -05:00
Anirudh Sevugan
0ad0d44b3d docs: add ELECTRON_INSTALL env vars and remove ELECTRON_SKIP_BINARY_D… (#50979)
docs: add ELECTRON_INSTALL env vars and remove ELECTRON_SKIP_BINARY_DOWNLOAD

these were added recently in a commit but no docs were there for them

also, ELECTRON_SKIP_BINARY_DOWNLOAD was deprecated in electron v42, so that is removed as well
2026-04-15 11:35:35 -05:00
Samuel Attard
04b9b7bc22 build: fail gha-done check when required job fails (#50959)
fix: fail gha-done when any required job failed

Previously, the `gha-done` gate job used an `if:` expression that
evaluated to false whenever any needed job reported a failure, which
caused the job to be *skipped* rather than *failed*. GitHub branch
protection treats skipped required checks as non-blocking, so a PR
could be marked mergeable even though one of its test jobs had failed.

Keep the job always running and move the failure check into a step
that explicitly exits 1 when any dependency failed or was cancelled,
so the "GitHub Actions Completed" required check actually blocks the
merge in that case.

Notes: none
2026-04-15 16:12:13 +02:00
Samuel Attard
22f15ec476 build: authenticate sudowoodo /token exchange via Actions OIDC (#51051) 2026-04-14 15:54:45 -07:00
Charles Kerr
53bf94fdf4 refactor: move electron::api::GlobalShortcut to cppgc (#50192)
* refactor: migrate electron::api::GlobalShortcut to cppgc

* refactor: lazy-create electron::api::GlobalShortcut

copy the lazy-create idom used by electron::api::Screen

* refactor: use gin::WeakCellFactory in GlobalCallbacks

* fix: make a copy of `callback` before running it

safeguard against the callback changing the map, invalidating `cb`

* chore: reduce unnecessary diffs with main

* fixup! refactor: use gin::WeakCellFactory in GlobalCallbacks

fix: must Trace() the weak cell factory

* fix: destruction order

- Setup isolate dispose observer to run destruction sequences
and remove self persistent reference
- Skip NOTREACHED check during destruction, it can happen
as a result of plaform listeners scheduling callbacks when Unregister is invoked.
- Fix the order of unregistration in GlobalShortcut::Unregister
- Add GlobalShortcut::UnregisterAllInternal to avoid any callsites
that can re-enter V8

* fix: crash during gc from incorrect cppgc object headers

* chore: update patches

* chore: cleanup

* chore: fix lint

---------

Co-authored-by: deepak1556 <hop2deep@gmail.com>
2026-04-14 15:52:13 -04:00
Michaela Laurencin
21c5e25f04 ci: allow manual backports to pass template check (#50916)
* ci: allow manual backports to pass template check

Co-Authored-By: Claude <noreply@anthropic.com>
Generated-By: GitHub Copilot

* Update .github/workflows/pr-template-check.yml

Co-authored-by: Niklas Wenzel <dev@nikwen.de>

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Niklas Wenzel <dev@nikwen.de>
2026-04-14 12:15:35 -07:00
Erick Zhao
cf2cc4d80f docs: mention pre-release installation (#51020)
* docs: pre-release installation

* Update installation.md

Co-authored-by: Niklas Wenzel <dev@nikwen.de>

---------

Co-authored-by: Niklas Wenzel <dev@nikwen.de>
2026-04-14 09:23:46 -07:00
Charles Kerr
8ea3d16ce9 chore: remove unused parts of chore_provide_iswebcontentscreationoverridden_with_full_params.patch (#51025)
chore: remove dead patches from chore_provide_iswebcontentscreationoverridden_with_full_params.patch
2026-04-14 09:23:43 -07:00
Charles Kerr
c30655785b refactor: remove FramelessView::Init() (#51008)
* refactor: remove FramelessView::Init()

move to constructor instead

* refactor: make FramelessView::window_ const

refactor: make FramelessView::frame_ const

* refactor: simplify NativeWindowViews::CreateFrameView() logic branches
2026-04-14 16:06:07 +02:00
David Sanders
d223ad67ea build: allow non-zero exit codes locally when linting Chromium rolls (#51031) 2026-04-14 09:04:09 -05:00
Charles Kerr
dad4ab658a fix: timing issue DCHECK crash in DesktopCapturer on macOS (#50960)
refactor: use StartUpdating in desktopCapturer

Replace the one-shot Update() callback model with the continuous
StartUpdating() observer model for NativeDesktopMediaList.

Fixes a macOS DCHECK(can_refresh()) crash in UpdateSourceThumbnail(),
where ScreenCaptureKit's recurrent thumbnail capturer would post
UpdateSourceThumbnail callbacks after the one-shot refresh_callback_
had been consumed. Now, can_refresh() is always true because
refresh_callback_ is repopulated via ScheduleNextRefresh().

Each capturer (window, screen) gets its own ListObserver that tracks
readiness via OnSourceAdded and OnSourceThumbnailChanged events.
Once a list has both sources and thumbnails (or thumbnails aren't
requested), its data is snapshotted and the capturer checks if all
requested types are ready before resolving to JS.

Also remove the "skip_next_refresh_" Chromium patch, which was a
workaround for the timing mismatch between the one-shot Update()
model and ScreenCaptureKit's asynchronous thumbnail delivery.

refactor: simplify state logic in DesktopCapturer
2026-04-14 09:03:18 -05:00
dependabot[bot]
9b85b9c0bc build(deps): bump actions/upload-artifact from 7.0.0 to 7.0.1 (#51032)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 7.0.0 to 7.0.1.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](bbbca2ddaa...043fb46d1a)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-version: 7.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-14 12:44:07 +02:00
John Kleinschmidt
abd29a397e ci: don't login to RBE for clang-tidy and gn-check (#51022)
* ci: don't login to RBE for clang-tidy

* ci: don't login to RBE for gn check
2026-04-14 12:33:34 +02:00
John Kleinschmidt
604e7e82f2 test: fixup autoupdater tests failures (#51024) 2026-04-14 12:33:28 +02:00
Shelley Vohr
2e3da1d079 fix: crash when closing devtools after focus (#47435) 2026-04-14 12:32:19 +02:00
dependabot[bot]
edbff16029 build(deps): bump actions/github-script from 8.0.0 to 9.0.0 (#51033)
Bumps [actions/github-script](https://github.com/actions/github-script) from 8.0.0 to 9.0.0.
- [Release notes](https://github.com/actions/github-script/releases)
- [Commits](ed597411d8...3a2844b7e9)

---
updated-dependencies:
- dependency-name: actions/github-script
  dependency-version: 9.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-14 11:22:02 +02:00
John Kleinschmidt
b5fbbed4db ci: only move items to Needs Review when appropriate (#51023) 2026-04-14 11:02:09 +02:00
Niklas Wenzel
663773a677 fix: linter issue (#51009) 2026-04-13 08:14:34 -07:00
Niklas Wenzel
351f35da8c fix: include missing metadata in trace files (#50892) 2026-04-13 09:36:06 -04:00
Robo
b9e462f397 refactor: api::utilityProcessWrapper managed by cppgc (#50955)
* refactor: api::utilityProcessWrapper managed by cppgc

* chore: fix lint
2026-04-13 10:46:51 +02:00
Shelley Vohr
a39108c5a4 fix: nodeIntegrationInWorker not working in AudioWorklet (#47244)
* fix: nodeIntegrationInWorker not working in AudioWorklet

* fix: deadlock on Windows when destroying non-AudioWorklet worker contexts

The previous change kept the WebWorkerObserver alive across
ContextWillDestroy so the worker thread could be reused for the next
context (AudioWorklet thread pooling, Chromium CL:5270028). This is
correct for AudioWorklet but wrong for PaintWorklet and other worker
types, which Blink does not pool — each teardown destroys the thread.

For those worker types, ~NodeBindings was deferred to the thread-exit
TLS callback. By that point set_uv_env(nullptr) had already run, so on
Windows the embed thread was parked in GetQueuedCompletionStatus with a
stale async_sent latch that swallowed the eventual WakeupEmbedThread()
from ~NodeBindings. uv_thread_join then blocked forever, deadlocking
renderer navigation. The worker-multiple-destroy crash case timed out
on win-x64/x86/arm64 as a result. macOS/Linux (epoll/kqueue) don't have
the latch and were unaffected.

Plumb is_audio_worklet from WillDestroyWorkerContextOnWorkerThread into
ContextWillDestroy. For non-AudioWorklet contexts, restore the
pre-existing behavior of calling lazy_tls->Set(nullptr) at the end of
the last-context cleanup so ~NodeBindings runs while the worker thread
is still healthy. AudioWorklet continues to keep the observer alive so
the next pooled context can share NodeBindings.

* chore: address review feedback

* fix: stop embed thread before destroying environments in worker teardown

FreeEnvironment (called via environments_.clear()) runs uv_run to drain
handle close callbacks. On Windows, both that uv_run and the embed
thread's PollEvents call GetQueuedCompletionStatus on the same IOCP
handle. IOCP completions are consumed by exactly one waiter, so the
embed thread can steal completions that FreeEnvironment needs, causing
uv_run to block indefinitely. On Linux/Mac epoll_wait/kevent can wake
multiple waiters for the same event so the race doesn't manifest.

Add NodeBindings::StopPolling() which cleanly joins the embed thread
without destroying handles or the loop, and allows PrepareEmbedThread +
StartPolling to restart it later. Call StopPolling() in
WebWorkerObserver::ContextWillDestroy before environments_.clear() so
FreeEnvironment's uv_run is the only thread touching the IOCP.

Split PrepareEmbedThread's handle initialization (uv_async_init,
uv_sem_init) from thread creation via a new embed_thread_prepared_ flag
so the handles survive across stop/restart cycles for pooled worklets
while the embed thread itself can be recreated.

* chore: address outstanding feedback
2026-04-13 10:27:30 +02:00
Kunal Dubey
04d9de6f73 fix: avoid window drag during corner resize in MAS build (#50637)
* fix: avoid window drag during corner resize in MAS build

* chore: update chromium patch offsets
2026-04-13 09:54:27 +02:00
Charles Kerr
b8dbe21b38 chore: do not patch fake_desktop_media_list.cc (#50953)
chore: do not patch files we do not use

do not patch fake_desktop_media_list.cc, .h
2026-04-13 09:27:32 +02:00
Samuel Attard
a57dbb55cc ci: split macos-x64 tests into 3 shards (#50968) 2026-04-13 09:26:19 +02:00
David Sanders
860a544534 ci: capture fatal errors in clang problem matcher (#50984) 2026-04-13 09:25:43 +02:00
David Sanders
e31cd64fe5 ci: ignore canceled jobs in audit (#50981)
* ci: ignore canceled jobs in audit

* chore: add another variation
2026-04-13 09:25:26 +02:00
Samuel Attard
10eb512d1d test: run oxfmt on simpleFullScreen test (#50990)
chore: fix oxfmt formatting in api-browser-window-spec

Co-authored-by: Claude <noreply@anthropic.com>
2026-04-12 19:58:02 -07:00
Calvin
052efc9727 chore: add AI tool policy to CONTRIBUTING.md & update PR template (#50451)
* chore: update PR template and add AI tool policy to CONTRIBUTING.md

* sentencesmithing
2026-04-12 19:10:42 -05:00
Shelley Vohr
a007bafaf1 fix: simpleFullScreen exits when web content calls requestFullscreen (#50874)
fix: simpleFullScreen exits when web content calls requestFullscreen

SetHtmlApiFullscreen only checked IsFullscreen() to detect that the
window was already fullscreen, missing the simple-fullscreen case on
macOS. When web content triggered requestFullscreen the code fell
through to SetFullScreen(true) which toggled simple fullscreen off.

Include IsSimpleFullScreen() in the guard so the HTML-API fullscreen
state is updated without touching the window's fullscreen mode.
2026-04-12 19:06:07 -05:00
Robo
ea757689b3 refactor: rm chore_add_electron_objects_to_wrappablepointertag.patch (#50957) 2026-04-12 19:02:40 -05:00
Samuel Attard
2c94aac330 build: add oxfmt for JS/TS formatting and import sorting (#50692)
* build: add oxfmt for code formatting and import sorting

Adds oxfmt as a devDependency alongside oxlint and wires it into the
lint pipeline. The .oxfmtrc.json config matches Electron's current JS
style (single quotes, semicolons, 2-space indent, trailing commas off,
printWidth 100) and configures sortImports with custom groups that
mirror the import/order pathGroups previously enforced by ESLint:
@electron/internal, @electron/*, and {electron,electron/**} each get
their own ordered group ahead of external modules.

- `yarn lint:fmt` runs `oxfmt --check` over JS/TS sources and is
  chained into `yarn lint` so CI enforces it automatically.
- `yarn format` runs `oxfmt --write` for local fix-up.
- lint-staged invokes `oxfmt --write` on staged .js/.ts/.mjs/.cjs
  files before oxlint, so formatting is applied at commit time.

The next commit applies the formatter to the existing codebase so the
check actually passes.

* chore: apply oxfmt formatting to JS and TS sources

Runs `yarn format` across lib/, spec/, script/, build/, default_app/,
and npm/ to bring the codebase in line with the .oxfmtrc.json settings
added in the previous commit. This is a pure formatting pass: import
statements are sorted into the groups defined by the config, method
chains longer than printWidth are broken, single-quoted strings
containing apostrophes are switched to double quotes, and a handful of
single-statement `if` bodies are re-wrapped and get braces added by
`oxlint --fix` to satisfy the `curly: multi-line` rule.

No behavior changes.
2026-04-12 02:03:04 -07:00
Shelley Vohr
dcb844c201 chore: add Claude Code skill for Node.js upgrades (#50910)
Adds a new skill mirroring the Chromium upgrade skill, adapted for
Node.js rolls. Covers patch conflict resolution, build fix workflow,
commit guidelines, and documents high-churn patches and major version
upgrade patterns (V8 bridge patch deletions, BoringSSL complexity).
2026-04-11 20:11:49 -07:00
Charles Kerr
9bc55a255c chore: remove some unnecessary diffs in refactor_expose_file_system_access_blocklist.patch (#50909)
chore: remove some unnecessary diffs in refactor_expose_file_system_access_blocklist.patch
2026-04-11 16:48:47 -05:00
Samuel Attard
12b74eac26 fix: respect iframe sandbox flags for external protocol navigation (#50901) 2026-04-11 16:16:23 -04:00
Charles Kerr
6e7938af1d refactor: migrate api::Extensions to cppgc (#50932)
* refactor: migrate api::Extensions to cppgc

* chore: update patch indices
2026-04-12 01:59:54 +09:00
Charles Kerr
5fded05add fix: dangling raw_ptr api::Protocol::protocol_registry_ (#50829) 2026-04-11 08:53:53 -05:00
Charles Kerr
1879998865 refactor: migrate api::ServiceWorkerContext to cppgc (#50931)
refactor: migrate api::ServiceWorkerContext to cppgc
2026-04-11 18:56:31 +09:00
Samuel Attard
b1b02d9123 fix: restrict window.open features to allowlisted BrowserWindow options (#50902) 2026-04-11 03:43:55 -04:00
Samuel Attard
3f140e1b4b fix: clamp autofill popup bounds to the requesting frame viewport (#50903) 2026-04-11 03:43:43 -04:00
Samuel Attard
3ff923990d fix: validate OSR frame geometry against shared-memory mapping size (#50904) 2026-04-11 03:43:24 -04:00
Samuel Attard
b4e14a9004 fix: use ShowItemInFolder for devtools showItemInFolder embedder message (#50905) 2026-04-11 03:43:14 -04:00
Samuel Attard
61bb03ca75 fix: use audit token instead of PID for parent code-signature check (#50907) 2026-04-11 03:42:52 -04:00
Samuel Attard
1a2029c3a2 fix: apply IsSafeRedirectTarget to net module redirects (#50869) 2026-04-10 19:19:51 -07:00
Samuel Attard
bfa5c93332 refactor: attach translator holder via v8::Function data slot (#50867) 2026-04-10 19:19:34 -07:00
Samuel Attard
f36def6601 fix: scope extension tab-ID resolution to the calling BrowserContext (#50906) 2026-04-10 19:16:21 -07:00
David Sanders
97347c4223 chore: clean up clang-tidy warnings (#50862)
* chore: use emplace and use it correctly

* chore: redundant cast to the same type [google-readability-casting]

* chore: do not create objects with +new [google-objc-avoid-nsobject-new]

* chore: default arguments on virtual or override methods are prohibited [google-default-arguments]

* chore: warning: C-style casts are discouraged; use static_cast [google-readability-casting]

CFLocaleGetValue already returns CFTypeRef so that redundant static_cast was removed

* chore: refactor block to avoid use after move warning from clang-tidy

Looks like clang-tidy couldn't tell these were two mutually exclusive
branches so there was no actual issue, but refactoring is cleaner
anyway since it makes it more DRY.

* chore: C-style casts are discouraged; use static_cast [google-readability-casting]

No cast needed here, everything is already the correct type

* chore: C-style casts are discouraged; use static_cast/const_cast/reinterpret_cast [google-readability-casting]

* chore: use '= default' to define a trivial destructor [modernize-use-equals-default]

* chore: use range-based for loop instead [modernize-loop-convert]

* chore: redundant void argument list [modernize-redundant-void-arg]

* chore: address code review feedback

* chore: use auto

Co-authored-by: Charles Kerr <charles@charleskerr.com>

---------

Co-authored-by: Charles Kerr <charles@charleskerr.com>
2026-04-10 16:37:57 -07:00
Alexey
8d8847d478 feat: capture JS stack trace on renderer OOM (#50043)
* feat: capture JS stack trace on renderer OOM

When a renderer process approaches its V8 heap limit, capture the
JavaScript stack trace and write it to both a Crashpad crash key
("js-oom-stack") and stderr.

The stack trace is captured via RequestInterrupt rather than directly
inside the NearHeapLimitCallback because CurrentStackTrace is unsafe
to call during OOM — V8 FATALs on optimized (TurboFan) frames that
have had their deoptimization data garbage-collected. RequestInterrupt
defers the capture to the next V8 safe point, where all frames are
guaranteed to have deopt data available. This matches Node.js's
approach of never capturing JS stacks inside the heap limit callback.

The callback is registered once per isolate via an atomic guard in
RendererClientBase::DidCreateScriptContext, preventing the CHECK
failure V8 raises on duplicate AddNearHeapLimitCallback registrations
(which would otherwise occur on page navigations or multiple contexts).

Refs: #46078
Made-with: Cursor

* Update shell/renderer/oom_stack_trace.cc

Co-authored-by: Niklas Wenzel <dev@nikwen.de>

* Update shell/renderer/oom_stack_trace.cc

Co-authored-by: Niklas Wenzel <dev@nikwen.de>

* test: add crash reporter test for OOM JS stack trace

Add a test that verifies the `electron.v8-oom.stack` crash key contains
the JS stack trace (including function names) when a renderer process
runs out of memory. Also deduplicate the heap info formatting in
oom_stack_trace.cc.

Refs: #46078
Made-with: Cursor

* fix: lint formatting in oom_stack_trace.cc

Made-with: Cursor

* fix: use proper logger API instead of cstdio

* fix: check heap headroom before capturing OOM stack trace

deepak1556: "Should there be check for available heap size [for]
CurrentStackTrace and formatting"

CurrentStackTrace allocates StackTraceInfo + StackFrameInfo on the V8
heap. If the 20 MB bump is partially consumed by the time the interrupt
fires, these allocations trigger a secondary OOM. Guard with a 2 MB
headroom check.

Made-with: Cursor

* fix: handle V8 cage limit when bumping heap for OOM stack capture

deepak1556: "Does this bumping work when we are at the cage limit of
4GB"

V8's pointer compression cage caps the heap at ~4 GB. When
current_heap_limit is already near the ceiling, our 20 MB bump gets
clamped to zero and the interrupt never fires. Detect this and record
heap info as the final crash key instead of waiting for a stack trace
that won't arrive.

Made-with: Cursor

* feat: add V8 heap statistics as OOM crash keys

deepak1556: "V8 seems to capture heap stats as crash keys but it gets
missed today due to the OOM callback override... wonder if we can
include that to get some more heuristics in the dump."

Record heap used/total/limit/available, per-space stats for old_space
and large_object_space, native/detached context counts, and utilization
percentage as crash keys. Also add heap stats in the V8OOMErrorCallback
in node_bindings.cc for the final OOM crash report.

Made-with: Cursor

* feat: support worker thread isolates for OOM stack trace

deepak1556: "You need a separate registration for worker threads via
WorkerScriptReadyForEvaluationOnWorkerThread but that also means the
process global g_registered_isolate would break."

Chromium has one V8 isolate per thread (main + one per web worker), so
thread_local is equivalent to per-isolate storage. Replace the global
atomic + mutex/set with a constinit thread_local OomState* that holds
the isolate pointer and per-isolate is_in_oom flag. The void* data
parameter on AddNearHeapLimitCallback delivers OomState* directly into
callbacks, so the hot path needs no TLS lookup.

Add WorkerScriptReadyForEvaluationOnWorkerThread and
WillDestroyWorkerContextOnWorkerThread overrides to RendererClientBase
so both ElectronRendererClient and ElectronSandboxedRendererClient get
worker OOM registration. Update ElectronRendererClient to call the base
class in both worker lifecycle methods.

Add a web worker OOM test that spawns a dedicated Worker with a memory
leak and verifies the stack trace captures the worker function name.

Made-with: Cursor

* fix: register OOM callback for all script contexts

When context isolation is enabled, ShouldNotifyClient skips
DidCreateScriptContext for the main world, but user JS still runs there
and can OOM. Register in DidInstallConditionalFeatures which fires for
every script context. The TLS dedup guard prevents double-registration
on the same isolate.

Made-with: Cursor

* fix: guard against division by zero and cage size changes in OOM handler

Add a zero-guard on heap_size_limit before computing utilization
percentage — maximizes robustness in an OOM code path.

Add static_assert on kPtrComprCageReservationSize to catch any
upstream V8 change to the cage size at compile time.

Made-with: Cursor

* fix: address review feedback on OOM stack trace PR

- Remove redundant RegisterOomStackTraceCallback from
  electron_render_frame_observer.cc; DidCreateScriptContext is sufficient
  since main world and isolated world share the same isolate
- Replace thread_local OomState* with base::ThreadLocalOwnedPointer
  wrapped in base::NoDestructor per Chromium style for non-trivially
  destructible types
- Change heap-headroom and cage-limit logs from ERROR to INFO since
  users cannot act on these diagnostics
- Add comment explaining why base class is called last in
  WillDestroyWorkerContextOnWorkerThread (OOM deregistration ordering)

Made-with: Cursor

* fix: skip OOM stack trace registration for worklet contexts

Worklets can share a thread and isolate via WorkletThreadHolder's
per-process singleton pattern. With per-thread OOM state, the first
worklet to be destroyed would prematurely remove the callback for
any remaining worklets on the same thread. Skip worklets entirely
to avoid this; can be revisited with ref-counting if needed.

Made-with: Cursor

* fix: prevent dangling raw_ptr<v8::Isolate> in OOM state

The OomState held a raw_ptr<v8::Isolate> that outlived the isolate on
the main thread: gin::IsolateHolder destroyed the isolate during
shutdown, but the OomState (stored in thread-local storage) was only
released later in JavascriptEnvironment::~JavascriptEnvironment. This
triggers a dangling pointer check when building with
enable_dangling_raw_ptr_checks.

Register OomState as a gin::PerIsolateData::DisposeObserver so it
clears the raw_ptr and removes the NearHeapLimitCallback before the
isolate is destroyed, regardless of destructor ordering.

Suggested-by: Deepak Mohan
Made-with: Cursor

* test: verify OOM crash keys end-to-end via crash reporter

Replace stderr-based OOM tests with end-to-end crash dump validation.
Instead of parsing log output, start a crash reporter server, trigger
renderer OOM, and verify the uploaded crash dump contains the expected
`electron.v8-oom.*` annotations — the same code path production crash
reports take.

Consolidate all OOM test scenarios (basic heap leak, JSON.stringify,
web worker) into a single `describe('OOM crash keys')` block inside
api-crash-reporter-spec using the existing crash fixture app with new
renderer-oom-json and renderer-oom-worker crash types.

The web worker test verifies that OOM crash keys are present but does
not assert on the JS function name: the 20 MB heap bump may be
exhausted before V8 reaches a safe point to fire the stack-capture
interrupt, leaving the crash key at "(stack pending)". Increasing the
bump or switching to a synchronous capture strategy would fix this but
is left for a follow-up.

Remove the standalone oom-stack-trace-spec.ts and its fixture app.

Made-with: Cursor

---------

Co-authored-by: Niklas Wenzel <dev@nikwen.de>
2026-04-10 15:13:16 -04:00
Niklas Wenzel
b9825ba835 fix: preference initialization with app.setPath('sessionData') (#50891)
fix: preference initialization with app.setPath('sessionData')
2026-04-11 03:22:09 +09:00
Robo
9be03cbe54 fix: enable blink gc plugin (#50465)
chore: address blink gc plugin errors

Key fixes:
- Replace `base::WeakPtrFactory` with `gin::WeakCellFactory` in
  MenuMac, MenuViews, and NetLog, since weak pointers to cppgc-managed
  objects must go through weak cells
- Replace `v8::Global<v8::Value>` with `cppgc::Persistent<Menu>` for
  the menu reference in BaseWindow
- Stop using `gin_helper::Handle<T>` with cppgc types; use raw `T*`
  and add a `static_assert` to prevent future misuse
- Add proper `Trace()` overrides for Menu, MenuMac, MenuViews, and
  NetLog to ensure cppgc members are visited during garbage collection
- Replace `SelfKeepAlive` prevent-GC mechanism in Menu with a
  `cppgc::Persistent` prevent-GC captured in `BindSelfToClosure`
- Introduce `GC_PLUGIN_IGNORE` macro to suppress
  known-safe violations: mojo::Remote fields, ObjC bridging pointers,
  and intentional persistent self-references
- Mark `ArgumentHolder` as `CPPGC_STACK_ALLOCATED()` in both Electron's
  and gin's function_template.h to silence raw-pointer-to-GC-type
  warnings
2026-04-11 03:21:39 +09:00
Michaela Laurencin
861ef95598 fix: remove decorateURL from default_app (#50852)
remove decorateURL from default_app

Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2026-04-10 13:18:26 -05:00
Jan Hannemann
20ed34a2fd feat: add id, groupId, and groupTitle support for Windows notifications (#50328)
* feat: allow to set id and groupId

* feat: use Id's without hash but check length

* feat: adds visual grouping via groupTitle

* test: tests added for id, groupId and groupTitle

* fix: unused vars on Mac and Linux

* fix: remove redundant parameter

* fix: add doc links for id and group

* fix: throw if groupId is missing

* fix: test
2026-04-10 11:01:57 -07:00
David Sanders
82aa603698 build: don't use //third_party/depot_tools in gn build scripts (#50858) 2026-04-10 13:43:25 -04:00
Charles Kerr
b4f725a763 test: guard permission handlers in File System API tests (#50890)
fix: guard permission handlers in File System API tests

Manual port of #50865 for `main` due to code shear.

1. Chromium can fire unrelated permission checks (e.g. 'background-sync')
on the default session. Copy a safeguard `permission === 'fileSystem'` from
"calls twice when trying to query a read/write file handle permissions".

2. add afterEach cleanup, reset setPermissionCheckHandler(null).
2026-04-10 18:43:04 +02:00
Robo
05f1cb553d fix: shutdown crash when unregistering power notification on windows (#50878)
* fix: crash on shutdown when unregistering power notification

Refs https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-unregistersuspendresumenotification
the handle should be the return value of registeration api
not the window handle.

* chore: remove redundant selfkeepalive

Followup to 7c0cb61b3c

* chore: fix lint

* chore: address review feedback
2026-04-11 00:47:05 +09:00
Robo
c0f187f90d feat: capture Node.js trace categories via Perfetto (#50591)
* feat: capture Node.js trace categories via Perfetto

* fix: crash in ELECTRON_RUN_AS_NODE

* chore: cleanup macro usage

* chore: update patches
2026-04-10 08:12:46 -04:00
Robo
5f820b7f69 test: add cppgc backed menu leak regression test (#50879)
* spec: add menu leak regression test

* spec: reduce menu count to remove CI flakiness
2026-04-10 20:48:28 +09:00
Calvin
8b7e7de208 fix: return numeric blksize and blocks from asar fs.stat (#50825)
fix: return numeric `blksize` and `blocks` from asar `fs.stat`

Previously, `fs.stat` on files inside `.asar` archives returned
`undefined` for `blksize` and `blocks`, violating the Node.js API
contract where these fields must be `number | bigint`.

Use `4096` for `blksize` (matching the convention used by `memfs` and
the proposed `node:vfs` module in nodejs/node#61478) and compute
`blocks` as `ceil(size / 512)` (standard 512-byte block units).

Fixes #42686

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 10:19:09 +02:00
Shelley Vohr
b59f573097 fix: pass root_gen_dir from GN to generate_node_headers.py (#50847)
fix: pass root_gen_dir from GN to generate_node_headers.py

PR #50828 replaced a local get_out_dir() (defaulting to 'Testing') with
the shared one from script/lib/util.py (defaulting to 'Default').
Neither default is correct because the actual output directory depends
on the active build config. Pass $root_gen_dir from the GN action so
the script always uses the correct path.
2026-04-10 09:41:45 +02:00
Charles Kerr
b417696d6b refactor: migrate electron::api::Protocol to cppgc (#50857)
refactor: migrate api::Protocol to cppgc
2026-04-10 15:58:33 +09:00
Mitchell Cohen
4203d7688f fix: external resize hit targets for frameless windows on Windows (#50706) 2026-04-09 18:13:13 -05:00
Zeenat Lawal
62e637275a fix: move Electron help menu links to default app only (#50629)
* fix: remove Electron links from default help menu

* fix: remove help menu entirely from default menu

* fix: move Electron help menu links to default app

* docs: update default menu items list in menu.md
2026-04-09 12:14:22 -07:00
Shelley Vohr
28c0eb29df fix: webContents.print() ignoring mediaSize when silent (#50808)
fix: webContents.print() ignoring mediaSize when silent

PR #49523 moved the default media size fallback into OnGetDeviceNameToUse,
but the new code unconditionally writes kSettingMediaSize — clobbering
any mediaSize the caller had already set in WebContents::Print() from
options.mediaSize / pageSize. As a result, silent prints with an
explicit pageSize (e.g. "Letter") fell back to A4 with tiny content.

Only populate the default/printer media size when the caller hasn't
already supplied one, preserving the precedence:
  1. user-supplied mediaSize / pageSize
  2. printer default (when usePrinterDefaultPageSize is true)
  3. A4 fallback
2026-04-09 12:16:40 -05:00
Charles Kerr
8a730e2aec fix: remove dangling raw_ptr api::WebContents::zoom_controller_ (#50812)
fix: remove dangling raw_ptr api::WebContents::zoom_controller_
2026-04-09 12:16:17 -05:00
Shelley Vohr
044be7ce40 fix: avoid crash in window.print() when prefilling native print dialog (#50843)
fix: avoid crash in window.print() when prefilling native print dialog

When UpdatePrinterSettings() fails (e.g. the printer rejects the
requested resolution), OnError() nullifies print_info_ via
ReleaseContext(). The return value was not checked, so
AskUserForSettings() passed nil to [NSPrintPanel runModalWithPrintInfo:],
crashing in PJCSessionHasApplicationSetPrinter with a null PMPrintSession.

Check the return value and fall back to UseDefaultSettings() on failure
so the dialog opens with defaults instead of crashing.
2026-04-09 13:14:36 -04:00
Shelley Vohr
7245c6a3f0 ci: re-check signed commits on every PR synchronize (#50811)
The needs-signed-commits label was previously added by the lightweight
synchronize workflow but only removed by a job in build.yml gated on
`gha-done`, which requires every macOS/Linux/Windows build to finish
green. That made label removal both slow (waits on the full pipeline)
and fragile (any unrelated build failure leaves the label pinned even
after commits are properly signed).

Drop the `if` guard on the synchronize job so it re-evaluates signing
on every push, and add a removal step that runs on success when the
label is present. Force-pushing signed commits now clears the label as
soon as the check completes, with no dependency on the build pipeline.
2026-04-09 11:02:01 -04:00
Charles Kerr
b484b0bde9 fix: fix inset and stop using gfx::ToFlooredRectDeprecated() (#50809)
fix: fix inset and stop using ToFlooredRectDeprecated()
2026-04-09 09:55:11 -05:00
Charles Kerr
6c8a910232 refactor: remove unnecessary raw_ptr SavePageHandler::web_contents_ (#50810)
refactor: remove unnecessary field raw_ptr<content::WebContents> SavePageHandler::web_contents_
2026-04-09 09:54:44 -05:00
Noah Gregory
cc3d4f5f58 fix: PDF support when site isolation trials disabled (#50689)
* fix: use proper OOPIF PDF check in `StreamsPrivateAPI`

* fix: add `ShouldEnableSubframeZoom` override to `ElectronBrowserClient` for upstream parity

* fix: add `MaybeOverrideLocalURLCrossOriginEmbedderPolicy` override to `ElectronBrowserClient` for upstream parity

* fix: add `DoesSiteRequireDedicatedProcess` override to `ElectronBrowserClient` for upstream parity

* style: move `DoesSiteRequireDedicatedProcess` to correct override section
2026-04-09 15:35:26 +02:00
Shelley Vohr
b711ce7b04 chore: remove window enlargement revert patch (#50612)
* chore: remove window enlargement revert patch

Chromium removed the `window_enlargement_` system from
DesktopWindowTreeHostWin (1771dbae), which was a workaround for an AMD
driver bug from 2013 (crbug.com/286609) where translucent HWNDs smaller
than 64x64 caused graphical glitches. Chromium confirmed this is no
longer needed and shipped the removal.

This removes the revert patch and all Electron-side code that depended
on the `kEnableTransparentHwndEnlargement` feature flag, including the
`GetExpandedWindowSize` helper and max size constraint expansion in
`NativeWindow::GetContentMaximumSize`.

* test: remove obsolete <64x64 transparent window test

The test was added in 2018 (#12904) to verify the AMD driver
workaround that artificially enlarged translucent HWNDs smaller than
64x64 (crbug.com/286609). The workaround set the real HWND to 64x64
and subtracted a stored window_enlargement_ from every client/window
bounds query, so getContentSize() reported the originally-requested
size even though the actual HWND was larger.

With both the Chromium window_enlargement_ system and Electron's
GetExpandedWindowSize gone, setContentSize on a transparent
thickFrame window calls SetWindowPos directly. WS_THICKFRAME windows
are subject to DefWindowProc's MINMAXINFO.ptMinTrackSize clamp on
programmatic resizes (Chromium's OnGetMinMaxInfo ends with
SetMsgHandled(FALSE), so DefWindowProc overwrites the zeroed
min-track with system defaults), which on Windows Server 2025
floors at 32x39 — hence the failing [32, 39] vs [30, 30].

The removed feature_list.cc comment explicitly flagged this test as
the blocker for retiring kEnableTransparentHwndEnlargement, so
delete it alongside the workaround it was validating.
2026-04-09 15:34:10 +02:00
Alexey
adf9a6e303 fix: restore std::deque for dynamic crash key storage (#50795)
#47171 migrated `std::deque` to `base::circular_deque` in
`shell/common/crash_keys.cc`. However, `CrashKeyString` wraps a
`crashpad::Annotation` that holds self-referential pointers and
registers itself in a process-global linked list. `circular_deque`
relocates elements on growth (via `VectorBuffer::MoveConstructRange`),
leaving those pointers dangling — causing missing crash keys or a hung
crashpad handler (especially on macOS). The `base/containers/README.md`
warns: "Since `base::deque` does not have stable iterators and it will
move the objects it contains, it may not be appropriate for all uses."

Reverts to `std::deque`, whose block-based layout never relocates
existing elements. Adds a regression test that registers 50 dynamic
crash keys and verifies they all survive a renderer crash.

Notes: Fixed crash keys being lost and the crash reporter hanging on
macOS when many dynamic crash keys were registered.

Made-with: Cursor
2026-04-09 10:50:32 +02:00
Calvin
6744293e96 fix: account for extraSize in aspect ratio min/max clamping on macOS (#50794)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-09 10:50:17 +02:00
dependabot[bot]
0d3342debf build(deps-dev): bump @xmldom/xmldom from 0.8.11 to 0.8.12 in the npm_and_yarn group across 1 directory (#50824)
build(deps-dev): bump @xmldom/xmldom

Bumps the npm_and_yarn group with 1 update in the / directory: [@xmldom/xmldom](https://github.com/xmldom/xmldom).


Updates `@xmldom/xmldom` from 0.8.11 to 0.8.12
- [Release notes](https://github.com/xmldom/xmldom/releases)
- [Changelog](https://github.com/xmldom/xmldom/blob/master/CHANGELOG.md)
- [Commits](https://github.com/xmldom/xmldom/compare/0.8.11...0.8.12)

---
updated-dependencies:
- dependency-name: "@xmldom/xmldom"
  dependency-version: 0.8.12
  dependency-type: direct:development
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 09:54:13 +02:00
Calvin
157cdac4b9 test: use shared get_out_dir() in generate_node_headers.py (#50828)
The local get_out_dir() defaulted to 'Testing' instead of 'Default',
causing e test to fail when using a non-Testing build config. Replace
it with the canonical version from script/lib/util.py.
2026-04-09 09:52:14 +02:00
Shelley Vohr
4dfada86ce fix: menu items not cleaned up after rebuild (#50806)
Menu was holding a SelfKeepAlive to itself from construction, so any
Menu that was never opened (e.g. an application menu replaced before
being shown) stayed pinned in cppgc forever. Repeated calls to
Menu.setApplicationMenu leaked every prior Menu along with its model
and items.

Restore the original Pin/Unpin lifecycle: start keep_alive_ empty and
only assign `this` in OnMenuWillShow. OnMenuWillClose already clears
it.
2026-04-09 11:56:39 +09:00
Kanishk Ranjan
df81a1d4ac test: add desktopCapturer icon validation (#50261)
* chore: testing of desktopCapturer can run on arm

* fix: DesktopMediaListCaptureThread crash

Fixed a crash when Windows calls ::CoCreateInstance() in the
DesktopMediaListCaptureThread before COM is initialized.

* test: added test for desktopCapturer fetchWindowIcons

* chore: updating Chromium patch hash

---------

Co-authored-by: Charles Kerr <charles@charleskerr.com>
2026-04-08 14:56:27 -04:00
Shelley Vohr
c3e3958668 fix: devtools re-attaches on open when previously detached (#50807)
PR #50646 added a dock state allowlist in SetDockState() that collapsed any
non-matching value to "right". WebContents::OpenDevTools passes an empty
string when no `mode` option is given, which is the sentinel LoadCompleted()
uses to restore `currentDockState` from prefs. The allowlist clobbered that
sentinel to "right", so previously-undocked devtools would flash detached
and then snap back to the right dock.

Preserve the empty string through SetDockState() so the pref-restore path
runs; still reject any non-empty invalid value to keep the JS-injection
guard from #50646 intact.
2026-04-08 13:36:47 -04:00
electron-roller[bot]
afd5fb4a60 chore: bump chromium to 148.0.7778.0 (main) (#50769)
* chore: bump chromium in DEPS to 148.0.7776.0

* chore: bump chromium in DEPS to 148.0.7778.0

* fix(patch): buffered_data_source_host_impl include added upstream

Ref: https://chromium-review.googlesource.com/c/chromium/src/+/7712714

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* fix(patch): ASan process info callback added upstream

Ref: https://chromium-review.googlesource.com/c/chromium/src/+/7724018

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* fix(patch): ServiceProcessHost per-instance observer migration

Ref: https://chromium-review.googlesource.com/c/chromium/src/+/7700794

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* fix(patch): FSA BlockPath factory method refactor

Upstream refactored BlockPath initialization to use factory methods
(CreateRelative, CreateAbsolute, CreateSuffix) and a switch statement.
Updated the exposed code in the header to match.

Ref: https://chromium-review.googlesource.com/c/chromium/src/+/7665590

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* fix(patch): service process tracker per-instance observer refactor

Ref: https://chromium-review.googlesource.com/c/chromium/src/+/7700794

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* chore: update patches (trivial only)

* 7723958: Rename blink::WebString::FromUTF16() to FromUtf16()

Ref: https://chromium-review.googlesource.com/c/chromium/src/+/7723958

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* fixup! fix(patch): ASan process info callback added upstream

---------

Co-authored-by: electron-roller[bot] <84116207+electron-roller[bot]@users.noreply.github.com>
Co-authored-by: Samuel Maddock <samuelmaddock@electronjs.org>
Co-authored-by: Claude <svc-devxp-claude@slack-corp.com>
2026-04-08 13:34:24 -04:00
Charles Kerr
8679522922 chore: iwyu commonly-included headers in shell/ (#50778)
* chore: iwyu in shell/browser/api/electron_api_web_contents.h

* chore: iwyu in shell/browser/browser.h

* chore: iwyu in shell/browser/javascript_environment.h

* chore: iwyu in shell/common/gin_hhelper/function_template.h

* chore: do not include node_includes.h if we are not using it

* chore: fix transitive include
2026-04-08 09:33:42 -05:00
David Sanders
0828de3ccd ci: include .obj checksums when calculating object change rate (#50772) 2026-04-08 14:57:40 +02:00
Michaela Laurencin
6b5a4ff66c ci: allow ai-pr label without comment (#50792) 2026-04-08 13:09:23 +02:00
Charles Kerr
ca28023d4d chore: remove unused enum classes (#50782)
chore: remove unused FileSystemAccessPermissionContext::Access enum class

chore: remove unused FileSystemAccessPermissionContext::RequestType enum class

declared in 344aba08 but never used
2026-04-08 09:41:38 +02:00
Samuel Attard
e60441ad60 build: update build-tools to latest (#50786) 2026-04-08 09:31:12 +02:00
Charles Kerr
a189425373 fix: dangling raw_ptr api::Session::browser_context_ (#50784)
* fix: dangling raw_ptr api::Session::browser_context_

* fix: address code review feedback
2026-04-08 15:04:29 +09:00
Charles Kerr
7eccea1315 refactor: remove use of deprecated class base::MemoryPressureListener (#50763) 2026-04-07 20:11:02 -05:00
669 changed files with 23743 additions and 13072 deletions

View File

@@ -11,6 +11,7 @@
"Bash(e patches:*)",
"Bash(e sync:*)",
"Skill(electron-chromium-upgrade)",
"Skill(electron-node-upgrade)",
"Read(*)",
"Bash(echo:*)",
"Bash(e build:*)",

View File

@@ -0,0 +1,106 @@
---
name: chrome-release-cls
description: Given a Chrome Releases blog post URL (chromereleases.googleblog.com), extract every CVE/bug and find the underlying Gerrit CL that fixed it by searching the local Chromium checkout and sub-repos. Use when asked to map Chrome security release notes to fixing CLs, or to find which commits correspond to CVEs in a Chrome stable update.
---
# Chrome Release → Fixing CL Mapper
Maps every security fix in a Chrome Releases blog post to the Gerrit CL(s) that fixed it.
## Input
`$ARGUMENTS` — a `https://chromereleases.googleblog.com/...` URL. If empty, ask the user for one.
## Procedure
### 1. Extract CVE → bug ID pairs from the blog post
The blog HTML buries bug IDs inside `<a>` tags, so strip tags first. Run:
```bash
curl -sL "$URL" | python3 -c '
import sys, re, html
t = re.sub(r"<[^>]+>", " ", sys.stdin.read())
t = re.sub(r"\s+", " ", html.unescape(t))
seen = set()
for m in re.finditer(r"\[\s*(\d{6,})\s*\]\s*(Critical|High|Medium|Low)\s*(CVE-\d{4}-\d+):\s*([^.]+?)\.", t):
if m.group(3) in seen: continue
seen.add(m.group(3))
print(f"{m.group(3)}|{m.group(1)}|{m.group(2)}|{m.group(4).strip()}")
' > /tmp/cve_bugs.txt
cat /tmp/cve_bugs.txt
```
If this yields nothing, the page may have changed format — fall back to `grep -oE 'CVE-[0-9]{4}-[0-9]+'` and `grep -oE 'crbug\.com/[0-9]+'` and pair them by order.
### 2. Find the fixing CL for each bug
Search git history in the Chromium checkout and relevant sub-repos for commits whose `Bug:` or `Fixed:` footer references the bug ID, then extract the `Reviewed-on:` Gerrit URL.
Repo selection by component keyword:
- ANGLE → `third_party/angle`
- Skia, Graphite → `third_party/skia`
- PDFium → `third_party/pdfium`
- Dawn → `third_party/dawn`
- V8, Turbofan, Maglev, Turboshaft → `v8`
- everything else → `.` (chromium/src)
Always also fall back to `.` if the hinted repo has no match.
```bash
cd /root/src/electron/src # chromium root (parent of electron/)
lookup() {
local bug="$1" repos="$2"
for repo in $repos . v8 third_party/skia third_party/angle third_party/pdfium third_party/dawn; do
local hits
hits=$(git -C "$repo" log --all --since='6 months ago' -E \
--grep="(Bug|Fixed):.*\\b${bug}\\b" --format='%H' 2>/dev/null | sort -u)
[[ -z "$hits" ]] && continue
while read -r h; do
git -C "$repo" log -1 --format='%B' "$h" | grep '^Reviewed-on:' | sed 's/^/ /'
echo "$(git -C "$repo" log -1 --format='%s' "$h")"
done <<<"$hits"
return 0
done
echo " (not found locally)"
}
```
Drive it from `/tmp/cve_bugs.txt`. Prefer the **non-`[M1xx]`-prefixed** commit subject as the canonical main CL; the `[M1xx]` ones are branch cherry-picks.
### 3. Handle misses
For any bug with no local hit:
- `git -C <repo> fetch origin` then re-search `--remotes` (fix may be newer than the checkout).
- Query Gerrit directly: `curl -s "https://chromium-review.googlesource.com/changes/?q=bug:${BUG}&n=10" | tail -n +2 | python3 -m json.tool` (also try `skia-review`, `pdfium-review`, `dawn-review`, `aomedia-review`).
- **`b/` bug format (Skia, Graphite, Dawn):** These repos reference bugs as `b/<id>` in commit messages rather than `Bug: <id>` footers. The Gerrit `bug:` query will return nothing. Use `message:<id>` search instead:
```bash
curl -s "https://skia-review.googlesource.com/changes/?q=message:${BUG}&n=5" | tail -n +2
```
Apply the same pattern for `dawn-review.googlesource.com` when the component is Dawn.
- **Tracing main CLs from merges:** When only `[M1xx]` merge CLs are found, query the CL detail for `cherry_pick_of_change` to find the original main CL number:
```bash
curl -s "https://chromium-review.googlesource.com/changes/${CL_NUM}?o=CURRENT_REVISION" | tail -n +2 | python3 -c "
import sys, json
d = json.load(sys.stdin)
print(d.get('cherry_pick_of_change', 'none'))
"
```
- If still nothing and the bug was reported very recently (especially by "Google Threat Intelligence" or marked in-the-wild), the CL is likely still access-restricted — report it as such rather than guessing.
### 4. Special cases
- **Roll CLs — skip and find the upstream fix:** For components whose fixes land in upstream repos (PDFium, Dawn, Skia, Graphite, libaom, libvpx, ffmpeg), the chromium-review hit will be a `Roll src/third_party/...` commit. Do not report the roll CL as the fix. Instead, query the component's own Gerrit instance directly for the actual fixing CL:
- PDFium → `pdfium-review.googlesource.com` (use `bug:` or `message:` query)
- Dawn → `dawn-review.googlesource.com` (use `message:` query — uses `b/` format)
- Skia / Graphite → `skia-review.googlesource.com` (use `message:` query — uses `b/` format)
- libaom → `aomedia-review.googlesource.com`
Only if the upstream Gerrit instance returns no results should you fall back to reporting the roll CL — in that case, include the roll CL and note that the actual fix is upstream but the specific CL could not be identified.
- Multiple `Reviewed-on:` lines in one commit body: cherry-picks keep the original line plus a new one. The **first** `Reviewed-on:` is the original CL.
- A bug may have multiple distinct fix CLs (fix + follow-up hardening) — list all of them.
### 5. Output
Produce a markdown table per severity level: `CVE | Bug | Component | Fix CL (main)`. Link bugs as `https://crbug.com/<id>`. Save raw output (including all branch merges) to `/tmp/cve_cls.txt` and mention the path.

View File

@@ -0,0 +1,123 @@
---
name: chrome-release-verify
description: End-to-end Chrome security backport for an Electron release branch. Given a Chrome Releases blog URL and a branch (e.g. 41-x-y), determines which CVE fixes are missing from the *actual synced source*, writes the cherry-pick patches locally, validates them with `e sync --3` + `lint --patches`, then pushes a single PR. Use when asked to backport a Chrome security release to N-x-y, "is CVE-X already in N-x-y?", or to produce/validate the cherry-pick set for a release branch.
---
# Chrome Release → Validated Backport PR
Input: `$ARGUMENTS` = `<release-branch> <chrome-releases-blog-url>` (e.g. `41-x-y https://chromereleases.googleblog.com/2026/04/stable-channel-update-for-desktop_15.html`). Ask if either is missing.
The flow is **local-first**: nothing is pushed until every patch applies via `e sync --3` and passes `lint --patches`.
## 1. Map CVE → bug → fix CL
Run `/chrome-release-cls <blog-url>` (or its inline procedure) to produce `/tmp/cve_bugs.txt` (`CVE|bug|severity|desc`) and a per-bug canonical fix CL. For each CL also note `repo` (path under `src/`: `.`, `v8`, `third_party/{skia,angle,pdfium,dawn}`, `third_party/libaom/source/libaom`) and `gerrit-host`.
**Prefer the target-milestone merge CL** if one exists (e.g. on `41-x-y` ≈ M146, prefer the `[M146]` cherry-pick over the main CL) — it's already rebased and far less likely to conflict. Find it via `git log --all --grep` on the Change-Id, or Gerrit `?q=bug:<n>`. If Chrome did *not* merge a fix to the target milestone, that's a strong signal the vulnerable code doesn't exist there — flag it for skip rather than forcing a port.
## 2. Prepare a synced worktree
Reuse `bp-<NN>` from `e show configs` if present, else `e worktree add bp-<NN> ~/src/electron-bp-<NN> --source <current> --no-sync`.
```bash
cd <root>/src/electron
git fetch origin <branch>
git checkout -B security-backport/<branch>/<short-date> origin/<branch>
e use bp-<NN>
e sync 2>&1 | tee /tmp/bp_sync.log
```
If sync fails with `NotADirectoryError: '<root>/src/.git/objects/info/alternates'`, remove `GIT_CACHE_PATH` from the bp config's `env` and retry.
## 3. Verify IN-TREE vs NEEDS-BACKPORT
For each bug, three checks against the **synced** repo:
1. `git -C "$repo" log HEAD --since='1 year ago' -E --grep="\b${bug}\b" --format='%h %s'`
2. Fetch Change-Id from Gerrit, then `git log HEAD --grep="^Change-Id: ${cid}$"`
3. `grep -rlE "(\b${bug}\b|${cid})" <root>/src/electron/patches/`
Any hit ⇒ IN-TREE. All empty ⇒ NEEDS-BACKPORT.
For each NEEDS-BACKPORT CL, also fetch its file list (`/changes/<proj>~<cl>/revisions/current/files`) and **skip** if every file is under `chrome/browser/`, `chrome/android/`, `ios/`, or `components/**/android/` — Electron doesn't compile those.
Report the table now (`CVE | Sev | Bug | Component | Verdict | CL`) and the proposed backport set; get user sign-off before continuing.
## 4. Write patches locally (no push yet)
For each backport CL, fetch the raw patch and write it into `patches/<dir>/`:
```bash
curl -s "https://${host}.googlesource.com/changes/${proj//\//%2F}~${cl}/revisions/current/patch" \
| base64 -d > "patches/${dir}/cherry-pick-${short}.patch"
echo "cherry-pick-${short}.patch" >> "patches/${dir}/.patches"
```
For repos with no Gerrit host `e cherry-pick` supports (e.g. **libaom** on aomedia), instead `git cherry-pick` the upstream commits onto the synced sub-repo HEAD and `git format-patch` the result.
For any newly-created `patches/<dir>/`, append to `patches/config.json` **preserving the compact one-line-per-entry style**:
```json
{ "patch_dir": "src/electron/patches/<dir>", "repo": "src/third_party/<dir-or-nested-path>" }
```
## 5. Validate with `e sync --3`
```bash
e sync --3 2>&1 | tee /tmp/bp_sync3.log
```
On `Patch failed at NNNN <subject>`:
- `cd` into the failing repo, inspect `git diff` for conflict markers.
- **Test-only files** (e.g. `web_tests/VirtualTestSuites`, `*_unittest.cc` context drift): take ours (`git checkout --ours -- <file>`) if the security-relevant hunks merged cleanly.
- **Substantive code conflicts**: check whether a target-milestone merge CL exists and swap to it. If none exists upstream and the surrounding code is structurally different, **drop the patch** (delete the file, remove from `.patches` and `config.json`) and note it for a separate manual-port PR — do not improvise security-fix semantics.
- After resolving: `git add <files> && git -c commit.gpgsign=false am --continue`, then `e patches <repo>` to export the resolved patch, then re-run `e sync --3`. Repeat until clean.
## 6. Export → lint → re-apply loop
```bash
e patches all
node script/lint.js --patches # must exit 0
```
If lint reports findings (typically trailing whitespace on `+` content lines), fixing them **changes the bytes the patch writes**, which invalidates the `index <old>..<new>` blob hashes that `e patches` baked in. Hand-editing a `.patch` and pushing it as-is will pass lint locally but fail CI's Apply Patches re-export check with a one-line `index` hash diff.
So whenever lint (or you) modifies any `.patch` file after export, round-trip once more:
```bash
# fix the lint findings in patches/**/*.patch, then:
e sync # re-apply the edited patches (no --3 needed; they applied cleanly last time)
e patches all # re-export so index blob hashes match the edited content
node script/lint.js --patches # must now exit 0
git diff --quiet -- patches/ || { echo "patches changed again — repeat the loop"; }
```
Repeat until `lint --patches` exits 0 **and** `git diff -- patches/` is empty after the final `e patches all`. Only then is the patch set CI-stable.
## 7. Commit, push, PR
```bash
git add patches/
git commit -m "chore: cherry-pick <N> changes from <dirs>"
git push origin HEAD
gh pr create --repo electron/electron --base <branch> --head <this-branch> \
--title "chore: cherry-pick <N> changes from <dirs>" \
--label "<branch>" --label backport-check-skip --label semver/patch --label "security 🔒" \
--body-file /tmp/pr_body.md
```
PR body format:
```markdown
Backports the following changes:
* [`<shortCommit>`](<gerrit-CL-url>) from <patchDir> — <subject> ([<bug>](https://crbug.com/<bug>), CVE-YYYY-NNNN)
* ...
Notes: Security: backported fixes for CVE-YYYY-NNNN, CVE-YYYY-NNNN, ....
```
Short commit links to the **Gerrit CL**; bug links to `crbug.com`; CVE comes from the blog mapping (the patch's own `Bug:` footer may differ); `Notes:` is the last line. Mention any dropped patches (with reason) above the `Notes:` line.
Restore `e use <previous>` when done.

View File

@@ -0,0 +1,323 @@
---
name: electron-node-upgrade
description: Guide for performing Node.js version upgrades in the Electron project. Use when working on the roller/node/main branch to fix patch conflicts during `e sync --3`. Covers the patch application workflow, conflict resolution, analyzing upstream Node.js changes, building, running the Node.js test suite, and proper commit formatting for patch fixes.
---
# Electron Node.js Upgrade: Phase One
## Summary
Run `e sync --3` repeatedly, fixing patch conflicts as they arise, until it succeeds. Then export patches and commit changes atomically.
## Success Criteria
Phase One is complete when:
- `e sync --3` exits with code 0 (no patch failures)
- All changes are committed per the commit guidelines
Do not stop until these criteria are met.
**CRITICAL** Do not delete or skip patches unless 100% certain the patch is no longer needed. For major version upgrades, patches that shim deprecated V8 APIs or backport upstream changes are often deletable because the new Node.js version already incorporates them — but verify before removing. Complicated conflicts or hard to resolve issues should be presented to the user after you have exhausted all other options. Do not delete the patch just because you can't solve it.
**CRITICAL** Never use `git am --skip` and then manually recreate a patch by making a new commit. This destroys the original patch's authorship, commit message, and position in the series. If `git am --continue` reports "No changes", investigate why — the changes were likely absorbed by a prior conflict resolution's 3-way merge. Present this situation to the user rather than skipping and recreating.
## Context
The `roller/node/main` branch is created by automation to update Electron's Node.js dependency version in `DEPS`. No work has been done to handle breaking changes between the old and new versions.
There are two types of Node.js version updates:
- **Bumps** (patch/minor): Automated by `electron-roller[bot]` with commit title `chore: bump node to v{version}`. Trivial patch index updates are handled automatically by `patchup[bot]`. These often land cleanly, but may require manual patch fixes.
- **Major upgrades** (e.g., v22 → v24): Manual, large PRs with commit title `chore: upgrade Node.js to v{X}.{Y}.{Z}`. These typically involve deleting obsolete patches, adapting many others, and updating `@types/node` in `package.json`.
**Key directories:**
- Current directory: Electron repo (always run `e` commands here)
- `../third_party/electron_node`: Node.js repo (where patches apply)
- `patches/node/`: Patch files for Node.js
- `docs/development/patches.md`: Patch system documentation
## Pre-flight Checks
Run these once at the start of each upgrade session:
1. **Clear rerere cache** (if enabled): `git rerere clear` in both the electron and `../third_party/electron_node` repos. Stale recorded resolutions from a prior attempt can silently apply wrong merges.
2. **Ensure pre-commit hooks are installed**: Check that `.git/hooks/pre-commit` exists. If not, run `yarn husky` to install it. The hook runs `lint-staged` which handles clang-format for C++ files.
## Workflow
1. Run `e sync --3` (the `--3` flag enables 3-way merge, always required)
2. If succeeds → skip to step 5
3. If patch fails:
- Identify target repo and patch from error output
- Analyze failure (see references/patch-analysis.md)
- Fix conflict in `../third_party/electron_node` working directory
- Run `git am --continue` in `../third_party/electron_node`
- Repeat until all patches for that repo apply
- IMPORTANT: Once `git am --continue` succeeds you MUST run `e patches node` to export fixes
- Return to step 1
4. When `e sync --3` succeeds, run `e patches all`
5. **Read `references/phase-one-commit-guidelines.md` NOW**, then commit changes following those instructions exactly.
## Commands Reference
| Command | Purpose |
|---------|---------|
| `e sync --3` | Clone deps and apply patches with 3-way merge |
| `git am --continue` | Continue after resolving conflict (run in node repo) |
| `e patches node` | Export commits from node repo to patch files |
| `e patches all` | Export all patches from all targets |
| `e patches node --commit-updates` | Export patches and auto-commit trivial changes |
| `e patches --list-targets` | List targets and config paths |
## Patch System Mental Model
```
patches/node/*.patch → [e sync --3] → ../third_party/electron_node commits
← [e patches] ←
```
## When to Edit Patches
| Situation | Action |
|-----------|--------|
| During active `git am` conflict | Fix in node repo, then `git am --continue` |
| Modifying patch outside conflict | Edit `.patch` file directly |
| Creating new patch (rare, avoid) | Commit in node repo, then `e patches node` |
Fix existing patches 99% of the time rather than creating new ones.
## Patch Fixing Rules
1. **Preserve authorship**: Keep original author in TODO comments (from patch `From:` field)
2. **Never change TODO assignees**: `TODO(name)` must retain original name
3. **Update descriptions**: If upstream changed APIs or macros, update patch commit message to reflect current state
4. **Never skip-and-recreate a patch**: If `git am --continue` says "No changes — did you forget to use 'git add'?", do NOT run `git am --skip` and create a replacement commit. The patch's changes were already absorbed by a prior 3-way merge resolution. This means an earlier conflict resolution pulled in too many changes. Present the situation to the user for guidance — the correct fix may require re-doing an earlier resolution more carefully to keep each patch's changes separate.
# Electron Node.js Upgrade: Phase Two
## Summary
Run `e build -k 999 -- --quiet` repeatedly, fixing build issues as they arise, until it succeeds. Then run `e start --version` to validate Electron launches and commit changes atomically.
Run Phase Two immediately after Phase One is complete.
## Success Criteria
Phase Two is complete when:
- `e build -k 999 -- --quiet` exits with code 0 (no build failures)
- `e start --version` has been run to check Electron launches
- All changes are committed per the commit guidelines
Do not stop until these criteria are met. Do not delete code or features, never comment out code in order to take short cut. Make all existing code, logic and intention work.
## Context
The `roller/node/main` branch is created by automation to update Electron's Node.js dependency version in `DEPS`. No work has been done to handle breaking changes between the old and new versions. Node.js APIs (especially internal V8 integration, OpenSSL/BoringSSL compatibility, and build system files) frequently change between versions. In every case the code in Electron must be updated to account for the change in Node.js, strongly avoid making changes to the code in Node.js to fix Electron's build.
**Key directories:**
- Current directory: Electron repo (always run `e` commands here)
- `../third_party/electron_node`: Node.js repo (do not touch this code to fix build issues, just read it to obtain context)
## Workflow
1. Run `e build -k 999 -- --quiet` (the `--quiet` flag suppresses per-target status lines, showing only errors and the final result)
2. If succeeds → skip to step 6
3. If build fails:
- Identify underlying file in "electron" from the compilation error message
- Analyze failure
- Fix build issue by adapting Electron's code for the change in Node.js
- Run `e build -t {target_that_failed}.o` to build just the failed target we were specifically fixing
- You can identify the target_that_failed from the failure line in the build log. E.g. `FAILED: 2e506007-8d5d-4f38-bdd1-b5cd77999a77 "./obj/electron/shell/browser/api/electron_api_utility_process.o" CXX obj/electron/shell/browser/api/electron_api_utility_process.o` the target name is `obj/electron/shell/browser/api/electron_api_utility_process.o`
- **Read `references/phase-two-commit-guidelines.md` NOW**, then commit changes following those instructions exactly.
- Return to step 1
4. **CRITICAL**: After ANY commit (especially patch commits), immediately run `git status` in the electron repo
- Look for other modified `.patch` files that only have index/hunk header changes
- These are dependent patches affected by your fix
- Commit them immediately with: `git commit -am "chore: update patches (trivial only)"`
5. Return to step 1
6. When `e build` succeeds, run `e start --version`
7. Check if you have any pending changes in the Node.js repo by running `git status` in `../third_party/electron_node`
- If you have changes follow the instructions below in "A. Patch Fixes" to correctly commit those modifications into the appropriate patch file
## Commands Reference
| Command | Purpose |
|---------|---------|
| `e build -k 999 -- --quiet` | Build Electron, continue on errors, suppress status lines |
| `e build -t {target}.o` | Build just one specific target to verify a fix |
| `e start --version` | Validate Electron launches after successful build |
## Two Types of Build Fixes
### A. Patch Fixes (for files in patched Node.js files)
When the error is in a file that Electron patches (check with `grep -l "filename" patches/node/*.patch`):
1. Edit the file in the Node.js source tree (`../third_party/electron_node/...`)
2. Create a fixup commit targeting the original patch commit:
```bash
cd ../third_party/electron_node
git add <modified-file>
git commit --fixup=<original-patch-commit-hash>
GIT_SEQUENCE_EDITOR=: git rebase --autosquash --autostash -i <commit>^
```
3. Export the updated patch: `e patches node`
4. Commit the updated patch file following `references/phase-one-commit-guidelines.md`.
To find the original patch commit to fixup: `git log --oneline | grep -i "keyword from patch name"`
The base commit for rebase is the Node.js commit before patches were applied. Find it by checking the `refs/patches/upstream-head` ref.
### B. Electron Code Fixes (for files in shell/, electron/, etc.)
When the error is in Electron's own source code:
1. Edit files directly in the electron repo
2. Commit directly (no patch export needed)
# Electron Node.js Upgrade: Phase Three
## Summary
Run the Node.js test suite via `script/node-spec-runner.js`, fix failing tests, and commit fixes until all tests pass. Certain tests are permanently disabled (listed in `script/node-disabled-tests.json`) and should not be run.
Run Phase Three immediately after Phase Two is complete.
## Success Criteria
Phase Three is complete when:
- `node script/node-spec-runner.js --default` exits with zero failures
- All changes are committed per the commit guidelines
Do not stop until these criteria are met.
## Context
Electron runs a subset of Node.js's upstream test suite using a custom runner (`script/node-spec-runner.js`). Tests are executed with the built Electron binary via `ELECTRON_RUN_AS_NODE=true`. Many tests need adaptation because Electron uses BoringSSL (not OpenSSL) and Chromium's V8 (which may differ from Node.js's bundled V8).
**Key files:**
- `script/node-spec-runner.js` — Test runner script
- `script/node-disabled-tests.json` — Permanently disabled tests (do not try to fix these)
- `../third_party/electron_node/test/` — Node.js test files (where patches apply)
- `patches/node/fix_crypto_tests_to_run_with_bssl.patch` — BoringSSL crypto test adaptations
- `patches/node/test_formally_mark_some_tests_as_flaky.patch` — Flaky test list
## Workflow
1. Run `node script/node-spec-runner.js --default` from the electron repo
2. If all tests pass → Phase Three is complete
3. If tests fail:
- Identify the failing test file(s) from the output
- Analyze each failure (see "Common Failure Patterns" below)
- Fix the test in `../third_party/electron_node/test/...`
- Re-run the specific failing test to verify: `node script/node-spec-runner.js {test-path}`
- The test path is relative to the node `test/` directory, e.g. `test/parallel/test-crypto-key-objects-raw.js`
- Do NOT use `--default` when running specific tests — it adds the full suite flags
- Do NOT run tests directly with `ELECTRON_RUN_AS_NODE` — the runner handles environment setup (e.g. temporarily switching `package.json` from ESM to CommonJS)
- Commit the fix using the fixup workflow and commit guidelines
- Return to step 1
## Commands Reference
| Command | Purpose |
|---------|---------|
| `node script/node-spec-runner.js --default` | Run full Node.js test suite |
| `node script/node-spec-runner.js test/parallel/test-foo.js` | Run a single test |
| `NODE_REGENERATE_SNAPSHOTS=1 node script/node-spec-runner.js test/test-runner/test-foo.mjs` | Regenerate snapshot for a snapshot-based test |
## Common Failure Patterns
### BoringSSL incompatibilities
Electron uses BoringSSL (via Chromium) instead of OpenSSL. Many crypto features are missing or behave differently:
| Unsupported in BoringSSL | Guard pattern |
|--------------------------|---------------|
| ChaCha20-Poly1305 | `if (!process.features.openssl_is_boringssl)` |
| AES-CCM (aes-128-ccm, aes-256-ccm) | `if (ciphers.includes('aes-128-ccm'))` |
| AES-KW (key wrapping) | `if (!process.features.openssl_is_boringssl)` |
| DSA keys | `if (!process.features.openssl_is_boringssl)` |
| Ed448 / X448 curves | `if (!process.features.openssl_is_boringssl)` |
| DH key PEM loading | `if (!process.features.openssl_is_boringssl)` |
| PQC algorithms (ML-KEM, ML-DSA, SLH-DSA) | `if (hasOpenSSL(3, 5))` (already guards these) |
When guarding tests, prefer checking cipher availability (`ciphers.includes(algo)`) over blanket BoringSSL checks where possible, as it's more precise and self-documenting.
New upstream tests that exercise these features will need guards added to the `fix_crypto_tests_to_run_with_bssl` patch.
### Snapshot test mismatches
Some tests compare output against committed `.snapshot` files using `assert.strictEqual` — these are NOT wildcard comparisons. When Chromium's V8 produces different output (e.g. different stack traces due to V8 enhancements), the snapshot must be regenerated:
```bash
NODE_REGENERATE_SNAPSHOTS=1 node script/node-spec-runner.js test/test-runner/test-foo.mjs
```
Then inspect the diff to verify the changes are expected, and commit the updated snapshot into the appropriate patch.
### V8 behavioral differences
Chromium's V8 may be ahead of Node.js's bundled V8. This can cause:
- Different stack trace formats (e.g. thenable async stack frames)
- Different error messages
- Features available in Chromium V8 that aren't in stock Node.js V8 (or vice versa)
## Two Types of Test Fixes
### A. Patch Fixes (most common for test failures)
Most test fixes go into existing patches in `patches/node/`. Use the fixup workflow:
1. Edit the test file in `../third_party/electron_node/test/...`
2. Find the relevant patch commit: `git log --oneline | grep -i "keyword"`
- Crypto/BoringSSL tests → `fix crypto tests to run with bssl`
- Snapshot tests → the specific snapshot patch (e.g. `test: accomodate V8 thenable`)
- Flaky tests → `test: formally mark some tests as flaky`
3. Create a fixup commit:
```bash
cd ../third_party/electron_node
git add test/path/to/test.js
git commit --fixup=<patch-commit-hash>
GIT_SEQUENCE_EDITOR=: git rebase --autosquash --autostash -i <commit>^
```
4. Export: `e patches node`
5. **Read `references/phase-three-commit-guidelines.md` NOW**, then commit the updated patch file.
### B. New Patches (rare)
Only create a new patch when the fix doesn't belong in any existing patch. The new patch commit in `../third_party/electron_node` must include a description explaining why the patch exists and when it can be removed — the lint check enforces this.
## Adding to Disabled Tests
Only add a test to `script/node-disabled-tests.json` as a **last resort** — when the test is fundamentally incompatible with Electron's architecture (not just a BoringSSL difference that can be guarded). Tests disabled here are completely skipped and never run.
# Critical: Read Before Committing
- Before ANY Phase One commits: Read `references/phase-one-commit-guidelines.md`
- Before ANY Phase Two commits: Read `references/phase-two-commit-guidelines.md`
- Before ANY Phase Three commits: Read `references/phase-three-commit-guidelines.md`
# High-Churn Patches
These patches consistently require the most work during Node.js upgrades:
- **`fix_handle_boringssl_and_openssl_incompatibilities.patch`** — Electron uses BoringSSL (via Chromium) while Node.js expects OpenSSL. This patch is large and complex, and upstream OpenSSL API changes frequently break it.
- **`fix_crypto_tests_to_run_with_bssl.patch`** — Companion to the above; adapts Node.js crypto tests for BoringSSL. Can grow significantly during major upgrades.
- **`support_v8_sandboxed_pointers.patch`** — V8 sandbox pointer support requires careful adaptation when V8 APIs change.
- **`build_add_gn_build_files.patch`** — The GN build file patch is large and touches many build targets. Upstream build system changes frequently conflict.
# Major Version Upgrades
Major Node.js version transitions (e.g., v22 → v24) are significantly more involved than patch bumps:
1. **Expect patch deletions.** Electron uses Chromium's V8, which is often ahead of the V8 version bundled in Node.js. Many patches exist to bridge this gap — shimming newer V8 APIs that Chromium's V8 has but Node.js' older V8 doesn't. When Node.js bumps to a newer major version, its V8 catches up to Chromium's, and those bridge patches can be deleted. In the v22 → v24 upgrade, 17 patches were deleted for this reason.
2. **Update `@types/node`** in `package.json` to match the new major version.
3. **Post-upgrade regressions are expected.** Even after the upgrade lands, follow-up fix PRs for edge cases (ESM path handling, certificate loading, platform-specific issues) are normal.
# Skill Directory Structure
This skill has additional reference files in `references/`:
- patch-analysis.md - How to analyze patch failures
- phase-one-commit-guidelines.md - Commit format for Phase One
- phase-two-commit-guidelines.md - Commit format for Phase Two
- phase-three-commit-guidelines.md - Commit format for Phase Three
Read these when referenced in the workflow steps.

View File

@@ -0,0 +1,112 @@
# Analyzing Patch Failures
## Investigation Steps
1. **Read the patch file** at `patches/node/{patch_name}.patch`
2. **Examine current state** of the file in the Node.js repo at mentioned line numbers
3. **Check recent upstream changes:**
```bash
cd ../third_party/electron_node
git log --oneline -10 -- {file}
```
4. **Find Node.js PR** in commit messages:
```
PR-URL: https://github.com/nodejs/node/pull/{PR_NUMBER}
```
## Critical: Resolve by Intent, Not by Mechanical Merge
When resolving a patch conflict, do NOT blindly preserve the patch's old code. Instead:
1. **Understand the upstream commit's full scope** — not just the conflicting hunk.
Run `git show <commit> --stat` and read diffs for all affected files.
Upstream may have removed structs, members, or methods that the patch
references in other hunks or files.
2. **Re-read the patch commit message** to understand its *intent* — what
behavior does it need to preserve or add?
3. **Implement the intent against the new upstream code.** If the patch's
purpose is "add BoringSSL compatibility", add only the compatibility
layer — don't also restore old code that upstream separately removed.
### Lesson: Upstream Removals Break Patch References
- **Trigger:** Patch conflict involves an upstream refactor (not just context drift)
- **Strategy:** After identifying the upstream commit, check its full diff for
removed types, members, and methods. If the patch's old code references
something removed, the resolution must use the new upstream mechanism.
### Lesson: Separate Patch Purpose from Patch Implementation
- **Trigger:** Conflict between "upstream simplified code" vs "patch has older code"
- **Strategy:** Identify the *minimal* change the patch needs. If the patch
wraps code in a conditional, only add the conditional — don't restore old
code that was inside the conditional but was separately cleaned up upstream.
### Lesson: Finish the Adaptation at Conflict Time
- **Trigger:** A patch conflict involves an upstream API removal or replacement
- **Strategy:** When resolving the conflict, fully adapt the patch to use the
new API in the same commit. Don't remove the old code and leave behind stale
references that will "be fixed in Phase Two." Each patch fix commit should be
a complete resolution.
## Common Failure Patterns
| Pattern | Cause | Solution |
|---------|-------|----------|
| Context lines don't match | Surrounding code changed | Update context in patch |
| File not found | File renamed/moved | Update patch target path |
| Function not found | Refactored upstream | Find new function name |
| OpenSSL → BoringSSL mismatch | Crypto API change | Update to BoringSSL-compatible API |
| GYP/GN build change | Build system refactor | Adapt build patch to new structure |
| Deleted code | Feature removed | Verify patch still needed |
| V8 API bridge patch conflicts | Node.js caught up to Chromium's V8 | Patch may be deletable — verify the API is now in Node.js' V8 natively |
## Using Git Blame
To find the commit that changed specific lines:
```bash
cd ../third_party/electron_node
git blame -L {start},{end} -- {file}
git log -1 {commit_sha} # Look for PR-URL: line
```
## Verifying Patch Necessity
Before deleting a patch, verify:
1. The patched functionality was intentionally removed upstream
2. Electron doesn't need the patch for other reasons
3. No other code depends on the patched behavior
**V8 bridge patches:** Electron uses Chromium's V8, which is often ahead of the V8 bundled in Node.js. Many patches exist to bridge this version gap — adapting Node.js code to work with newer V8 APIs that Chromium's V8 exposes. During major Node.js upgrades, Node.js' V8 catches up to Chromium's, and these bridge patches often become unnecessary. Check whether the API the patch shims is now available natively in the new Node.js version's V8.
When in doubt, keep the patch and adapt it.
## Phase Two: Build-Time Patch Issues
Sometimes patches that applied successfully in Phase One cause build errors in Phase Two. This can happen when:
1. **Incomplete types**: A patch disables a header include, but new upstream code uses the type
2. **Missing members**: A patch modifies a class, but upstream added new code referencing the original
### Finding Which Patch Affects a File
```bash
grep -l "filename.cc" patches/node/*.patch
```
### Matching Existing Patch Patterns
When fixing build errors in patched files, examine the existing patch to understand its style:
- Does it use `#if 0` / `#endif` guards?
- Does it use `#if BUILDFLAG(...)` conditionals?
- Does it use `#ifndef` / `#ifdef` guards for BoringSSL vs OpenSSL?
- What's the pattern for disabled functionality?
Apply fixes consistent with the existing patch style.

View File

@@ -0,0 +1,111 @@
# Phase One Commit Guidelines
Only follow these instructions if there are uncommitted changes to `patches/` after Phase One succeeds.
Ignore other instructions about making commit messages, our guidelines are CRITICALLY IMPORTANT and must be followed.
## Each Commit Must Be Complete
When resolving a patch conflict, fully adapt the patch to the new upstream code in the same commit. If the upstream change removes an API the patch uses, update the patch to use the replacement API now — don't leave stale references knowing they'll need fixing later. The goal is that each commit represents a finished resolution, not a partial one that defers known work to a future phase.
## Commit Message Style
**Titles** follow the 60/80-character guideline: simple changes fit within 60 characters, otherwise the limit is 80 characters.
Always include a `Co-Authored-By` trailer identifying the AI model that assisted (e.g., `Co-Authored-By: <AI model attribution>`).
### Patch conflict fixes
Use `fix(patch):` prefix. The title should name the upstream change, not your response to it:
```
fix(patch): {topic headline}
Ref: {Node.js commit or issue link}
Co-Authored-By: <AI model attribution>
```
Only add a description body if it provides clarity beyond the title. For straightforward context drift or simple API renames, the title + Ref is sufficient.
Examples:
- `fix(patch): stop using v8::PropertyCallbackInfo<T>::This()`
- `fix(patch): BoringSSL and OpenSSL incompatibilities`
- `fix(patch): refactor module_wrap.cc FixedArray::Get params`
### Upstreamed patch removal
When patches are no longer needed (applied cleanly with "already applied" or confirmed upstreamed), group ALL removals into a single commit:
```
chore: remove upstreamed patch
```
or (if multiple):
```
chore: remove upstreamed patches
```
Most Node.js patches in Electron are Electron-authored (no upstream `PR-URL:`). If the patch originated from an upstream Node.js PR, no extra `Ref:` is needed. Otherwise, add a `Ref:` pointing to the relevant Node.js issue or commit if one exists.
### Trivial patch updates
After all fix commits, stage remaining trivial changes (index, line numbers, context only):
```bash
git add patches
git commit -m "chore: update patches (trivial only)"
```
**Conflict resolution can produce trivial results.** A `git am` conflict doesn't always mean the patch content changed — context drift alone can cause a conflict. After resolving and exporting, inspect the patch diff: if only index hashes, line numbers, and context lines changed (not the patch's own `+`/`-` lines), it's trivial and belongs here, not in a `fix(patch):` commit.
## Atomic Commits
Each patch conflict fix gets its own commit with its own Ref.
IMPORTANT: Try really hard to find the PR or commit reference per the instructions below. Each change you made should in theory have been in response to a change made in Node.js that you identified or can identify. Try for a while to identify and include the ref in the commit message. Do not give up easily.
## Finding Commit/Issue References
Use `git log` or `git blame` on Node.js source files in `../third_party/electron_node`. Look for:
```
PR-URL: https://github.com/nodejs/node/pull/XXXXX
```
or issue references in the patch itself:
```
Refs: https://github.com/nodejs/node/issues/XXXXX
```
Note: Most Node.js patches in Electron are Electron-authored and won't have upstream references. In that case, check `git log` in the Node.js repo to find which upstream commit caused the conflict.
If no reference found after searching: `Ref: Unable to locate reference`
## Example Commits
### Patch conflict fix (simple — title is sufficient)
```
fix(patch): stop using v8::PropertyCallbackInfo<T>::This()
Ref: https://github.com/nodejs/node/issues/60616
Co-Authored-By: <AI model attribution>
```
### Patch conflict fix (complex — description adds value)
```
fix(patch): BoringSSL and OpenSSL incompatibilities
Upstream updated OpenSSL APIs that diverge from BoringSSL. Adapted
the compatibility shims in crypto patches to use the BoringSSL
equivalents.
Ref: Unable to locate reference
Co-Authored-By: <AI model attribution>
```

View File

@@ -0,0 +1,80 @@
# Phase Three Commit Guidelines
Only follow these instructions if there are uncommitted changes after fixing a test failure during Phase Three.
Ignore other instructions about making commit messages, our guidelines are CRITICALLY IMPORTANT and must be followed.
## Commit Message Style
**Titles** follow the 60/80-character guideline: simple changes fit within 60 characters, otherwise the limit is 80 characters.
Always include a `Co-Authored-By` trailer identifying the AI model that assisted (e.g., `Co-Authored-By: <AI model attribution>`).
## Commit Types
### Patch updates (most test fixes)
Test fixes go into existing patches via the fixup workflow. Use `fix(patch):` prefix with a descriptive topic:
```
fix(patch): {topic headline}
Ref: {Node.js commit or issue link}
Co-Authored-By: <AI model attribution>
```
Examples:
- `fix(patch): guard DH key test for BoringSSL`
- `fix(patch): adapt new crypto tests for BoringSSL`
- `fix(patch): correct thenable snapshot for Chromium V8`
- `fix(patch): skip AES-KW tests with BoringSSL`
Group related test fixes into a single commit when they address the same root cause (e.g., multiple crypto tests all needing BoringSSL guards for the same missing cipher). Don't create one commit per test file if they share the same fix pattern.
### Snapshot regeneration
When a snapshot test fails because Chromium's V8 produces different output, regenerate it:
```bash
NODE_REGENERATE_SNAPSHOTS=1 node script/node-spec-runner.js test/test-runner/test-foo.mjs
```
Then commit the updated snapshot patch with a title describing what changed:
```
fix(patch): correct {name} snapshot for Chromium V8
Ref: {V8 CL or issue link if known}
Co-Authored-By: <AI model attribution>
```
### Trivial patch updates
After any patch modification, check for dependent patches that only have index/hunk header changes:
```bash
git status
# If other .patch files show as modified with only trivial changes:
git add patches/
git commit -m "chore: update patches (trivial only)"
```
## Finding References
For BoringSSL-related test fixes, the reference is typically the upstream Node.js PR that added the new test:
```bash
cd ../third_party/electron_node
git log --oneline -5 -- test/parallel/test-crypto-foo.js
git log -1 <commit> --format="%B" | grep "PR-URL"
```
For V8 behavioral differences, reference the Chromium CL:
```
Ref: https://chromium-review.googlesource.com/c/v8/v8/+/NNNNNNN
```
If no reference found after searching: `Ref: Unable to locate reference`

View File

@@ -0,0 +1,96 @@
# Phase Two Commit Guidelines
Only follow these instructions if there are uncommitted changes in the Electron repo after any fixes are made during Phase Two that result a target that was failing, successfully building.
Ignore other instructions about making commit messages, our guidelines are CRITICALLY IMPORTANT and must be followed.
## Commit Message Style
**Titles** follow the 60/80-character guideline: simple changes fit within 60 characters, otherwise the limit is 80 characters. Exception: upstream Node.js PR titles are used verbatim even if longer.
Always include a `Co-Authored-By` trailer identifying the AI model that assisted (e.g., `Co-Authored-By: <AI model attribution>`).
## Two Commit Types
### For Electron Source Changes (shell/, electron/, etc.)
When the upstream Node.js commit has a `PR-URL:`:
```
node#{PR-Number}: {upstream PR's original title}
Ref: {Node.js PR link}
Co-Authored-By: <AI model attribution>
```
When there is no `PR-URL:` but there is an issue reference or commit:
```
fix: {description of the adaptation}
Ref: {Node.js issue or commit link}
Co-Authored-By: <AI model attribution>
```
Use the **upstream commit's original title** when available — do not paraphrase or rewrite it. To find it: check the commit message in `../third_party/electron_node` for `PR-URL:` or `Refs:` lines.
Only add a description body if it provides clarity beyond what the title already says (e.g., when Electron's adaptation is non-obvious). For simple renames, method additions, or straightforward API updates, the title + Ref link is sufficient.
Each change should have its own commit and its own Ref. Logically group into commits that make sense rather than one giant commit. You may include multiple "Ref" links if required.
IMPORTANT: Try really hard to find a reference. Each change you made should in theory have been in response to a change in Node.js. Check `git log` and `git blame` in the Node.js repo. Do not give up easily.
### For Patch Updates (patches/node/*.patch)
Use the same fixup workflow as Phase One and follow `references/phase-one-commit-guidelines.md` for the commit message format (`fix(patch):` prefix, topic style).
## Dependent Patch Header Updates
After any patch modification, check for other affected patches:
```bash
git status
# If other .patch files show as modified with only index, line number, and context changes:
git add patches/
git commit -m "chore: update patches (trivial only)"
```
## Finding References
Use `git log` or `git blame` on Node.js source files in `../third_party/electron_node`. Look for:
```
PR-URL: https://github.com/nodejs/node/pull/XXXXX
Refs: https://github.com/nodejs/node/issues/XXXXX
```
Note: Many Node.js patches in Electron are Electron-authored and won't have upstream `PR-URL:` lines. Check the patch's own commit message for `Refs:` lines, or use `git log` in the Node.js repo to find which upstream commit caused the build break.
If no reference found after searching: `Ref: Unable to locate reference`
## Example Commits
### Electron Source Fix (with upstream PR)
```
node#61898: src: stop using v8::PropertyCallbackInfo<T>::This()
Ref: https://github.com/nodejs/node/pull/61898
Co-Authored-By: <AI model attribution>
```
### Electron Source Fix (with issue reference, no PR)
```
fix: adapt to v8::PropertyCallbackInfo<T>::This() removal
Updated NodeBindings to use HolderV2() after upstream Node.js
stopped using the deprecated This() API.
Ref: https://github.com/nodejs/node/issues/60616
Co-Authored-By: <AI model attribution>
```

View File

@@ -1,14 +0,0 @@
name: Maintainer Issue (not for public use)
description: Only to be created by Electron maintainers
body:
- type: checkboxes
attributes:
label: Confirmation
options:
- label: I am a [maintainer](https://github.com/orgs/electron/people) of the Electron project. (If not, please create a [different issue type](https://github.com/electron/electron/issues/new/).)
required: true
- type: textarea
attributes:
label: Description
validations:
required: true

View File

@@ -6,14 +6,17 @@ the requirements below.
Contributors guide: https://github.com/electron/electron/blob/main/CONTRIBUTING.md
NOTE: PRS submitted without this template will be automatically closed.
Using a coding agent / AI? Read the policy: https://github.com/electron/governance/blob/main/policy/ai.md
NOTE: PRs submitted that do not follow this template will be automatically closed.
-->
#### Checklist
<!-- Remove items that do not apply. For completed items, change [ ] to [x]. -->
- [ ] PR description included
- [ ] I have built and tested this PR
- [ ] I have built and tested this change
- [ ] I have filled out the PR description
- [ ] [I have reviewed and verified the changes](https://github.com/electron/governance/blob/main/policy/ai.md)
- [ ] `npm test` passes
- [ ] tests are [changed or added](https://github.com/electron/electron/blob/main/docs/development/testing.md)
- [ ] relevant API documentation, tutorials, and examples are updated and follow the [documentation style guide](https://github.com/electron/electron/blob/main/docs/development/style-guide.md)

View File

@@ -101,12 +101,31 @@ runs:
git pack-refs
cd ..
# Pre-create the ThinLTO cache directory so lld-link does not need to
# call CreateDirectoryW through the bindflt filter driver, which can
# return ERROR_INVALID_PARAMETER under concurrent I/O on ARC runners.
# Discover the path from GN instead of hardcoding it so we stay in
# sync with `cache_dir` in build/config/compiler/BUILD.gn; skip the
# pre-create when ThinLTO is disabled (non-official builds).
$env:ELECTRON_DEPOT_TOOLS_DISABLE_LOG = "1"
$ltoFlag = e d gn desc out/Default //electron:electron_app ldflags 2>$null |
Select-String -Pattern '^/lldltocache:(.+)$' |
Select-Object -First 1
if ($ltoFlag) {
$cachePath = Join-Path 'out\Default' $ltoFlag.Matches[0].Groups[1].Value
New-Item -ItemType Directory -Force -Path $cachePath | Out-Null
}
$env:NINJA_SUMMARIZE_BUILD = 1
if ("${{ inputs.is-release }}" -eq "true") {
e build --target electron:release_build
} else {
e build --target electron:testing_build
}
if ($LASTEXITCODE -ne 0) {
Write-Host "e build failed with exit code $LASTEXITCODE"
exit $LASTEXITCODE
}
Copy-Item out\Default\.ninja_log out\electron_ninja_log
node electron\script\check-symlinks.js

View File

@@ -0,0 +1,24 @@
name: 'Build Image SHA'
description: 'Single source of truth for the ghcr.io/electron/build image SHA'
inputs:
override:
description: 'Optional override SHA (e.g. from a workflow_dispatch input)'
required: false
default: ''
outputs:
build-image-sha:
description: 'The electron/build image SHA to use'
value: ${{ steps.set.outputs.build-image-sha }}
runs:
using: 'composite'
steps:
- id: set
shell: bash
env:
OVERRIDE: ${{ inputs.override }}
run: |
if [ -n "$OVERRIDE" ]; then
echo "build-image-sha=$OVERRIDE" >> "$GITHUB_OUTPUT"
else
echo "build-image-sha=daad061f4b99a0ae1c841be4aa09188280a9c8a4" >> "$GITHUB_OUTPUT"
fi

View File

@@ -133,7 +133,7 @@ runs:
run : |
cd src/third_party/angle
rm -f .git/objects/info/alternates
git remote set-url origin https://chromium.googlesource.com/angle/angle.git
git remote set-url origin https://github.com/google/angle.git
cp .git/config .git/config.backup
git remote remove origin
mv .git/config.backup .git/config

View File

@@ -15,7 +15,7 @@ runs:
git config --global core.preloadindex true
git config --global core.longpaths true
fi
export BUILD_TOOLS_SHA=a0cc95a1884a631559bcca0c948465b725d9295a
export BUILD_TOOLS_SHA=1b7bd25dae4a780bb3170fff56c9327b53aaf7eb
npm i -g @electron/build-tools
# Update depot_tools to ensure python
e d update_depot_tools
@@ -29,4 +29,4 @@ runs:
else
echo "$HOME/.electron_build_tools/third_party/depot_tools" >> $GITHUB_PATH
echo "$HOME/.electron_build_tools/third_party/depot_tools/python-bin" >> $GITHUB_PATH
fi
fi

View File

@@ -21,11 +21,28 @@ runs:
if [ "$TARGET_ARCH" = "x86" ]; then
export npm_config_arch="ia32"
fi
# if running on linux arm skip yarn Builds
ARCH=$(uname -m)
node script/yarn.js install --immutable --mode=skip-build
# if running on linux arm skip yarn Builds
if [ "$ARCH" = "armv7l" ]; then
echo "Skipping yarn build on linux arm"
node script/yarn.js install --immutable --mode=skip-build
else
# Pre-seed the node-gyp header cache so the parallel native-addon
# builds below don't race on a cold cache. Linux build containers
# already ship a warm cache (electron/build-images#68), so only do
# this on macOS / Windows runners.
if [ "$(uname -s)" != "Linux" ]; then
for i in 1 2 3; do
if node node_modules/node-gyp/bin/node-gyp.js install; then
break
fi
if [ "$i" = "3" ]; then
echo "node-gyp header pre-seed failed after 3 attempts" >&2
exit 1
fi
echo "node-gyp header pre-seed failed (attempt $i), retrying in 5s..." >&2
sleep 5
done
fi
node script/yarn.js install --immutable
fi

View File

@@ -5,7 +5,7 @@
"fromPath": "src/out/Default/args.gn",
"pattern": [
{
"regexp": "^(.+)[(:](\\d+)[:,](\\d+)\\)?:\\s+(warning|error):\\s+(.*)$",
"regexp": "^(.+)[(:](\\d+)[:,](\\d+)\\)?:\\s+(warning|fatal error|error):\\s+(.*)$",
"file": 1,
"line": 2,
"column": 3,

View File

@@ -0,0 +1,47 @@
From 85b561ea4dbc76ba98af020b970f3aa6b20fdb9e Mon Sep 17 00:00:00 2001
From: Samuel Attard <sam@electronjs.org>
Date: Wed, 8 Apr 2026 23:24:15 -0700
Subject: [PATCH] siso: reuse the outer *os.File for chunked ReadAt in
fileParser.readFile
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
The per-chunk goroutine currently re-opens fname to get its own handle
for ReadAt. (*os.File).ReadAt is documented as safe for concurrent
calls on the same File (on Windows it is ReadFile with an OVERLAPPED
offset, so there is no shared seek state), so the extra open is
redundant — the goroutines can share the outer f.
Besides halving the CreateFileW calls per subninja, this avoids an
intermittent 'The parameter is incorrect.' (ERROR_INVALID_PARAMETER)
from bindflt.sys when out/ is a mapped directory inside a Windows
container: bindflt's handle-relative NtCreateFile path races when a
second relative open arrives while the first handle to the same target
is still being set up. Absolute paths and single opens do not trigger
it; see microsoft/Windows-Containers#<tbd>.
---
siso/toolsupport/ninjautil/file_parser.go | 7 -------
1 file changed, 7 deletions(-)
diff --git a/siso/toolsupport/ninjautil/file_parser.go b/siso/toolsupport/ninjautil/file_parser.go
index 8c18d084..63116662 100644
--- a/siso/toolsupport/ninjautil/file_parser.go
+++ b/siso/toolsupport/ninjautil/file_parser.go
@@ -111,13 +111,6 @@ func (p *fileParser) readFile(ctx context.Context, fname string) ([]byte, error)
eg.Go(func() error {
p.sema <- struct{}{}
defer func() { <-p.sema }()
- f, err := os.Open(fname)
- if err != nil {
- return err
- }
- defer func() {
- _ = f.Close()
- }()
for len(chunkBuf) > 0 {
n, err := f.ReadAt(chunkBuf, pos)
if err != nil {
--
2.53.0

View File

@@ -0,0 +1,132 @@
From a8afee1089ec2ae9ab5837b438d07338aefb3bc4 Mon Sep 17 00:00:00 2001
From: Samuel Attard <sam@electronjs.org>
Date: Wed, 22 Apr 2026 16:27:51 -0700
Subject: [PATCH] siso: retry transient ERROR_INVALID_PARAMETER when opening
ninja files on Windows
ManifestParser.Load fans out across all subninja files (~90k in a
Chromium build) at NumCPU parallelism. On Windows builders where out/
is served through a filesystem filter driver (e.g. bindflt/wcifs for
container bind mounts), CreateFileW can intermittently return
ERROR_INVALID_PARAMETER under this concurrent open burst. The previous
patch removes the redundant per-chunk re-open, but the single remaining
open per file can still hit the race; without a retry a single transient
failure aborts the entire manifest load.
Wrap the remaining os.Open call in readFile in a small Windows-only
retry for ERROR_INVALID_PARAMETER (5 attempts, 5-80ms backoff). Each
retry is logged via clog.Warningf and also written to stderr so it is
visible in CI step output where glog warnings are file-only by default.
Other platforms keep the direct os.Open path.
---
siso/toolsupport/ninjautil/file_parser.go | 3 +-
siso/toolsupport/ninjautil/openfile_other.go | 18 +++++++
.../toolsupport/ninjautil/openfile_windows.go | 50 +++++++++++++++++++
3 files changed, 69 insertions(+), 2 deletions(-)
create mode 100644 siso/toolsupport/ninjautil/openfile_other.go
create mode 100644 siso/toolsupport/ninjautil/openfile_windows.go
diff --git a/siso/toolsupport/ninjautil/file_parser.go b/siso/toolsupport/ninjautil/file_parser.go
index 6311666..324528d 100644
--- a/siso/toolsupport/ninjautil/file_parser.go
+++ b/siso/toolsupport/ninjautil/file_parser.go
@@ -7,7 +7,6 @@ package ninjautil
import (
"context"
"fmt"
- "os"
"runtime/trace"
"sync"
"time"
@@ -91,7 +90,7 @@ func (p *fileParser) parseFile(ctx context.Context, fname string) error {
// readFile reads a file of fname in parallel.
func (p *fileParser) readFile(ctx context.Context, fname string) ([]byte, error) {
defer trace.StartRegion(ctx, "ninja.read").End()
- f, err := os.Open(fname)
+ f, err := openFile(ctx, fname)
if err != nil {
return nil, err
}
diff --git a/siso/toolsupport/ninjautil/openfile_other.go b/siso/toolsupport/ninjautil/openfile_other.go
new file mode 100644
index 0000000..9fca690
--- /dev/null
+++ b/siso/toolsupport/ninjautil/openfile_other.go
@@ -0,0 +1,18 @@
+// Copyright 2026 The Chromium Authors
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+//go:build !windows
+
+package ninjautil
+
+import (
+ "context"
+ "os"
+)
+
+// openFile opens fname for reading.
+// See openfile_windows.go for the Windows variant with transient-error retry.
+func openFile(ctx context.Context, fname string) (*os.File, error) {
+ return os.Open(fname)
+}
diff --git a/siso/toolsupport/ninjautil/openfile_windows.go b/siso/toolsupport/ninjautil/openfile_windows.go
new file mode 100644
index 0000000..f9d8e9d
--- /dev/null
+++ b/siso/toolsupport/ninjautil/openfile_windows.go
@@ -0,0 +1,50 @@
+// Copyright 2026 The Chromium Authors
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+//go:build windows
+
+package ninjautil
+
+import (
+ "context"
+ "errors"
+ "fmt"
+ "os"
+ "time"
+
+ "golang.org/x/sys/windows"
+
+ "go.chromium.org/build/siso/o11y/clog"
+)
+
+// openFile opens fname for reading, retrying transient
+// ERROR_INVALID_PARAMETER failures.
+//
+// On Windows, CreateFileW can intermittently return
+// ERROR_INVALID_PARAMETER when the target lives behind a filesystem
+// filter driver (e.g. bindflt/wcifs for container bind mounts) under
+// highly concurrent opens. loadFile fans out across ~90k subninja
+// files at NumCPU parallelism, so a single transient failure would
+// otherwise abort the whole manifest load.
+func openFile(ctx context.Context, fname string) (*os.File, error) {
+ const maxAttempts = 5
+ delay := 5 * time.Millisecond
+ for i := 0; ; i++ {
+ f, err := os.Open(fname)
+ if err == nil {
+ return f, nil
+ }
+ if i+1 >= maxAttempts || !errors.Is(err, windows.ERROR_INVALID_PARAMETER) {
+ return nil, err
+ }
+ clog.Warningf(ctx, "open %s: %v; retrying (%d/%d) after %s", fname, err, i+1, maxAttempts, delay)
+ fmt.Fprintf(os.Stderr, "siso: open %s: %v; retrying (%d/%d) after %s\n", fname, err, i+1, maxAttempts, delay)
+ select {
+ case <-time.After(delay):
+ case <-ctx.Done():
+ return nil, context.Cause(ctx)
+ }
+ delay *= 2
+ }
+}
--
2.53.0

View File

@@ -18,6 +18,7 @@ jobs:
pull-requests: read
outputs:
has-patches: ${{ steps.filter.outputs.patches }}
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
with:
@@ -33,6 +34,9 @@ jobs:
patches:
- DEPS
- 'patches/**'
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
apply-patches:
needs: setup
@@ -41,7 +45,7 @@ jobs:
permissions:
contents: read
container:
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
@@ -73,7 +77,7 @@ jobs:
target-platform: linux
- name: Upload Patch Conflict Fix
if: ${{ failure() }}
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: update-patches
path: patches/update-patches.patch

View File

@@ -17,7 +17,7 @@ jobs:
with:
fetch-depth: 0
- name: Setup Node.js/npm
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
with:
node-version: 24.12.x
- name: Setting Up Dig Site
@@ -49,7 +49,7 @@ jobs:
sha-file: .dig-old
filename: electron.old.d.ts
- name: Upload artifacts
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f #v7.0.0
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a #v7.0.1
with:
name: artifacts
path: electron/artifacts

View File

@@ -17,7 +17,7 @@ jobs:
contents: read
steps:
- name: Setup Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6.4.0
with:
node-version: 22.17.x
- name: Sparse checkout repository
@@ -28,7 +28,7 @@ jobs:
.github
.yarn
- run: yarn workspaces focus @electron/gha-workflows
- uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
- uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
id: audit-errors
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
@@ -87,6 +87,8 @@ jobs:
!message.startsWith("The hosted runner lost communication with the server") &&
!message.startsWith("Dependabot encountered an error performing the update") &&
!message.startsWith("The action 'Run Electron Tests' has timed out") &&
!message.startsWith("The operation was canceled") &&
!message.startsWith("Canceling since") &&
!/Unable to make request/.test(message) &&
!/The requested URL returned error/.test(message),
)

View File

@@ -105,7 +105,7 @@ jobs:
org: electron
- name: Generate Release Project Board Metadata
if: ${{ steps.check-major-version.outputs.MAJOR }}
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
id: generate-project-metadata
env:
MAJOR: ${{ steps.check-major-version.outputs.MAJOR }}
@@ -157,7 +157,7 @@ jobs:
}))
- name: Create Release Project Board
if: ${{ steps.check-major-version.outputs.MAJOR }}
uses: dsanders11/project-actions/copy-project@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/copy-project@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
id: create-release-board
with:
drafts: true
@@ -170,6 +170,60 @@ jobs:
template-view: ${{ steps.generate-project-metadata.outputs.template-view }}
title: ${{ steps.generate-project-metadata.outputs.major }}-x-y
token: ${{ steps.generate-token.outputs.token }}
- name: Randomly Assign Draft Issues to Release WG Members
if: ${{ steps.check-major-version.outputs.MAJOR }}
uses: dsanders11/project-actions/github-script@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
env:
PROJECT_ID: ${{ steps.create-release-board.outputs.id }}
with:
token: ${{ steps.generate-token.outputs.token }}
script: |
const { data: members } = await github.rest.teams.listMembersInOrg({
org: 'electron',
team_slug: 'wg-releases',
});
const excludedLogins = ['nikwen'];
const memberLogins = new Set(members.map(m => m.login));
for (const login of excludedLogins) {
if (!memberLogins.has(login)) {
core.warning(`Excluded member "${login}" is not in @electron/wg-releases`);
}
}
const eligible = members.filter(m => !excludedLogins.includes(m.login));
if (eligible.length === 0) {
core.warning('No eligible members found in @electron/wg-releases team');
return;
}
const projectId = process.env.PROJECT_ID;
const draftIssues = await actions.getDraftIssues(projectId);
if (draftIssues.length === 0) {
core.info('No draft issues found in the project');
return;
}
// Fisher-Yates shuffle for uniform random assignment
const shuffled = [...eligible];
for (let i = shuffled.length - 1; i > 0; i--) {
const j = Math.floor(Math.random() * (i + 1));
[shuffled[i], shuffled[j]] = [shuffled[j], shuffled[i]];
}
// Assign draft issues round-robin across team members
for (let i = 0; i < draftIssues.length; i++) {
const member = shuffled[i % shuffled.length];
const draftIssue = draftIssues[i];
core.info(`Assigning "${draftIssue.content.title}" to ${member.login}`);
await actions.editItem(projectId, draftIssue.content.id, {
assignees: [member.login],
});
}
core.info(`Assigned ${draftIssues.length} draft issues to ${eligible.length} team members`);
- name: Dump Release Project Board Contents
if: ${{ steps.check-major-version.outputs.MAJOR }}
run: gh project item-list ${{ steps.create-release-board.outputs.number }} --owner electron --format json | jq
@@ -177,7 +231,7 @@ jobs:
GITHUB_TOKEN: ${{ steps.generate-token.outputs.token }}
- name: Find Previous Release Project Board
if: ${{ steps.check-major-version.outputs.MAJOR }}
uses: dsanders11/project-actions/find-project@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/find-project@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
id: find-prev-release-board
with:
fail-if-project-not-found: false
@@ -185,7 +239,7 @@ jobs:
token: ${{ steps.generate-token.outputs.token }}
- name: Close Previous Release Project Board
if: ${{ steps.find-prev-release-board.outputs.number }}
uses: dsanders11/project-actions/close-project@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/close-project@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
project-number: ${{ steps.find-prev-release-board.outputs.number }}
token: ${{ steps.generate-token.outputs.token }}

View File

@@ -2,25 +2,38 @@ name: Build Git Cache
# This workflow updates git cache on the cross-instance cache volumes
# It runs daily at midnight.
on:
on:
schedule:
- cron: "0 0 * * *"
permissions: {}
jobs:
build-git-cache-linux:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
build-git-cache-linux:
needs: setup
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
env:
CHROMIUM_GIT_COOKIE: ${{ secrets.CHROMIUM_GIT_COOKIE }}
CHROMIUM_GIT_COOKIE: ${{ secrets.CHROMIUM_GIT_COOKIE }}
GCLIENT_EXTRA_ARGS: '--custom-var=checkout_arm=True --custom-var=checkout_arm64=True'
steps:
- name: Checkout Electron
@@ -34,12 +47,12 @@ jobs:
target-platform: linux
build-git-cache-windows:
if: github.repository == 'electron/electron'
needs: setup
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root --device /dev/fuse --cap-add SYS_ADMIN
volumes:
- /mnt/win-cache:/mnt/win-cache
@@ -59,14 +72,13 @@ jobs:
target-platform: win
build-git-cache-macos:
if: github.repository == 'electron/electron'
# This job updates the same git cache as linux, so it needs to run after the linux one.
needs: [setup, build-git-cache-linux]
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
# This job updates the same git cache as linux, so it needs to run after the linux one.
needs: build-git-cache-linux
container:
image: ghcr.io/electron/build:eac3529546ea8f3aa356d31e345715eef342233b
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
@@ -82,4 +94,4 @@ jobs:
- name: Build Git Cache
uses: ./src/electron/.github/actions/build-git-cache
with:
target-platform: macos
target-platform: macos

View File

@@ -6,8 +6,8 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
required: true
default: ''
required: false
skip-macos:
type: boolean
description: 'Skip macOS builds'
@@ -48,14 +48,14 @@ permissions: {}
jobs:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-latest
runs-on: ubuntu-slim
permissions:
contents: read
pull-requests: read
outputs:
docs: ${{ steps.filter.outputs.docs }}
src: ${{ steps.filter.outputs.src }}
build-image-sha: ${{ steps.set-output.outputs.build-image-sha }}
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
docs-only: ${{ steps.set-output.outputs.docs-only }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
@@ -67,20 +67,21 @@ jobs:
filters: |
docs:
- 'docs/**'
- '.claude/**'
- README.md
- SECURITY.md
- CONTRIBUTING.md
- CODE_OF_CONDUCT.md
src:
- '!docs/**'
- name: Set Outputs for Build Image SHA & Docs Only
- '!{docs,.claude}/**'
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
- name: Set Docs Only
id: set-output
run: |
if [ -z "${{ inputs.build-image-sha }}" ]; then
echo "build-image-sha=eac3529546ea8f3aa356d31e345715eef342233b" >> "$GITHUB_OUTPUT"
else
echo "build-image-sha=${{ inputs.build-image-sha }}" >> "$GITHUB_OUTPUT"
fi
echo "docs-only=${{ steps.filter.outputs.docs == 'true' && steps.filter.outputs.src == 'false' }}" >> "$GITHUB_OUTPUT"
# Lint Jobs
@@ -200,6 +201,15 @@ jobs:
generate-sas-token: 'true'
target-platform: win
# Build a patched siso binary for Windows CI in parallel with checkout-windows.
# The Windows build jobs download the resulting artifact and use it via SISO_PATH.
build-siso-windows:
needs: setup
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
uses: ./.github/workflows/pipeline-segment-build-siso-windows.yml
permissions:
contents: read
# GN Check Jobs
macos-gn-check:
uses: ./.github/workflows/pipeline-segment-electron-gn-check.yml
@@ -265,10 +275,11 @@ jobs:
contents: read
issues: read
pull-requests: read
uses: ./.github/workflows/pipeline-electron-build-and-test.yml
uses: ./.github/workflows/pipeline-electron-build-and-tidy-and-test.yml
needs: checkout-macos
with:
build-runs-on: macos-15-xlarge
clang-tidy-runs-on: macos-15-large
test-runs-on: macos-15
target-platform: macos
target-arch: arm64
@@ -383,12 +394,14 @@ jobs:
contents: read
issues: read
pull-requests: read
uses: ./.github/workflows/pipeline-electron-build-and-test.yml
needs: checkout-windows
uses: ./.github/workflows/pipeline-electron-build-and-tidy-and-test.yml
needs: [checkout-windows, build-siso-windows]
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
with:
build-runs-on: electron-arc-centralus-windows-amd64-16core
build-runs-on: electron-arc-centralus-windows-amd64-32core
clang-tidy-runs-on: electron-arc-centralus-linux-amd64-8core
test-runs-on: windows-latest
clang-tidy-container: '{"image":"ghcr.io/electron/build:${{ needs.checkout-windows.outputs.build-image-sha }}","options":"--user root --device /dev/fuse --cap-add SYS_ADMIN","volumes":["/mnt/win-cache:/mnt/win-cache"]}'
target-platform: win
target-arch: x64
is-release: false
@@ -403,10 +416,10 @@ jobs:
issues: read
pull-requests: read
uses: ./.github/workflows/pipeline-electron-build-and-test.yml
needs: checkout-windows
needs: [checkout-windows, build-siso-windows]
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
with:
build-runs-on: electron-arc-centralus-windows-amd64-16core
build-runs-on: electron-arc-centralus-windows-amd64-32core
test-runs-on: windows-latest
target-platform: win
target-arch: x86
@@ -422,10 +435,10 @@ jobs:
issues: read
pull-requests: read
uses: ./.github/workflows/pipeline-electron-build-and-test.yml
needs: checkout-windows
needs: [checkout-windows, build-siso-windows]
if: ${{ needs.setup.outputs.src == 'true' && !inputs.skip-windows }}
with:
build-runs-on: electron-arc-centralus-windows-amd64-16core
build-runs-on: electron-arc-centralus-windows-amd64-32core
test-runs-on: windows-11-arm
target-platform: win
target-arch: arm64
@@ -440,36 +453,12 @@ jobs:
runs-on: ubuntu-latest
permissions:
contents: read
needs: [docs-only, macos-x64, macos-arm64, linux-x64, linux-x64-asan, linux-arm, linux-arm64, windows-x64, windows-x86, windows-arm64]
if: always() && github.repository == 'electron/electron' && !contains(needs.*.result, 'failure')
steps:
needs: [docs-only, macos-x64, macos-arm64, linux-x64, linux-x64-asan, linux-arm, linux-arm64, build-siso-windows, windows-x64, windows-x86, windows-arm64]
if: always() && github.repository == 'electron/electron'
steps:
- name: Fail if any needed job failed or was cancelled
if: contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled')
run: exit 1
- name: GitHub Actions Jobs Done
run: |
echo "All GitHub Actions Jobs are done"
check-signed-commits:
name: Check signed commits in green PR
needs: gha-done
if: ${{ contains(github.event.pull_request.labels.*.name, 'needs-signed-commits')}}
runs-on: ubuntu-slim
permissions:
contents: read
pull-requests: write
steps:
- name: Check signed commits in PR
uses: 1Password/check-signed-commits-action@ed2885f3ed2577a4f5d3c3fe895432a557d23d52 # v1
with:
comment: |
⚠️ This PR contains unsigned commits. This repository enforces [commit signatures](https://docs.github.com/en/authentication/managing-commit-signature-verification)
for all incoming PRs. To get your PR merged, please sign those commits
(`git rebase --exec 'git commit -S --amend --no-edit -n' @{upstream}`) and force push them to this branch
(`git push --force-with-lease`)
For more information on signing commits, see GitHub's documentation on [Telling Git about your signing key](https://docs.github.com/en/authentication/managing-commit-signature-verification/telling-git-about-your-signing-key).
- name: Remove needs-signed-commits label
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
gh pr edit $PR_URL --remove-label needs-signed-commits

View File

@@ -13,13 +13,26 @@ on:
permissions: {}
jobs:
clean-orphaned-uploads:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
clean-orphaned-uploads:
needs: setup
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -12,15 +12,28 @@ on:
permissions: {}
jobs:
clean-src-cache:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
clean-src-cache:
needs: setup
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
env:
DD_API_KEY: ${{ secrets.DD_API_KEY }}
container:
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -21,7 +21,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Set status
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90
@@ -42,7 +42,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Set status
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90

View File

@@ -20,7 +20,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Add to Issue Triage
uses: dsanders11/project-actions/add-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/add-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
field: Reporter
field-value: ${{ github.event.issue.user.login }}
@@ -46,7 +46,7 @@ jobs:
.yarn
- run: yarn workspaces focus @electron/gha-workflows
- name: Add labels
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
id: add-labels
env:
ISSUE_BODY: ${{ github.event.issue.body }}

View File

@@ -20,7 +20,7 @@ jobs:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Remove from issue triage
uses: dsanders11/project-actions/delete-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/delete-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90

View File

@@ -33,7 +33,7 @@ jobs:
org: electron
- name: Set status
if: ${{ steps.check-for-blocked-labels.outputs.NOT_BLOCKED }}
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 90

View File

@@ -6,7 +6,8 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
default: ''
required: false
upload-to-storage:
description: 'Uploads to Azure storage'
required: false
@@ -20,13 +21,28 @@ on:
permissions: {}
jobs:
checkout-linux:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
checkout-linux:
needs: setup
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
@@ -50,11 +66,11 @@ jobs:
attestations: write
contents: read
id-token: write
needs: checkout-linux
needs: [setup, checkout-linux]
with:
environment: production-release
build-runs-on: electron-arc-centralus-linux-amd64-32core
build-container: '{"image":"ghcr.io/electron/build:${{ inputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
build-container: '{"image":"ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
target-platform: linux
target-arch: x64
is-release: true
@@ -70,11 +86,11 @@ jobs:
attestations: write
contents: read
id-token: write
needs: checkout-linux
needs: [setup, checkout-linux]
with:
environment: production-release
build-runs-on: electron-arc-centralus-linux-amd64-32core
build-container: '{"image":"ghcr.io/electron/build:${{ inputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
build-container: '{"image":"ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
target-platform: linux
target-arch: arm
is-release: true
@@ -90,11 +106,11 @@ jobs:
attestations: write
contents: read
id-token: write
needs: checkout-linux
needs: [setup, checkout-linux]
with:
environment: production-release
build-runs-on: electron-arc-centralus-linux-amd64-32core
build-container: '{"image":"ghcr.io/electron/build:${{ inputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
build-container: '{"image":"ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}","options":"--user root","volumes":["/mnt/cross-instance-cache:/mnt/cross-instance-cache"]}'
target-platform: linux
target-arch: arm64
is-release: true

View File

@@ -6,8 +6,8 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
required: true
default: ''
required: false
upload-to-storage:
description: 'Uploads to Azure storage'
required: false
@@ -21,13 +21,28 @@ on:
permissions: {}
jobs:
checkout-macos:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
checkout-macos:
needs: setup
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache

View File

@@ -0,0 +1,98 @@
name: Pipeline Segment - Build Siso (Windows)
# Builds a patched siso binary for Windows CI. Reads the siso revision from
# the Chromium DEPS file at the pinned chromium_version, shallow-clones
# chromium.googlesource.com/build at that revision, applies the patches under
# .github/siso-patches/, cross-compiles siso.exe for windows/amd64, and
# publishes it as the `siso-windows-amd64` artifact. The Windows build jobs
# download it and use it via SISO_PATH. The built binary is cached keyed on
# the siso revision + sha256 of the patch contents, so subsequent runs just
# restore it.
on:
workflow_call: {}
permissions: {}
jobs:
build:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout Electron
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 1
ref: ${{ github.event.pull_request.head.sha }}
sparse-checkout: |
DEPS
.github/siso-patches
- name: Resolve siso revision from Chromium DEPS
id: resolve
run: |
set -euo pipefail
CHROMIUM_VERSION=$(python3 -c "import re; print(re.search(r\"'chromium_version':\s*\n\s*'([^']+)'\", open('DEPS').read()).group(1))")
if ! [[ "$CHROMIUM_VERSION" =~ ^[0-9]+(\.[0-9]+){1,3}$ ]]; then
echo "error: unexpected chromium_version format: $CHROMIUM_VERSION" >&2
exit 1
fi
curl -sfL "https://raw.githubusercontent.com/chromium/chromium/${CHROMIUM_VERSION}/DEPS" -o /tmp/chromium-DEPS
SISO_SHA=$(python3 -c "import re; print(re.search(r\"'siso_version':\s*'git_revision:([0-9a-f]+)'\", open('/tmp/chromium-DEPS').read()).group(1))")
if ! [[ "$SISO_SHA" =~ ^[0-9a-f]{40}$ ]]; then
echo "error: unexpected siso_version SHA: $SISO_SHA" >&2
exit 1
fi
PATCHES_HASH=$(find .github/siso-patches -type f -name '*.patch' | sort | xargs sha256sum | sha256sum | awk '{print $1}')
echo "siso-sha=${SISO_SHA}" >> "$GITHUB_OUTPUT"
echo "patches-hash=${PATCHES_HASH}" >> "$GITHUB_OUTPUT"
echo "Chromium ${CHROMIUM_VERSION} pins siso at ${SISO_SHA}"
echo "Patches hash: ${PATCHES_HASH}"
- name: Restore cached siso binary
id: cache-siso
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: siso-out/siso.exe
key: siso-windows-amd64-${{ steps.resolve.outputs.siso-sha }}-${{ steps.resolve.outputs.patches-hash }}
- name: Shallow clone chromium build repo at pinned revision
if: steps.cache-siso.outputs.cache-hit != 'true'
env:
SISO_SHA: ${{ steps.resolve.outputs.siso-sha }}
run: |
set -euo pipefail
mkdir chromium-build
cd chromium-build
git init -q
git remote add origin https://chromium.googlesource.com/build
git -c protocol.version=2 fetch --depth=1 origin "$SISO_SHA"
git checkout --detach FETCH_HEAD
- name: Apply in-tree siso patches
if: steps.cache-siso.outputs.cache-hit != 'true'
run: |
set -euo pipefail
cd chromium-build
git -c user.name=electron-ci -c user.email=ci@electronjs.org \
am --3way "${GITHUB_WORKSPACE}/.github/siso-patches"/*.patch
- name: Set up Go
if: steps.cache-siso.outputs.cache-hit != 'true'
uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6.4.0
with:
go-version-file: chromium-build/siso/go.mod
cache: false
- name: Build siso (windows/amd64)
if: steps.cache-siso.outputs.cache-hit != 'true'
working-directory: chromium-build/siso
env:
CGO_ENABLED: '0'
GOOS: windows
GOARCH: amd64
run: |
mkdir -p "${GITHUB_WORKSPACE}/siso-out"
go build -trimpath -o "${GITHUB_WORKSPACE}/siso-out/siso.exe" .
- name: Upload siso artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: siso-windows-amd64
path: siso-out/siso.exe
if-no-files-found: error
retention-days: 1

View File

@@ -77,7 +77,6 @@ env:
ELECTRON_ARTIFACTS_BLOB_STORAGE: ${{ secrets.ELECTRON_ARTIFACTS_BLOB_STORAGE }}
ELECTRON_RBE_JWT: ${{ secrets.ELECTRON_RBE_JWT }}
SUDOWOODO_EXCHANGE_URL: ${{ secrets.SUDOWOODO_EXCHANGE_URL }}
SUDOWOODO_EXCHANGE_TOKEN: ${{ secrets.SUDOWOODO_EXCHANGE_TOKEN }}
GCLIENT_EXTRA_ARGS: ${{ inputs.target-platform == 'macos' && '--custom-var=checkout_mac=True --custom-var=host_os=mac' || inputs.target-platform == 'win' && '--custom-var=checkout_win=True' || '--custom-var=checkout_arm=True --custom-var=checkout_arm64=True' }}
ELECTRON_OUT_DIR: Default
ACTIONS_STEP_DEBUG: ${{ secrets.ACTIONS_STEP_DEBUG }}
@@ -124,7 +123,7 @@ jobs:
run: df -h
- name: Setup Node.js/npm
if: ${{ inputs.target-platform == 'macos' }}
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
with:
node-version: 22.21.x
cache: yarn
@@ -195,6 +194,22 @@ jobs:
- name: Free up space (macOS)
if: ${{ inputs.target-platform == 'macos' }}
uses: ./src/electron/.github/actions/free-space-macos
- name: Download custom siso binary (Windows)
if: ${{ inputs.target-platform == 'win' }}
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: siso-windows-amd64
path: ${{ runner.temp }}/siso
- name: Set SISO_PATH (Windows)
if: ${{ inputs.target-platform == 'win' }}
run: |
SISO_BIN="${RUNNER_TEMP}/siso/siso.exe"
if [ ! -f "$SISO_BIN" ]; then
echo "error: expected siso binary at $SISO_BIN" >&2
exit 1
fi
echo "SISO_PATH=$SISO_BIN" >> "$GITHUB_ENV"
echo "Using custom siso binary at $SISO_BIN"
- name: Build Electron
if: ${{ inputs.target-platform != 'macos' || (inputs.target-variant == 'all' || inputs.target-variant == 'darwin') }}
uses: ./src/electron/.github/actions/build-electron

View File

@@ -135,15 +135,34 @@ jobs:
run: echo "::add-matcher::src/electron/.github/problem-matchers/clang.json"
- name: Run Clang-Tidy
run: |
e init -f --root=$(pwd) --out=${ELECTRON_OUT_DIR} testing --target-cpu ${TARGET_ARCH}
e init -f --root=$(pwd) --out=${ELECTRON_OUT_DIR} testing --target-cpu ${TARGET_ARCH} --remote-build none
export GN_EXTRA_ARGS="target_cpu=\"${TARGET_ARCH}\""
# For macOS use_remoteexec=false will cause GN errors, so even though we're doing no remote build, set it
export GN_EXTRA_ARGS="use_remoteexec=true target_cpu=\"${TARGET_ARCH}\""
if [ "${{ inputs.target-platform }}" = "win" ]; then
export GN_EXTRA_ARGS="$GN_EXTRA_ARGS use_v8_context_snapshot=true target_os=\"win\""
fi
e build --only-gen
# Copy macOS framework headers so clang-tidy can find them via -F.
# This must happen after e build --only-gen since e init -f may
# recreate the output directory.
if [ "${{ inputs.target-platform }}" = "macos" ]; then
OUT=src/out/${ELECTRON_OUT_DIR}
SQRL=src/third_party/squirrel.mac
mkdir -p ${OUT}/{ReactiveObjC,Squirrel,Mantle}.framework/Headers
cp ${SQRL}/vendor/ReactiveObjC/ReactiveObjC/*.h ${OUT}/ReactiveObjC.framework/Headers/
cp ${SQRL}/vendor/ReactiveObjC/ReactiveObjC/extobjc/*.h ${OUT}/ReactiveObjC.framework/Headers/
cp ${SQRL}/Squirrel/*.h ${OUT}/Squirrel.framework/Headers/
cp ${SQRL}/vendor/Mantle/Mantle/include/*.h ${OUT}/Mantle.framework/Headers/
cp ${SQRL}/vendor/Mantle/Mantle/extobjc/include/*.h ${OUT}/Mantle.framework/Headers/
fi
cd src/electron
node script/yarn.js lint:clang-tidy --jobs 8 --out-dir ../out/${ELECTRON_OUT_DIR}
- name: Remove Clang problem matcher

View File

@@ -130,7 +130,7 @@ jobs:
run: |
for target_cpu in ${{ inputs.target-archs }}
do
e init -f --root=$(pwd) --out=Default ${{ inputs.gn-build-type }} --import ${{ inputs.gn-build-type }} --target-cpu $target_cpu
e init -f --root=$(pwd) --out=Default ${{ inputs.gn-build-type }} --import ${{ inputs.gn-build-type }} --target-cpu $target_cpu --remote-build none
cd src
export GN_EXTRA_ARGS="target_cpu=\"$target_cpu\""
if [ "${{ inputs.target-platform }}" = "linux" ]; then

View File

@@ -79,7 +79,6 @@ env:
ELECTRON_ARTIFACTS_BLOB_STORAGE: ${{ secrets.ELECTRON_ARTIFACTS_BLOB_STORAGE }}
ELECTRON_RBE_JWT: ${{ secrets.ELECTRON_RBE_JWT }}
SUDOWOODO_EXCHANGE_URL: ${{ secrets.SUDOWOODO_EXCHANGE_URL }}
SUDOWOODO_EXCHANGE_TOKEN: ${{ secrets.SUDOWOODO_EXCHANGE_TOKEN }}
GCLIENT_EXTRA_ARGS: ${{ inputs.target-platform == 'macos' &&
'--custom-var=checkout_mac=True --custom-var=host_os=mac' ||
inputs.target-platform == 'win' && '--custom-var=checkout_win=True' ||
@@ -132,7 +131,7 @@ jobs:
run: df -h
- name: Setup Node.js/npm
if: ${{ inputs.target-platform == 'macos' }}
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
with:
node-version: 22.21.x
cache: yarn
@@ -208,6 +207,22 @@ jobs:
- name: Free up space (macOS)
if: ${{ inputs.target-platform == 'macos' }}
uses: ./src/electron/.github/actions/free-space-macos
- name: Download custom siso binary (Windows)
if: ${{ inputs.target-platform == 'win' }}
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c
with:
name: siso-windows-amd64
path: ${{ runner.temp }}/siso
- name: Set SISO_PATH (Windows)
if: ${{ inputs.target-platform == 'win' }}
run: |
SISO_BIN="${RUNNER_TEMP}/siso/siso.exe"
if [ ! -f "$SISO_BIN" ]; then
echo "error: expected siso binary at $SISO_BIN" >&2
exit 1
fi
echo "SISO_PATH=$SISO_BIN" >> "$GITHUB_ENV"
echo "Using custom siso binary at $SISO_BIN"
- name: Build Electron
if: ${{ inputs.target-platform != 'macos' || (inputs.target-variant == 'all' ||
inputs.target-variant == 'darwin') }}

View File

@@ -66,7 +66,7 @@ jobs:
fail-fast: false
matrix:
build-type: ${{ inputs.target-platform == 'macos' && fromJSON('["darwin","mas"]') || (inputs.target-platform == 'win' && fromJSON('["win"]') || fromJSON('["linux"]')) }}
shard: ${{ case(inputs.display-server == 'wayland', fromJSON('[1]'), inputs.target-platform == 'linux', fromJSON('[1, 2, 3]'), fromJSON('[1, 2]')) }}
shard: ${{ case(inputs.display-server == 'wayland', fromJSON('[1]'), inputs.target-platform == 'linux', fromJSON('[1, 2, 3]'), inputs.target-platform == 'macos' && inputs.target-arch == 'x64', fromJSON('[1, 2, 3]'), fromJSON('[1, 2]')) }}
env:
BUILD_TYPE: ${{ matrix.build-type }}
TARGET_ARCH: ${{ inputs.target-arch }}
@@ -79,7 +79,7 @@ jobs:
cp $(which node) /mnt/runner-externals/node24/bin/
- name: Setup Node.js/npm
if: ${{ inputs.target-platform == 'win' }}
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e
with:
node-version: 22.21.x
- name: Add TCC permissions on macOS
@@ -227,7 +227,7 @@ jobs:
cd src/electron
export ELECTRON_TEST_RESULTS_DIR=`pwd`/junit
# Get which tests are on this shard
tests_files=$(node script/split-tests ${{ matrix.shard }} ${{ case(inputs.display-server == 'wayland', 1, inputs.target-platform == 'linux', 3, 2) }})
tests_files=$(node script/split-tests ${{ matrix.shard }} ${{ case(inputs.display-server == 'wayland', 1, inputs.target-platform == 'linux', 3, inputs.target-platform == 'macos' && inputs.target-arch == 'x64', 3, 2) }})
if [ "${{ inputs.display-server }}" = "wayland" ]; then
allowlist_file=script/wayland-test-allowlist.txt
filtered_tests=""
@@ -315,7 +315,7 @@ jobs:
if: always() && !cancelled()
- name: Upload Test Artifacts
if: always()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f #v7.0.0
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a #v7.0.1
with:
name: ${{ inputs.target-platform == 'linux' && format('test_artifacts_{0}_{1}_{2}', env.ARTIFACT_KEY, inputs.display-server, matrix.shard) || format('test_artifacts_{0}_{1}', env.ARTIFACT_KEY, matrix.shard) }}
path: src/electron/spec/artifacts

View File

@@ -25,7 +25,7 @@ jobs:
sparse-checkout: .github/PULL_REQUEST_TEMPLATE.md
sparse-checkout-cone-mode: false
- name: Check for required sections
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
script: |
const fs = require('fs');
@@ -38,6 +38,12 @@ jobs:
return;
}
const body = context.payload.pull_request.body || '';
// Allow through if body contains a valid backport line
const backportRegex = /Backport of (?:#|https:\/\/github.com\/electron\/electron\/pull\/)\d+/i;
if (backportRegex.test(body)) {
console.log('Backport PR detected, skipping required section check.');
return;
}
const missingSections = requiredSections.filter(
(section) => !body.includes(section),
);

View File

@@ -36,8 +36,18 @@ jobs:
with:
creds: ${{ secrets.ISSUE_TRIAGE_GH_APP_CREDS }}
org: electron
- name: Get project item status
uses: dsanders11/project-actions/get-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
id: get-item
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 118
fail-if-item-not-found: false
- name: Set status to Needs Review
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
if: >-
(steps.get-item.outputs.field-status == '🛑 Needs Submitter Response'
|| steps.get-item.outputs.field-status == '🟡 WIP')
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 118

View File

@@ -42,7 +42,7 @@ jobs:
creds: ${{ secrets.RELEASE_BOARD_GH_APP_CREDS }}
org: electron
- name: Set status
uses: dsanders11/project-actions/edit-item@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/edit-item@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
token: ${{ steps.generate-token.outputs.token }}
project-number: 94
@@ -50,7 +50,7 @@ jobs:
field-value: ✅ Reviewed
pull-request-labeled-ai-pr:
name: ai-pr label added
if: github.event.label.name == 'ai-pr'
if: github.event.label.name == 'ai-pr' && github.event.pull_request.state != 'closed'
runs-on: ubuntu-latest
permissions: {}
steps:

View File

@@ -13,7 +13,6 @@ permissions: {}
jobs:
check-signed-commits:
name: Check signed commits in PR
if: ${{ !contains(github.event.pull_request.labels.*.name, 'needs-signed-commits')}}
runs-on: ubuntu-slim
permissions:
contents: read
@@ -23,9 +22,9 @@ jobs:
uses: 1Password/check-signed-commits-action@ed2885f3ed2577a4f5d3c3fe895432a557d23d52 # v1
with:
comment: |
⚠️ This PR contains unsigned commits. This repository enforces [commit signatures](https://docs.github.com/en/authentication/managing-commit-signature-verification)
for all incoming PRs. To get your PR merged, please sign those commits
(`git rebase --exec 'git commit -S --amend --no-edit -n' @{upstream}`) and force push them to this branch
⚠️ This PR contains unsigned commits. This repository enforces [commit signatures](https://docs.github.com/en/authentication/managing-commit-signature-verification)
for all incoming PRs. To get your PR merged, please sign those commits
(`git rebase --exec 'git commit -S --amend --no-edit -n' @{upstream}`) and force push them to this branch
(`git push --force-with-lease`)
For more information on signing commits, see GitHub's documentation on [Telling Git about your signing key](https://docs.github.com/en/authentication/managing-commit-signature-verification/telling-git-about-your-signing-key).
@@ -37,3 +36,11 @@ jobs:
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
gh pr edit $PR_URL --add-label needs-signed-commits
- name: Remove needs-signed-commits label
if: ${{ success() && contains(github.event.pull_request.labels.*.name, 'needs-signed-commits') }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
gh pr edit $PR_URL --remove-label needs-signed-commits

View File

@@ -43,7 +43,7 @@ jobs:
# Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF
# format to the repository Actions tab.
- name: "Upload artifact"
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: SARIF file
path: results.sarif
@@ -51,6 +51,6 @@ jobs:
# Upload the results to GitHub's code scanning dashboard.
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v3.29.5
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v3.29.5
with:
sarif_file: results.sarif

View File

@@ -29,7 +29,7 @@ jobs:
PROJECT_NUMBER=$(gh project list --owner electron --format json | jq -r '.projects | map(select(.title | test("^[0-9]+-x-y$"))) | max_by(.number) | .number')
echo "PROJECT_NUMBER=$PROJECT_NUMBER" >> "$GITHUB_OUTPUT"
- name: Update Completed Stable Prep Items
uses: dsanders11/project-actions/completed-by@5767984408ccc6742f83acc8b8d8ea5e09f329af # v2.0.0
uses: dsanders11/project-actions/completed-by@4b06452b0128cf601dac14399aa668a8eed2d684 # v2.0.1
with:
field: Prep Status
field-value: ✅ Complete

View File

@@ -6,8 +6,8 @@ on:
build-image-sha:
type: string
description: 'SHA for electron/build image'
default: 'eac3529546ea8f3aa356d31e345715eef342233b'
required: true
default: ''
required: false
upload-to-storage:
description: 'Uploads to Azure storage'
required: false
@@ -21,13 +21,28 @@ on:
permissions: {}
jobs:
checkout-windows:
setup:
if: github.repository == 'electron/electron'
runs-on: ubuntu-slim
permissions:
contents: read
outputs:
build-image-sha: ${{ steps.build-image-sha.outputs.build-image-sha }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
- name: Set Build Image SHA
id: build-image-sha
uses: ./.github/actions/build-image-sha
with:
override: ${{ inputs.build-image-sha }}
checkout-windows:
needs: setup
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:${{ inputs.build-image-sha }}
image: ghcr.io/electron/build:${{ needs.setup.outputs.build-image-sha }}
options: --user root --device /dev/fuse --cap-add SYS_ADMIN
volumes:
- /mnt/win-cache:/mnt/win-cache
@@ -37,8 +52,6 @@ jobs:
GCLIENT_EXTRA_ARGS: '--custom-var=checkout_win=True'
TARGET_OS: 'win'
ELECTRON_DEPOT_TOOLS_WIN_TOOLCHAIN: '1'
outputs:
build-image-sha: ${{ inputs.build-image-sha }}
steps:
- name: Checkout Electron
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
@@ -51,6 +64,14 @@ jobs:
generate-sas-token: 'true'
target-platform: win
# Build the patched siso binary in parallel with checkout-windows; the
# publish-*-win jobs consume it via SISO_PATH.
build-siso-windows:
needs: setup
uses: ./.github/workflows/pipeline-segment-build-siso-windows.yml
permissions:
contents: read
publish-x64-win:
uses: ./.github/workflows/pipeline-segment-electron-publish.yml
permissions:
@@ -58,10 +79,10 @@ jobs:
attestations: write
contents: read
id-token: write
needs: checkout-windows
needs: [checkout-windows, build-siso-windows]
with:
environment: production-release
build-runs-on: electron-arc-centralus-windows-amd64-16core
build-runs-on: electron-arc-centralus-windows-amd64-32core
target-platform: win
target-arch: x64
is-release: true
@@ -77,10 +98,10 @@ jobs:
attestations: write
contents: read
id-token: write
needs: checkout-windows
needs: [checkout-windows, build-siso-windows]
with:
environment: production-release
build-runs-on: electron-arc-centralus-windows-amd64-16core
build-runs-on: electron-arc-centralus-windows-amd64-32core
target-platform: win
target-arch: arm64
is-release: true
@@ -96,10 +117,10 @@ jobs:
attestations: write
contents: read
id-token: write
needs: checkout-windows
needs: [checkout-windows, build-siso-windows]
with:
environment: production-release
build-runs-on: electron-arc-centralus-windows-amd64-16core
build-runs-on: electron-arc-centralus-windows-amd64-32core
target-platform: win
target-arch: x86
is-release: true

47
.oxfmtrc.json Normal file
View File

@@ -0,0 +1,47 @@
{
"$schema": "./node_modules/oxfmt/configuration_schema.json",
"printWidth": 120,
"tabWidth": 2,
"singleQuote": true,
"trailingComma": "none",
"sortImports": {
"newlinesBetween": true,
"groups": [
"electron-internal",
"electron-scoped",
"electron",
"external",
"builtin",
["sibling", "parent"],
"index",
"type",
"unknown"
],
"customGroups": [
{
"groupName": "electron-internal",
"elementNamePattern": ["@electron/internal", "@electron/internal/**"]
},
{
"groupName": "electron-scoped",
"elementNamePattern": ["@electron/**"]
},
{
"groupName": "electron",
"elementNamePattern": ["electron", "electron/**"]
}
]
},
"ignorePatterns": [
"node_modules",
"out",
"ts-gen",
"spec/node_modules",
"spec/fixtures/native-addon",
".github/workflows/node_modules",
"docs/fiddles",
"shell/browser/resources/win/resource.h",
"shell/common/node_includes.h",
"spec/fixtures/pages/jquery-3.6.0.min.js"
]
}

82
.yarn/README.md Normal file
View File

@@ -0,0 +1,82 @@
# Vendored Yarn release
This directory holds the Yarn release used by this repo (`yarnPath` in
`.yarnrc.yml`). The release file is checked in so every contributor and CI job
runs the exact same Yarn, and so we can carry small local patches when needed.
`releases/yarn-4.12.0.cjs` currently carries one such patch, described below.
If you bump the Yarn version, read the **Upgrading Yarn** section first.
## Patch: use `JsZipImpl` for the node-modules link step
### What changed
Two call sites in `releases/yarn-4.12.0.cjs` are modified so the
`node-modules` linker (and the `pnpm`-loose linker) construct their read-only
`ZipOpenFS` with `customZipImplementation: ST` — Yarn's pure-JS `JsZipImpl`
instead of falling through to the default WASM-backed `LibZipImpl`:
```text
new $f({maxOpenFiles:80,readOnlyArchives:!0})
→ new $f({maxOpenFiles:80,readOnlyArchives:!0,customZipImplementation:ST})
```
A comment block at the top of the `.cjs` file marks the file as patched and
points back here.
### Why
On the `linux-arm` CI test shards we run a 32-bit `arm32v7` container. During
`yarn install`'s **Link step**, Yarn opens up to 80 cache zips concurrently.
With `LibZipImpl`, each open zip is `readFileSync`'d into a Node `Buffer`
**and copied again into the WASM linear memory**, and every file read does a
WASM `_malloc(size)` for the entry. The WASM heap has to grow as a single
contiguous region of the 32-bit address space; once enough zips are resident,
the `_malloc` for a large entry — most often `typescript/lib/typescript.js`
(~9 MB inside a ~22 MB zip) — fails.
Yarn's cross-FS `copyFilePromise` swallows the underlying error and re-throws
a generic one, so CI shows:
```text
YN0001: While persisting .../typescript-patch-...zip/node_modules/typescript/
EINVAL: invalid argument, copyfile '/node_modules/typescript/lib/typescript.js' -> '...'
```
The unmasked form (occasionally seen on `pdfjs-dist`) is the WASM-heap failure
string `Couldn't allocate enough memory`. This started failing ~1-in-3
`linux-arm / test` shards at **Install Dependencies** on 2026-04-13, after
[#50692](https://github.com/electron/electron/pull/50692) grew the cache enough
to push the 32-bit process over the edge nondeterministically — e.g.
[run 24739817558](https://github.com/electron/electron/actions/runs/24739817558/job/72380803746).
`JsZipImpl` avoids the problem entirely: it opens the zip by file descriptor,
reads only the central directory into memory, and `readSync`s individual
entries into ordinary Node `Buffer`s — **no WASM heap involved**. It is
read-only and path-based, which is exactly how the linker uses these archives.
There is no `.yarnrc.yml` setting or environment variable to select the zip
implementation (verified against the bundle), so editing the vendored release
is the only way to switch it short of re-implementing the linker in a plugin.
Upstream references:
[yarnpkg/berry#3972](https://github.com/yarnpkg/berry/issues/3972),
[yarnpkg/berry#6722](https://github.com/yarnpkg/berry/issues/6722),
[yarnpkg/berry#6550](https://github.com/yarnpkg/berry/issues/6550).
### Upgrading Yarn
When bumping `releases/yarn-*.cjs`:
1. Check whether upstream now defaults `readOnlyArchives` opens to `JsZipImpl`,
or exposes a config knob for the zip implementation. If so, drop this patch.
2. Otherwise, re-apply: search the new bundle for
`maxOpenFiles:80,readOnlyArchives:!0` (the surrounding minified identifiers
will differ) and add `,customZipImplementation:<JsZipImpl symbol>` — that
symbol is whatever the new bundle exports as `JsZipImpl` from
`@yarnpkg/libzip`.
3. Re-add the header comment pointing back to this README.
4. Verify with
`rm -rf node_modules spec/node_modules && node script/yarn.js install --immutable --mode=skip-build`
and confirm `node_modules/typescript/lib/typescript.js` is byte-identical to
an unpatched install.

File diff suppressed because one or more lines are too long

View File

@@ -105,21 +105,25 @@ electron_mac_bundle_id = branding.mac_bundle_id
if (override_electron_version != "") {
electron_version = override_electron_version
} else {
# When building from source code tarball there is no git tag available and
# When building from a source code tarball there is no git tag available and
# builders must explicitly pass override_electron_version in gn args.
#
# Resolve the real locations of packed-refs and HEAD via git so that this
# also works when electron/ is a `git worktree` (where .git is a file, not a
# directory, and GN's read_file cannot follow the gitdir indirection).
electron_git_ref_paths =
exec_script("script/get-git-ref-paths.py", [], "list lines")
# This read_file call will assert if there is no git information, without it
# gn will generate a malformed build configuration and ninja will get into
# infinite loop.
read_file(".git/packed-refs", "string")
read_file(electron_git_ref_paths[0], "string")
# Set electron version from git tag.
electron_version = exec_script("script/get-git-version.py",
[],
"trim string",
[
".git/packed-refs",
".git/HEAD",
])
electron_git_ref_paths)
}
if (is_mas_build) {
@@ -147,6 +151,25 @@ config("branding") {
config("electron_lib_config") {
include_dirs = [ "." ]
cflags = []
if (is_clang && clang_use_chrome_plugins) {
# The plugin is built directly into clang, so there's no need to load it
# dynamically.
cflags += [
"-Xclang",
"-add-plugin",
"-Xclang",
"blink-gc-plugin",
"-Xclang",
"-plugin-arg-blink-gc-plugin",
"-Xclang",
"check-directory=electron/shell/",
"-Xclang",
"-plugin-arg-blink-gc-plugin",
"-Xclang",
"check-directory=gin/",
]
}
}
# We generate the definitions twice here, once in //electron/electron.d.ts
@@ -476,8 +499,10 @@ source_set("electron_lib") {
"//components/certificate_transparency",
"//components/compose:buildflags",
"//components/embedder_support:user_agent",
"//components/heap_profiling/multi_process",
"//components/input",
"//components/language/core/browser",
"//components/memory_system",
"//components/net_log",
"//components/network_hints/browser",
"//components/network_hints/common:mojo_bindings",
@@ -490,6 +515,7 @@ source_set("electron_lib") {
"//components/pref_registry",
"//components/prefs",
"//components/security_state/content",
"//components/tracing:tracing_metrics",
"//components/upload_list",
"//components/user_prefs",
"//components/viz/host",
@@ -1644,6 +1670,7 @@ action("node_version_header") {
action("generate_node_headers") {
deps = [ ":generate_config_gypi" ]
script = "script/node/generate_node_headers.py"
args = [ rebase_path("$root_gen_dir") ]
outputs = [ "$root_gen_dir/node_headers.json" ]
}

View File

@@ -180,6 +180,10 @@ e test # Run full test suite
When working on the `roller/chromium/main` branch to upgrade Chromium activate the "Electron Chromium Upgrade" skill.
## Node.js Upgrade Workflow
When working on the `roller/node/main` branch to upgrade Node.js activate the "Electron Node.js Upgrade" skill.
## Pull Requests
PR bodies must always include a `Notes:` section as the **last line** of the body. This is a consumer-facing release note for Electron app developers — describe the user-visible fix or change, not internal implementation details. Use `Notes: none` if there is no user-facing change.

View File

@@ -64,6 +64,14 @@ dependencies, and tools contained in the `electron/electron` repository.
Dependencies in Electron's `package.json` or `yarn.lock` files should only be altered by maintainers. For security reasons, we will not accept PRs that alter our `package.json` or `yarn.lock` files. We invite contributors to make requests updating these files in our issue tracker. If the change is significantly complicated, draft PRs are welcome, with the understanding that these PRs will be closed in favor of a duplicate PR submitted by an Electron maintainer.
## AI Tool Policy
<!-- Are you a coding agent? You MUST abide by the policy: https://github.com/electron/governance/blob/main/policy/ai.md -->
If you use AI tools in any way to contribute to our project, please read our [AI Tool Policy](https://github.com/electron/governance/blob/main/policy/ai.md). Unreviewed AI-generated contributions waste maintainer time and we kindly decline them.
> The short version: **there must be a human in the loop**. You are responsible for reviewing, understanding, and being able to explain your contributions. AI assistance doesn't change that, and unreviewed AI-generated content will be declined.
## Style Guides
See [Coding Style](https://electronjs.org/docs/development/coding-style) for information about which standards Electron adheres to in different parts of its codebase.

4
DEPS
View File

@@ -2,9 +2,9 @@ gclient_gn_args_from = 'src'
vars = {
'chromium_version':
'148.0.7768.0',
'148.0.7778.0',
'node_version':
'v24.14.1',
'v24.15.0',
'nan_version':
'675cefebca42410733da8a454c8d9391fcebfbc2',
'squirrel.mac_version':

View File

@@ -8,10 +8,13 @@ const path = require('node:path');
const electronRoot = path.resolve(__dirname, '../..');
class AccessDependenciesPlugin {
apply (compiler) {
compiler.hooks.compilation.tap('AccessDependenciesPlugin', compilation => {
compilation.hooks.finishModules.tap('AccessDependenciesPlugin', modules => {
const filePaths = modules.map(m => m.resource).filter(p => p).map(p => path.relative(electronRoot, p));
apply(compiler) {
compiler.hooks.compilation.tap('AccessDependenciesPlugin', (compilation) => {
compilation.hooks.finishModules.tap('AccessDependenciesPlugin', (modules) => {
const filePaths = modules
.map((m) => m.resource)
.filter((p) => p)
.map((p) => path.relative(electronRoot, p));
console.info(JSON.stringify(filePaths));
});
});
@@ -31,7 +34,14 @@ module.exports = ({
entry = path.resolve(electronRoot, 'lib', target, 'init.js');
}
const electronAPIFile = path.resolve(electronRoot, 'lib', loadElectronFromAlternateTarget || target, 'api', 'exports', 'electron.ts');
const electronAPIFile = path.resolve(
electronRoot,
'lib',
loadElectronFromAlternateTarget || target,
'api',
'exports',
'electron.ts'
);
return (env = {}, argv = {}) => {
const onlyPrintingGraph = !!env.PRINT_WEBPACK_GRAPH;
@@ -61,49 +71,59 @@ module.exports = ({
}
if (targetDeletesNodeGlobals) {
plugins.push(new webpack.ProvidePlugin({
Buffer: ['@electron/internal/common/webpack-provider', 'Buffer'],
global: ['@electron/internal/common/webpack-provider', '_global'],
process: ['@electron/internal/common/webpack-provider', 'process']
}));
plugins.push(
new webpack.ProvidePlugin({
Buffer: ['@electron/internal/common/webpack-provider', 'Buffer'],
global: ['@electron/internal/common/webpack-provider', '_global'],
process: ['@electron/internal/common/webpack-provider', 'process']
})
);
}
// Webpack 5 no longer polyfills process or Buffer.
if (!alwaysHasNode) {
plugins.push(new webpack.ProvidePlugin({
Buffer: ['buffer', 'Buffer'],
process: 'process/browser'
}));
plugins.push(
new webpack.ProvidePlugin({
Buffer: ['buffer', 'Buffer'],
process: 'process/browser'
})
);
}
plugins.push(new webpack.ProvidePlugin({
Promise: ['@electron/internal/common/webpack-globals-provider', 'Promise']
}));
plugins.push(
new webpack.ProvidePlugin({
Promise: ['@electron/internal/common/webpack-globals-provider', 'Promise']
})
);
plugins.push(new webpack.DefinePlugin(defines));
if (wrapInitWithProfilingTimeout) {
plugins.push(new WrapperPlugin({
header: 'function ___electron_webpack_init__() {',
footer: `
plugins.push(
new WrapperPlugin({
header: 'function ___electron_webpack_init__() {',
footer: `
};
if ((globalThis.process || binding.process).argv.includes("--profile-electron-init")) {
setTimeout(___electron_webpack_init__, 0);
} else {
___electron_webpack_init__();
}`
}));
})
);
}
if (wrapInitWithTryCatch) {
plugins.push(new WrapperPlugin({
header: 'try {',
footer: `
plugins.push(
new WrapperPlugin({
header: 'try {',
footer: `
} catch (err) {
console.error('Electron ${outputFilename} script failed to run');
console.error(err);
}`
}));
})
);
}
return {
@@ -133,23 +153,26 @@ if ((globalThis.process || binding.process).argv.includes("--profile-electron-in
}
},
module: {
rules: [{
test: (moduleName) => !onlyPrintingGraph && ignoredModules.includes(moduleName),
loader: 'null-loader'
}, {
test: /\.ts$/,
loader: 'ts-loader',
options: {
configFile: path.resolve(electronRoot, 'tsconfig.electron.json'),
transpileOnly: onlyPrintingGraph,
ignoreDiagnostics: [
// File '{0}' is not under 'rootDir' '{1}'.
6059,
// Private field '{0}' must be declared in an enclosing class.
1111
]
rules: [
{
test: (moduleName) => !onlyPrintingGraph && ignoredModules.includes(moduleName),
loader: 'null-loader'
},
{
test: /\.ts$/,
loader: 'ts-loader',
options: {
configFile: path.resolve(electronRoot, 'tsconfig.electron.json'),
transpileOnly: onlyPrintingGraph,
ignoreDiagnostics: [
// File '{0}' is not under 'rootDir' '{1}'.
6059,
// Private field '{0}' must be declared in an enclosing class.
1111
]
}
}
}]
]
},
node: {
__dirname: false,

View File

@@ -1,5 +1,5 @@
import { shell } from 'electron/common';
import { app, dialog, BrowserWindow, ipcMain } from 'electron/main';
import { app, dialog, BrowserWindow, ipcMain, Menu } from 'electron/main';
import * as path from 'node:path';
import * as url from 'node:url';
@@ -11,23 +11,62 @@ app.on('window-all-closed', () => {
app.quit();
});
function decorateURL (url: string) {
// safely add `?utm_source=default_app
const parsedUrl = new URL(url);
parsedUrl.searchParams.append('utm_source', 'default_app');
return parsedUrl.toString();
}
const isMac = process.platform === 'darwin';
app.whenReady().then(() => {
const helpMenu: Electron.MenuItemConstructorOptions = {
role: 'help',
submenu: [
{
label: 'Learn More',
click: async () => {
await shell.openExternal('https://electronjs.org');
}
},
{
label: 'Documentation',
click: async () => {
const version = process.versions.electron;
await shell.openExternal(`https://github.com/electron/electron/tree/v${version}/docs#readme`);
}
},
{
label: 'Community Discussions',
click: async () => {
await shell.openExternal('https://discord.gg/electronjs');
}
},
{
label: 'Search Issues',
click: async () => {
await shell.openExternal('https://github.com/electron/electron/issues');
}
}
]
};
const macAppMenu: Electron.MenuItemConstructorOptions = { role: 'appMenu' };
const template: Electron.MenuItemConstructorOptions[] = [
...(isMac ? [macAppMenu] : []),
{ role: 'fileMenu' },
{ role: 'editMenu' },
{ role: 'viewMenu' },
{ role: 'windowMenu' },
helpMenu
];
Menu.setApplicationMenu(Menu.buildFromTemplate(template));
});
// Find the shortest path to the electron binary
const absoluteElectronPath = process.execPath;
const relativeElectronPath = path.relative(process.cwd(), absoluteElectronPath);
const electronPath = absoluteElectronPath.length < relativeElectronPath.length
? absoluteElectronPath
: relativeElectronPath;
const electronPath =
absoluteElectronPath.length < relativeElectronPath.length ? absoluteElectronPath : relativeElectronPath;
const indexPath = path.resolve(app.getAppPath(), 'index.html');
function isTrustedSender (webContents: Electron.WebContents) {
function isTrustedSender(webContents: Electron.WebContents) {
if (webContents !== (mainWindow && mainWindow.webContents)) {
return false;
}
@@ -43,7 +82,7 @@ ipcMain.handle('bootstrap', (event) => {
return isTrustedSender(event.sender) ? electronPath : null;
});
async function createWindow (backgroundColor?: string) {
async function createWindow(backgroundColor?: string) {
await app.whenReady();
const options: Electron.BrowserWindowConstructorOptions = {
@@ -68,8 +107,8 @@ async function createWindow (backgroundColor?: string) {
mainWindow = new BrowserWindow(options);
mainWindow.on('ready-to-show', () => mainWindow!.show());
mainWindow.webContents.setWindowOpenHandler(details => {
shell.openExternal(decorateURL(details.url));
mainWindow.webContents.setWindowOpenHandler((details) => {
shell.openExternal(details.url);
return { action: 'deny' };
});

View File

@@ -15,7 +15,7 @@ type DefaultAppOptions = {
interactive: boolean;
abi: boolean;
modules: string[];
}
};
// Parse command line options.
const argv = process.argv.slice(1);
@@ -74,7 +74,7 @@ if (option.modules.length > 0) {
(Module as any)._preloadModules(option.modules);
}
async function loadApplicationPackage (packagePath: string) {
async function loadApplicationPackage(packagePath: string) {
// Add a flag indicating app is started from default app.
Object.defineProperty(process, 'defaultApp', {
configurable: false,
@@ -92,9 +92,11 @@ async function loadApplicationPackage (packagePath: string) {
const emitWarning = process.emitWarning;
try {
process.emitWarning = () => {};
packageJson = (await import(url.pathToFileURL(packageJsonPath).toString(), {
with: { type: 'json' }
})).default;
packageJson = (
await import(url.pathToFileURL(packageJsonPath).toString(), {
with: { type: 'json' }
})
).default;
} catch (e) {
showErrorMessage(`Unable to parse ${packageJsonPath}\n\n${(e as Error).message}`);
return;
@@ -143,23 +145,23 @@ async function loadApplicationPackage (packagePath: string) {
}
}
function showErrorMessage (message: string) {
function showErrorMessage(message: string) {
app.focus();
dialog.showErrorBox('Error launching app', message);
process.exit(1);
}
async function loadApplicationByURL (appUrl: string) {
async function loadApplicationByURL(appUrl: string) {
const { loadURL } = await import('./default_app.js');
loadURL(appUrl);
}
async function loadApplicationByFile (appPath: string) {
async function loadApplicationByFile(appPath: string) {
const { loadFile } = await import('./default_app.js');
loadFile(appPath);
}
async function startRepl () {
async function startRepl() {
if (process.platform === 'win32') {
console.error('Electron REPL not currently supported on Windows');
process.exit(1);
@@ -187,7 +189,7 @@ async function startRepl () {
process.exit(0);
});
function defineBuiltin (context: any, name: string, getter: Function) {
function defineBuiltin(context: any, name: string, getter: Function) {
const setReal = (val: any) => {
// Deleting the property before re-assigning it disables the
// getter/setter mechanism.
@@ -225,11 +227,42 @@ async function startRepl () {
// we only trigger custom tab-completion when no common words are
// potentially matches.
const commonWords = [
'async', 'await', 'break', 'case', 'catch', 'const', 'continue',
'debugger', 'default', 'delete', 'do', 'else', 'export', 'false',
'finally', 'for', 'function', 'if', 'import', 'in', 'instanceof', 'let',
'new', 'null', 'return', 'switch', 'this', 'throw', 'true', 'try',
'typeof', 'var', 'void', 'while', 'with', 'yield'
'async',
'await',
'break',
'case',
'catch',
'const',
'continue',
'debugger',
'default',
'delete',
'do',
'else',
'export',
'false',
'finally',
'for',
'function',
'if',
'import',
'in',
'instanceof',
'let',
'new',
'null',
'return',
'switch',
'this',
'throw',
'true',
'try',
'typeof',
'var',
'void',
'while',
'with',
'yield'
];
const electronBuiltins = [...Object.keys(electron), 'original-fs', 'electron'];

View File

@@ -2,10 +2,10 @@ const { ipcRenderer, contextBridge } = require('electron/renderer');
const policy = window.trustedTypes.createPolicy('electron-default-app', {
// we trust the SVG contents
createHTML: input => input
createHTML: (input) => input
});
async function getOcticonSvg (name: string) {
async function getOcticonSvg(name: string) {
try {
const response = await fetch(`octicon/${name}.svg`);
const div = document.createElement('div');
@@ -16,7 +16,7 @@ async function getOcticonSvg (name: string) {
}
}
async function loadSVG (element: HTMLSpanElement) {
async function loadSVG(element: HTMLSpanElement) {
for (const cssClass of element.classList) {
if (cssClass.startsWith('octicon-')) {
const icon = await getOcticonSvg(cssClass.substr(8));
@@ -32,9 +32,9 @@ async function loadSVG (element: HTMLSpanElement) {
}
}
async function initialize () {
async function initialize() {
const electronPath = await ipcRenderer.invoke('bootstrap');
function replaceText (selector: string, text: string, link?: string) {
function replaceText(selector: string, text: string, link?: string) {
const element = document.querySelector<HTMLElement>(selector);
if (element) {
if (link) {
@@ -51,7 +51,11 @@ async function initialize () {
replaceText('.electron-version', `Electron v${process.versions.electron}`, 'https://electronjs.org/docs');
replaceText('.chrome-version', `Chromium v${process.versions.chrome}`, 'https://developer.chrome.com/docs/chromium');
replaceText('.node-version', `Node v${process.versions.node}`, `https://nodejs.org/docs/v${process.versions.node}/api`);
replaceText(
'.node-version',
`Node v${process.versions.node}`,
`https://nodejs.org/docs/v${process.versions.node}/api`
);
replaceText('.v8-version', `v8 v${process.versions.v8}`, 'https://v8.dev/docs');
replaceText('.command-example', `${electronPath} path-to-app`);

View File

@@ -124,4 +124,65 @@ Returns `Promise<Object>` - Resolves with an object containing the `value` and `
Get the maximum usage across processes of trace buffer as a percentage of the
full state.
### `contentTracing.enableHeapProfiling([options])` _Experimental_
<!--
```YAML history
added:
- pr-url: https://github.com/electron/electron/pull/50826
```
-->
* `options` ([EnableHeapProfilingOptions](structures/enable-heap-profiling-options.md)) (optional)
Returns `Promise<void>` - Resolves once heap profiling has been enabled.
Enable [heap profiling](https://chromium.googlesource.com/chromium/src/+/lkgr/docs/memory-infra/heap_profiler.md)
for MemoryInfra traces. Equivalent to the `--memlog` switch in Chrome.
Only takes effect if the `disabled-by-default-memory-infra` category is included.
Needs to be called before `contentTracing.startRecording()`.
Usage:
```js
const { contentTracing } = require('electron')
async function recordTrace () {
await contentTracing.enableHeapProfiling()
await contentTracing.startRecording({
included_categories: ['disabled-by-default-memory-infra'],
excluded_categories: ['*'],
memory_dump_config: {
triggers: [
{ mode: 'detailed', periodic_interval_ms: 1000 }
]
}
})
await new Promise(resolve => setTimeout(resolve, 5000))
const filePath = await contentTracing.stopRecording()
}
```
To view the recorded heap dumps:
1. Download the breakpad symbols for your Electron version from the Electron GitHub
[releases](https://github.com/electron/electron/releases)
2. Clone the [Electron source code](../development/build-instructions-gn.md)
3. In your Chromium checkout for Electron, run this command to symbolicate the heap dump:
```bash
python3 third_party/catapult/tracing/bin/symbolize_trace --use-breakpad-symbols --breakpad-symbols-directory /path/to/breakpad_symbols /path/to/trace.json
```
4. Open the symbolicated trace in `chrome://tracing` (the Perfetto UI does not support memory dumps
yet)
5. Click on one of the `M` symbols
6. Click on a `` triple bar icon (e.g., in the `malloc` column)
<img src="../images/viewing-heap-dumps.png" alt="Screenshot showing how to view a heapdump in Chromium's tracing view" />
[trace viewer]: https://chromium.googlesource.com/catapult/+/HEAD/tracing/README.md

View File

@@ -203,13 +203,22 @@ the one downloaded by `npm install`. Usage:
export ELECTRON_OVERRIDE_DIST_PATH=/Users/username/projects/electron/out/Testing
```
### `ELECTRON_SKIP_BINARY_DOWNLOAD`
### `ELECTRON_INSTALL_PLATFORM`
If you want to install your project's dependencies but don't need to use Electron functionality,
you can set the `ELECTRON_SKIP_BINARY_DOWNLOAD` environment variable to prevent the binary from being
downloaded. For instance, this feature can be useful in continuous integration environments when
running unit tests that mock out the `electron` module.
Manually overrides platform used by `electron` package during an install.
This can be useful if you are on one platform (e.g macOS) but want to
download binaries for another platform (e.g Windows or Linux). Usage:
```sh
ELECTRON_SKIP_BINARY_DOWNLOAD=1 npm install
ELECTRON_INSTALL_PLATFORM=darwin npm install
```
### `ELECTRON_INSTALL_ARCH`
Manually overrides architecture used by `electron` package during an install.
This can be useful if you are on one arch (e.g `arm64`) but want to download
binaries meant for another arch. Note that this will not work under Rosetta. Usage:
```sh
ELECTRON_INSTALL_ARCH=arm64 npm install
```

View File

@@ -46,7 +46,7 @@ this has the additional effect of removing the menu bar from the window.
> [!NOTE]
> The default menu will be created automatically if the app does not set one.
> It contains standard items such as `File`, `Edit`, `View`, `Window` and `Help`.
> It contains standard items such as `File`, `Edit`, `View`, and `Window`.
#### `Menu.getApplicationMenu()`

View File

@@ -76,11 +76,51 @@ app.whenReady().then(() => {
})
```
#### `Notification.getHistory()` _macOS_
Returns `Promise<Notification[]>` - Resolves with an array of `Notification` objects representing all delivered notifications still present in Notification Center.
Each returned `Notification` is a live object connected to the corresponding delivered notification. Interaction events (`click`, `reply`, `action`, `close`) will fire on these objects when the user interacts with the notification in Notification Center. This is useful after an app restart to re-attach event handlers to notifications from a previous session.
The returned notifications have their `id`, `groupId`, `title`, `subtitle`, and `body` properties populated from information available in the Notification Center. Other properties (e.g., `actions`, `silent`, `icon`) are not available from delivered notifications and will have default values.
> [!NOTE]
> Like all macOS notification APIs, this method requires the application to be
> code-signed. In unsigned development builds, notifications are not delivered
> to Notification Center and this method will resolve with an empty array.
> [!NOTE]
> Unlike notifications created with `new Notification()`, notifications returned
> by `getHistory()` will remain visible in Notification Center when the object
> is garbage collected. Calling `show()` on a restored notification will remove
> the original from Notification Center and post a new one with the same
> properties.
```js
const { Notification, app } = require('electron')
app.whenReady().then(async () => {
// Restore notifications from a previous session
const notifications = await Notification.getHistory()
for (const n of notifications) {
console.log(`Found delivered notification: ${n.id} - ${n.title}`)
n.on('click', () => {
console.log(`User clicked: ${n.id}`)
})
n.on('reply', (event) => {
console.log(`User replied to ${n.id}: ${event.reply}`)
})
}
// Keep references so events continue to fire
})
```
### `new Notification([options])`
* `options` Object (optional)
* `id` string (optional) _macOS_ - A unique identifier for the notification, mapping to `UNNotificationRequest`'s [`identifier`](https://developer.apple.com/documentation/usernotifications/unnotificationrequest/identifier) property. Defaults to a random UUID if not provided or if an empty string is passed. This can be used to remove or update previously delivered notifications.
* `groupId` string (optional) _macOS_ - A string identifier used to visually group notifications together in Notification Center. Maps to `UNNotificationContent`'s [`threadIdentifier`](https://developer.apple.com/documentation/usernotifications/unnotificationcontent/threadidentifier) property.
* `id` string (optional) _macOS_ _Windows_ - A unique identifier for the notification. On macOS, maps to `UNNotificationRequest`'s [`identifier`](https://developer.apple.com/documentation/usernotifications/unnotificationrequest/identifier) property. On Windows, maps to the toast notification's [`Tag`](https://learn.microsoft.com/en-us/uwp/api/windows.ui.notifications.toastnotification.tag) property. Defaults to a random UUID if not provided or if an empty string is passed. This can be used to remove or update previously delivered notifications.
* `groupId` string (optional) _macOS_ _Windows_ - A string identifier used to visually group notifications together in Notification Center / Action Center. On macOS, maps to `UNNotificationContent`'s [`threadIdentifier`](https://developer.apple.com/documentation/usernotifications/unnotificationcontent/threadidentifier) property. On Windows, maps to the toast notification's [`Group`](https://learn.microsoft.com/en-us/uwp/api/windows.ui.notifications.toastnotification.group) property.
* `groupTitle` string (optional) _Windows_ - A title for the notification group header. When both `groupId` and `groupTitle` are specified, Windows will display a header above the notification that groups related notifications together. Maps to the toast notification's [`header`](https://learn.microsoft.com/en-us/windows/apps/design/shell/tiles-and-notifications/toast-headers) element.
* `title` string (optional) - A title for the notification, which will be displayed at the top of the notification window when it is shown.
* `subtitle` string (optional) _macOS_ - A subtitle for the notification, which will be displayed below the title.
* `body` string (optional) - The body text of the notification, which will be displayed below the title or subtitle.
@@ -291,6 +331,10 @@ call this method before the OS will display it.
If the notification has been shown before, this method will dismiss the previously
shown notification and create a new one with identical properties.
On macOS, calling `show()` on a notification returned by `Notification.getHistory()` will
remove the original notification from Notification Center and post a new one with the same
properties.
```js
const { Notification, app } = require('electron')
@@ -329,13 +373,17 @@ app.whenReady().then(() => {
### Instance Properties
#### `notification.id` _macOS_ _Readonly_
#### `notification.id` _macOS_ _Windows_ _Readonly_
A `string` property representing the unique identifier of the notification. This is set at construction time — either from the `id` option or as a generated UUID if none was provided.
#### `notification.groupId` _macOS_ _Readonly_
#### `notification.groupId` _macOS_ _Windows_ _Readonly_
A `string` property representing the group identifier of the notification. Notifications with the same `groupId` will be visually grouped together in Notification Center.
A `string` property representing the group identifier of the notification. Notifications with the same `groupId` will be visually grouped together in Notification Center (macOS) or Action Center (Windows).
#### `notification.groupTitle` _Windows_ _Readonly_
A `string` property representing the title of the notification group header.
#### `notification.title`

View File

@@ -650,7 +650,7 @@ Clears the sessions HTTP cache.
`scheme://host:port`.
* `storages` string[] (optional) - The types of storages to clear, can be
`cookies`, `filesystem`, `indexdb`, `localstorage`,
`shadercache`, `websql`, `serviceworkers`, `cachestorage`. If not
`shadercache`, `serviceworkers`, `cachestorage`. If not
specified, clear all storage types.
Returns `Promise<void>` - resolves when the storage data has been cleared.

View File

@@ -0,0 +1,26 @@
# EnableHeapProfilingOptions Object
* `mode` string (optional) - Controls which processes are profiled. Equivalent to `--memlog` in
Chrome. Default is `all`.
* `all` - Profile all processes.
* `browser` - Profile only the browser process.
* `gpu` - Profile only the GPU process.
* `minimal` - Profile only the browser and GPU processes.
* `renderer-sampling` - Profile at most 1 renderer process. Each renderer process has a fixed
probability of being profiled when the renderer process is started or, for existing processes,
when heap profiling is enabled.
* `all-renderers` - Profile all renderer processes.
* `utility-sampling` - Each utility process has a fixed probability of being profiled.
* `all-utilities` - Profile all utility processes.
* `utility-and-browser` - Profile all utility processes and the browser process.
* `samplingRate` number (optional) - Controls the sampling interval in bytes. The lower the
interval, the more precise the profile is. However it comes at the cost of performance. Default
is `100000` (100KB). That is enough to observe allocation sites that make allocations >500KB
total, where total equals to a single allocation size times the number of such allocations at the
same call site. Equivalent to `--memlog-sampling-rate` in Chrome. Must be an integer between
`1000` and `10000000`.
* `stackMode` string (optional) - Controls the type of metadata recorded for each allocation.
Equivalent to `--memlog-stack-mode` in Chrome. Default is `native`.
* `native` - Instruction addresses from unwinding the stack.
* `native-with-thread-names` - Instruction addresses from unwinding the stack. Includes the thread
name as the first frame.

View File

@@ -5,6 +5,7 @@
* `rgba` - 32bpp RGBA (byte-order), 1 plane.
* `rgbaf16` - Half float RGBA, 1 plane.
* `nv12` - 12bpp with Y plane followed by a 2x2 interleaved UV plane.
* `nv16` - 16bpp with Y plane followed by a 2x1 interleaved UV plane.
* `p010le` - 4:2:0 10-bit YUV (little-endian), Y plane followed by a 2x2 interleaved UV plane.
* `colorSpace` [ColorSpace](color-space.md) (optional) - The color space of the texture.
* `codedSize` [Size](size.md) - The full dimensions of the shared texture.

View File

@@ -2275,6 +2275,20 @@ Returns `Integer` - The Chromium internal `pid` of the associated renderer. Can
be compared to the `frameProcessId` passed by frame specific navigation events
(e.g. `did-frame-navigate`)
#### `contents.clone()`
Returns `WebContents` - A cloned WebContents instance. This method creates a copy
of the WebContents with the following attributes:
* **WebPreferences** - All preferences from the original WebContents are copied
* **SiteInstance** - Uses the same SiteInstance as the original. This means the cloned WebContents will reuse the same render process as the original when loading same-origin pages, and only spawn a new render process for cross-origin navigations. This process allocation behavior is consistent with window.open and tab duplication in Chromium. For more details, see [Chromium's Site Isolation](https://www.chromium.org/developers/design-documents/site-isolation/) design document.
* **Opener relationship** - Inherits the opener (window.opener) relationship
* **Navigation state** - Copies the navigation history and controller state
The cloned WebContents is an independent instance with its own lifecycle that can be destroyed separately and will not contain any open web pages.
This API is useful for use cases where you want to create a new WebContents that shares the same render process with the original for same-origin content, while maintaining full lifecycle independence. Additionally, reusing the existing render process can help optimize memory usage and page load speed to a certain extent, as it eliminates the overhead of spawning and initializing a new render process from scratch.
#### `contents.takeHeapSnapshot(filePath)`
* `filePath` string - Path to the output file.

View File

@@ -33,10 +33,14 @@ because it is invoked in the main process.
Returns [`Window`](https://developer.mozilla.org/en-US/docs/Web/API/Window) | null
`features` is a comma-separated key-value list, following the standard format of
the browser. Electron will parse [`BrowserWindowConstructorOptions`](structures/browser-window-options.md) out of this
list where possible, for convenience. For full control and better ergonomics,
consider using `webContents.setWindowOpenHandler` to customize the
BrowserWindow creation.
the browser. For convenience, Electron will parse a subset of presentational
[`BrowserWindowConstructorOptions`](structures/browser-window-options.md) out of
this list (such as `width`, `height`, `x`, `y`, `show`, `frame`, `title`,
`backgroundColor`). Because the renderer is untrusted, options that cause the
main process to access the filesystem or that are otherwise privileged (such as
`icon`) are ignored. For full control and better ergonomics, use
`webContents.setWindowOpenHandler` to customize the BrowserWindow creation from
the main process.
A subset of [`WebPreferences`](structures/web-preferences.md) can be set directly,
unnested, from the features string: `zoomFactor`, `nodeIntegration`, `javascript`,
@@ -56,9 +60,10 @@ window.open('https://github.com', '_blank', 'top=500,left=200,frame=false,nodeIn
enabled on the parent window.
* JavaScript will always be disabled in the opened `window` if it is disabled on
the parent window.
* Non-standard features (that are not handled by Chromium or Electron) given in
`features` will be passed to any registered `webContents`'s
`did-create-window` event handler in the `options` argument.
* Features that are not handled by Chromium and not in Electron's allowlist of
presentational `BrowserWindowConstructorOptions` are ignored. The raw
`features` string is still available to the main process via
`setWindowOpenHandler`.
* `frameName` follows the specification of `target` located in the [native documentation](https://developer.mozilla.org/en-US/docs/Web/API/Window/open#parameters).
* When opening `about:blank`, the child window's [`WebPreferences`](structures/web-preferences.md) will be copied
from the parent window, and there is no way to override it because Chromium

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 MiB

View File

@@ -2,28 +2,53 @@
Electron frequently releases major versions alongside every other Chromium release.
This document focuses on the release cadence and version support policy.
For a more in-depth guide on our git branches and how Electron uses semantic versions,
check out our [Electron Versioning](./electron-versioning.md) doc.
> [!TIP]
> See the [Electron Versioning](./electron-versioning.md) document for more details
> on how Electron is versioned.
## Timeline
[Electron's Release Schedule](https://releases.electronjs.org/schedule) lists a schedule of Electron major releases showing key milestones including alpha, beta, and stable release dates, as well as end-of-life dates and dependency versions.
:::info Official support dates may change
> [!IMPORTANT]
> Electron's official support policy is the latest 3 stable releases. Our stable
> release and end-of-life dates are determined by Chromium, and may be subject to
> change. While we try to keep our planned release and end-of-life dates frequently
> updated here, future dates may change if affected by upstream scheduling changes,
> and may not always be accurately reflected.
>
> See [Chromium's public release schedule](https://chromiumdash.appspot.com/schedule) for
> definitive information about Chromium's scheduled release dates.
Electron's official support policy is the latest 3 stable releases. Our stable
release and end-of-life dates are determined by Chromium, and may be subject to
change. While we try to keep our planned release and end-of-life dates frequently
updated here, future dates may change if affected by upstream scheduling changes,
and may not always be accurately reflected.
Electron's cadence between major version releases is 8 weeks long. Before each major
version hits stable, it goes through a four-week **alpha** phase and a four-week
**beta** phase.
See [Chromium's public release schedule](https://chromiumdash.appspot.com/schedule) for
definitive information about Chromium's scheduled release dates.
:::
```mermaid
gantt
title Electron release cycle
dateFormat YYYY-MM-DD
axisFormat Week %W
todayMarker off
section v41
Alpha phase :a1, 2026-01-19, 4w
M146 enters Chrome beta :milestone, bm1, after a1, 0d
Beta phase :b1, after a1, 4w
M146 enters Chrome stable :milestone, s1, after b1, 0d
Supported until v44 release :active, after b1, 12w
section v42
Alpha phase :a2, after b1, 4w
M148 enters Chrome beta :milestone, bm2, after a2, 0d
Beta phase :b2, after a2, 4w
M148 enters Chrome stable :milestone, s2, after b2, 0d
Supported until v45 release :active, after b2, 4w
```
**Notes:**
* Alphas are generally less stable than beta releases. The cutoff between the two
corresponds to when the underlying Chromium version enters Chrome's Beta channel.
* The `-alpha.1`, `-beta.1`, and `stable` dates are our solid release dates.
* We strive for weekly alpha/beta releases, but we often release more than scheduled.
* All dates are our goals but there may be reasons for adjusting the stable deadline, such as security bugs.
@@ -38,10 +63,11 @@ and may not always be accurately reflected.
## Version support policy
The latest three _stable_ major versions are supported by the Electron team.
For example, if the latest release is 6.1.x, then the 5.0.x as well
as the 4.2.x series are supported. We only support the latest minor release
For example, if the latest release is 42.1.x, then the 41.0.x as well
as the 40.2.x series are supported. We only support the latest minor release
for each stable release series. This means that in the case of a security fix,
6.1.x will receive the fix, but we will not release a new version of 6.0.x.
42.1.x will receive the fix, but we will not release a new version of 42.0.x.
The latest stable release unilaterally receives all fixes from `main`,
and the version prior to that receives the vast majority of those fixes
@@ -50,11 +76,8 @@ only security fixes directly.
### Chromium version support
:::info Chromium release schedule
Chromium's public release schedule is [here](https://chromiumdash.appspot.com/schedule).
:::
> [!TIP]
> Chromium's public release schedule is [here](https://chromiumdash.appspot.com/schedule).
Electron targets Chromium even-number versions, releasing every 8 weeks in concert
with Chromium's 4-week release schedule. For example, Electron 26 uses Chromium 116, while Electron 27 uses Chromium 118.
@@ -82,3 +105,7 @@ and that number is reduced to two in major version 10, the three-argument versio
continue to work until, at minimum, major version 12. Past the minimum two-version
threshold, we will attempt to support backwards compatibility beyond two versions
until the maintainers feel the maintenance burden is too high to continue doing so.
> [!TIP]
> For a canonical list of breaking changes, see the [Breaking Changes](../breaking-changes.md)
> document.

View File

@@ -14,18 +14,6 @@ To update an existing project to use the latest stable version:
npm install --save-dev electron@latest
```
## Versioning scheme
There are several major changes from our 1.x strategy outlined below. Each change is intended to satisfy the needs and priorities of developers/maintainers and app developers.
1. Strict use of the [SemVer](#semver) spec
2. Introduction of semver-compliant `-beta` tags
3. Introduction of [conventional commit messages](https://conventionalcommits.org/)
4. Well-defined stabilization branches
5. The `main` branch is versionless; only stabilization branches contain version information
We will cover in detail how git branching works, how npm tagging works, what developers should expect to see, and how one can backport changes.
## SemVer
Below is a table explicitly mapping types of changes to their corresponding category of SemVer (e.g. Major, Minor, Patch).
@@ -34,7 +22,7 @@ Below is a table explicitly mapping types of changes to their corresponding cate
| ------------------------------- | ---------------------------------- | ----------------------------- |
| Electron breaking API changes | Electron non-breaking API changes | Electron bug fixes |
| Node.js major version updates | Node.js minor version updates | Node.js patch version updates |
| Chromium version updates | | fix-related chromium patches |
| Chromium version updates | | fix-related Chromium patches |
For more information, see the [Semantic Versioning 2.0.0](https://semver.org/) spec.
@@ -44,68 +32,189 @@ Note that most Chromium updates will be considered breaking. Fixes that can be b
Stabilization branches are branches that run parallel to `main`, taking in only cherry-picked commits that are related to security or stability. These branches are never merged back to `main`.
![Stabilization Branches](../images/versioning-sketch-1.png)
Since Electron 8, stabilization branches are always **major** version lines, and named against the following template `$MAJOR-x-y` e.g. `8-x-y`. Prior to that we used **minor** version lines and named them as `$MAJOR-$MINOR-x` e.g. `2-0-x`.
We allow for multiple stabilization branches to exist simultaneously, one for each supported version. For more details on which versions are supported, see our [Electron Releases](./electron-timelines.md) doc.
![Multiple Stability Branches](../images/versioning-sketch-2.png)
Older lines will not be supported by the Electron project, but other groups can take ownership and backport stability and security fixes on their own. We discourage this, but recognize that it makes life easier for many app developers.
## Beta releases and bug fixes
Developers want to know which releases are _safe_ to use. Even seemingly innocent features can introduce regressions in complex applications. At the same time, locking to a fixed version is dangerous because youre ignoring security patches and bug fixes that may have come out since your version. Our goal is to allow the following standard semver ranges in `package.json` :
* Use `~2.0.0` to admit only stability or security related fixes to your `2.0.0` release.
* Use `^2.0.0` to admit non-breaking _reasonably stable_ feature work as well as security and bug fixes.
Whats important about the second point is that apps using `^` should still be able to expect a reasonable level of stability. To accomplish this, SemVer allows for a _pre-release identifier_ to indicate a particular version is not yet _safe_ or _stable_.
Whatever you choose, you will periodically have to bump the version in your `package.json` as breaking changes are a fact of Chromium life.
The process is as follows:
1. All new major and minor releases lines begin with a beta series indicated by SemVer prerelease tags of `beta.N`, e.g. `2.0.0-beta.1`. After the first beta, subsequent beta releases must meet all of the following conditions:
1. The change is backwards API-compatible (deprecations are allowed)
2. The risk to meeting our stability timeline must be low.
2. If allowed changes need to be made once a release is beta, they are applied and the prerelease tag is incremented, e.g. `2.0.0-beta.2`.
3. If a particular beta release is _generally regarded_ as stable, it will be re-released as a stable build, changing only the version information. e.g. `2.0.0`. After the first stable, all changes must be backwards-compatible bug or security fixes.
4. If future bug fixes or security patches need to be made once a release is stable, they are applied and the _patch_ version is incremented
e.g. `2.0.1`.
Specifically, the above means:
1. Admitting non-breaking-API changes before Week 3 in the beta cycle is okay, even if those changes have the potential to cause moderate side-effects.
2. Admitting feature-flagged changes, that do not otherwise alter existing code paths, at most points in the beta cycle is okay. Users can explicitly enable those flags in their apps.
3. Admitting features of any sort after Week 3 in the beta cycle is 👎 without a very good reason.
For each major and minor bump, you should expect to see something like the following:
```plaintext
2.0.0-beta.1
2.0.0-beta.2
2.0.0-beta.3
2.0.0
2.0.1
2.0.2
```mermaid
gitGraph
commit
commit
branch N-x-y
checkout main
commit id:"fix-1"
checkout N-x-y
cherry-pick id:"fix-1"
checkout main
commit id:"fix-2"
checkout N-x-y
cherry-pick id:"fix-2"
checkout main
commit
commit
```
An example lifecycle in pictures:
Since Electron 8, stabilization branches are always **major** version lines, and named against the following template `$MAJOR-x-y` e.g. `8-x-y`. (Prior to that, we used **minor** version lines and named them as `$MAJOR-$MINOR-x` e.g. `2-0-x`.)
* A new release branch is created that includes the latest set of features. It is published as `2.0.0-beta.1`.
![New Release Branch](../images/versioning-sketch-3.png)
* A bug fix comes into master that can be backported to the release branch. The patch is applied, and a new beta is published as `2.0.0-beta.2`.
![Bugfix Backport to Beta](../images/versioning-sketch-4.png)
* The beta is considered _generally stable_ and it is published again as a non-beta under `2.0.0`.
![Beta to Stable](../images/versioning-sketch-5.png)
* Later, a zero-day exploit is revealed and a fix is applied to master. We backport the fix to the `2-0-x` line and release `2.0.1`.
![Security Backports](../images/versioning-sketch-6.png)
We allow for multiple stabilization branches to exist simultaneously, one for each supported version.
A few examples of how various SemVer ranges will pick up new releases:
> [!TIP]
> For more details on which versions are supported, see our [Electron Releases](./electron-timelines.md) doc.
![Semvers and Releases](../images/versioning-sketch-7.png)
```mermaid
gitGraph
commit
branch "41-x-y"
checkout main
commit
commit
commit id:"fix-a"
checkout "41-x-y"
cherry-pick id:"fix-a"
checkout main
commit
commit id:"fix-b"
checkout "41-x-y"
cherry-pick id:"fix-b"
checkout main
commit
branch "42-x-y"
checkout main
commit
commit id:"fix-c"
checkout "41-x-y"
cherry-pick id:"fix-c"
checkout "42-x-y"
cherry-pick id:"fix-c"
checkout main
commit
commit id:"fix-d"
checkout "41-x-y"
cherry-pick id:"fix-d"
checkout "42-x-y"
cherry-pick id:"fix-d"
checkout main
commit
```
Older lines will not be supported by the Electron project.
## Release cycle
Electron follows an **8-week regular release cycle** where key milestones correspond to
matching dates in the Chromium release cycle.
```mermaid
gantt
title Electron release cycle
dateFormat YYYY-MM-DD
axisFormat Week %W
todayMarker off
section v41
Alpha phase :a1, 2026-01-19, 4w
M146 enters Chrome beta :milestone, bm1, after a1, 0d
Beta phase :b1, after a1, 4w
M146 enters Chrome stable :milestone, s1, after b1, 0d
Supported until v44 release :active, after b1, 12w
section v42
Alpha phase :a2, after b1, 4w
M148 enters Chrome beta :milestone, bm2, after a2, 0d
Beta phase :b2, after a2, 4w
M148 enters Chrome stable :milestone, s2, after b2, 0d
Supported until v45 release :active, after b2, 4w
```
### Example
When Electron 41 hits its stable release, the release line for Electron 42 is branched off of `main`.
Its first alpha release is created with all the changes contained on `main`:
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit tag:"v42.0.0-alpha.1"
```
A bug fix comes into `main` that can be backported to the release branch. The patch is applied,
and it is published in the next `v42.0.0-alpha.2` release.
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit id:"42.0.0-alpha.1" tag:"v42.0.0-alpha.1"
checkout "main"
commit
commit id:"fix-1"
checkout "42-x-y"
cherry-pick id:"fix-1" tag:"v42.0.0-alpha.2"
```
The version of Chromium that powers Electron 42 hits Chrome's beta channel. The `alpha` line is
promoted to `beta`.
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit id:"42.0.0-alpha.1" tag:"v42.0.0-alpha.1"
checkout "main"
commit
commit id:"fix-1"
checkout "42-x-y"
cherry-pick id:"fix-1" tag:"v42.0.0-alpha.2"
checkout "main"
commit
commit
commit id:"fix-2"
checkout "42-x-y"
cherry-pick id:"fix-2" tag:"v42.0.0-beta.1"
```
Beta releases continue weekly until Electron 42 is promoted to stable and the same cycle starts again
with `43-x-y`. Later, a zero-day exploit is revealed and a fix is applied to `main`. We backport the
fix to the `42-x-y` line and release `42.0.1`.
```mermaid
gitGraph
commit
commit
commit
branch "42-x-y"
checkout "42-x-y"
commit id:"42.0.0-alpha.1" tag:"v42.0.0-alpha.1"
checkout "main"
commit
commit id:"fix-1"
checkout "42-x-y"
cherry-pick id:"fix-1" tag:"v42.0.0-alpha.2"
checkout "main"
commit
commit
commit id:"fix-2"
checkout "42-x-y"
cherry-pick id:"fix-2" tag:"v42.0.0-beta.1"
checkout "main"
commit id:"fix-3"
checkout "42-x-y"
cherry-pick id:"fix-3" tag:"v42.0.0"
checkout "main"
branch "43-x-y"
checkout "43-x-y"
commit id:"43.0.0-alpha.1" tag:"v43.0.0-alpha.1"
checkout "main"
commit id:"security-fix"
checkout "42-x-y"
cherry-pick id:"security-fix" tag:"v42.0.1"
checkout "43-x-y"
cherry-pick id:"security-fix" tag:"v43.0.0-alpha.2"
```
### Backport request process
@@ -136,10 +245,11 @@ The `electron/electron` repository also enforces squash merging, so you only nee
## Versioned `main` branch
* The `main` branch will always contain the next major version `X.0.0-nightly.DATE` in its `package.json`.
* The `main` branch always corresponds to the major version above the current pre-release line.
* Unstable nightly releases of `main` are released under the [`electron-nightly`](https://www.npmjs.com/package/electron-nightly)
package on npm.
* Release branches are never merged back to `main`.
* Release branches _do_ contain the correct version in their `package.json`.
* As soon as a release branch is cut for a major, `main` must be bumped to the next major (i.e. `main` is always versioned as the next theoretical release branch).
* All `package.json` values are fixed at `0.0.0-development`.
## Historical versioning (Electron 1.X)
@@ -147,6 +257,29 @@ Electron versions _< 2.0_ did not conform to the [SemVer](https://semver.org) sp
Here is an example of the 1.x strategy:
![1.x Versioning](../images/versioning-sketch-0.png)
```mermaid
---
config:
gitGraph:
mainBranchName: 'master'
---
gitGraph
commit
branch "bugfix-1"
checkout "bugfix-1"
commit
checkout master
merge "bugfix-1" tag:"1.8.1"
branch "feature"
checkout "feature"
commit
checkout master
merge "feature" tag:"1.8.2"
branch "bugfix-2"
checkout "bugfix-2"
commit
checkout master
merge "bugfix-2" tag:"1.8.3"
```
An app developed with `1.8.1` cannot take the `1.8.3` bug fix without either absorbing the `1.8.2` feature, or by backporting the fix and maintaining a new release line.

View File

@@ -25,6 +25,27 @@ included in the `electron` package:
npx install-electron --no
```
## Installing prereleases
Electron [distributes experimental releases of future major versions](./electron-timelines.md)
via npm as well.
Nightly builds contain the latest changes from the `main` branch:
```sh
npm install electron-nightly --save-dev
```
Alpha and beta builds contain changes slated for the next major version:
```sh
npm install electron@alpha --save-dev
npm install electron@beta --save-dev
```
> [!TIP]
> For more information on available Electron releases, see the [Release Status dashboard](https://releases.electronjs.org).
## Running Electron ad-hoc
If you're in a pinch and would prefer to not use `npm install` in your local

View File

@@ -91,6 +91,7 @@ auto_filenames = {
"docs/api/structures/custom-scheme.md",
"docs/api/structures/desktop-capturer-source.md",
"docs/api/structures/display.md",
"docs/api/structures/enable-heap-profiling-options.md",
"docs/api/structures/extension-info.md",
"docs/api/structures/extension.md",
"docs/api/structures/file-filter.md",

View File

@@ -432,6 +432,10 @@ filenames = {
"shell/browser/media/media_capture_devices_dispatcher.h",
"shell/browser/media/media_device_id_salt.cc",
"shell/browser/media/media_device_id_salt.h",
"shell/browser/metrics/electron_metrics_log_uploader.cc",
"shell/browser/metrics/electron_metrics_log_uploader.h",
"shell/browser/metrics/electron_metrics_service_client.cc",
"shell/browser/metrics/electron_metrics_service_client.h",
"shell/browser/microtasks_runner.cc",
"shell/browser/microtasks_runner.h",
"shell/browser/native_window.cc",
@@ -508,6 +512,10 @@ filenames = {
"shell/browser/session_preferences.h",
"shell/browser/special_storage_policy.cc",
"shell/browser/special_storage_policy.h",
"shell/browser/tracing/electron_background_tracing_metrics_provider.cc",
"shell/browser/tracing/electron_background_tracing_metrics_provider.h",
"shell/browser/tracing/electron_tracing_delegate.cc",
"shell/browser/tracing/electron_tracing_delegate.h",
"shell/browser/ui/accelerator_util.cc",
"shell/browser/ui/accelerator_util.h",
"shell/browser/ui/autofill_popup.cc",
@@ -680,6 +688,7 @@ filenames = {
"shell/common/gin_helper/wrappable.cc",
"shell/common/gin_helper/wrappable.h",
"shell/common/gin_helper/wrappable_base.h",
"shell/common/gin_helper/wrappable_pointer_tags.h",
"shell/common/heap_snapshot.cc",
"shell/common/heap_snapshot.h",
"shell/common/key_weak_map.h",
@@ -734,6 +743,8 @@ filenames = {
"shell/renderer/electron_sandboxed_renderer_client.h",
"shell/renderer/electron_smooth_round_rect.cc",
"shell/renderer/electron_smooth_round_rect.h",
"shell/renderer/oom_stack_trace.cc",
"shell/renderer/oom_stack_trace.h",
"shell/renderer/preload_realm_context.cc",
"shell/renderer/preload_realm_context.h",
"shell/renderer/preload_utils.cc",
@@ -781,6 +792,8 @@ filenames = {
"shell/browser/extensions/electron_extension_system_factory.h",
"shell/browser/extensions/electron_extension_system.cc",
"shell/browser/extensions/electron_extension_system.h",
"shell/browser/extensions/electron_extension_tab_util.cc",
"shell/browser/extensions/electron_extension_tab_util.h",
"shell/browser/extensions/electron_extension_web_contents_observer.cc",
"shell/browser/extensions/electron_extension_web_contents_observer.h",
"shell/browser/extensions/electron_extensions_api_client.cc",

View File

@@ -41,7 +41,8 @@ Object.assign(app, {
commandLine: {
hasSwitch: (theSwitch: string) => commandLine.hasSwitch(String(theSwitch)),
getSwitchValue: (theSwitch: string) => commandLine.getSwitchValue(String(theSwitch)),
appendSwitch: (theSwitch: string, value?: string) => commandLine.appendSwitch(String(theSwitch), typeof value === 'undefined' ? value : String(value)),
appendSwitch: (theSwitch: string, value?: string) =>
commandLine.appendSwitch(String(theSwitch), typeof value === 'undefined' ? value : String(value)),
appendArgument: (arg: string) => commandLine.appendArgument(String(arg)),
removeSwitch: (theSwitch: string) => commandLine.removeSwitch(String(theSwitch))
} as Electron.CommandLine
@@ -50,10 +51,10 @@ Object.assign(app, {
// we define this here because it'd be overly complicated to
// do in native land
Object.defineProperty(app, 'applicationMenu', {
get () {
get() {
return Menu.getApplicationMenu();
},
set (menu: Electron.Menu | null) {
set(menu: Electron.Menu | null) {
return Menu.setApplicationMenu(menu);
}
});
@@ -116,17 +117,22 @@ for (const name of events) {
}
app._clientCertRequestPasswordHandler = null;
app.setClientCertRequestPasswordHandler = function (handler: (params: Electron.ClientCertRequestParams) => Promise<string>) {
app.setClientCertRequestPasswordHandler = function (
handler: (params: Electron.ClientCertRequestParams) => Promise<string>
) {
app._clientCertRequestPasswordHandler = handler;
};
app.on('-client-certificate-request-password', async (event: Electron.Event<Electron.ClientCertRequestParams>, callback: (password: string) => void) => {
event.preventDefault();
const { hostname, tokenName, isRetry } = event;
if (!app._clientCertRequestPasswordHandler) {
callback('');
return;
app.on(
'-client-certificate-request-password',
async (event: Electron.Event<Electron.ClientCertRequestParams>, callback: (password: string) => void) => {
event.preventDefault();
const { hostname, tokenName, isRetry } = event;
if (!app._clientCertRequestPasswordHandler) {
callback('');
return;
}
const password = await app._clientCertRequestPasswordHandler({ hostname, tokenName, isRetry });
callback(password);
}
const password = await app._clientCertRequestPasswordHandler({ hostname, tokenName, isRetry });
callback(password);
});
);

View File

@@ -135,7 +135,7 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
allowAnyVersion: boolean = false;
// Private: Validate that the URL points to an MSIX file (following redirects)
private async validateMsixUrl (url: string): Promise<void> {
private async validateMsixUrl(url: string): Promise<void> {
try {
// Make a HEAD request to follow redirects and get the final URL
const response = await net.fetch(url, {
@@ -153,7 +153,9 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
const hasMsixExtension = pathname.endsWith('.msix') || pathname.endsWith('.msixbundle');
if (!hasMsixExtension) {
throw new Error(`Update URL does not point to an MSIX file. Expected .msix or .msixbundle extension, got final URL: ${finalUrl}`);
throw new Error(
`Update URL does not point to an MSIX file. Expected .msix or .msixbundle extension, got final URL: ${finalUrl}`
);
}
} catch (error) {
if (error instanceof TypeError) {
@@ -164,7 +166,7 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
}
// Private: Check if URL is a direct MSIX file (following redirects)
private async isDirectMsixUrl (url: string, emitError: boolean = false): Promise<boolean> {
private async isDirectMsixUrl(url: string, emitError: boolean = false): Promise<boolean> {
try {
await this.validateMsixUrl(url);
return true;
@@ -178,12 +180,12 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
// Supports both versioning (x.y.z) and Windows version format (x.y.z.a)
// Returns: 1 if v1 > v2, -1 if v1 < v2, 0 if v1 === v2
private compareVersions (v1: string, v2: string): number {
const parts1 = v1.split('.').map(part => {
private compareVersions(v1: string, v2: string): number {
const parts1 = v1.split('.').map((part) => {
const parsed = parseInt(part, 10);
return isNaN(parsed) ? 0 : parsed;
});
const parts2 = v2.split('.').map(part => {
const parts2 = v2.split('.').map((part) => {
const parsed = parseInt(part, 10);
return isNaN(parsed) ? 0 : parsed;
});
@@ -203,9 +205,12 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
// Private: Parse the static releases array format
// This is a static JSON file containing all releases
private parseStaticReleasFile (json: any, currentVersion: string): { ok: boolean; available: boolean; url?: string; name?: string; notes?: string; pub_date?: string } {
private parseStaticReleasFile(
json: any,
currentVersion: string
): { ok: boolean; available: boolean; url?: string; name?: string; notes?: string; pub_date?: string } {
if (!Array.isArray(json.releases) || !json.currentRelease || typeof json.currentRelease !== 'string') {
this.emitError(new Error('Invalid releases format. Expected \'releases\' array and \'currentRelease\' string.'));
this.emitError(new Error("Invalid releases format. Expected 'releases' array and 'currentRelease' string."));
return { ok: false, available: false };
}
@@ -234,14 +239,18 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
const releaseEntry = json.releases.find((r: any) => r.version === currentReleaseVersion);
if (!releaseEntry || !releaseEntry.updateTo) {
this.emitError(new Error(`Release entry for version '${currentReleaseVersion}' not found or missing 'updateTo' property.`));
this.emitError(
new Error(`Release entry for version '${currentReleaseVersion}' not found or missing 'updateTo' property.`)
);
return { ok: false, available: false };
}
const updateTo = releaseEntry.updateTo;
if (!updateTo.url) {
this.emitError(new Error(`Invalid release entry. 'updateTo.url' is missing for version ${currentReleaseVersion}.`));
this.emitError(
new Error(`Invalid release entry. 'updateTo.url' is missing for version ${currentReleaseVersion}.`)
);
return { ok: false, available: false };
}
@@ -255,15 +264,22 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
};
}
private parseDynamicReleasFile (json: any): { ok: boolean; available: boolean; url?: string; name?: string; notes?: string; pub_date?: string } {
private parseDynamicReleasFile(json: any): {
ok: boolean;
available: boolean;
url?: string;
name?: string;
notes?: string;
pub_date?: string;
} {
if (!json.url) {
this.emitError(new Error('Invalid releases format. Expected \'url\' string property.'));
this.emitError(new Error("Invalid releases format. Expected 'url' string property."));
return { ok: false, available: false };
}
return { ok: true, available: true, url: json.url, name: json.name, notes: json.notes, pub_date: json.pub_date };
}
private async fetchSquirrelJson (url: string) {
private async fetchSquirrelJson(url: string) {
const headers: Record<string, string> = {
...this.updateHeaders,
Accept: 'application/json' // Always set Accept header, overriding any user-provided Accept
@@ -299,8 +315,8 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
}
}
private async getUpdateInfo (url: string): Promise<UpdateInfo> {
if (url && await this.isDirectMsixUrl(url)) {
private async getUpdateInfo(url: string): Promise<UpdateInfo> {
if (url && (await this.isDirectMsixUrl(url))) {
return { ok: true, available: true, updateUrl: url, releaseDate: new Date() };
} else {
const updateJson = await this.fetchSquirrelJson(url);
@@ -321,7 +337,7 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
const releaseName = updateJson.name ?? '';
releaseDate = releaseDate ?? new Date();
if (!await this.isDirectMsixUrl(updateUrl, true)) {
if (!(await this.isDirectMsixUrl(updateUrl, true))) {
return { ok: false };
} else {
return {
@@ -337,11 +353,11 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
}
}
getFeedURL () {
getFeedURL() {
return this.updateURL ?? '';
}
setFeedURL (options: { url: string; headers?: Record<string, string>; allowAnyVersion?: boolean } | string) {
setFeedURL(options: { url: string; headers?: Record<string, string>; allowAnyVersion?: boolean } | string) {
let updateURL: string;
let headers: Record<string, string> | undefined;
let allowAnyVersion: boolean | undefined;
@@ -351,23 +367,23 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
headers = options.headers;
allowAnyVersion = options.allowAnyVersion;
} else {
throw new TypeError('Expected options object to contain a \'url\' string property in setFeedUrl call');
throw new TypeError("Expected options object to contain a 'url' string property in setFeedUrl call");
}
} else if (typeof options === 'string') {
updateURL = options;
} else {
throw new TypeError('Expected an options object with a \'url\' property to be provided');
throw new TypeError("Expected an options object with a 'url' property to be provided");
}
this.updateURL = updateURL;
this.updateHeaders = headers ?? null;
this.allowAnyVersion = allowAnyVersion ?? false;
}
getPackageInfo (): MSIXPackageInfo {
getPackageInfo(): MSIXPackageInfo {
return msixUpdate.getPackageInfo() as MSIXPackageInfo;
}
async checkForUpdates () {
async checkForUpdates() {
const url = this.updateURL;
if (!url) {
return this.emitError(new Error('Update URL is not set'));
@@ -382,7 +398,11 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
// If appInstallerUri is set, Windows App Installer manages updates automatically
// Prevent updates here to avoid conflicts
if (packageInfo.appInstallerUri) {
return this.emitError(new Error('Auto-updates are managed by Windows App Installer. Updates are not allowed when installed via Application Manifest.'));
return this.emitError(
new Error(
'Auto-updates are managed by Windows App Installer. Updates are not allowed when installed via Application Manifest.'
)
);
}
this.emit('checking-for-update');
@@ -405,18 +425,26 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
forceUpdateFromAnyVersion: this.allowAnyVersion
} as UpdateMsixOptions);
this.emit('update-downloaded', {}, msixUrlInfo.releaseNotes, msixUrlInfo.releaseName, msixUrlInfo.releaseDate, msixUrlInfo.updateUrl, () => {
this.quitAndInstall();
});
this.emit(
'update-downloaded',
{},
msixUrlInfo.releaseNotes,
msixUrlInfo.releaseName,
msixUrlInfo.releaseDate,
msixUrlInfo.updateUrl,
() => {
this.quitAndInstall();
}
);
}
} catch (error) {
this.emitError(error as Error);
}
}
async quitAndInstall () {
async quitAndInstall() {
if (!this.updateAvailable) {
this.emitError(new Error('No update available, can\'t quit and install'));
this.emitError(new Error("No update available, can't quit and install"));
app.quit();
return;
}
@@ -441,7 +469,7 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
// Private: Emit both error object and message, this is to keep compatibility
// with Old APIs.
emitError (error: Error) {
emitError(error: Error) {
this.emit('error', error, error.message);
}
}

View File

@@ -8,40 +8,40 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
updateAvailable: boolean = false;
updateURL: string | null = null;
quitAndInstall () {
quitAndInstall() {
if (!this.updateAvailable) {
return this.emitError(new Error('No update available, can\'t quit and install'));
return this.emitError(new Error("No update available, can't quit and install"));
}
squirrelUpdate.processStart();
app.quit();
}
getFeedURL () {
getFeedURL() {
return this.updateURL ?? '';
}
getPackageInfo () {
getPackageInfo() {
// Squirrel-based Windows apps don't have MSIX package information
return undefined;
}
setFeedURL (options: { url: string } | string) {
setFeedURL(options: { url: string } | string) {
let updateURL: string;
if (typeof options === 'object') {
if (typeof options.url === 'string') {
updateURL = options.url;
} else {
throw new TypeError('Expected options object to contain a \'url\' string property in setFeedUrl call');
throw new TypeError("Expected options object to contain a 'url' string property in setFeedUrl call");
}
} else if (typeof options === 'string') {
updateURL = options;
} else {
throw new TypeError('Expected an options object with a \'url\' property to be provided');
throw new TypeError("Expected an options object with a 'url' property to be provided");
}
this.updateURL = updateURL;
}
async checkForUpdates () {
async checkForUpdates() {
const url = this.updateURL;
if (!url) {
return this.emitError(new Error('Update URL is not set'));
@@ -72,7 +72,7 @@ class AutoUpdater extends EventEmitter implements Electron.AutoUpdater {
// Private: Emit both error object and message, this is to keep compatibility
// with Old APIs.
emitError (error: Error) {
emitError(error: Error) {
this.emit('error', error, error.message);
}
}

View File

@@ -1,4 +1,5 @@
const { updateMsix, registerPackage, registerRestartOnUpdate, getPackageInfo } =
process._linkedBinding('electron_browser_msix_updater');
const { updateMsix, registerPackage, registerRestartOnUpdate, getPackageInfo } = process._linkedBinding(
'electron_browser_msix_updater'
);
export { updateMsix, registerPackage, registerRestartOnUpdate, getPackageInfo };

View File

@@ -34,8 +34,12 @@ const spawnUpdate = async function (args: string[], options: { detached: boolean
let stdout = '';
let stderr = '';
spawnedProcess.stdout.on('data', (data) => { stdout += data; });
spawnedProcess.stderr.on('data', (data) => { stderr += data; });
spawnedProcess.stdout.on('data', (data) => {
stdout += data;
});
spawnedProcess.stderr.on('data', (data) => {
stderr += data;
});
spawnedProcess.on('error', (error) => {
spawnedProcess = undefined;
@@ -59,12 +63,12 @@ const spawnUpdate = async function (args: string[], options: { detached: boolean
};
// Start an instance of the installed app.
export function processStart () {
export function processStart() {
spawnUpdate(['--processStartAndWait', exeName], { detached: true });
}
// Download the releases specified by the URL and write new results to stdout.
export async function checkForUpdate (updateURL: string): Promise<any> {
export async function checkForUpdate(updateURL: string): Promise<any> {
const stdout = await spawnUpdate(['--checkForUpdate', updateURL], { detached: false });
try {
// Last line of output is the JSON details about the releases
@@ -76,12 +80,12 @@ export async function checkForUpdate (updateURL: string): Promise<any> {
}
// Update the application to the latest remote version specified by URL.
export async function update (updateURL: string): Promise<void> {
export async function update(updateURL: string): Promise<void> {
await spawnUpdate(['--update', updateURL], { detached: false });
}
// Is the Update.exe installed with the current application?
export function supported () {
export function supported() {
try {
fs.accessSync(updateExe, fs.constants.R_OK);
return true;

View File

@@ -25,88 +25,156 @@ BaseWindow.prototype.setTouchBar = function (touchBar) {
// Properties
Object.defineProperty(BaseWindow.prototype, 'autoHideMenuBar', {
get: function () { return this.isMenuBarAutoHide(); },
set: function (autoHide) { this.setAutoHideMenuBar(autoHide); }
get: function () {
return this.isMenuBarAutoHide();
},
set: function (autoHide) {
this.setAutoHideMenuBar(autoHide);
}
});
Object.defineProperty(BaseWindow.prototype, 'visibleOnAllWorkspaces', {
get: function () { return this.isVisibleOnAllWorkspaces(); },
set: function (visible) { this.setVisibleOnAllWorkspaces(visible); }
get: function () {
return this.isVisibleOnAllWorkspaces();
},
set: function (visible) {
this.setVisibleOnAllWorkspaces(visible);
}
});
Object.defineProperty(BaseWindow.prototype, 'fullScreen', {
get: function () { return this.isFullScreen(); },
set: function (full) { this.setFullScreen(full); }
get: function () {
return this.isFullScreen();
},
set: function (full) {
this.setFullScreen(full);
}
});
Object.defineProperty(BaseWindow.prototype, 'simpleFullScreen', {
get: function () { return this.isSimpleFullScreen(); },
set: function (simple) { this.setSimpleFullScreen(simple); }
get: function () {
return this.isSimpleFullScreen();
},
set: function (simple) {
this.setSimpleFullScreen(simple);
}
});
Object.defineProperty(BaseWindow.prototype, 'focusable', {
get: function () { return this.isFocusable(); },
set: function (focusable) { this.setFocusable(focusable); }
get: function () {
return this.isFocusable();
},
set: function (focusable) {
this.setFocusable(focusable);
}
});
Object.defineProperty(BaseWindow.prototype, 'kiosk', {
get: function () { return this.isKiosk(); },
set: function (kiosk) { this.setKiosk(kiosk); }
get: function () {
return this.isKiosk();
},
set: function (kiosk) {
this.setKiosk(kiosk);
}
});
Object.defineProperty(BaseWindow.prototype, 'documentEdited', {
get: function () { return this.isDocumentEdited(); },
set: function (edited) { this.setDocumentEdited(edited); }
get: function () {
return this.isDocumentEdited();
},
set: function (edited) {
this.setDocumentEdited(edited);
}
});
Object.defineProperty(BaseWindow.prototype, 'shadow', {
get: function () { return this.hasShadow(); },
set: function (shadow) { this.setHasShadow(shadow); }
get: function () {
return this.hasShadow();
},
set: function (shadow) {
this.setHasShadow(shadow);
}
});
Object.defineProperty(BaseWindow.prototype, 'representedFilename', {
get: function () { return this.getRepresentedFilename(); },
set: function (filename) { this.setRepresentedFilename(filename); }
get: function () {
return this.getRepresentedFilename();
},
set: function (filename) {
this.setRepresentedFilename(filename);
}
});
Object.defineProperty(BaseWindow.prototype, 'minimizable', {
get: function () { return this.isMinimizable(); },
set: function (min) { this.setMinimizable(min); }
get: function () {
return this.isMinimizable();
},
set: function (min) {
this.setMinimizable(min);
}
});
Object.defineProperty(BaseWindow.prototype, 'title', {
get: function () { return this.getTitle(); },
set: function (title) { this.setTitle(title); }
get: function () {
return this.getTitle();
},
set: function (title) {
this.setTitle(title);
}
});
Object.defineProperty(BaseWindow.prototype, 'maximizable', {
get: function () { return this.isMaximizable(); },
set: function (max) { this.setMaximizable(max); }
get: function () {
return this.isMaximizable();
},
set: function (max) {
this.setMaximizable(max);
}
});
Object.defineProperty(BaseWindow.prototype, 'resizable', {
get: function () { return this.isResizable(); },
set: function (res) { this.setResizable(res); }
get: function () {
return this.isResizable();
},
set: function (res) {
this.setResizable(res);
}
});
Object.defineProperty(BaseWindow.prototype, 'menuBarVisible', {
get: function () { return this.isMenuBarVisible(); },
set: function (visible) { this.setMenuBarVisibility(visible); }
get: function () {
return this.isMenuBarVisible();
},
set: function (visible) {
this.setMenuBarVisibility(visible);
}
});
Object.defineProperty(BaseWindow.prototype, 'fullScreenable', {
get: function () { return this.isFullScreenable(); },
set: function (full) { this.setFullScreenable(full); }
get: function () {
return this.isFullScreenable();
},
set: function (full) {
this.setFullScreenable(full);
}
});
Object.defineProperty(BaseWindow.prototype, 'closable', {
get: function () { return this.isClosable(); },
set: function (close) { this.setClosable(close); }
get: function () {
return this.isClosable();
},
set: function (close) {
this.setClosable(close);
}
});
Object.defineProperty(BaseWindow.prototype, 'movable', {
get: function () { return this.isMovable(); },
set: function (move) { this.setMovable(move); }
get: function () {
return this.isMovable();
},
set: function (move) {
this.setMovable(move);
}
});
BaseWindow.getFocusedWindow = () => {

View File

@@ -1,4 +1,11 @@
import { BrowserWindow, AutoResizeOptions, Rectangle, WebContentsView, WebPreferences, WebContents } from 'electron/main';
import {
BrowserWindow,
AutoResizeOptions,
Rectangle,
WebContentsView,
WebPreferences,
WebContents
} from 'electron/main';
const v8Util = process._linkedBinding('electron_common_v8_util');
@@ -10,10 +17,10 @@ export default class BrowserView {
// AutoResize state
#resizeListener: ((...args: any[]) => void) | null = null;
#lastWindowSize: {width: number, height: number} = { width: 0, height: 0 };
#lastWindowSize: { width: number; height: number } = { width: 0, height: 0 };
#autoResizeFlags: AutoResizeOptions = {};
constructor (options: {webPreferences: WebPreferences, webContents?: WebContents} = { webPreferences: {} }) {
constructor(options: { webPreferences: WebPreferences; webContents?: WebContents } = { webPreferences: {} }) {
const { webPreferences = {}, webContents } = options;
if (webContents) {
v8Util.setHiddenValue(webPreferences, 'webContents', webContents);
@@ -25,21 +32,21 @@ export default class BrowserView {
this.#webContentsView.webContents.once('destroyed', this.#destroyListener);
}
get webContents () {
get webContents() {
return this.#webContentsView.webContents;
}
setBounds (bounds: Rectangle) {
setBounds(bounds: Rectangle) {
this.#webContentsView.setBounds(bounds);
this.#autoHorizontalProportion = null;
this.#autoVerticalProportion = null;
}
getBounds () {
getBounds() {
return this.#webContentsView.getBounds();
}
setAutoResize (options: AutoResizeOptions) {
setAutoResize(options: AutoResizeOptions) {
if (options == null || typeof options !== 'object') {
throw new Error('Invalid auto resize options');
}
@@ -55,19 +62,19 @@ export default class BrowserView {
this.#autoVerticalProportion = null;
}
setBackgroundColor (color: string) {
setBackgroundColor(color: string) {
this.#webContentsView.setBackgroundColor(color);
}
// Internal methods
get ownerWindow (): BrowserWindow | null {
get ownerWindow(): BrowserWindow | null {
return this.#ownerWindow;
}
// We can't rely solely on the webContents' owner window because
// a webContents can be closed by the user while the BrowserView
// remains alive and attached to a BrowserWindow.
set ownerWindow (w: BrowserWindow | null) {
set ownerWindow(w: BrowserWindow | null) {
this.#removeResizeListener();
if (this.webContents && !this.webContents.isDestroyed()) {
@@ -77,7 +84,7 @@ export default class BrowserView {
this.#ownerWindow = w;
if (w) {
this.#lastWindowSize = w.getBounds();
w.on('resize', this.#resizeListener = this.#autoResize.bind(this));
w.on('resize', (this.#resizeListener = this.#autoResize.bind(this)));
w.on('closed', () => {
this.#removeResizeListener();
this.#ownerWindow = null;
@@ -86,25 +93,25 @@ export default class BrowserView {
}
}
#onDestroy () {
#onDestroy() {
// Ensure that if #webContentsView's webContents is destroyed,
// the WebContentsView is removed from the view hierarchy.
this.#ownerWindow?.contentView.removeChildView(this.webContentsView);
}
#removeResizeListener () {
#removeResizeListener() {
if (this.#ownerWindow && this.#resizeListener) {
this.#ownerWindow.off('resize', this.#resizeListener);
this.#resizeListener = null;
}
}
#autoHorizontalProportion: {width: number, left: number} | null = null;
#autoVerticalProportion: {height: number, top: number} | null = null;
#autoResize () {
#autoHorizontalProportion: { width: number; left: number } | null = null;
#autoVerticalProportion: { height: number; top: number } | null = null;
#autoResize() {
if (!this.ownerWindow) {
throw new Error('Electron bug: #autoResize called without owner window');
};
}
if (this.#autoResizeFlags.horizontal && this.#autoHorizontalProportion == null) {
const viewBounds = this.#webContentsView.getBounds();
@@ -158,7 +165,7 @@ export default class BrowserView {
};
}
get webContentsView () {
get webContentsView() {
return this.#webContentsView;
}
}

View File

@@ -40,10 +40,14 @@ BrowserWindow.prototype._init = function (this: BWT) {
let unresponsiveEvent: NodeJS.Timeout | null = null;
const emitUnresponsiveEvent = () => {
unresponsiveEvent = null;
if (!this.isDestroyed() && this.isEnabled()) { this.emit('unresponsive'); }
if (!this.isDestroyed() && this.isEnabled()) {
this.emit('unresponsive');
}
};
this.webContents.on('unresponsive', () => {
if (!unresponsiveEvent) { unresponsiveEvent = setTimeout(emitUnresponsiveEvent, 50); }
if (!unresponsiveEvent) {
unresponsiveEvent = setTimeout(emitUnresponsiveEvent, 50);
}
});
this.webContents.on('responsive', () => {
if (unresponsiveEvent) {
@@ -83,16 +87,16 @@ BrowserWindow.prototype._init = function (this: BWT) {
this._browserViews = [];
this.on('closed', () => {
this._browserViews.forEach(b => b.webContents?.close({ waitForBeforeUnload: true }));
this._browserViews.forEach((b) => b.webContents?.close({ waitForBeforeUnload: true }));
});
// Notify the creation of the window.
app.emit('browser-window-created', { preventDefault () {} }, this);
app.emit('browser-window-created', { preventDefault() {} }, this);
Object.defineProperty(this, 'devToolsWebContents', {
enumerable: true,
configurable: false,
get () {
get() {
return this.webContents.devToolsWebContents;
}
});
@@ -104,7 +108,7 @@ const isBrowserWindow = (win: any) => {
BrowserWindow.fromId = (id: number) => {
const win = BaseWindow.fromId(id);
return isBrowserWindow(win) ? win as any as BWT : null;
return isBrowserWindow(win) ? (win as any as BWT) : null;
};
BrowserWindow.getAllWindows = () => {
@@ -214,10 +218,12 @@ BrowserWindow.prototype.addBrowserView = function (browserView: BrowserView) {
};
BrowserWindow.prototype.setBrowserView = function (browserView: BrowserView) {
this._browserViews.forEach(bv => {
this._browserViews.forEach((bv) => {
this.removeBrowserView(bv);
});
if (browserView) { this.addBrowserView(browserView); }
if (browserView) {
this.addBrowserView(browserView);
}
};
BrowserWindow.prototype.removeBrowserView = function (browserView: BrowserView) {

View File

@@ -5,7 +5,7 @@ import { app } from 'electron/main';
const binding = process._linkedBinding('electron_browser_crash_reporter');
class CrashReporter implements Electron.CrashReporter {
start (options: Electron.CrashReporterStartOptions) {
start(options: Electron.CrashReporterStartOptions) {
const {
productName = app.name,
companyName,
@@ -21,7 +21,9 @@ class CrashReporter implements Electron.CrashReporter {
if (uploadToServer && !submitURL) throw new Error('submitURL must be specified when uploadToServer is true');
if (!compress && uploadToServer) {
deprecate.log('Sending uncompressed crash reports is deprecated and will be removed in a future version of Electron. Set { compress: true } to opt-in to the new behavior. Crash reports will be uploaded gzipped, which most crash reporting servers support.');
deprecate.log(
'Sending uncompressed crash reports is deprecated and will be removed in a future version of Electron. Set { compress: true } to opt-in to the new behavior. Crash reports will be uploaded gzipped, which most crash reporting servers support.'
);
}
const appVersion = app.getVersion();
@@ -34,26 +36,33 @@ class CrashReporter implements Electron.CrashReporter {
...globalExtra
};
binding.start(submitURL, uploadToServer,
ignoreSystemCrashHandler, rateLimit, compress, globalExtraAmended, extra, false);
binding.start(
submitURL,
uploadToServer,
ignoreSystemCrashHandler,
rateLimit,
compress,
globalExtraAmended,
extra,
false
);
}
getLastCrashReport () {
const reports = this.getUploadedReports()
.sort((a, b) => {
const ats = (a && a.date) ? new Date(a.date).getTime() : 0;
const bts = (b && b.date) ? new Date(b.date).getTime() : 0;
return bts - ats;
});
getLastCrashReport() {
const reports = this.getUploadedReports().sort((a, b) => {
const ats = a && a.date ? new Date(a.date).getTime() : 0;
const bts = b && b.date ? new Date(b.date).getTime() : 0;
return bts - ats;
});
return (reports.length > 0) ? reports[0] : null;
return reports.length > 0 ? reports[0] : null;
}
getUploadedReports (): Electron.CrashReport[] {
getUploadedReports(): Electron.CrashReport[] {
return binding.getUploadedReports();
}
getUploadToServer () {
getUploadToServer() {
if (process.type === 'browser') {
return binding.getUploadToServer();
} else {
@@ -61,7 +70,7 @@ class CrashReporter implements Electron.CrashReporter {
}
}
setUploadToServer (uploadToServer: boolean) {
setUploadToServer(uploadToServer: boolean) {
if (process.type === 'browser') {
return binding.setUploadToServer(uploadToServer);
} else {
@@ -69,15 +78,15 @@ class CrashReporter implements Electron.CrashReporter {
}
}
addExtraParameter (key: string, value: string) {
addExtraParameter(key: string, value: string) {
binding.addExtraParameter(key, value);
}
removeExtraParameter (key: string) {
removeExtraParameter(key: string) {
binding.removeExtraParameter(key);
}
getParameters () {
getParameters() {
return binding.getParameters();
}
}

View File

@@ -1,8 +1,11 @@
import { BrowserWindow } from 'electron/main';
const { createDesktopCapturer, isDisplayMediaSystemPickerAvailable } = process._linkedBinding('electron_browser_desktop_capturer');
const { createDesktopCapturer, isDisplayMediaSystemPickerAvailable } = process._linkedBinding(
'electron_browser_desktop_capturer'
);
const deepEqual = (a: ElectronInternal.GetSourcesOptions, b: ElectronInternal.GetSourcesOptions) => JSON.stringify(a) === JSON.stringify(b);
const deepEqual = (a: ElectronInternal.GetSourcesOptions, b: ElectronInternal.GetSourcesOptions) =>
JSON.stringify(a) === JSON.stringify(b);
let currentlyRunning: {
options: ElectronInternal.GetSourcesOptions;
@@ -10,13 +13,13 @@ let currentlyRunning: {
}[] = [];
// |options.types| can't be empty and must be an array
function isValid (options: Electron.SourcesOptions) {
function isValid(options: Electron.SourcesOptions) {
return Array.isArray(options?.types);
}
export { isDisplayMediaSystemPickerAvailable };
export async function getSources (args: Electron.SourcesOptions) {
export async function getSources(args: Electron.SourcesOptions) {
if (!isValid(args)) throw new Error('Invalid options');
const resizableValues = new Map();
@@ -50,45 +53,50 @@ export async function getSources (args: Electron.SourcesOptions) {
}
}
let resolveGetSources!: (value: ElectronInternal.GetSourcesResult[]) => void;
let rejectGetSources!: (reason?: any) => void;
const getSources = new Promise<ElectronInternal.GetSourcesResult[]>((resolve, reject) => {
let capturer: ElectronInternal.DesktopCapturer | null = createDesktopCapturer();
resolveGetSources = resolve;
rejectGetSources = reject;
});
const stopRunning = () => {
if (capturer) {
delete capturer._onerror;
delete capturer._onfinished;
capturer = null;
// Register in currentlyRunning BEFORE startHandling so that synchronous
// completion (e.g. when no capturers are created) can properly clean up.
currentlyRunning.push({ options, getSources });
if (process.platform === 'darwin') {
for (const win of BrowserWindow.getAllWindows()) {
if (resizableValues.has(win.id)) {
win.resizable = resizableValues.get(win.id);
}
};
let capturer: ElectronInternal.DesktopCapturer | null = createDesktopCapturer();
const stopRunning = () => {
if (capturer) {
delete capturer._onerror;
delete capturer._onfinished;
capturer = null;
if (process.platform === 'darwin') {
for (const win of BrowserWindow.getAllWindows()) {
if (resizableValues.has(win.id)) {
win.resizable = resizableValues.get(win.id);
}
}
}
// Remove from currentlyRunning once we resolve or reject
currentlyRunning = currentlyRunning.filter(running => running.options !== options);
};
}
// Remove from currentlyRunning once we resolve or reject
currentlyRunning = currentlyRunning.filter((running) => running.options !== options);
};
capturer._onerror = (error: string) => {
stopRunning();
// eslint-disable-next-line prefer-promise-reject-errors
reject(error);
};
capturer._onerror = (error: string) => {
stopRunning();
// eslint-disable-next-line prefer-promise-reject-errors
rejectGetSources(error);
};
capturer._onfinished = (sources: Electron.DesktopCapturerSource[]) => {
stopRunning();
resolve(sources);
};
capturer._onfinished = (sources: Electron.DesktopCapturerSource[]) => {
stopRunning();
resolveGetSources(sources);
};
capturer.startHandling(captureWindow, captureScreen, thumbnailSize, fetchWindowIcons);
});
currentlyRunning.push({
options,
getSources
});
capturer.startHandling(captureWindow, captureScreen, thumbnailSize, fetchWindowIcons);
return getSources;
}

View File

@@ -1,5 +1,13 @@
import { app, BaseWindow } from 'electron/main';
import type { OpenDialogOptions, OpenDialogReturnValue, MessageBoxOptions, SaveDialogOptions, SaveDialogReturnValue, MessageBoxReturnValue, CertificateTrustDialogOptions } from 'electron/main';
import type {
OpenDialogOptions,
OpenDialogReturnValue,
MessageBoxOptions,
SaveDialogOptions,
SaveDialogReturnValue,
MessageBoxReturnValue,
CertificateTrustDialogOptions
} from 'electron/main';
const dialogBinding = process._linkedBinding('electron_browser_dialog');
@@ -60,7 +68,9 @@ const checkAppInitialized = function () {
const setupOpenDialogProperties = (properties: (keyof typeof OpenFileDialogProperties)[]): number => {
let dialogProperties = 0;
for (const property of properties) {
if (Object.hasOwn(OpenFileDialogProperties, property)) { dialogProperties |= OpenFileDialogProperties[property]; }
if (Object.hasOwn(OpenFileDialogProperties, property)) {
dialogProperties |= OpenFileDialogProperties[property];
}
}
return dialogProperties;
};
@@ -68,7 +78,9 @@ const setupOpenDialogProperties = (properties: (keyof typeof OpenFileDialogPrope
const setupSaveDialogProperties = (properties: (keyof typeof SaveFileDialogProperties)[]): number => {
let dialogProperties = 0;
for (const property of properties) {
if (Object.hasOwn(SaveFileDialogProperties, property)) { dialogProperties |= SaveFileDialogProperties[property]; }
if (Object.hasOwn(SaveFileDialogProperties, property)) {
dialogProperties |= SaveFileDialogProperties[property];
}
}
return dialogProperties;
};
@@ -150,7 +162,7 @@ const openDialog = (sync: boolean, window: BaseWindow | null, options?: OpenDial
properties: setupOpenDialogProperties(properties)
};
return (sync) ? dialogBinding.showOpenDialogSync(settings) : dialogBinding.showOpenDialog(settings);
return sync ? dialogBinding.showOpenDialogSync(settings) : dialogBinding.showOpenDialog(settings);
};
const messageBox = (sync: boolean, window: BaseWindow | null, options?: MessageBoxOptions) => {
@@ -194,7 +206,7 @@ const messageBox = (sync: boolean, window: BaseWindow | null, options?: MessageB
// Choose a default button to get selected when dialog is cancelled.
if (cancelId == null) {
// If the defaultId is set to 0, ensure the cancel button is a different index (1)
cancelId = (defaultId === 0 && buttons.length > 1) ? 1 : 0;
cancelId = defaultId === 0 && buttons.length > 1 ? 1 : 0;
for (const [i, button] of buttons.entries()) {
const text = button.toLowerCase();
if (text === 'cancel' || text === 'no') {
@@ -210,7 +222,9 @@ const messageBox = (sync: boolean, window: BaseWindow | null, options?: MessageB
// Generate an ID used for closing the message box.
id = getNextId();
// Close the message box when signal is aborted.
if (signal.aborted) { return Promise.resolve({ cancelId, checkboxChecked }); }
if (signal.aborted) {
return Promise.resolve({ cancelId, checkboxChecked });
}
signal.addEventListener('abort', () => dialogBinding._closeMessageBox(id));
}
@@ -240,59 +254,80 @@ const messageBox = (sync: boolean, window: BaseWindow | null, options?: MessageB
export function showOpenDialog(window: BaseWindow, options: OpenDialogOptions): OpenDialogReturnValue;
export function showOpenDialog(options: OpenDialogOptions): OpenDialogReturnValue;
export function showOpenDialog (windowOrOptions: BaseWindow | OpenDialogOptions, maybeOptions?: OpenDialogOptions): OpenDialogReturnValue {
const window = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions);
const options = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions);
export function showOpenDialog(
windowOrOptions: BaseWindow | OpenDialogOptions,
maybeOptions?: OpenDialogOptions
): OpenDialogReturnValue {
const window = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions;
const options = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions;
return openDialog(false, window, options);
}
export function showOpenDialogSync(window: BaseWindow, options: OpenDialogOptions): OpenDialogReturnValue;
export function showOpenDialogSync(options: OpenDialogOptions): OpenDialogReturnValue;
export function showOpenDialogSync (windowOrOptions: BaseWindow | OpenDialogOptions, maybeOptions?: OpenDialogOptions): OpenDialogReturnValue {
const window = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions);
const options = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions);
export function showOpenDialogSync(
windowOrOptions: BaseWindow | OpenDialogOptions,
maybeOptions?: OpenDialogOptions
): OpenDialogReturnValue {
const window = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions;
const options = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions;
return openDialog(true, window, options);
}
export function showSaveDialog(window: BaseWindow, options: SaveDialogOptions): SaveDialogReturnValue;
export function showSaveDialog(options: SaveDialogOptions): SaveDialogReturnValue;
export function showSaveDialog (windowOrOptions: BaseWindow | SaveDialogOptions, maybeOptions?: SaveDialogOptions): SaveDialogReturnValue {
const window = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions);
const options = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions);
export function showSaveDialog(
windowOrOptions: BaseWindow | SaveDialogOptions,
maybeOptions?: SaveDialogOptions
): SaveDialogReturnValue {
const window = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions;
const options = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions;
return saveDialog(false, window, options);
}
export function showSaveDialogSync(window: BaseWindow, options: SaveDialogOptions): SaveDialogReturnValue;
export function showSaveDialogSync(options: SaveDialogOptions): SaveDialogReturnValue;
export function showSaveDialogSync (windowOrOptions: BaseWindow | SaveDialogOptions, maybeOptions?: SaveDialogOptions): SaveDialogReturnValue {
const window = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions);
const options = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions);
export function showSaveDialogSync(
windowOrOptions: BaseWindow | SaveDialogOptions,
maybeOptions?: SaveDialogOptions
): SaveDialogReturnValue {
const window = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions;
const options = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions;
return saveDialog(true, window, options);
}
export function showMessageBox(window: BaseWindow, options: MessageBoxOptions): MessageBoxReturnValue;
export function showMessageBox(options: MessageBoxOptions): MessageBoxReturnValue;
export function showMessageBox (windowOrOptions: BaseWindow | MessageBoxOptions, maybeOptions?: MessageBoxOptions): MessageBoxReturnValue {
const window = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions);
const options = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions);
export function showMessageBox(
windowOrOptions: BaseWindow | MessageBoxOptions,
maybeOptions?: MessageBoxOptions
): MessageBoxReturnValue {
const window = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions;
const options = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions;
return messageBox(false, window, options);
}
export function showMessageBoxSync(window: BaseWindow, options: MessageBoxOptions): MessageBoxReturnValue;
export function showMessageBoxSync(options: MessageBoxOptions): MessageBoxReturnValue;
export function showMessageBoxSync (windowOrOptions: BaseWindow | MessageBoxOptions, maybeOptions?: MessageBoxOptions): MessageBoxReturnValue {
const window = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions);
const options = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions);
export function showMessageBoxSync(
windowOrOptions: BaseWindow | MessageBoxOptions,
maybeOptions?: MessageBoxOptions
): MessageBoxReturnValue {
const window = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions;
const options = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions;
return messageBox(true, window, options);
}
export function showErrorBox (...args: any[]) {
export function showErrorBox(...args: any[]) {
return dialogBinding.showErrorBox(...args);
}
export function showCertificateTrustDialog (windowOrOptions: BaseWindow | CertificateTrustDialogOptions, maybeOptions?: CertificateTrustDialogOptions) {
const window = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions);
const options = (windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions);
export function showCertificateTrustDialog(
windowOrOptions: BaseWindow | CertificateTrustDialogOptions,
maybeOptions?: CertificateTrustDialogOptions
) {
const window = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? null : windowOrOptions;
const options = windowOrOptions && !(windowOrOptions instanceof BaseWindow) ? windowOrOptions : maybeOptions;
if (options == null || typeof options !== 'object') {
throw new TypeError('options must be an object');

View File

@@ -1,2 +1,39 @@
const { globalShortcut } = process._linkedBinding('electron_browser_global_shortcut');
export default globalShortcut;
const { createGlobalShortcut } = process._linkedBinding('electron_browser_global_shortcut');
let globalShortcut: Electron.GlobalShortcut;
const createGlobalShortcutIfNeeded = () => {
if (globalShortcut === undefined) {
globalShortcut = createGlobalShortcut();
}
};
export default new Proxy(
{},
{
get: (_target, property: keyof Electron.GlobalShortcut) => {
createGlobalShortcutIfNeeded();
const value = globalShortcut[property];
if (typeof value === 'function') {
return value.bind(globalShortcut);
}
return value;
},
set: (_target, property: string, value: unknown) => {
createGlobalShortcutIfNeeded();
return Reflect.set(globalShortcut, property, value);
},
ownKeys: () => {
createGlobalShortcutIfNeeded();
return Reflect.ownKeys(globalShortcut);
},
has: (_target, property: string) => {
createGlobalShortcutIfNeeded();
return property in globalShortcut;
},
getOwnPropertyDescriptor: (_target, property: string) => {
createGlobalShortcutIfNeeded();
return Reflect.getOwnPropertyDescriptor(globalShortcut, property);
}
}
);

View File

@@ -4,8 +4,12 @@ let _inAppPurchase;
if (process.platform === 'darwin') {
const { inAppPurchase } = process._linkedBinding('electron_browser_in_app_purchase');
const _purchase = inAppPurchase.purchaseProduct as (productID: string, quantity?: number, username?: string) => Promise<boolean>;
inAppPurchase.purchaseProduct = (productID: string, opts?: number | { quantity?: number, username?: string }) => {
const _purchase = inAppPurchase.purchaseProduct as (
productID: string,
quantity?: number,
username?: string
) => Promise<boolean>;
inAppPurchase.purchaseProduct = (productID: string, opts?: number | { quantity?: number; username?: string }) => {
const quantity = typeof opts === 'object' ? opts.quantity : opts;
const username = typeof opts === 'object' ? opts.username : undefined;
return _purchase.apply(inAppPurchase, [productID, quantity, username]);

View File

@@ -1,20 +1,66 @@
import { app, BaseWindow, BrowserWindow, session, webContents, WebContents, MenuItemConstructorOptions } from 'electron/main';
import {
app,
BaseWindow,
BrowserWindow,
session,
webContents,
WebContents,
MenuItemConstructorOptions
} from 'electron/main';
const isMac = process.platform === 'darwin';
const isWindows = process.platform === 'win32';
const isLinux = process.platform === 'linux';
type RoleId = 'about' | 'close' | 'copy' | 'cut' | 'delete' | 'forcereload' | 'front' | 'help' | 'hide' | 'hideothers' | 'minimize' |
'paste' | 'pasteandmatchstyle' | 'quit' | 'redo' | 'reload' | 'resetzoom' | 'selectall' | 'services' | 'recentdocuments' | 'clearrecentdocuments' |
'showsubstitutions' | 'togglesmartquotes' | 'togglesmartdashes' | 'toggletextreplacement' | 'startspeaking' | 'stopspeaking' |
'toggledevtools' | 'togglefullscreen' | 'undo' | 'unhide' | 'window' | 'zoom' | 'zoomin' | 'zoomout' | 'togglespellchecker' |
'appmenu' | 'filemenu' | 'editmenu' | 'viewmenu' | 'windowmenu' | 'sharemenu'
type RoleId =
| 'about'
| 'close'
| 'copy'
| 'cut'
| 'delete'
| 'forcereload'
| 'front'
| 'help'
| 'hide'
| 'hideothers'
| 'minimize'
| 'paste'
| 'pasteandmatchstyle'
| 'quit'
| 'redo'
| 'reload'
| 'resetzoom'
| 'selectall'
| 'services'
| 'recentdocuments'
| 'clearrecentdocuments'
| 'showsubstitutions'
| 'togglesmartquotes'
| 'togglesmartdashes'
| 'toggletextreplacement'
| 'startspeaking'
| 'stopspeaking'
| 'toggledevtools'
| 'togglefullscreen'
| 'undo'
| 'unhide'
| 'window'
| 'zoom'
| 'zoomin'
| 'zoomout'
| 'togglespellchecker'
| 'appmenu'
| 'filemenu'
| 'editmenu'
| 'viewmenu'
| 'windowmenu'
| 'sharemenu';
interface Role {
label: string;
accelerator?: string;
checked?: boolean;
windowMethod?: ((window: BaseWindow) => void);
webContentsMethod?: ((webContents: WebContents) => void);
windowMethod?: (window: BaseWindow) => void;
webContentsMethod?: (webContents: WebContents) => void;
appMethod?: () => void;
registerAccelerator?: boolean;
nonNativeMacOSRole?: boolean;
@@ -23,7 +69,7 @@ interface Role {
export const roleList: Record<RoleId, Role> = {
about: {
get label () {
get label() {
return isLinux ? 'About' : `About ${app.name}`;
},
...((isWindows || isLinux) && { appMethod: () => app.showAboutPanel() })
@@ -31,23 +77,23 @@ export const roleList: Record<RoleId, Role> = {
close: {
label: isMac ? 'Close Window' : 'Close',
accelerator: 'CommandOrControl+W',
windowMethod: w => w.close()
windowMethod: (w) => w.close()
},
copy: {
label: 'Copy',
accelerator: 'CommandOrControl+C',
webContentsMethod: wc => wc.copy(),
webContentsMethod: (wc) => wc.copy(),
registerAccelerator: false
},
cut: {
label: 'Cut',
accelerator: 'CommandOrControl+X',
webContentsMethod: wc => wc.cut(),
webContentsMethod: (wc) => wc.cut(),
registerAccelerator: false
},
delete: {
label: 'Delete',
webContentsMethod: wc => wc.delete()
webContentsMethod: (wc) => wc.delete()
},
forcereload: {
label: 'Force Reload',
@@ -66,7 +112,7 @@ export const roleList: Record<RoleId, Role> = {
label: 'Help'
},
hide: {
get label () {
get label() {
return `Hide ${app.name}`;
},
accelerator: 'Command+H'
@@ -78,28 +124,31 @@ export const roleList: Record<RoleId, Role> = {
minimize: {
label: 'Minimize',
accelerator: 'CommandOrControl+M',
windowMethod: w => {
windowMethod: (w) => {
if (w.minimizable) w.minimize();
}
},
paste: {
label: 'Paste',
accelerator: 'CommandOrControl+V',
webContentsMethod: wc => wc.paste(),
webContentsMethod: (wc) => wc.paste(),
registerAccelerator: false
},
pasteandmatchstyle: {
label: 'Paste and Match Style',
accelerator: isMac ? 'Cmd+Option+Shift+V' : 'Shift+CommandOrControl+V',
webContentsMethod: wc => wc.pasteAndMatchStyle(),
webContentsMethod: (wc) => wc.pasteAndMatchStyle(),
registerAccelerator: false
},
quit: {
get label () {
get label() {
switch (process.platform) {
case 'darwin': return `Quit ${app.name}`;
case 'win32': return 'Exit';
default: return 'Quit';
case 'darwin':
return `Quit ${app.name}`;
case 'win32':
return 'Exit';
default:
return 'Quit';
}
},
accelerator: isWindows ? undefined : 'CommandOrControl+Q',
@@ -108,7 +157,7 @@ export const roleList: Record<RoleId, Role> = {
redo: {
label: 'Redo',
accelerator: isWindows ? 'Control+Y' : 'Shift+CommandOrControl+Z',
webContentsMethod: wc => wc.redo()
webContentsMethod: (wc) => wc.redo()
},
reload: {
label: 'Reload',
@@ -131,7 +180,7 @@ export const roleList: Record<RoleId, Role> = {
selectall: {
label: 'Select All',
accelerator: 'CommandOrControl+A',
webContentsMethod: wc => wc.selectAll()
webContentsMethod: (wc) => wc.selectAll()
},
services: {
label: 'Services'
@@ -164,7 +213,7 @@ export const roleList: Record<RoleId, Role> = {
label: 'Toggle Developer Tools',
accelerator: isMac ? 'Alt+Command+I' : 'Ctrl+Shift+I',
nonNativeMacOSRole: true,
webContentsMethod: wc => {
webContentsMethod: (wc) => {
const bw = wc.getOwnerBrowserWindow();
if (bw) bw.webContents.toggleDevTools();
}
@@ -179,7 +228,7 @@ export const roleList: Record<RoleId, Role> = {
undo: {
label: 'Undo',
accelerator: 'CommandOrControl+Z',
webContentsMethod: wc => wc.undo()
webContentsMethod: (wc) => wc.undo()
},
unhide: {
label: 'Show All'
@@ -208,7 +257,7 @@ export const roleList: Record<RoleId, Role> = {
},
togglespellchecker: {
label: 'Check Spelling While Typing',
get checked () {
get checked() {
const wc = webContents.getFocusedWebContents();
const ses = wc ? wc.session : session.defaultSession;
return ses.spellCheckerEnabled;
@@ -221,7 +270,7 @@ export const roleList: Record<RoleId, Role> = {
},
// App submenu should be used for Mac only
appmenu: {
get label () {
get label() {
return app.name;
},
submenu: [
@@ -239,9 +288,7 @@ export const roleList: Record<RoleId, Role> = {
// File submenu
filemenu: {
label: 'File',
submenu: [
isMac ? { role: 'close' } : { role: 'quit' }
]
submenu: [isMac ? { role: 'close' } : { role: 'quit' }]
},
// Edit submenu
editmenu: {
@@ -254,34 +301,27 @@ export const roleList: Record<RoleId, Role> = {
{ role: 'copy' },
{ role: 'paste' },
...(isMac
? [
{ role: 'pasteAndMatchStyle' },
{ role: 'delete' },
{ role: 'selectAll' },
{ type: 'separator' },
{
label: 'Substitutions',
submenu: [
{ role: 'showSubstitutions' },
{ type: 'separator' },
{ role: 'toggleSmartQuotes' },
{ role: 'toggleSmartDashes' },
{ role: 'toggleTextReplacement' }
]
},
{
label: 'Speech',
submenu: [
{ role: 'startSpeaking' },
{ role: 'stopSpeaking' }
]
}
] as MenuItemConstructorOptions[]
: [
{ role: 'delete' },
{ type: 'separator' },
{ role: 'selectAll' }
] as MenuItemConstructorOptions[])
? ([
{ role: 'pasteAndMatchStyle' },
{ role: 'delete' },
{ role: 'selectAll' },
{ type: 'separator' },
{
label: 'Substitutions',
submenu: [
{ role: 'showSubstitutions' },
{ type: 'separator' },
{ role: 'toggleSmartQuotes' },
{ role: 'toggleSmartDashes' },
{ role: 'toggleTextReplacement' }
]
},
{
label: 'Speech',
submenu: [{ role: 'startSpeaking' }, { role: 'stopSpeaking' }]
}
] as MenuItemConstructorOptions[])
: ([{ role: 'delete' }, { type: 'separator' }, { role: 'selectAll' }] as MenuItemConstructorOptions[]))
]
},
// View submenu
@@ -306,13 +346,8 @@ export const roleList: Record<RoleId, Role> = {
{ role: 'minimize' },
{ role: 'zoom' },
...(isMac
? [
{ type: 'separator' },
{ role: 'front' }
] as MenuItemConstructorOptions[]
: [
{ role: 'close' }
] as MenuItemConstructorOptions[])
? ([{ type: 'separator' }, { role: 'front' }] as MenuItemConstructorOptions[])
: ([{ role: 'close' }] as MenuItemConstructorOptions[]))
]
},
// Share submenu
@@ -334,34 +369,34 @@ const canExecuteRole = (role: keyof typeof roleList) => {
return roleList[role].nonNativeMacOSRole;
};
export function getDefaultType (role: RoleId) {
export function getDefaultType(role: RoleId) {
if (shouldOverrideCheckStatus(role)) return 'checkbox';
return 'normal';
}
export function getDefaultLabel (role: RoleId) {
export function getDefaultLabel(role: RoleId) {
return hasRole(role) ? roleList[role].label : '';
}
export function getCheckStatus (role: RoleId) {
export function getCheckStatus(role: RoleId) {
if (hasRole(role)) return roleList[role].checked;
}
export function shouldOverrideCheckStatus (role: RoleId) {
export function shouldOverrideCheckStatus(role: RoleId) {
return hasRole(role) && Object.hasOwn(roleList[role], 'checked');
}
export function getDefaultAccelerator (role: RoleId) {
export function getDefaultAccelerator(role: RoleId) {
if (hasRole(role)) return roleList[role].accelerator;
return undefined;
}
export function shouldRegisterAccelerator (role: RoleId) {
export function shouldRegisterAccelerator(role: RoleId) {
const hasRoleRegister = hasRole(role) && roleList[role].registerAccelerator !== undefined;
return hasRoleRegister ? roleList[role].registerAccelerator : true;
}
export function getDefaultSubmenu (role: RoleId) {
export function getDefaultSubmenu(role: RoleId) {
if (!hasRole(role)) return;
let { submenu } = roleList[role];
@@ -374,7 +409,7 @@ export function getDefaultSubmenu (role: RoleId) {
return submenu;
}
export function execute (role: RoleId, focusedWindow: BaseWindow, focusedWebContents: WebContents) {
export function execute(role: RoleId, focusedWindow: BaseWindow, focusedWebContents: WebContents) {
if (!canExecuteRole(role)) return false;
const { appMethod, webContentsMethod, windowMethod } = roleList[role];

View File

@@ -56,8 +56,7 @@ const MenuItem = function (this: any, options: any) {
const click = options.click;
this.click = (event: KeyboardEvent, focusedWindow: BaseWindow, focusedWebContents: WebContents) => {
// Manually flip the checked flags when clicked.
if (!roles.shouldOverrideCheckStatus(this.role) &&
(this.type === 'checkbox' || this.type === 'radio')) {
if (!roles.shouldOverrideCheckStatus(this.role) && (this.type === 'checkbox' || this.type === 'radio')) {
this.checked = !this.checked;
}

View File

@@ -1,13 +1,16 @@
function splitArray<T> (arr: T[], predicate: (x: T) => boolean) {
const result = arr.reduce((multi, item) => {
const current = multi[multi.length - 1];
if (predicate(item)) {
if (current.length > 0) multi.push([]);
} else {
current.push(item);
}
return multi;
}, [[]] as T[][]);
function splitArray<T>(arr: T[], predicate: (x: T) => boolean) {
const result = arr.reduce(
(multi, item) => {
const current = multi[multi.length - 1];
if (predicate(item)) {
if (current.length > 0) multi.push([]);
} else {
current.push(item);
}
return multi;
},
[[]] as T[][]
);
if (result[result.length - 1].length === 0) {
return result.slice(0, result.length - 1);
@@ -15,7 +18,7 @@ function splitArray<T> (arr: T[], predicate: (x: T) => boolean) {
return result;
}
function joinArrays (arrays: any[][], joinIDs: any[]) {
function joinArrays(arrays: any[][], joinIDs: any[]) {
return arrays.reduce((joined, arr, i) => {
if (i > 0 && arr.length) {
if (joinIDs.length > 0) {
@@ -29,26 +32,23 @@ function joinArrays (arrays: any[][], joinIDs: any[]) {
}, []);
}
function pushOntoMultiMap<K, V> (map: Map<K, V[]>, key: K, value: V) {
function pushOntoMultiMap<K, V>(map: Map<K, V[]>, key: K, value: V) {
if (!map.has(key)) {
map.set(key, []);
}
map.get(key)!.push(value);
}
function indexOfGroupContainingID<T> (groups: {id?: T}[][], id: T, ignoreGroup: {id?: T}[]) {
function indexOfGroupContainingID<T>(groups: { id?: T }[][], id: T, ignoreGroup: { id?: T }[]) {
return groups.findIndex(
candidateGroup =>
candidateGroup !== ignoreGroup &&
candidateGroup.some(
candidateItem => candidateItem.id === id
)
(candidateGroup) =>
candidateGroup !== ignoreGroup && candidateGroup.some((candidateItem) => candidateItem.id === id)
);
}
// Sort nodes topologically using a depth-first approach. Encountered cycles
// are broken.
function sortTopologically<T> (originalOrder: T[], edgesById: Map<T, T[]>) {
function sortTopologically<T>(originalOrder: T[], edgesById: Map<T, T[]>) {
const sorted = [] as T[];
const marked = new Set<T>();
@@ -71,7 +71,7 @@ function sortTopologically<T> (originalOrder: T[], edgesById: Map<T, T[]>) {
return sorted;
}
function attemptToMergeAGroup<T> (groups: {before?: T[], after?: T[], id?: T}[][]) {
function attemptToMergeAGroup<T>(groups: { before?: T[]; after?: T[]; id?: T }[][]) {
for (let i = 0; i < groups.length; i++) {
const group = groups[i];
for (const item of group) {
@@ -90,7 +90,7 @@ function attemptToMergeAGroup<T> (groups: {before?: T[], after?: T[], id?: T}[][
return false;
}
function mergeGroups<T> (groups: {before?: T[], after?: T[], id?: T}[][]) {
function mergeGroups<T>(groups: { before?: T[]; after?: T[]; id?: T }[][]) {
let merged = true;
while (merged) {
merged = attemptToMergeAGroup(groups);
@@ -98,7 +98,7 @@ function mergeGroups<T> (groups: {before?: T[], after?: T[], id?: T}[][]) {
return groups;
}
function sortItemsInGroup<T> (group: {before?: T[], after?: T[], id?: T}[]) {
function sortItemsInGroup<T>(group: { before?: T[]; after?: T[]; id?: T }[]) {
const originalOrder = group.map((node, i) => i);
const edges = new Map();
const idToIndex = new Map(group.map((item, i) => [item.id, i]));
@@ -123,10 +123,14 @@ function sortItemsInGroup<T> (group: {before?: T[], after?: T[], id?: T}[]) {
}
const sortedNodes = sortTopologically(originalOrder, edges);
return sortedNodes.map(i => group[i]);
return sortedNodes.map((i) => group[i]);
}
function findEdgesInGroup<T> (groups: {beforeGroupContaining?: T[], afterGroupContaining?: T[], id?: T}[][], i: number, edges: Map<any, any>) {
function findEdgesInGroup<T>(
groups: { beforeGroupContaining?: T[]; afterGroupContaining?: T[]; id?: T }[][],
i: number,
edges: Map<any, any>
) {
const group = groups[i];
for (const item of group) {
if (item.beforeGroupContaining) {
@@ -150,7 +154,7 @@ function findEdgesInGroup<T> (groups: {beforeGroupContaining?: T[], afterGroupCo
}
}
function sortGroups<T> (groups: {id?: T}[][]) {
function sortGroups<T>(groups: { id?: T }[][]) {
const originalOrder = groups.map((item, i) => i);
const edges = new Map();
@@ -159,13 +163,15 @@ function sortGroups<T> (groups: {id?: T}[][]) {
}
const sortedGroupIndexes = sortTopologically(originalOrder, edges);
return sortedGroupIndexes.map(i => groups[i]);
return sortedGroupIndexes.map((i) => groups[i]);
}
export function sortMenuItems (menuItems: (Electron.MenuItemConstructorOptions | Electron.MenuItem)[]) {
export function sortMenuItems(menuItems: (Electron.MenuItemConstructorOptions | Electron.MenuItem)[]) {
const isSeparator = (i: Electron.MenuItemConstructorOptions | Electron.MenuItem) => {
const opts = i as Electron.MenuItemConstructorOptions;
return i.type === 'separator' && !opts.before && !opts.after && !opts.beforeGroupContaining && !opts.afterGroupContaining;
return (
i.type === 'separator' && !opts.before && !opts.after && !opts.beforeGroupContaining && !opts.afterGroupContaining
);
};
const separators = menuItems.filter(isSeparator);

View File

@@ -92,7 +92,7 @@ Menu.prototype._executeCommand = function (event, id) {
Menu.prototype._menuWillShow = function () {
// Ensure radio groups have at least one menu item selected
for (const id of Object.keys(this.groupsMap)) {
const found = this.groupsMap[id].find(item => item.checked) || null;
const found = this.groupsMap[id].find((item) => item.checked) || null;
if (!found) checked.set(this.groupsMap[id][0], true);
}
};
@@ -141,7 +141,7 @@ Menu.prototype.closePopup = function (window) {
Menu.prototype.getMenuItemById = function (id) {
const items = this.items;
let found = items.find(item => item.id === id) || null;
let found = items.find((item) => item.id === id) || null;
for (let i = 0; !found && i < items.length; i++) {
const { submenu } = items[i];
if (submenu) {
@@ -213,7 +213,7 @@ Menu.setApplicationMenu = function (menu: MenuType) {
bindings.setApplicationMenu(menu);
} else {
const windows = BaseWindow.getAllWindows();
windows.map(w => w.setMenu(menu));
windows.map((w) => w.setMenu(menu));
}
};
@@ -244,14 +244,16 @@ Menu.buildFromTemplate = function (template) {
/* Helper Functions */
// validate the template against having the wrong attribute
function areValidTemplateItems (template: (MenuItemConstructorOptions | MenuItem)[]) {
return template.every(item =>
item != null &&
typeof item === 'object' &&
(Object.hasOwn(item, 'label') || Object.hasOwn(item, 'role') || item.type === 'separator'));
function areValidTemplateItems(template: (MenuItemConstructorOptions | MenuItem)[]) {
return template.every(
(item) =>
item != null &&
typeof item === 'object' &&
(Object.hasOwn(item, 'label') || Object.hasOwn(item, 'role') || item.type === 'separator')
);
}
function sortTemplate (template: (MenuItemConstructorOptions | MenuItem)[]) {
function sortTemplate(template: (MenuItemConstructorOptions | MenuItem)[]) {
const sorted = sortMenuItems(template);
for (const item of sorted) {
if (Array.isArray(item.submenu)) {
@@ -262,7 +264,7 @@ function sortTemplate (template: (MenuItemConstructorOptions | MenuItem)[]) {
}
// Search between separators to find a radio menu item and return its group id
function generateGroupId (items: (MenuItemConstructorOptions | MenuItem)[], pos: number) {
function generateGroupId(items: (MenuItemConstructorOptions | MenuItem)[], pos: number) {
if (pos > 0) {
for (let idx = pos - 1; idx >= 0; idx--) {
if (items[idx].type === 'radio') return (items[idx] as MenuItem).groupId;
@@ -278,7 +280,7 @@ function generateGroupId (items: (MenuItemConstructorOptions | MenuItem)[], pos:
return groupIdIndex;
}
function removeExtraSeparators (items: (MenuItemConstructorOptions | MenuItem)[]) {
function removeExtraSeparators(items: (MenuItemConstructorOptions | MenuItem)[]) {
// fold adjacent separators together
let ret = items.filter((e, idx, arr) => {
if (e.visible === false) return true;
@@ -294,7 +296,7 @@ function removeExtraSeparators (items: (MenuItemConstructorOptions | MenuItem)[]
return ret;
}
function insertItemByType (this: MenuType, item: MenuItem, pos: number) {
function insertItemByType(this: MenuType, item: MenuItem, pos: number) {
const types = {
normal: () => this.insertItem(pos, item.commandId, item.label),
header: () => this.insertItem(pos, item.commandId, item.label),

View File

@@ -1,14 +1,11 @@
import { MessagePortMain } from '@electron/internal/browser/message-port-main';
import { EventEmitter } from 'events';
const { createPair } = process._linkedBinding('electron_browser_message_port');
export default class MessageChannelMain extends EventEmitter implements Electron.MessageChannelMain {
export default class MessageChannelMain implements Electron.MessageChannelMain {
port1: MessagePortMain;
port2: MessagePortMain;
constructor () {
super();
constructor() {
const { port1, port2 } = createPair();
this.port1 = new MessagePortMain(port1);
this.port2 = new MessagePortMain(port2);

View File

@@ -4,7 +4,11 @@ import { ClientRequestConstructorOptions, ClientRequest, IncomingMessage, Sessio
import { Readable, Writable, isReadable } from 'stream';
function createDeferredPromise<T, E extends Error = Error> (): { promise: Promise<T>; resolve: (x: T) => void; reject: (e: E) => void; } {
function createDeferredPromise<T, E extends Error = Error>(): {
promise: Promise<T>;
resolve: (x: T) => void;
reject: (e: E) => void;
} {
let res: (x: T) => void;
let rej: (e: E) => void;
const promise = new Promise<T>((resolve, reject) => {
@@ -15,8 +19,12 @@ function createDeferredPromise<T, E extends Error = Error> (): { promise: Promis
return { promise, resolve: res!, reject: rej! };
}
export function fetchWithSession (input: RequestInfo, init: (RequestInit & {bypassCustomProtocolHandlers?: boolean}) | undefined, session: SessionT | undefined,
request: (options: ClientRequestConstructorOptions | string) => ClientRequest) {
export function fetchWithSession(
input: RequestInfo,
init: (RequestInit & { bypassCustomProtocolHandlers?: boolean }) | undefined,
session: SessionT | undefined,
request: (options: ClientRequestConstructorOptions | string) => ClientRequest
) {
const p = createDeferredPromise<Response>();
let req: Request;
try {
@@ -76,16 +84,18 @@ export function fetchWithSession (input: RequestInfo, init: (RequestInit & {bypa
// We can't set credentials to same-origin unless there's an origin set.
const credentials = req.credentials === 'same-origin' && !origin ? 'include' : req.credentials;
const r = request(allowAnyProtocol({
session,
method: req.method,
url: req.url,
origin,
credentials,
cache: req.cache,
referrerPolicy: req.referrerPolicy,
redirect: req.redirect
}));
const r = request(
allowAnyProtocol({
session,
method: req.method,
url: req.url,
origin,
credentials,
cache: req.cache,
referrerPolicy: req.referrerPolicy,
redirect: req.redirect
})
);
(r as any)._urlLoaderOptions.bypassCustomProtocolHandlers = !!init?.bypassCustomProtocolHandlers;
@@ -105,7 +115,10 @@ export function fetchWithSession (input: RequestInfo, init: (RequestInit & {bypa
headers.set(k, Array.isArray(v) ? v.join(', ') : v);
}
const nullBodyStatus = [101, 204, 205, 304];
const body = nullBodyStatus.includes(resp.statusCode) || req.method === 'HEAD' ? null : Readable.toWeb(resp as unknown as Readable) as ReadableStream;
const body =
nullBodyStatus.includes(resp.statusCode) || req.method === 'HEAD'
? null
: (Readable.toWeb(resp as unknown as Readable) as ReadableStream);
const rResp = new Response(body, {
headers,
status: resp.statusCode,
@@ -122,7 +135,9 @@ export function fetchWithSession (input: RequestInfo, init: (RequestInit & {bypa
// pipeTo expects a WritableStream<Uint8Array>. Node.js' Writable.toWeb returns WritableStream<any>,
// which causes a TS structural mismatch.
const writable = Writable.toWeb(r as unknown as Writable) as unknown as WritableStream<Uint8Array>;
if (!req.body?.pipeTo(writable).then(() => r.end())) { r.end(); }
if (!req.body?.pipeTo(writable).then(() => r.end())) {
r.end();
}
return p.promise;
}

View File

@@ -15,7 +15,7 @@ const stopLogging: typeof session.defaultSession.netLog.stopLogging = async () =
export default {
startLogging,
stopLogging,
get currentlyLogging (): boolean {
get currentlyLogging(): boolean {
if (!app.isReady()) return false;
return session.defaultSession.netLog.currentlyLogging;
}

View File

@@ -5,18 +5,21 @@ import type { ClientRequestConstructorOptions } from 'electron/main';
const { isOnline } = process._linkedBinding('electron_common_net');
export function request (options: ClientRequestConstructorOptions | string, callback?: (message: IncomingMessage) => void) {
export function request(
options: ClientRequestConstructorOptions | string,
callback?: (message: IncomingMessage) => void
) {
if (!app.isReady()) {
throw new Error('net module can only be used after app is ready');
}
return new ClientRequest(options, callback);
}
export function fetch (input: RequestInfo, init?: RequestInit): Promise<Response> {
export function fetch(input: RequestInfo, init?: RequestInit): Promise<Response> {
return session.defaultSession.fetch(input, init);
}
export function resolveHost (host: string, options?: Electron.ResolveHostOptions): Promise<Electron.ResolvedHost> {
export function resolveHost(host: string, options?: Electron.ResolveHostOptions): Promise<Electron.ResolvedHost> {
return session.defaultSession.resolveHost(host, options);
}

Some files were not shown because too many files have changed in this diff Show More