Compare commits

..

30 Commits

Author SHA1 Message Date
trop[bot]
2d7e11a76c ci: use hermetic mac SDK for the release ffmpeg build (#50758)
* ci: use hermetic mac SDK for the release ffmpeg build

gn gen out/ffmpeg runs as a raw gn invocation, so it never receives the
mac_sdk_path arg that e build injects for out/Default. On macOS runners
that means out/Default builds against the hermetic build-tools SDK while
out/ffmpeg falls through to the runner's system Xcode SDK. Reuse the
value e build already wrote so both builds share the same sysroot.

Co-authored-by: Samuel Attard <sattard@anthropic.com>

* ci: copy hermetic SDK symlink into out/ffmpeg and rewrite path

mac_sdk_path must live under root_build_dir, so pointing out/ffmpeg at
//out/Default/... doesn't work. Copy the xcode_links symlink tree into
out/ffmpeg and rewrite the path. Gate on Darwin so Windows/Linux don't
run the sed/cp at all.

Co-authored-by: Samuel Attard <sattard@anthropic.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Samuel Attard <sattard@anthropic.com>
2026-04-06 23:15:27 -07:00
Shelley Vohr
3fc23369b6 chore: cherry-pick 89b42d2d3326 from chromium (#50624)
* chore: cherry-pick 89b42d2d3326 from chromium

* chore: update patches

---------

Co-authored-by: PatchUp <73610968+patchup[bot]@users.noreply.github.com>
2026-04-06 18:40:55 -05:00
trop[bot]
965ac948e0 ci: make src-cache upload atomic (#50749)
ci: make src-cache upload atomic and sweep orphaned temp files

The checkout action's cp of the ~6GB zstd archive directly to the final
path on the cache share is non-atomic; an interrupted copy or a
concurrent reader produces zstd "Read error (39): premature end" on
restore, and the truncated file then satisfies the existence check so
no later run repairs it.

Upload to a run-unique *.tar.upload-<run_id>-<attempt> temp name on the
share and mv to the final path, discarding our temp if a concurrent run
got there first. A new clean-orphaned-cache-uploads workflow removes
temp files older than 4h every 4 hours.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Sam Attard <sattard@anthropic.com>
2026-04-06 22:07:51 +00:00
Samuel Attard
e58fcdece3 build: derive patches upstream-head ref from script path (39-x-y) (#50742)
build: derive patches upstream-head ref from script path (#50727)

* build: derive patches upstream-head ref from script path

gclient-new-workdir.py symlinks each repo's .git/refs back to the source
checkout, so the fixed refs/patches/upstream-head was shared across all
worktrees. Parallel `e sync` runs in different worktrees clobbered each
other's upstream-head, breaking `e patches` and check-patch-diff.

Suffix the ref with an md5 of the script directory so each worktree writes
a distinct ref into the shared refs dir. Fall back to the legacy ref name
in guess_base_commit so existing checkouts keep working until next sync.

* fixup: also write legacy upstream-head ref and note it in docs
2026-04-06 16:08:35 -04:00
Samuel Attard
73552d2720 ci: use github mirror to get lint dependency versions (#50737)
ci: use github mirror to get lint dependency versions (#50733)

Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2026-04-06 09:49:40 -07:00
Samuel Attard
da458ffd10 build: replace npx with lockfile-pinned binaries (#50711)
build: replace npx with lockfile-pinned binaries (#50598)

* build: replace npx with lockfile-pinned binaries

- nan-spec-runner: reorder yarn install first, invoke nan node-gyp bin directly
- publish-to-npm: use host npm with E404 try/catch (closes existing TODO)
- upload-symbols: add @sentry/cli devDep, invoke from node_modules/.bin
- remove script/lib/npx.py (dead since #48243)

* build: bump @sentry/cli to 1.70.0 for arm support

* build: bump @sentry/cli to 1.72.0, skip CDN download on test jobs

@sentry/cli fetches its platform binary from Sentry CDN at postinstall.
Only upload-symbols.py (release pipeline) needs the binary; set
SENTRYCLI_SKIP_DOWNLOAD=1 in the two test-segment workflows that
call install-dependencies. The 64k variant uses pre-built artifacts
and does not install deps.
2026-04-06 10:15:12 -04:00
Samuel Attard
0cbdf2f037 fix: propagate requesting frame through sync permission checks (#50714)
fix: propagate requesting frame through sync permission checks (#50679)

WebContentsPermissionHelper::CheckPermission was hardcoding
GetPrimaryMainFrame() and deriving the requesting origin from
web_contents_->GetLastCommittedURL(), so the setPermissionCheckHandler
callback always received the top frame's origin and
details.isMainFrame/details.requestingUrl always reflected the main
frame, even when a cross-origin subframe with allow="serial" or
allow="camera; microphone" triggered the check.

Thread the requesting RenderFrameHost through CheckPermission,
CheckSerialAccessPermission, and CheckMediaAccessPermission so the
permission manager receives the real requesting frame. Update the
serial delegate and WebContents::CheckMediaAccessPermission callers to
pass the frame they already have.

Adds a regression test that loads a cross-origin iframe with
allow="camera; microphone", calls enumerateDevices() from within the
iframe, and asserts the permission check handler receives the iframe
origin for requestingOrigin, isMainFrame, and requestingUrl.
2026-04-06 10:07:27 -04:00
Samuel Attard
5442f1d7fb ci: zstd-compress the src cache and drop the doubled win_toolchain (#50720)
ci: zstd-compress the src cache and drop the doubled win_toolchain (#50702)

* ci: shrink src cache and fix Windows tar cleanup

- Exclude platform-specific toolchains (llvm-build, rust-toolchain) from
  the src cache; all platforms now fetch them via fix-sync post-restore
- Exclude unused test data and benchmarks: blink/web_tests, jetstream,
  speedometer, catapult/tracing/test_data, swiftshader/tests/regres
- Fix Windows restore leaving the tarball on disk after extraction
  ($src_cache was scoped to the previous PowerShell step)
- Bump src-cache key v1 -> v2

* ci: fetch llvm/rust toolchains in gn-check and clang-tidy

These workflows restore the src cache but don't run fix-sync. Now that
llvm-build and rust-toolchain are excluded from the cache, they need to
download them directly — gn gen read_file()s both, and clang-tidy runs
the binary from llvm-build.

* ci: fetch clang-tidy package explicitly

update.py's default 'clang' package doesn't include the clang-tidy
binary; it ships as a separate package.

* ci: preserve blink/web_tests/BUILD.gn when stripping test data

//BUILD.gn references //third_party/blink/web_tests:wpt_tests as a
target label, so the BUILD.gn must exist for gn gen. The data = [...]
entries it declares are runtime-only and not existence-checked at gen
time, so the actual test directories can still be removed.

* ci: compress src cache with zstd and drop gclient sync -vv

The src cache was an uncompressed tar (~16GB after exclusions). Switch
to zstd -T0 --long=30 for ~4x smaller transfer and multi-threaded
compression. Decompress on restore:
- Linux/macOS: zstd -d -c | tar -xf -
- Windows: zstd -d to an intermediate .tar, then the existing 7z
  -snld20 extraction (preserves symlink handling)

All filename references updated .tar -> .tar.zst. -f added to the two
-o invocations so re-runs overwrite instead of failing.

Also drop -vv from gclient sync; default verbosity is sufficient.

* ci: keep .tar extension for src cache (zstd content inside)

The sas-sidecar that issues Azure SAS tokens validates filenames against
/^v[0-9]+-[a-z\-]+-[a-f0-9]+\.(tar|tgz)$/ and is not easily redeployed,
so keep the .tar extension and decode zstd on restore. Windows
decompresses to a distinct intermediate (src_cache.tar) so input and
output don't collide.

* ci: log NTFS 8.3/lastaccess/Defender state before Windows cache extract

Temporary diagnostics to see whether 8.3 short-name generation is the
cause of the ~20 min tar extraction.

* ci: revert src-cache exclusion additions

The new exclusions (web_tests contents, jetstream, speedometer,
catapult test_data, regres, llvm-build, rust-toolchain) caused siso/RBE
cache misses — even data-only deps are part of action input hashes.
Revert to the original exclusion list and drop the corresponding
toolchain-fetch plumbing. zstd compression, the Windows tar cleanup,
and the -vv removal remain.

* ci: drop win_toolchain from src cache; remove NTFS diagnostics

The Windows src cache includes 14.6GB of depot_tools/win_toolchain —
7.3GB of MSVC/SDK doubled because tar captures both the vs_files.ciopfs
backing store and the live ciopfs mount at vs_files/. Every Windows
cache consumer already re-fetches this via vs_toolchain.py update
--force (fix-sync for build/publish, inline for gn-check/clang-tidy),
so the cached copy is never used.

Diagnostics removed — CI confirmed 8dot3, last-access, and Defender are
all already off on the AKS Windows nodes.

* ci: unmount ciopfs vs_files before removing win_toolchain

vs_files is a live ciopfs mount during the win-targeted checkout; rm -rf
fails with EBUSY until it's unmounted.

* ci: skip win_toolchain download during checkout instead of removing after

fusermount isn't on the checkout container, so the ciopfs mount can't be
torn down before rm. Setting DEPOT_TOOLS_WIN_TOOLCHAIN=0 makes the
win_toolchain hook a no-op (vs_toolchain.py:525-527), so there's no
download and no mount. All Windows consumers re-fetch it post-restore
anyway. The rm -rf stays as a safety net.

* ci: also set ELECTRON_DEPOT_TOOLS_WIN_TOOLCHAIN=0 for checkout sync

build.yml sets ELECTRON_DEPOT_TOOLS_WIN_TOOLCHAIN=1 at the job level for
the Windows checkout, which makes e d inject DEPOT_TOOLS_WIN_TOOLCHAIN=1
and override the inline =0. Need both: the ELECTRON_ var stops e d from
overriding, the plain one stops vs_toolchain.py from defaulting to 1.

* ci: extract Windows src cache with piped tar instead of 7z

7z takes ~20 min to extract the ~1.1M-entry tar regardless of size —
~1ms per entry of header parsing and path handling, single-threaded,
well under the 75k IOPS / 1000 MBps the ephemeral disk can do. Switch
to the same zstd -d | tar -xf - pipe used on Linux/macOS (via Git Bash
tar). No intermediate src_cache.tar, download deleted after extract.

The -snld20 flag was working around 7z's own "dangerous symlink"
refusal; GNU tar extracts symlinks as-is so it shouldn't be needed.

* ci: keep depot_tools/win_toolchain scripts in src cache

The rm -rf removed get_toolchain_if_necessary.py (a depot_tools source
file), breaking vs_toolchain.py update --force on restore.
DEPOT_TOOLS_WIN_TOOLCHAIN=0 on the sync already prevents the vs_files
download, so the rm was only removing scripts.

* ci: split src cache into 4 parallel-extractable shards

Windows tar extraction is ~1ms/entry for ~1.2M entries (~20 min)
regardless of tool, well under the 75k IOPS / 1000 MBps the D16lds_v5
ephemeral disk can do. Tar is a sequential stream so the only way to
parallelize is to split at creation time.

Shards (balanced by entry count, ~220-360k each):
  a: src/third_party/blink
  b: src/third_party/{dawn,electron_node,tflite,devtools-frontend}
  c: src/third_party (rest)
  d: src (excluding third_party)

DEPSHASH is now the raw hash; shard files are
v2-src-cache-shard-{a..d}-${DEPSHASH}.tar (all pass the sas-sidecar
filename regex). sas-token is now a JSON keyed by shard letter. All
restore paths extract the four shards in parallel with per-PID wait so
a failed shard aborts the step.

* Revert "ci: split src cache into 4 parallel-extractable shards"

This reverts commit 970574998b.
2026-04-06 10:01:23 -04:00
trop[bot]
be77994af2 ci: fetch clang-tidy package in fix-sync (#50723)
fix-sync re-downloads llvm-build on macOS/Windows with the base clang
and objdump packages, but not clang-tidy. A local gclient sync pulls
clang-tidy (checkout_clang_tidy=True in DEPS), so CI's llvm-build tree
diverges from a local one. siso hashes the toolchain as action input,
so cache-only local runs against the CI-populated RBE cache miss.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Samuel Attard <sattard@anthropic.com>
2026-04-06 07:34:27 +00:00
trop[bot]
2046ae8773 fix: validate dock_state_ against allowlist before JS execution (#50667)
fix: validate dock_state_ against allowlist before JS execution

The dock_state_ member was concatenated directly into a JavaScript
string and executed via ExecuteJavaScript() in the DevTools context.

We should validate against the four known dock states and fall back
to "right" for any unrecognized value for safety

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Shelley Vohr <shelley.vohr@gmail.com>
2026-04-03 15:45:40 -05:00
Shelley Vohr
92892ca481 chore: cherry-pick d8b01057f740 from chromium (#50620)
* chore: cherry-pick d8b01057f740 from chromium

* fixup! chore: cherry-pick d8b01057f740 from chromium

* chore: manually update cherry-pick-d8b01057f740.patch for 142.0.7444.265

---------

Co-authored-by: Charles Kerr <charles@charleskerr.com>
2026-04-03 17:25:15 +02:00
John Kleinschmidt
0ba01d5cc6 ci: update actions to node24 (#50524)
ci: update actions to node24 (#50373)

* ci: update actions to node24

* chore: fixup actions/cache to 5.0.4 everywhere

(cherry picked from commit 639d3b99b7)
2026-03-31 15:27:05 +02:00
trop[bot]
a32b124d64 ci: update nick-fields/retry to v4.0.0 (#50545)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <kleinschmidtorama@gmail.com>
2026-03-31 15:15:43 +02:00
trop[bot]
8f26c7a1b8 fix: add missing HandleScope in contentTracing.getTraceBufferUsage() (#50595)
The `OnTraceBufferUsageAvailable` callback creates V8 handles via
`Dictionary::CreateEmpty()` before `promise.Resolve()` enters its
`SettleScope` (which provides a `HandleScope`). When the callback
fires asynchronously from a Mojo response (i.e. when a trace session
is active), there is no `HandleScope` on the stack, causing a fatal
V8 error: "Cannot create a handle without a HandleScope".

Add an explicit `v8::HandleScope` at the top of the callback, matching
the pattern used by the other contentTracing APIs which resolve their
promises through `SettleScope` or the static `ResolvePromise` helper.

Made-with: Cursor

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Alexey Kozy <alexey@anysphere.co>
2026-03-31 11:37:27 +02:00
Michaela Laurencin
45d03a5392 ci: add functionality for programmatic add/remove needs-signed-commits label (#50316) (#50585)
* remove comment based label removal

* ci: add functionality for programmatic add/remove needs-signed-commits label

* add new line to pull-request-opened-synchronized
2026-03-31 10:16:42 +02:00
Samuel Attard
c8be8adebf build: upload patch conflict fix as CI artifact (#50578)
* build: add patch conflict resolution workflow with CI artifacts (#50235)

ci: upload patch conflict fix as artifact in apply-patches

When patch-up.js cannot auto-push the 3-way-merged patch diff (e.g. on
fork PRs), the checkout action already writes patches/update-patches.patch
and tells the user to check CI artifacts — but nothing was uploading it.

This adds the missing upload-artifact step to the apply-patches job so
the resolved diff is available for download, and documents in CLAUDE.md
that pulling this artifact and applying it with `git am` is the fast
path for fixing patch conflicts on PR branches without a full local sync.

Co-authored-by: Claude <noreply@anthropic.com>
(cherry picked from commit 816e5964fb)

* build: skip archiving patch conflict fix artifact (#50251)

The update-patches artifact is a single .patch file, so zipping it
is unnecessary overhead. With archive: false, gh run download fetches
the raw file directly without requiring a decompression step.

Co-authored-by: Claude <noreply@anthropic.com>
(cherry picked from commit f4a50a8fde)

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-03-30 18:04:29 +00:00
trop[bot]
f10a9b784c refactor: improve input handling in FilePath gin converter (#50548)
refactor: improve input handling in file_path_converter

Properly handle paths containing ASCII control characters in the FilePath gin converter

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Keeley Hammond <vertedinde@electronjs.org>
2026-03-27 14:05:00 -07:00
trop[bot]
9d2f8cb4da refactor: remove dead named-window lookup from guest-window-manager (#50498)
The frameNamesToWindow map was a holdover from the BrowserWindowProxy
IPC shim. Since nativeWindowOpen became the only code path, Blink's
FrameTree::FindOrCreateFrameForNavigation resolves named window targets
directly in the renderer, scoped to the opener's browsing context
group. When a matching named window exists, Blink navigates it without
ever sending a CreateNewWindow IPC to the browser, so this map was
never consulted in the legitimate same-opener case.

The only time the map found a match was when two unrelated renderers
happened to use the same target name, in which case openGuestWindow
would short-circuit before consuming the guest WebContents that
Chromium had already created for the new window, leaking it.

Adds a test verifying Blink handles same-opener named-target reuse
end-to-end without any browser-side tracking.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Sam Attard <sattard@anthropic.com>
2026-03-26 00:50:03 -07:00
trop[bot]
1173004739 fix: crash calling OSR shared texture release() after texture GC'd (#50499)
The weak persistent tracking the OffscreenReleaseHolderMonitor was tied
to the texture object, but the release() closure holds a raw pointer to
the monitor via its v8::External data. If JS retained texture.release
while dropping the texture itself, the monitor would be freed on GC and
a later release() call would crash.

Track the release function instead of the texture object. Since the
texture holds release as a property, this keeps the monitor alive as
long as either is reachable.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Sam Attard <sattard@anthropic.com>
2026-03-26 00:49:22 -07:00
trop[bot]
be37adefd0 fix: crash in clipboard.readImage() on malformed image data (#50493)
gfx::PNGCodec::Decode() returns a null SkBitmap when it fails to decode
the clipboard contents as a PNG. Passing that null bitmap to
gfx::Image::CreateFrom1xBitmap() triggers a crash.

Return an empty gfx::Image instead, matching the existing null-check
pattern in skia_util.cc.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Sam Attard <sattard@anthropic.com>
2026-03-25 22:16:26 -07:00
Keeley Hammond
7007907df0 chore: cherry-pick 3 changes from chromium (#50461)
* chore: cherry-pick 45c5a70d984d from chromium

Describe a vector of segments as "segments", not "tokens"

Bug: 487117772
Change-Id: I2dc132c4e618e398e1f8bdabc03a8d2ab6c118e7
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7606599
Commit-Queue: Anders Hartvoll Ruud <andruud@chromium.org>
Reviewed-by: Steinar H Gunderson <sesse@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1590040}

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* chore: cherry-pick 05e4b544803c from chromium

Stringify CSSUnparsedValues via toString, as normal

Bug: 484751092
Change-Id: I5db45ad85f780c67a2ea3ba8482c390ebab10068
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7600415
Commit-Queue: Anders Hartvoll Ruud <andruud@chromium.org>
Reviewed-by: Steinar H Gunderson <sesse@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1590041}

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* chore: cherry-pick 5efc7a0127a6 from chromium

Validate CSSUnparsedValues upon assignment

Fixed: 484751092
Change-Id: Id7f888a6df8c02ade24910900f5d01909cb2dfad
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7595347
Reviewed-by: Steinar H Gunderson <sesse@chromium.org>
Commit-Queue: Anders Hartvoll Ruud <andruud@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1590110}

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

* chore: update patches

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

---------

Co-authored-by: Claude <svc-devxp-claude@slack-corp.com>
2026-03-25 01:42:34 +00:00
Keeley Hammond
2c8b6ee0c0 chore: cherry-pick fbfb27470bf6 from chromium (#50436)
* chore: cherry-pick fbfb27470bf6 from chromium

* chore: cherry-pick bf6dd974238b from angle (#50435)

* fix: remove duplicate MaxGeometryUniformBlocks from angle cherry-pick patch

The angle cherry-pick added MaxGeometryUniformBlocks in new locations,
but it already existed in the EXT_geometry_shader section, causing a
duplicate struct member build error in ShaderLang.h.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 23:22:22 +00:00
Keeley Hammond
4c64377ead chore: cherry-pick 50b057660b4d from chromium (#50440)
* chore: cherry-pick 50b057660b4d from chromium

* chore: update patches

---------

Co-authored-by: PatchUp <73610968+patchup[bot]@users.noreply.github.com>
Co-authored-by: John Kleinschmidt <jkleinsc@electronjs.org>
2026-03-24 19:01:06 -04:00
Keeley Hammond
0ef056130c fix: read nodeIntegrationInWorker from per-frame WebPreferences (#50122) (#50468)
* fix: read nodeIntegrationInWorker from per-frame WebPreferences (#50122)

Previously the renderer checked a process-wide command-line switch to
decide whether to create a Node.js environment for dedicated workers.
When a renderer process hosted multiple WebContents with different
nodeIntegrationInWorker values (e.g. via window.open with overridden
webPreferences in setWindowOpenHandler), all workers in the process
used whichever value the first WebContents set on the command line.

Instead, plumb the flag through blink's WorkerSettings at worker
creation time, copying it from the initiating frame's WebPreferences.
The check on the worker thread then reads the per-worker value. Nested
workers inherit the flag from their parent worker via
WorkerSettings::Copy.

The --node-integration-in-worker command-line switch is removed as it
is no longer consumed.

* fix: restore base/command_line.h include needed by SetUpWebAssemblyTrapHandler

The backported PR removed this include (matching main where
SetUpWebAssemblyTrapHandler was refactored), but on 39-x-y the function
still uses base::CommandLine.

Co-Authored-By: Claude <svc-devxp-claude@slack-corp.com>

---------

Co-authored-by: Samuel Attard <sam@electronjs.org>
Co-authored-by: Claude <svc-devxp-claude@slack-corp.com>
2026-03-24 22:49:16 +00:00
Keeley Hammond
64373df3ca chore: cherry-pick 074d472db745 from chromium (#50443)
* chore: cherry-pick 074d472db745 from chromium

* chore: update patches

---------

Co-authored-by: PatchUp <73610968+patchup[bot]@users.noreply.github.com>
2026-03-23 21:40:49 -04:00
trop[bot]
13e44072be fix: don't re-parse URL unnecessarily when handling dialogs (#50400)
* fix: fallback to opaque URL when needed inside dialog callback

Co-authored-by: Noah Gregory <noahmgregory@gmail.com>

* refactor: remove additional URL parsing entirely when showing dialogs

Co-authored-by: Noah Gregory <noahmgregory@gmail.com>

* test: add crash test case for URL-less dialogs

Co-authored-by: Noah Gregory <noahmgregory@gmail.com>

* refactor: exit on events instead of on timeout for dialog crash test

Co-authored-by: Robo <hop2deep@gmail.com>

Co-authored-by: Noah Gregory <noahmgregory@gmail.com>

* style: make linter happy

Co-authored-by: Noah Gregory <noahmgregory@gmail.com>

* style: make linter actually happy

Co-authored-by: Noah Gregory <noahmgregory@gmail.com>

* fix: address failing `safeDialogs` tests

Co-authored-by: Noah Gregory <noahmgregory@gmail.com>

---------

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Noah Gregory <noahmgregory@gmail.com>
2026-03-20 17:31:06 -04:00
trop[bot]
16a038502a ci: output build cache hit rate as GHA annotation (#50369)
Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: David Sanders <dsanders11@ucsbalum.com>
2026-03-19 14:53:25 -04:00
trop[bot]
00a492d282 chore: Respect HTTP(S) proxy env variable for Yarn (#50349)
Respect HTTP(S) proxy env variable for Yarn

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Filip Mösner <filip.mosner@seznam.cz>
2026-03-18 19:15:19 -04:00
trop[bot]
290a77b843 fix: correctly track BaseWindow::IsActive() on MacOS (#50338)
fix: correctly set IsActive() in BaseWindow on MacOS

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: Kyle Cutler <kycutler@microsoft.com>
2026-03-18 19:12:31 -04:00
trop[bot]
87baa17e65 fix: ensure WebContents::WasShown runs when window is shown (#50341)
Avoids a freeze when failing to enter fullscreen on macOS.

Co-authored-by: trop[bot] <37223003+trop[bot]@users.noreply.github.com>
Co-authored-by: John Beutner <beutner.john@gmail.com>
2026-03-18 14:57:20 -04:00
65 changed files with 2790 additions and 210 deletions

View File

@@ -174,7 +174,17 @@ runs:
if: ${{ inputs.is-release == 'true' }}
run: |
cd src
gn gen out/ffmpeg --args="import(\"//electron/build/args/ffmpeg.gn\") use_remoteexec=true use_siso=true $GN_EXTRA_ARGS"
# Reuse the hermetic mac_sdk_path that `e build` wrote for out/Default so
# out/ffmpeg builds against the same SDK instead of the runner's system Xcode.
# The path has to live under root_build_dir, so copy the symlink tree and
# rewrite Default -> ffmpeg.
MAC_SDK_ARG=""
if [ "$(uname)" = "Darwin" ]; then
mkdir -p out/ffmpeg
cp -a out/Default/xcode_links out/ffmpeg/
MAC_SDK_ARG=$(sed -n 's|^\(mac_sdk_path = "//out/\)Default/|\1ffmpeg/|p' out/Default/args.gn)
fi
gn gen out/ffmpeg --args="import(\"//electron/build/args/ffmpeg.gn\") use_remoteexec=true use_siso=true $MAC_SDK_ARG $GN_EXTRA_ARGS"
e build --target electron:electron_ffmpeg_zip -C ../../out/ffmpeg
- name: Remove Clang problem matcher
shell: bash
@@ -243,12 +253,12 @@ runs:
run: ./src/electron/script/actions/move-artifacts.sh
- name: Upload Generated Artifacts ${{ inputs.step-suffix }}
if: always() && !cancelled()
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f #v7.0.0
with:
name: generated_artifacts_${{ env.ARTIFACT_KEY }}
path: ./generated_artifacts_${{ inputs.artifact-platform }}_${{ inputs.target-arch }}
- name: Upload Src Artifacts ${{ inputs.step-suffix }}
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f #v7.0.0
with:
name: src_artifacts_${{ env.ARTIFACT_KEY }}
path: ./src_artifacts_${{ inputs.artifact-platform }}_${{ inputs.target-arch }}

View File

@@ -28,7 +28,7 @@ runs:
shell: bash
run: |
node src/electron/script/generate-deps-hash.js
DEPSHASH="v1-src-cache-$(cat src/electron/.depshash)"
DEPSHASH="v2-src-cache-$(cat src/electron/.depshash)"
echo "DEPSHASH=$DEPSHASH" >> $GITHUB_ENV
echo "CACHE_FILE=$DEPSHASH.tar" >> $GITHUB_ENV
if [ "${{ inputs.target-platform }}" = "win" ]; then
@@ -43,7 +43,7 @@ runs:
curl --unix-socket /var/run/sas/sas.sock --fail "http://foo/$CACHE_FILE?platform=${{ inputs.target-platform }}&getAccountName=true" > sas-token
- name: Save SAS Key
if: ${{ inputs.generate-sas-token == 'true' }}
uses: actions/cache/save@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
uses: actions/cache/save@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: sas-token
key: sas-key-${{ inputs.target-platform }}-${{ github.run_number }}-${{ github.run_attempt }}
@@ -109,7 +109,7 @@ runs:
echo "target_os=['$TARGET_OS']" >> ./.gclient
fi
ELECTRON_USE_THREE_WAY_MERGE_FOR_PATCHES=1 e d gclient sync --with_branch_heads --with_tags -vv
ELECTRON_DEPOT_TOOLS_WIN_TOOLCHAIN=0 DEPOT_TOOLS_WIN_TOOLCHAIN=0 ELECTRON_USE_THREE_WAY_MERGE_FOR_PATCHES=1 e d gclient sync --with_branch_heads --with_tags
if [[ "${{ inputs.is-release }}" != "true" ]]; then
# Re-export all the patches to check if there were changes.
python3 src/electron/script/export_all_patches.py src/electron/patches/config.json
@@ -187,21 +187,35 @@ runs:
shell: bash
run: |
echo "Uncompressed src size: $(du -sh src | cut -f1 -d' ')"
tar -cf $CACHE_FILE src
# Named .tar but zstd-compressed; the sas-sidecar's filename allowlist
# only permits .tar/.tgz so we keep the extension and decode on restore.
tar -cf - src | zstd -T0 --long=30 -f -o $CACHE_FILE
echo "Compressed src to $(du -sh $CACHE_FILE | cut -f1 -d' ')"
cp ./$CACHE_FILE $CACHE_DRIVE/
- name: Persist Src Cache
if: ${{ steps.check-cache.outputs.cache_exists == 'false' && inputs.use-cache == 'true' }}
shell: bash
run: |
final_cache_path=$CACHE_DRIVE/$CACHE_FILE
# Upload to a run-unique temp name first so concurrent readers never
# observe a partially-written file, and an interrupted copy can't leave
# a truncated file at the final path. Orphaned temp files get swept by
# the clean-orphaned-cache-uploads workflow.
tmp_cache_path=$final_cache_path.upload-${GITHUB_RUN_ID}-${GITHUB_RUN_ATTEMPT}
echo "Uploading to temp path: $tmp_cache_path"
cp ./$CACHE_FILE $tmp_cache_path
echo "Using cache key: $DEPSHASH"
echo "Checking path: $final_cache_path"
if [ -f "$final_cache_path" ]; then
echo "Cache already persisted at $final_cache_path by a concurrent run; discarding ours"
rm -f $tmp_cache_path
else
mv -f $tmp_cache_path $final_cache_path
echo "Cache key persisted in $final_cache_path"
fi
if [ ! -f "$final_cache_path" ]; then
echo "Cache key not found"
exit 1
else
echo "Cache key persisted in $final_cache_path"
fi
- name: Wait for active SSH sessions
shell: bash

View File

@@ -27,6 +27,7 @@ runs:
python3 src/tools/clang/scripts/update.py
# Refs https://chromium-review.googlesource.com/c/chromium/src/+/6667681
python3 src/tools/clang/scripts/update.py --package objdump
python3 src/tools/clang/scripts/update.py --package clang-tidy
- name: Fix esbuild
if: ${{ inputs.target-platform != 'linux' }}
uses: ./src/electron/.github/actions/cipd-install

View File

@@ -7,7 +7,7 @@ runs:
shell: bash
id: yarn-cache-dir-path
run: echo "dir=$(node src/electron/script/yarn.js config get cacheFolder)" >> $GITHUB_OUTPUT
- uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
- uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
id: yarn-cache
with:
path: ${{ steps.yarn-cache-dir-path.outputs.dir }}

View File

@@ -31,7 +31,7 @@ runs:
fi
mkdir temp-cache
tar -xf $cache_path -C temp-cache
zstd -d --long=30 -c $cache_path | tar -xf - -C temp-cache
echo "Unzipped cache is $(du -sh temp-cache/src | cut -f1)"
if [ -d "temp-cache/src" ]; then

View File

@@ -8,14 +8,14 @@ runs:
steps:
- name: Obtain SAS Key
continue-on-error: true
uses: actions/cache/restore@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: sas-token
key: sas-key-${{ inputs.target-platform }}-${{ github.run_number }}-1
enableCrossOsArchive: true
- name: Obtain SAS Key
continue-on-error: true
uses: actions/cache/restore@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: sas-token
key: sas-key-${{ inputs.target-platform }}-${{ github.run_number }}-${{ github.run_attempt }}
@@ -24,7 +24,7 @@ runs:
# The cache will always exist here as a result of the checkout job
# Either it was uploaded to Azure in the checkout job for this commit
# or it was uploaded in the checkout job for a previous commit.
uses: nick-fields/retry@7152eba30c6575329ac0576536151aca5a72780e # v3.0.0
uses: nick-fields/retry@ad984534de44a9489a53aefd81eb77f87c70dc60 # v4.0.0
with:
timeout_minutes: 30
max_attempts: 3
@@ -61,9 +61,9 @@ runs:
echo "Cache is empty - exiting"
exit 1
fi
mkdir temp-cache
tar -xf $DEPSHASH.tar -C temp-cache
zstd -d --long=30 -c $DEPSHASH.tar | tar -xf - -C temp-cache
echo "Unzipped cache is $(du -sh temp-cache/src | cut -f1)"
if [ -d "temp-cache/src" ]; then
@@ -85,23 +85,21 @@ runs:
- name: Unzip and Ensure Src Cache (Windows)
if: ${{ inputs.target-platform == 'win' }}
shell: powershell
shell: bash
run: |
$src_cache = "$env:DEPSHASH.tar"
$cache_size = $(Get-Item $src_cache).length
Write-Host "Downloaded cache is $cache_size"
if ($cache_size -eq 0) {
Write-Host "Cache is empty - exiting"
echo "Downloaded cache is $(du -sh $DEPSHASH.tar | cut -f1)"
if [ `du $DEPSHASH.tar | cut -f1` = "0" ]; then
echo "Cache is empty - exiting"
exit 1
}
fi
$TEMP_DIR=New-Item -ItemType Directory -Path temp-cache
$TEMP_DIR_PATH = $TEMP_DIR.FullName
C:\ProgramData\Chocolatey\bin\7z.exe -y -snld20 x $src_cache -o"$TEMP_DIR_PATH"
mkdir temp-cache
zstd -d --long=30 -c $DEPSHASH.tar | tar -xf - -C temp-cache
rm -f $DEPSHASH.tar
- name: Move Src Cache (Windows)
if: ${{ inputs.target-platform == 'win' }}
uses: nick-fields/retry@7152eba30c6575329ac0576536151aca5a72780e # v3.0.0
uses: nick-fields/retry@ad984534de44a9489a53aefd81eb77f87c70dc60 # v4.0.0
with:
timeout_minutes: 30
max_attempts: 3
@@ -112,9 +110,6 @@ runs:
Write-Host "Relocating Cache"
Remove-Item -Recurse -Force src
Move-Item temp-cache\src src
Write-Host "Deleting zip file"
Remove-Item -Force $src_cache
}
if (-Not (Test-Path "src\third_party\blink")) {
Write-Host "Cache was not correctly restored - exiting"

View File

@@ -71,3 +71,11 @@ jobs:
uses: ./src/electron/.github/actions/checkout
with:
target-platform: linux
- name: Upload Patch Conflict Fix
if: ${{ failure() }}
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: update-patches
path: patches/update-patches.patch
if-no-files-found: ignore
archive: false

View File

@@ -431,3 +431,30 @@ jobs:
- name: GitHub Actions Jobs Done
run: |
echo "All GitHub Actions Jobs are done"
check-signed-commits:
name: Check signed commits in green PR
needs: gha-done
if: ${{ contains(github.event.pull_request.labels.*.name, 'needs-signed-commits')}}
runs-on: ubuntu-slim
permissions:
contents: read
pull-requests: write
steps:
- name: Check signed commits in PR
uses: 1Password/check-signed-commits-action@ed2885f3ed2577a4f5d3c3fe895432a557d23d52 # v1
with:
comment: |
⚠️ This PR contains unsigned commits. This repository enforces [commit signatures](https://docs.github.com/en/authentication/managing-commit-signature-verification)
for all incoming PRs. To get your PR merged, please sign those commits
(`git rebase --exec 'git commit -S --amend --no-edit -n' @{upstream}`) and force push them to this branch
(`git push --force-with-lease`)
For more information on signing commits, see GitHub's documentation on [Telling Git about your signing key](https://docs.github.com/en/authentication/managing-commit-signature-verification/telling-git-about-your-signing-key).
- name: Remove needs-signed-commits label
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
gh pr edit $PR_URL --remove-label needs-signed-commits

View File

@@ -0,0 +1,32 @@
name: Clean Orphaned Cache Uploads
# Description:
# Sweeps orphaned in-flight upload temp files left on the src-cache volumes
# by checkout/action.yml when its cp-to-share step dies before the rename.
# A successful upload finishes in minutes, so anything older than 4h is dead.
on:
schedule:
- cron: "0 */4 * * *"
workflow_dispatch:
permissions: {}
jobs:
clean-orphaned-uploads:
if: github.repository == 'electron/electron'
runs-on: electron-arc-centralus-linux-amd64-32core
permissions:
contents: read
container:
image: ghcr.io/electron/build:bc2f48b2415a670de18d13605b1cf0eb5fdbaae1
options: --user root
volumes:
- /mnt/cross-instance-cache:/mnt/cross-instance-cache
- /mnt/win-cache:/mnt/win-cache
steps:
- name: Remove Orphaned Upload Temp Files
shell: bash
run: |
find /mnt/cross-instance-cache -maxdepth 1 -type f -name '*.tar.upload-*' -mmin +240 -print -delete
find /mnt/win-cache -maxdepth 1 -type f -name '*.tar.upload-*' -mmin +240 -print -delete

View File

@@ -35,7 +35,7 @@ jobs:
- name: Generate DEPS Hash
run: |
node src/electron/script/generate-deps-hash.js
DEPSHASH=v1-src-cache-$(cat src/electron/.depshash)
DEPSHASH=v2-src-cache-$(cat src/electron/.depshash)
echo "DEPSHASH=$DEPSHASH" >> $GITHUB_ENV
echo "CACHE_PATH=$DEPSHASH.tar" >> $GITHUB_ENV
- name: Restore src cache via AKS

View File

@@ -46,7 +46,7 @@ jobs:
shell: bash
run: |
chromium_revision="$(grep -A1 chromium_version src/electron/DEPS | tr -d '\n' | cut -d\' -f4)"
gn_version="$(curl -sL -b ~/.gitcookies "https://chromium.googlesource.com/chromium/src/+/${chromium_revision}/DEPS?format=TEXT" | base64 -d | grep gn_version | head -n1 | cut -d\' -f4)"
gn_version="$(curl -sL "https://raw.githubusercontent.com/chromium/chromium/refs/tags/${chromium_revision}/DEPS" | grep gn_version | head -n1 | cut -d\' -f4)"
cipd ensure -ensure-file - -root . <<-CIPD
\$ServiceURL https://chrome-infra-packages.appspot.com/
@@ -62,7 +62,7 @@ jobs:
chromium_revision="$(grep -A1 chromium_version src/electron/DEPS | tr -d '\n' | cut -d\' -f4)"
mkdir -p src/buildtools
curl -sL -b ~/.gitcookies "https://chromium.googlesource.com/chromium/src/+/${chromium_revision}/buildtools/DEPS?format=TEXT" | base64 -d > src/buildtools/DEPS
curl -sL "https://raw.githubusercontent.com/chromium/chromium/refs/tags/${chromium_revision}/buildtools/DEPS" > src/buildtools/DEPS
gclient sync --spec="solutions=[{'name':'src/buildtools','url':None,'deps_file':'DEPS','custom_vars':{'process_deps':True},'managed':False}]"
- name: Add problem matchers

View File

@@ -151,7 +151,7 @@ jobs:
- name: Generate DEPS Hash
run: |
node src/electron/script/generate-deps-hash.js
DEPSHASH=v1-src-cache-$(cat src/electron/.depshash)
DEPSHASH=v2-src-cache-$(cat src/electron/.depshash)
echo "DEPSHASH=$DEPSHASH" >> $GITHUB_ENV
echo "CACHE_PATH=$DEPSHASH.tar" >> $GITHUB_ENV
- name: Restore src cache via AZCopy

View File

@@ -81,7 +81,7 @@ jobs:
- name: Generate DEPS Hash
run: |
node src/electron/script/generate-deps-hash.js
DEPSHASH=v1-src-cache-$(cat src/electron/.depshash)
DEPSHASH=v2-src-cache-$(cat src/electron/.depshash)
echo "DEPSHASH=$DEPSHASH" >> $GITHUB_ENV
echo "CACHE_PATH=$DEPSHASH.tar" >> $GITHUB_ENV
- name: Restore src cache via AZCopy

View File

@@ -160,7 +160,7 @@ jobs:
- name: Generate DEPS Hash
run: |
node src/electron/script/generate-deps-hash.js
DEPSHASH=v1-src-cache-$(cat src/electron/.depshash)
DEPSHASH=v2-src-cache-$(cat src/electron/.depshash)
echo "DEPSHASH=$DEPSHASH" >> $GITHUB_ENV
echo "CACHE_PATH=$DEPSHASH.tar" >> $GITHUB_ENV
- name: Restore src cache via AZCopy

View File

@@ -43,6 +43,8 @@ env:
ELECTRON_OUT_DIR: Default
ELECTRON_RBE_JWT: ${{ secrets.ELECTRON_RBE_JWT }}
ACTIONS_STEP_DEBUG: ${{ secrets.ACTIONS_STEP_DEBUG }}
# @sentry/cli is only needed by release upload-symbols.py; skip the ~17MB CDN download on test jobs
SENTRYCLI_SKIP_DOWNLOAD: 1
jobs:
test:
@@ -289,7 +291,7 @@ jobs:
if: always() && !cancelled()
- name: Upload Test Artifacts
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f #v7.0.0
with:
name: test_artifacts_${{ env.ARTIFACT_KEY }}_${{ matrix.shard }}
path: src/electron/spec/artifacts

View File

@@ -36,6 +36,8 @@ env:
CHROMIUM_GIT_COOKIE: ${{ secrets.CHROMIUM_GIT_COOKIE }}
ELECTRON_OUT_DIR: Default
ELECTRON_RBE_JWT: ${{ secrets.ELECTRON_RBE_JWT }}
# @sentry/cli is only needed by release upload-symbols.py; skip the ~17MB CDN download on test jobs
SENTRYCLI_SKIP_DOWNLOAD: 1
jobs:
node-tests:

View File

@@ -0,0 +1,35 @@
name: Pull Request Opened/Synchronized
on:
pull_request_target:
types: [opened, synchronize]
permissions: {}
jobs:
check-signed-commits:
name: Check signed commits in PR
if: ${{ !contains(github.event.pull_request.labels.*.name, 'needs-signed-commits')}}
runs-on: ubuntu-slim
permissions:
contents: read
pull-requests: write
steps:
- name: Check signed commits in PR
uses: 1Password/check-signed-commits-action@ed2885f3ed2577a4f5d3c3fe895432a557d23d52 # v1
with:
comment: |
⚠️ This PR contains unsigned commits. This repository enforces [commit signatures](https://docs.github.com/en/authentication/managing-commit-signature-verification)
for all incoming PRs. To get your PR merged, please sign those commits
(`git rebase --exec 'git commit -S --amend --no-edit -n' @{upstream}`) and force push them to this branch
(`git push --force-with-lease`)
For more information on signing commits, see GitHub's documentation on [Telling Git about your signing key](https://docs.github.com/en/authentication/managing-commit-signature-verification/telling-git-about-your-signing-key).
- name: Add needs-signed-commits label
if: ${{ failure() }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
gh pr edit $PR_URL --add-label needs-signed-commits

View File

@@ -9,4 +9,8 @@ npmMinimalAgeGate: 10080
npmPreapprovedPackages:
- "@electron/*"
httpProxy: "${HTTP_PROXY:-}"
httpsProxy: "${HTTPS_PROXY:-}"
yarnPath: .yarn/releases/yarn-4.12.0.cjs

View File

@@ -79,7 +79,7 @@ $ ../../electron/script/git-import-patches ../../electron/patches/node
$ ../../electron/script/git-export-patches -o ../../electron/patches/node
```
Note that `git-import-patches` will mark the commit that was `HEAD` when it was run as `refs/patches/upstream-head`. This lets you keep track of which commits are from Electron patches (those that come after `refs/patches/upstream-head`) and which commits are in upstream (those before `refs/patches/upstream-head`).
Note that `git-import-patches` will mark the commit that was `HEAD` when it was run as `refs/patches/upstream-head` (and a checkout-specific `refs/patches/upstream-head-<hash>` so that gclient worktrees sharing a `.git/refs` directory don't clobber each other). This lets you keep track of which commits are from Electron patches (those that come after `refs/patches/upstream-head`) and which commits are in upstream (those before `refs/patches/upstream-head`).
#### Resolving conflicts

View File

@@ -777,8 +777,7 @@ WebContents.prototype._init = function () {
const originCounts = new Map<string, number>();
const openDialogs = new Set<AbortController>();
this.on('-run-dialog', async (info, callback) => {
const originUrl = new URL(info.frame.url);
const origin = originUrl.protocol === 'file:' ? originUrl.href : originUrl.origin;
const origin = info.frame.origin === 'file://' ? info.frame.url : info.frame.origin;
if ((originCounts.get(origin) ?? 0) < 0) return callback(false, '');
const prefs = this.getLastWebPreferences();

View File

@@ -17,11 +17,6 @@ export type WindowOpenArgs = {
features: string,
}
const frameNamesToWindow = new Map<string, WebContents>();
const registerFrameNameToGuestWindow = (name: string, webContents: WebContents) => frameNamesToWindow.set(name, webContents);
const unregisterFrameName = (name: string) => frameNamesToWindow.delete(name);
const getGuestWebContentsByFrameName = (name: string) => frameNamesToWindow.get(name);
/**
* `openGuestWindow` is called to create and setup event handling for the new
* window.
@@ -47,20 +42,6 @@ export function openGuestWindow ({ embedder, guest, referrer, disposition, postD
...overrideBrowserWindowOptions
};
// To spec, subsequent window.open calls with the same frame name (`target` in
// spec parlance) will reuse the previous window.
// https://html.spec.whatwg.org/multipage/window-object.html#apis-for-creating-and-navigating-browsing-contexts-by-name
const existingWebContents = getGuestWebContentsByFrameName(frameName);
if (existingWebContents) {
if (existingWebContents.isDestroyed()) {
// FIXME(t57ser): The webContents is destroyed for some reason, unregister the frame name
unregisterFrameName(frameName);
} else {
existingWebContents.loadURL(url);
return;
}
}
if (createWindow) {
const webContents = createWindow({
webContents: guest,
@@ -72,7 +53,7 @@ export function openGuestWindow ({ embedder, guest, referrer, disposition, postD
throw new Error('Invalid webContents. Created window should be connected to webContents passed with options object.');
}
handleWindowLifecycleEvents({ embedder, frameName, guest, outlivesOpener });
handleWindowLifecycleEvents({ embedder, guest, outlivesOpener });
}
return;
@@ -96,7 +77,7 @@ export function openGuestWindow ({ embedder, guest, referrer, disposition, postD
});
}
handleWindowLifecycleEvents({ embedder, frameName, guest: window.webContents, outlivesOpener });
handleWindowLifecycleEvents({ embedder, guest: window.webContents, outlivesOpener });
embedder.emit('did-create-window', window, { url, frameName, options: browserWindowOptions, disposition, referrer, postData });
}
@@ -107,10 +88,9 @@ export function openGuestWindow ({ embedder, guest, referrer, disposition, postD
* too is the guest destroyed; this is Electron convention and isn't based in
* browser behavior.
*/
const handleWindowLifecycleEvents = function ({ embedder, guest, frameName, outlivesOpener }: {
const handleWindowLifecycleEvents = function ({ embedder, guest, outlivesOpener }: {
embedder: WebContents,
guest: WebContents,
frameName: string,
outlivesOpener: boolean
}) {
const closedByEmbedder = function () {
@@ -128,13 +108,6 @@ const handleWindowLifecycleEvents = function ({ embedder, guest, frameName, outl
embedder.once('current-render-view-deleted' as any, closedByEmbedder);
}
guest.once('destroyed', closedByUser);
if (frameName) {
registerFrameNameToGuestWindow(frameName, guest);
guest.once('destroyed', function () {
unregisterFrameName(frameName);
});
}
};
// Security options that child windows will always inherit from parent windows

View File

@@ -14,6 +14,7 @@
"@electron/typescript-definitions": "^9.1.2",
"@octokit/rest": "^20.1.2",
"@primer/octicons": "^10.0.0",
"@sentry/cli": "1.72.0",
"@types/minimist": "^1.2.5",
"@types/node": "^22.7.7",
"@types/semver": "^7.5.8",
@@ -150,6 +151,9 @@
"spec/fixtures/native-addon/*"
],
"dependenciesMeta": {
"@sentry/cli": {
"built": true
},
"abstract-socket": {
"built": true
}

View File

@@ -1 +1,2 @@
cherry-pick-a08731cf6d70.patch
cherry-pick-bf6dd974238b.patch

View File

@@ -0,0 +1,376 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Geoff Lang <geofflang@chromium.org>
Date: Wed, 11 Feb 2026 15:51:46 -0500
Subject: Optionally validate GL_MAX_*_UNIFORM_BLOCKS at compile time.
These were validated at link time but some drivers have compiler crashes
when compiling shaders with too many uniform blocks.
Bug: chromium:475877320
Change-Id: I4413ce06307b4fe9e27105d85f66f610c235a301
Reviewed-on: https://chromium-review.googlesource.com/c/angle/angle/+/7568089
Commit-Queue: Geoff Lang <geofflang@chromium.org>
Reviewed-by: Shahbaz Youssefi <syoussefi@chromium.org>
diff --git a/include/GLSLANG/ShaderLang.h b/include/GLSLANG/ShaderLang.h
index a4d90d43b6e300d6a1158e1b5a9b623e0123f7ce..c87dfa42d6f71fa5d8e92a7e9b31c566e87ee16a 100644
--- a/include/GLSLANG/ShaderLang.h
+++ b/include/GLSLANG/ShaderLang.h
@@ -26,7 +26,7 @@
// Version number for shader translation API.
// It is incremented every time the API changes.
-#define ANGLE_SH_VERSION 382
+#define ANGLE_SH_VERSION 383
enum ShShaderSpec
{
@@ -388,6 +388,10 @@ struct ShCompileOptions
uint64_t forceShaderPrecisionHighpToMediump : 1;
+ // Validate that the count of uniform blocks is within the GL_MAX_*_UNIFORM_BLOCKS limits. These
+ // limits must be supplied in the BuiltinResources.
+ uint64_t validatePerStageMaxUniformBlocks : 1;
+
// Ask compiler to generate Vulkan transform feedback emulation support code.
uint64_t addVulkanXfbEmulationSupportCode : 1;
@@ -587,6 +591,12 @@ struct ShBuiltInResources
int MinProgramTexelOffset;
int MaxProgramTexelOffset;
+ // GL_MAX_FRAGMENT_UNIFORM_BLOCKS
+ int MaxFragmentUniformBlocks;
+
+ // GL_MAX_VERTEX_UNIFORM_BLOCKS
+ int MaxVertexUniformBlocks;
+
// Extension constants.
// Value of GL_MAX_DUAL_SOURCE_DRAW_BUFFERS_EXT for OpenGL ES output context.
@@ -704,6 +714,9 @@ struct ShBuiltInResources
// maximum point size (higher limit from ALIASED_POINT_SIZE_RANGE)
float MaxPointSize;
+ // GL_MAX_COMPUTE_UNIFORM_BLOCKS
+ int MaxComputeUniformBlocks;
+
// EXT_geometry_shader constants
int MaxGeometryUniformComponents;
int MaxGeometryUniformBlocks;
@@ -727,6 +740,7 @@ struct ShBuiltInResources
int MaxTessControlImageUniforms;
int MaxTessControlAtomicCounters;
int MaxTessControlAtomicCounterBuffers;
+ int MaxTessControlUniformBlocks;
int MaxTessPatchComponents;
int MaxPatchVertices;
@@ -739,6 +753,7 @@ struct ShBuiltInResources
int MaxTessEvaluationImageUniforms;
int MaxTessEvaluationAtomicCounters;
int MaxTessEvaluationAtomicCounterBuffers;
+ int MaxTessEvaluationUniformBlocks;
// Subpixel bits used in rasterization.
int SubPixelBits;
diff --git a/include/platform/autogen/FeaturesGL_autogen.h b/include/platform/autogen/FeaturesGL_autogen.h
index c5fb020f74f23cb072732459f520823bfb97b82c..c492b8287a03b5cd3966caae4fc96ecf71317918 100644
--- a/include/platform/autogen/FeaturesGL_autogen.h
+++ b/include/platform/autogen/FeaturesGL_autogen.h
@@ -632,6 +632,12 @@ struct FeaturesGL : FeatureSetBase
&members,
};
+ FeatureInfo validateMaxPerStageUniformBlocksAtCompileTime = {
+ "validateMaxPerStageUniformBlocksAtCompileTime",
+ FeatureCategory::OpenGLWorkarounds,
+ &members,
+ };
+
};
inline FeaturesGL::FeaturesGL() = default;
diff --git a/include/platform/gl_features.json b/include/platform/gl_features.json
index f69426154880aacbdeb1be35749ee765b8790a6f..656133772e36d4a9be0d006e3298437216ae0044 100644
--- a/include/platform/gl_features.json
+++ b/include/platform/gl_features.json
@@ -820,6 +820,14 @@
"Some Adreno drivers assume incorrect glSampleCoverage if new FBO is bound with different sample count"
],
"issue": "https://crbug.com/408364831"
+ },
+ {
+ "name": "validate_max_per_stage_uniform_blocks_at_compile_time",
+ "category": "Workarounds",
+ "description": [
+ "Validate GL_MAX_*_UNIFORM_BLOCKS at compile time instead of link time to work around compiler bugs."
+ ],
+ "issue": "http://crbug.com/475877320"
}
]
}
diff --git a/src/compiler/translator/Compiler.cpp b/src/compiler/translator/Compiler.cpp
index bd5e6990fbde80f02faf42c2b914580737caa4b2..f966774d4e521527721b54fbe170595bf209600e 100644
--- a/src/compiler/translator/Compiler.cpp
+++ b/src/compiler/translator/Compiler.cpp
@@ -1633,6 +1633,8 @@ void TCompiler::setResourceString()
<< ":MaxFragmentInputVectors:" << mResources.MaxFragmentInputVectors
<< ":MinProgramTexelOffset:" << mResources.MinProgramTexelOffset
<< ":MaxProgramTexelOffset:" << mResources.MaxProgramTexelOffset
+ << ":MaxFragmentUniformBlocks:" << mResources.MaxFragmentUniformBlocks
+ << ":MaxVertexUniformBlocks:" << mResources.MaxVertexUniformBlocks
<< ":MaxDualSourceDrawBuffers:" << mResources.MaxDualSourceDrawBuffers
<< ":MaxViewsOVR:" << mResources.MaxViewsOVR
<< ":NV_draw_buffers:" << mResources.NV_draw_buffers
@@ -1682,6 +1684,7 @@ void TCompiler::setResourceString()
<< ":MaxFragmentAtomicCounterBuffers:" << mResources.MaxFragmentAtomicCounterBuffers
<< ":MaxCombinedAtomicCounterBuffers:" << mResources.MaxCombinedAtomicCounterBuffers
<< ":MaxAtomicCounterBufferSize:" << mResources.MaxAtomicCounterBufferSize
+ << ":MaxComputeUnformBlocks:" << mResources.MaxComputeUniformBlocks
<< ":MaxGeometryUniformComponents:" << mResources.MaxGeometryUniformComponents
<< ":MaxGeometryUniformBlocks:" << mResources.MaxGeometryUniformBlocks
<< ":MaxGeometryInputComponents:" << mResources.MaxGeometryInputComponents
@@ -1705,6 +1708,7 @@ void TCompiler::setResourceString()
<< ":MaxTessControlImageUniforms:" << mResources.MaxTessControlImageUniforms
<< ":MaxTessControlAtomicCounters:" << mResources.MaxTessControlAtomicCounters
<< ":MaxTessControlAtomicCounterBuffers:" << mResources.MaxTessControlAtomicCounterBuffers
+ << ":MaxTessControlUniformBlocks:" << mResources.MaxTessControlUniformBlocks
<< ":MaxTessPatchComponents:" << mResources.MaxTessPatchComponents
<< ":MaxPatchVertices:" << mResources.MaxPatchVertices
<< ":MaxTessGenLevel:" << mResources.MaxTessGenLevel
@@ -1714,7 +1718,9 @@ void TCompiler::setResourceString()
<< ":MaxTessEvaluationUniformComponents:" << mResources.MaxTessEvaluationUniformComponents
<< ":MaxTessEvaluationImageUniforms:" << mResources.MaxTessEvaluationImageUniforms
<< ":MaxTessEvaluationAtomicCounters:" << mResources.MaxTessEvaluationAtomicCounters
- << ":MaxTessEvaluationAtomicCounterBuffers:" << mResources.MaxTessEvaluationAtomicCounterBuffers;
+ << ":MaxTessEvaluationAtomicCounterBuffers:" << mResources.MaxTessEvaluationAtomicCounterBuffers
+ << ":MaxTessControlUniformBlocks:" << mResources.MaxTessControlUniformBlocks
+ ;
// clang-format on
mBuiltInResourcesString = strstream.str();
diff --git a/src/compiler/translator/ParseContext.cpp b/src/compiler/translator/ParseContext.cpp
index de788456a27399e0a25a57f737c2a7b5e73b848c..53a3f79a9302cbacd16e4080527ceb7f21feabdb 100644
--- a/src/compiler/translator/ParseContext.cpp
+++ b/src/compiler/translator/ParseContext.cpp
@@ -250,6 +250,37 @@ bool IsSamplerOrStructWithOnlySamplers(const TType *type)
{
return IsSampler(type->getBasicType()) || type->isStructureContainingOnlySamplers();
}
+
+unsigned int GetMaxUniformBlocksForShaderType(sh::GLenum shaderType,
+ const ShCompileOptions &options,
+ const ShBuiltInResources &resources)
+{
+ // If the validatePerStageMaxUniformBlocks workaround is disabled. Set a limit that will not be
+ // hit.
+ if (!options.validatePerStageMaxUniformBlocks)
+ {
+ return std::numeric_limits<unsigned int>::max();
+ }
+
+ switch (shaderType)
+ {
+ case GL_FRAGMENT_SHADER:
+ return resources.MaxFragmentUniformBlocks;
+ case GL_VERTEX_SHADER:
+ return resources.MaxVertexUniformBlocks;
+ case GL_COMPUTE_SHADER:
+ return resources.MaxComputeUniformBlocks;
+ case GL_GEOMETRY_SHADER:
+ return resources.MaxGeometryUniformBlocks;
+ case GL_TESS_CONTROL_SHADER:
+ return resources.MaxTessControlUniformBlocks;
+ case GL_TESS_EVALUATION_SHADER:
+ return resources.MaxTessEvaluationUniformBlocks;
+ default:
+ UNREACHABLE();
+ return 0;
+ }
+}
} // namespace
// This tracks each binding point's current default offset for inheritance of subsequent
@@ -341,6 +372,8 @@ TParseContext::TParseContext(TSymbolTable &symt,
mMaxAtomicCounterBufferSize(resources.MaxAtomicCounterBufferSize),
mMaxShaderStorageBufferBindings(resources.MaxShaderStorageBufferBindings),
mMaxPixelLocalStoragePlanes(resources.MaxPixelLocalStoragePlanes),
+ mMaxUniformBlocks(GetMaxUniformBlocksForShaderType(mShaderType, options, resources)),
+ mNumUniformBlocks(0),
mDeclaringFunction(false),
mDeclaringMain(false),
mIsMainDeclared(false),
@@ -5080,6 +5113,22 @@ TIntermDeclaration *TParseContext::addInterfaceBlock(
error(arraySizesLine, "geometry shader input blocks must be an array", "");
}
+ // Validate max uniform block limits
+ if (typeQualifier.qualifier == EvqUniform)
+ {
+ unsigned int blockCount =
+ arraySizes == nullptr || arraySizes->empty() ? 1 : (*arraySizes)[0];
+ if (mNumUniformBlocks + blockCount > mMaxUniformBlocks)
+ {
+ error(arraySizesLine,
+ "uniform block count greater than per stage maximum uniform blocks", "");
+ }
+ else
+ {
+ mNumUniformBlocks += blockCount;
+ }
+ }
+
checkIndexIsNotSpecified(typeQualifier.line, typeQualifier.layoutQualifier.index);
if (mShaderVersion < 310)
diff --git a/src/compiler/translator/ParseContext.h b/src/compiler/translator/ParseContext.h
index 5f6800df8c003aa8f930915222a0c04ff838cdad..fb13198d49df37be82cf4199cc05a0cb1337d00c 100644
--- a/src/compiler/translator/ParseContext.h
+++ b/src/compiler/translator/ParseContext.h
@@ -798,6 +798,12 @@ class TParseContext : angle::NonCopyable
int mMaxShaderStorageBufferBindings;
int mMaxPixelLocalStoragePlanes;
+ // Maximum number of uniform blocks allowed to be declared in this shader. Taken from the
+ // built-in resources and resolved to this shader type.
+ unsigned int mMaxUniformBlocks;
+ // Current count of declared uniform blocks.
+ unsigned int mNumUniformBlocks;
+
// keeps track whether we are declaring / defining a function
bool mDeclaringFunction;
diff --git a/src/compiler/translator/ShaderLang.cpp b/src/compiler/translator/ShaderLang.cpp
index fa67c01cc5bc4f0d3027aa06f4133fd3521b61ce..56bd95a7241f479bb63e2ec13838a6de184b24ab 100644
--- a/src/compiler/translator/ShaderLang.cpp
+++ b/src/compiler/translator/ShaderLang.cpp
@@ -254,6 +254,8 @@ void InitBuiltInResources(ShBuiltInResources *resources)
resources->MaxFragmentInputVectors = 15;
resources->MinProgramTexelOffset = -8;
resources->MaxProgramTexelOffset = 7;
+ resources->MaxFragmentUniformBlocks = 12;
+ resources->MaxVertexUniformBlocks = 12;
// Extensions constants.
resources->MaxDualSourceDrawBuffers = 0;
@@ -314,6 +316,8 @@ void InitBuiltInResources(ShBuiltInResources *resources)
resources->MaxUniformBufferBindings = 32;
resources->MaxShaderStorageBufferBindings = 4;
+ resources->MaxComputeUniformBlocks = 12;
+
resources->MaxGeometryUniformComponents = 1024;
resources->MaxGeometryUniformBlocks = 12;
resources->MaxGeometryInputComponents = 64;
@@ -335,6 +339,7 @@ void InitBuiltInResources(ShBuiltInResources *resources)
resources->MaxTessControlImageUniforms = 0;
resources->MaxTessControlAtomicCounters = 0;
resources->MaxTessControlAtomicCounterBuffers = 0;
+ resources->MaxTessControlUniformBlocks = 12;
resources->MaxTessPatchComponents = 120;
resources->MaxPatchVertices = 32;
@@ -347,6 +352,7 @@ void InitBuiltInResources(ShBuiltInResources *resources)
resources->MaxTessEvaluationImageUniforms = 0;
resources->MaxTessEvaluationAtomicCounters = 0;
resources->MaxTessEvaluationAtomicCounterBuffers = 0;
+ resources->MaxTessEvaluationUniformBlocks = 12;
resources->SubPixelBits = 8;
diff --git a/src/libANGLE/Compiler.cpp b/src/libANGLE/Compiler.cpp
index 00684c8ed08609a3a4d6ef6f36107756207ed72b..1893b6bddb33fde567162e2e9dbb12785dad538d 100644
--- a/src/libANGLE/Compiler.cpp
+++ b/src/libANGLE/Compiler.cpp
@@ -169,6 +169,8 @@ Compiler::Compiler(rx::GLImplFactory *implFactory, const State &state, egl::Disp
mResources.MaxFragmentInputVectors = caps.maxFragmentInputComponents / 4;
mResources.MinProgramTexelOffset = caps.minProgramTexelOffset;
mResources.MaxProgramTexelOffset = caps.maxProgramTexelOffset;
+ mResources.MaxFragmentUniformBlocks = caps.maxShaderUniformBlocks[gl::ShaderType::Fragment];
+ mResources.MaxVertexUniformBlocks = caps.maxShaderUniformBlocks[gl::ShaderType::Vertex];
// EXT_blend_func_extended
mResources.EXT_blend_func_extended = extensions.blendFuncExtendedEXT;
@@ -211,6 +213,7 @@ Compiler::Compiler(rx::GLImplFactory *implFactory, const State &state, egl::Disp
mResources.MaxCombinedImageUniforms = caps.maxCombinedImageUniforms;
mResources.MaxCombinedShaderOutputResources = caps.maxCombinedShaderOutputResources;
mResources.MaxUniformLocations = caps.maxUniformLocations;
+ mResources.MaxComputeUniformBlocks = caps.maxShaderUniformBlocks[gl::ShaderType::Compute];
for (size_t index = 0u; index < 3u; ++index)
{
@@ -280,6 +283,8 @@ Compiler::Compiler(rx::GLImplFactory *implFactory, const State &state, egl::Disp
mResources.MaxTessControlAtomicCounters = caps.maxShaderAtomicCounters[ShaderType::TessControl];
mResources.MaxTessControlAtomicCounterBuffers =
caps.maxShaderAtomicCounterBuffers[ShaderType::TessControl];
+ mResources.MaxTessControlUniformBlocks =
+ caps.maxShaderUniformBlocks[gl::ShaderType::TessControl];
mResources.MaxTessPatchComponents = caps.maxTessPatchComponents;
mResources.MaxPatchVertices = caps.maxPatchVertices;
@@ -297,6 +302,8 @@ Compiler::Compiler(rx::GLImplFactory *implFactory, const State &state, egl::Disp
caps.maxShaderAtomicCounters[ShaderType::TessEvaluation];
mResources.MaxTessEvaluationAtomicCounterBuffers =
caps.maxShaderAtomicCounterBuffers[ShaderType::TessEvaluation];
+ mResources.MaxTessEvaluationUniformBlocks =
+ caps.maxShaderUniformBlocks[gl::ShaderType::TessEvaluation];
// Subpixel bits.
mResources.SubPixelBits = static_cast<int>(caps.subPixelBits);
diff --git a/src/libANGLE/renderer/gl/ShaderGL.cpp b/src/libANGLE/renderer/gl/ShaderGL.cpp
index bef2feaee055bae36c24bd4edfc98c86ada25cc3..6b66d4e529933d90984ea972ed9375964e6a2ff5 100644
--- a/src/libANGLE/renderer/gl/ShaderGL.cpp
+++ b/src/libANGLE/renderer/gl/ShaderGL.cpp
@@ -272,6 +272,11 @@ std::shared_ptr<ShaderTranslateTask> ShaderGL::compile(const gl::Context *contex
options->pls = contextGL->getNativePixelLocalStorageOptions();
}
+ if (features.validateMaxPerStageUniformBlocksAtCompileTime.enabled)
+ {
+ options->validatePerStageMaxUniformBlocks = true;
+ }
+
return std::shared_ptr<ShaderTranslateTask>(
new ShaderTranslateTaskGL(functions, mShaderID, contextGL->hasNativeParallelCompile()));
}
diff --git a/src/libANGLE/renderer/gl/renderergl_utils.cpp b/src/libANGLE/renderer/gl/renderergl_utils.cpp
index b8b328b050ba1af54dbb02143d26456d827260dc..aaa928d6e595dbea220e95129be92461cd1eb280 100644
--- a/src/libANGLE/renderer/gl/renderergl_utils.cpp
+++ b/src/libANGLE/renderer/gl/renderergl_utils.cpp
@@ -2707,6 +2707,10 @@ void InitializeFeatures(const FunctionsGL *functions, angle::FeaturesGL *feature
// number of samples in currently bound FBO and require to reset sample
// coverage each time FBO changes.
ANGLE_FEATURE_CONDITION(features, resetSampleCoverageOnFBOChange, isQualcomm);
+
+ // IMG GL drivers crash while compiling shaders with more than the limit of uniform blocks.
+ ANGLE_FEATURE_CONDITION(features, validateMaxPerStageUniformBlocksAtCompileTime,
+ IsPowerVR(vendor));
}
void InitializeFrontendFeatures(const FunctionsGL *functions, angle::FrontendFeatures *features)
diff --git a/util/autogen/angle_features_autogen.cpp b/util/autogen/angle_features_autogen.cpp
index 7ff616edd4b5a61e544254e6bf1a0bc6c49293a4..f56164ad977ad19b7d58ccbf628f28c8fb24f0a2 100644
--- a/util/autogen/angle_features_autogen.cpp
+++ b/util/autogen/angle_features_autogen.cpp
@@ -473,6 +473,7 @@ constexpr PackedEnumMap<Feature, const char *> kFeatureNames = {{
{Feature::UseVkEventForBufferBarrier, "useVkEventForBufferBarrier"},
{Feature::UseVkEventForImageBarrier, "useVkEventForImageBarrier"},
{Feature::UseVmaForImageSuballocation, "useVmaForImageSuballocation"},
+ {Feature::ValidateMaxPerStageUniformBlocksAtCompileTime, "validateMaxPerStageUniformBlocksAtCompileTime"},
{Feature::VaryingsRequireMatchingPrecisionInSpirv, "varyingsRequireMatchingPrecisionInSpirv"},
{Feature::VerifyPipelineCacheInBlobCache, "verifyPipelineCacheInBlobCache"},
{Feature::VertexIDDoesNotIncludeBaseVertex, "vertexIDDoesNotIncludeBaseVertex"},
diff --git a/util/autogen/angle_features_autogen.h b/util/autogen/angle_features_autogen.h
index fd19b2b8f4eb7a0bf9973bf2baeaf68c106e11a8..55bf1b418c4f2f185e833c4c385094792f1bbf05 100644
--- a/util/autogen/angle_features_autogen.h
+++ b/util/autogen/angle_features_autogen.h
@@ -473,6 +473,7 @@ enum class Feature
UseVkEventForBufferBarrier,
UseVkEventForImageBarrier,
UseVmaForImageSuballocation,
+ ValidateMaxPerStageUniformBlocksAtCompileTime,
VaryingsRequireMatchingPrecisionInSpirv,
VerifyPipelineCacheInBlobCache,
VertexIDDoesNotIncludeBaseVertex,

View File

@@ -151,3 +151,12 @@ graphite_handle_out_of_order_recording_errors.patch
ozone_wayland_treat_dnd_drop_performed_with_none_action_as_a.patch
cherry-pick-e045399a1ecb.patch
loaf_add_feature_to_enable_sourceurl_for_all_protocols.patch
cherry-pick-50b057660b4d.patch
cherry-pick-074d472db745.patch
validate_uniform_block_count_limits_at_compile_time_on_img.patch
feat_plumb_node_integration_in_worker_through_workersettings.patch
cherry-pick-45c5a70d984d.patch
cherry-pick-05e4b544803c.patch
cherry-pick-5efc7a0127a6.patch
cherry-pick-d8b01057f740.patch
cherry-pick-89b42d2d3326.patch

View File

@@ -0,0 +1,204 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Anders Hartvoll Ruud <andruud@chromium.org>
Date: Wed, 25 Feb 2026 03:25:14 -0800
Subject: Stringify CSSUnparsedValues via toString, as normal
CSSUnparsedValue exposes a special stringification function
ToUnparsedString() in addition to the regular toString().
The documentation says it returns "tokens without substituting
variables", but it's not clear what this means; we don't substitute
any variables in CSSStyleValue::toString() either.
This CL makes ToUnparsedString() private (and renames it).
Clients needing to serialize a CSSUnparsedValue can do so via
the normal toString() function. (If ToUnparsedString() existed
for performance reasons, that should have been documented.)
Also, the /**/-"fixup" pass over the value has been folded into
ToStringInternal(). This is to make it easy to find the canonical string
representation of this value within CSSUnparsedValue (without going
through a CSSValue).
The main point of this CL is to prepare for validating
the "argument grammar" of the value during the StyleValue-to-CSSValue
conversion in StylePropertyMap (which requires item (2) above).
We now jump through additional hoops to ultimately get a string
from the outside of CSSUnparsedValue, but there should otherwise
be no behavior change.
Bug: 484751092
Change-Id: I5db45ad85f780c67a2ea3ba8482c390ebab10068
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7600415
Commit-Queue: Anders Hartvoll Ruud <andruud@chromium.org>
Reviewed-by: Steinar H Gunderson <sesse@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1590041}
diff --git a/third_party/blink/renderer/core/css/cssom/cross_thread_style_value_test.cc b/third_party/blink/renderer/core/css/cssom/cross_thread_style_value_test.cc
index dcc2eccbc84e6cd5710ab51cee2dab49661467c1..86d42c87a6bd10838a3e059c9227868e5bfc0798 100644
--- a/third_party/blink/renderer/core/css/cssom/cross_thread_style_value_test.cc
+++ b/third_party/blink/renderer/core/css/cssom/cross_thread_style_value_test.cc
@@ -19,12 +19,12 @@
#include "third_party/blink/renderer/core/css/cssom/css_keyword_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_style_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_unit_value.h"
-#include "third_party/blink/renderer/core/css/cssom/css_unparsed_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_unsupported_color.h"
#include "third_party/blink/renderer/platform/scheduler/public/non_main_thread.h"
#include "third_party/blink/renderer/platform/scheduler/public/post_cross_thread_task.h"
#include "third_party/blink/renderer/platform/wtf/cross_thread_copier_std.h"
#include "third_party/blink/renderer/platform/wtf/cross_thread_functional.h"
+#include "third_party/blink/renderer/platform/wtf/wtf.h"
namespace blink {
@@ -152,8 +152,7 @@ TEST_F(CrossThreadStyleValueTest, CrossThreadUnparsedValueToCSSStyleValue) {
CSSStyleValue* style_value = value->ToCSSStyleValue();
EXPECT_EQ(style_value->GetType(),
CSSStyleValue::StyleValueType::kUnparsedType);
- EXPECT_EQ(static_cast<CSSUnparsedValue*>(style_value)->ToUnparsedString(),
- "Unparsed");
+ EXPECT_EQ(style_value->toString(), "Unparsed");
}
TEST_F(CrossThreadStyleValueTest, PassKeywordValueCrossThread) {
diff --git a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
index 67a4afde452f94ffb9ecbaeea104a3997c65b7b3..9c6bb62d044f804b0ce7bc8df398d77695cf950c 100644
--- a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
+++ b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
@@ -124,16 +124,26 @@ IndexedPropertySetterResult CSSUnparsedValue::AnonymousIndexedSetter(
}
const CSSValue* CSSUnparsedValue::ToCSSValue() const {
- String unparsed_string = ToUnparsedString();
- CSSParserTokenStream stream(unparsed_string);
+ String unparsed_string = ToStringInternal();
- if (stream.AtEnd()) {
+ if (unparsed_string.IsNull()) {
return MakeGarbageCollected<CSSUnparsedDeclarationValue>(
MakeGarbageCollected<CSSVariableData>());
}
- // The string we just parsed has /**/ inserted between every token
- // to make sure we get back the correct sequence of tokens.
+ // TODO(crbug.com/985028): We should probably propagate the CSSParserContext
+ // to here.
+ return MakeGarbageCollected<CSSUnparsedDeclarationValue>(
+ CSSVariableData::Create(unparsed_string, false /* is_animation_tainted */,
+ false /* is_attr_tainted */,
+ false /* needs_variable_resolution */));
+}
+
+String CSSUnparsedValue::ToStringInternal() const {
+ String serialized = SerializeSegments();
+
+ // The serialization above defensively inserted /**/ between segments
+ // to make sure that e.g. ['foo', 'bar'] does not collapse into 'foobar'.
// The spec mentions nothing of the sort:
// https://drafts.css-houdini.org/css-typed-om-1/#unparsedvalue-serialization
//
@@ -147,6 +157,10 @@ const CSSValue* CSSUnparsedValue::ToCSSValue() const {
// the original contents of any comments will be lost, but Typed OM does
// not have anywhere to store that kind of data, so it is expected.
StringBuilder builder;
+ CSSParserTokenStream stream(serialized);
+ if (stream.AtEnd()) {
+ return g_null_atom;
+ }
CSSParserToken token = stream.ConsumeRaw();
token.Serialize(builder);
while (!stream.Peek().IsEOF()) {
@@ -156,17 +170,10 @@ const CSSValue* CSSUnparsedValue::ToCSSValue() const {
token = stream.ConsumeRaw();
token.Serialize(builder);
}
- String original_text = builder.ReleaseString();
-
- // TODO(crbug.com/985028): We should probably propagate the CSSParserContext
- // to here.
- return MakeGarbageCollected<CSSUnparsedDeclarationValue>(
- CSSVariableData::Create(original_text, false /* is_animation_tainted */,
- false /* is_attr_tainted */,
- false /* needs_variable_resolution */));
+ return builder.ReleaseString();
}
-String CSSUnparsedValue::ToUnparsedString() const {
+String CSSUnparsedValue::SerializeSegments() const {
StringBuilder builder;
HeapHashSet<Member<const CSSUnparsedValue>> values_on_stack;
if (AppendUnparsedString(builder, values_on_stack)) {
diff --git a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
index 5d1961b170f14ae21ca8f69b3c3cd8af28f4478a..ec7e3ed708f406d7a61fdb370b2eed8a8297cffb 100644
--- a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
+++ b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
@@ -67,15 +67,9 @@ class CORE_EXPORT CSSUnparsedValue final : public CSSStyleValue {
CSSStyleValue::Trace(visitor);
}
- // Unlike CSSStyleValue::toString(), this returns tokens without
- // substituting variables. There are extra /**/ inserted between
- // every token to ensure there are no ambiguities, which is fine
- // because this value is never presented directly to the user
- // (ToCSSValue() will parse to a token range and then re-serialize
- // using extra /**/ only where needed).
- String ToUnparsedString() const;
-
private:
+ String ToStringInternal() const;
+ String SerializeSegments() const;
// Return 'false' if there is a cycle in the serialization.
bool AppendUnparsedString(
StringBuilder&,
diff --git a/third_party/blink/renderer/core/css/cssom/paint_worklet_style_property_map_test.cc b/third_party/blink/renderer/core/css/cssom/paint_worklet_style_property_map_test.cc
index f81fa39423a9235bc58e1600ca7a250affd3d9bb..2ee4dd7e591095b8460ca559b29b78e37ab71729 100644
--- a/third_party/blink/renderer/core/css/cssom/paint_worklet_style_property_map_test.cc
+++ b/third_party/blink/renderer/core/css/cssom/paint_worklet_style_property_map_test.cc
@@ -5,6 +5,7 @@
#include "third_party/blink/renderer/core/css/cssom/paint_worklet_style_property_map.h"
#include <memory>
+
#include "base/synchronization/waitable_event.h"
#include "base/task/single_thread_task_runner.h"
#include "testing/gtest/include/gtest/gtest.h"
@@ -13,7 +14,6 @@
#include "third_party/blink/renderer/core/css/cssom/css_keyword_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_paint_worklet_input.h"
#include "third_party/blink/renderer/core/css/cssom/css_unit_value.h"
-#include "third_party/blink/renderer/core/css/cssom/css_unparsed_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_unsupported_color.h"
#include "third_party/blink/renderer/core/css/properties/longhands/custom_property.h"
#include "third_party/blink/renderer/core/dom/element.h"
@@ -23,6 +23,7 @@
#include "third_party/blink/renderer/platform/scheduler/public/post_cross_thread_task.h"
#include "third_party/blink/renderer/platform/wtf/cross_thread_copier_base.h"
#include "third_party/blink/renderer/platform/wtf/cross_thread_functional.h"
+#include "third_party/blink/renderer/platform/wtf/wtf.h"
namespace blink {
@@ -66,8 +67,7 @@ class PaintWorkletStylePropertyMapTest : public PageTestBase {
CSSStyleValue* style_value = data.at("--x")->ToCSSStyleValue();
EXPECT_EQ(style_value->GetType(),
CSSStyleValue::StyleValueType::kUnparsedType);
- EXPECT_EQ(static_cast<CSSUnparsedValue*>(style_value)->ToUnparsedString(),
- "50");
+ EXPECT_EQ(style_value->toString(), "50");
waitable_event->Signal();
}
diff --git a/third_party/blink/renderer/core/css/properties/computed_style_utils.cc b/third_party/blink/renderer/core/css/properties/computed_style_utils.cc
index 1db0fd72478d708008f1b95a6aff206b28f60a6a..b52d8065c770aba822e9977c251c540643972629 100644
--- a/third_party/blink/renderer/core/css/properties/computed_style_utils.cc
+++ b/third_party/blink/renderer/core/css/properties/computed_style_utils.cc
@@ -4872,7 +4872,7 @@ ComputedStyleUtils::CrossThreadStyleValueFromCSSStyleValue(
To<CSSUnsupportedColor>(style_value)->Value());
case CSSStyleValue::StyleValueType::kUnparsedType:
return std::make_unique<CrossThreadUnparsedValue>(
- To<CSSUnparsedValue>(style_value)->ToUnparsedString());
+ To<CSSUnparsedValue>(style_value)->toString());
default:
return std::make_unique<CrossThreadUnsupportedValue>(
style_value->toString());

View File

@@ -0,0 +1,296 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Mikel Astiz <mastiz@chromium.org>
Date: Tue, 10 Mar 2026 13:22:17 -0700
Subject: [M146][base] Fix UAF in base::OnceCallbackList on re-entrant Notify()
Before this patch, `base::OnceCallbackList` was susceptible to a
heap-use-after-free when `Notify()` was called re-entrantly.
The UAF occurred because `OnceCallbackList::RunCallback()` immediately
spliced executed nodes out of `callbacks_` and into `null_callbacks_`.
If a nested `Notify()` executed a node that an outer `Notify()` loop was
already holding an iterator to, and that node's subscription was
subsequently destroyed during the re-entrant cycle, the node would be
physically erased from `null_callbacks_`. When control returned to the
outer loop, it would attempt to evaluate the now-dangling iterator.
This CL fixes the bug by deferring list mutations until the outermost
iteration completes:
1. `RunCallback()` no longer splices nodes during iteration.
2. Cancellation logic is pushed down to the subclasses via a new
`CancelCallback()` hook, which is an extension to the pre-existing
`CancelNullCallback()` with increased responsibilities and clearer
semantics.
3. If a subscription is destroyed while `is_iterating` is true,
`OnceCallbackList` resets the node and stashes its iterator in
`pending_erasures_`.
4. A new `CleanUpNullCallbacksPostIteration()` phase runs at the end
of the outermost `Notify()`, which safely splices executed nodes
into `null_callbacks_` and physically erases the pending dead nodes.
As a side effect, the type-trait hack in `Notify()` based on
`is_instantiation<CallbackType, OnceCallback>` can be removed, because
this information is exposed directly by
`OnceCallbackList::CleanUpNullCallbacksPostIteration()`.
The newly-added unit-test
CallbackListTest.OnceCallbackListCancelDuringReentrantNotify reproduces
the scenario and crashed before this patch.
(cherry picked from commit 36acd49636845be2419269acbe9a5137da3d5d96)
Change-Id: I6b1e2bcb97be1bc8d6a15e5ca7511992e00e1772
Fixed: 489381399
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7627506
Commit-Queue: Mikel Astiz <mastiz@chromium.org>
Reviewed-by: Gabriel Charette <gab@chromium.org>
Cr-Original-Commit-Position: refs/heads/main@{#1594520}
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7653916
Bot-Commit: Rubber Stamper <rubber-stamper@appspot.gserviceaccount.com>
Cr-Commit-Position: refs/branch-heads/7680@{#2287}
Cr-Branched-From: 76b7d80e5cda23fe6537eed26d68c92e995c7f39-refs/heads/main@{#1582197}
diff --git a/base/callback_list.h b/base/callback_list.h
index 82cb11dc0ee02906b009cc383c41a056861199d0..d5f99cf685486f1ea74718b4e6b228a5d83f0c29 100644
--- a/base/callback_list.h
+++ b/base/callback_list.h
@@ -9,6 +9,7 @@
#include <list>
#include <memory>
#include <utility>
+#include <vector>
#include "base/auto_reset.h"
#include "base/base_export.h"
@@ -16,7 +17,6 @@
#include "base/functional/bind.h"
#include "base/functional/callback.h"
#include "base/memory/weak_ptr.h"
-#include "base/types/is_instantiation.h"
// OVERVIEW:
//
@@ -240,17 +240,14 @@ class CallbackListBase {
// Any null callbacks remaining in the list were canceled due to
// Subscription destruction during iteration, and can safely be erased now.
- const size_t erased_callbacks =
- std::erase_if(callbacks_, [](const auto& cb) { return cb.is_null(); });
-
- // Run |removal_callback_| if any callbacks were canceled. Note that we
- // cannot simply compare list sizes before and after iterating, since
- // notification may result in Add()ing new callbacks as well as canceling
- // them. Also note that if this is a OnceCallbackList, the OnceCallbacks
- // that were executed above have all been removed regardless of whether
- // they're counted in |erased_callbacks_|.
- if (removal_callback_ &&
- (erased_callbacks || is_instantiation<CallbackType, OnceCallback>)) {
+ const bool any_callbacks_erased = static_cast<CallbackListImpl*>(this)
+ ->CleanUpNullCallbacksPostIteration();
+
+ // Run |removal_callback_| if any callbacks were canceled or executed. Note
+ // that simply comparing list sizes before and after iterating cannot be
+ // done, since notification may result in Add()ing new callbacks as well as
+ // canceling them.
+ if (removal_callback_ && any_callbacks_erased) {
removal_callback_.Run(); // May delete |this|!
}
}
@@ -264,21 +261,9 @@ class CallbackListBase {
private:
// Cancels the callback pointed to by |it|, which is guaranteed to be valid.
void CancelCallback(const typename Callbacks::iterator& it) {
- if (static_cast<CallbackListImpl*>(this)->CancelNullCallback(it)) {
- return;
- }
-
- if (iterating_) {
- // Calling erase() here is unsafe, since the loop in Notify() may be
- // referencing this same iterator, e.g. if adjacent callbacks'
- // Subscriptions are both destroyed when the first one is Run(). Just
- // reset the callback and let Notify() clean it up at the end.
- it->Reset();
- } else {
- callbacks_.erase(it);
- if (removal_callback_) {
- removal_callback_.Run(); // May delete |this|!
- }
+ if (static_cast<CallbackListImpl*>(this)->CancelCallback(it, iterating_) &&
+ removal_callback_) {
+ removal_callback_.Run(); // May delete |this|!
}
}
@@ -304,23 +289,71 @@ class OnceCallbackList
// Runs the current callback, which may cancel it or any other callbacks.
template <typename... RunArgs>
void RunCallback(typename Traits::Callbacks::iterator it, RunArgs&&... args) {
- // OnceCallbacks still have Subscriptions with outstanding iterators;
- // splice() removes them from |callbacks_| without invalidating those.
- null_callbacks_.splice(null_callbacks_.end(), this->callbacks_, it);
+ // Do not splice here. Splicing during iteration breaks re-entrant Notify()
+ // by invalidating the outer loop's iterator. Splicing is deferred to
+ // CleanUpNullCallbacksPostIteration(), which is called when the outermost
+ // Notify() finishes.
// NOTE: Intentionally does not call std::forward<RunArgs>(args)...; see
// comments in Notify().
std::move(*it).Run(args...);
}
- // If |it| refers to an already-canceled callback, does any necessary cleanup
- // and returns true. Otherwise returns false.
- bool CancelNullCallback(const typename Traits::Callbacks::iterator& it) {
+ // Called during subscription destruction to cancel the callback. Returns true
+ // if the callback was removed from the active list and the generic removal
+ // callback should be executed. Returns false if the callback was already
+ // executed, or if the erasure is deferred due to active iteration.
+ bool CancelCallback(const typename Traits::Callbacks::iterator& it,
+ bool is_iterating) {
+ if (is_iterating) {
+ // During iteration, nodes cannot be safely erased from |callbacks_|
+ // without invalidating iterators. They also cannot be spliced into
+ // |null_callbacks_| right now. Thus, the node is reset and tracked for
+ // erasure in CleanUpNullCallbacksPostIteration().
+ it->Reset();
+ pending_erasures_.push_back(it);
+ return false;
+ }
+
if (it->is_null()) {
+ // The callback already ran, so it's safely sitting in |null_callbacks_|.
null_callbacks_.erase(it);
- return true;
+ return false;
}
- return false;
+
+ // The callback hasn't run yet, so it's still in |callbacks_|.
+ this->callbacks_.erase(it);
+ return true;
+ }
+
+ // Performs post-iteration cleanup. Successfully executed callbacks (which
+ // become null) are spliced into |null_callbacks_| to keep their
+ // Subscriptions' iterators valid. Callbacks explicitly canceled during
+ // iteration (tracked in |pending_erasures_|) are erased. Returns true if any
+ // callbacks were erased or spliced out.
+ bool CleanUpNullCallbacksPostIteration() {
+ bool any_spliced = false;
+ for (auto it = this->callbacks_.begin(); it != this->callbacks_.end();) {
+ if (it->is_null()) {
+ any_spliced = true;
+ auto next = std::next(it);
+ null_callbacks_.splice(null_callbacks_.end(), this->callbacks_, it);
+ it = next;
+ } else {
+ ++it;
+ }
+ }
+
+ bool any_erased = !pending_erasures_.empty();
+ for (auto pending_it : pending_erasures_) {
+ // Note: `pending_it` was originally an iterator into `callbacks_`, but
+ // the node it points to has just been spliced into `null_callbacks_`. The
+ // iterator itself remains valid and can now be used for erasure from
+ // `null_callbacks_`.
+ null_callbacks_.erase(pending_it);
+ }
+ pending_erasures_.clear();
+ return any_spliced || any_erased;
}
// Holds null callbacks whose Subscriptions are still alive, so the
@@ -328,6 +361,11 @@ class OnceCallbackList
// OnceCallbacks, since RepeatingCallbacks are not canceled except by
// Subscription destruction.
typename Traits::Callbacks null_callbacks_;
+
+ // Holds iterators for callbacks canceled during iteration.
+ // Erasure is deferred to CleanUpNullCallbacksPostIteration() when iteration
+ // completes to prevent invalidating iterators that an outer loop might hold.
+ std::vector<typename Traits::Callbacks::iterator> pending_erasures_;
};
template <typename Signature>
@@ -344,14 +382,29 @@ class RepeatingCallbackList
it->Run(args...);
}
- // If |it| refers to an already-canceled callback, does any necessary cleanup
- // and returns true. Otherwise returns false.
- bool CancelNullCallback(const typename Traits::Callbacks::iterator& it) {
- // Because at most one Subscription can point to a given callback, and
- // RepeatingCallbacks are only reset by CancelCallback(), no one should be
- // able to request cancellation of a canceled RepeatingCallback.
- DCHECK(!it->is_null());
- return false;
+ // Called during subscription destruction to cancel the callback. Returns true
+ // if the callback was removed from the active list and the generic removal
+ // callback should be executed. Returns false if the callback was already
+ // executed, or if the erasure is deferred due to active iteration.
+ bool CancelCallback(const typename Traits::Callbacks::iterator& it,
+ bool is_iterating) {
+ if (is_iterating) {
+ // During iteration, nodes cannot be safely erased from |callbacks_|
+ // without invalidating iterators. The node is reset and will be swept up
+ // by CleanUpNullCallbacksPostIteration().
+ it->Reset();
+ return false;
+ }
+
+ this->callbacks_.erase(it);
+ return true;
+ }
+
+ // Performs post-iteration cleanup by erasing all canceled callbacks. Returns
+ // true if any callbacks were erased.
+ bool CleanUpNullCallbacksPostIteration() {
+ return std::erase_if(this->callbacks_,
+ [](const auto& cb) { return cb.is_null(); }) > 0;
}
};
diff --git a/base/callback_list_unittest.cc b/base/callback_list_unittest.cc
index 7474278525e5efecc0de903809a54d366896d524..a855443fbae862befbc3a2a484ea335632136e94 100644
--- a/base/callback_list_unittest.cc
+++ b/base/callback_list_unittest.cc
@@ -10,6 +10,7 @@
#include "base/functional/bind.h"
#include "base/functional/callback_helpers.h"
#include "base/memory/raw_ptr.h"
+#include "base/test/bind.h"
#include "base/test/test_future.h"
#include "testing/gtest/include/gtest/gtest.h"
@@ -577,6 +578,30 @@ TEST(CallbackListTest, ReentrantNotify) {
EXPECT_EQ(1, d.total());
}
+// Regression test for crbug.com/489381399: Verifies Notify() can be called
+// reentrantly for OnceCallbackList even if a callback is canceled during the
+// reentrant notification.
+TEST(CallbackListTest, OnceCallbackListCancelDuringReentrantNotify) {
+ OnceClosureList cb_reg;
+ CallbackListSubscription sub_a, sub_b;
+
+ auto cb_a = base::BindLambdaForTesting([&]() {
+ // Re-entrant notification.
+ cb_reg.Notify();
+ // After re-entrant notification returns, sub_b has been run. Destroying it
+ // now should be a no-op.
+ sub_b = {};
+ });
+
+ auto cb_b = base::DoNothing();
+
+ sub_a = cb_reg.Add(std::move(cb_a));
+ sub_b = cb_reg.Add(std::move(cb_b));
+
+ // This should not crash.
+ cb_reg.Notify();
+}
+
TEST(CallbackListTest, ClearPreventsInvocation) {
Listener listener;
RepeatingClosureList cb_reg;

View File

@@ -0,0 +1,199 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Anders Hartvoll Ruud <andruud@chromium.org>
Date: Wed, 25 Feb 2026 03:24:19 -0800
Subject: Describe a vector of segments as "segments", not "tokens"
The specification uses the term "tokens" to refer to a sequence
of V8CSSUnparsedSegment objects, and CSSUnparsedValue has adopted
this terminology. While it is usually a good idea for Blink
to mirror the language used in specifications, "tokens" is very
confusing here, since it always means CSSParserTokens in every other
place in the style code.
Bug: 487117772
Change-Id: I2dc132c4e618e398e1f8bdabc03a8d2ab6c118e7
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7606599
Commit-Queue: Anders Hartvoll Ruud <andruud@chromium.org>
Reviewed-by: Steinar H Gunderson <sesse@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1590040}
diff --git a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
index 3d43306dd902b3071637e5d5d0af26e5ee47f141..67a4afde452f94ffb9ecbaeea104a3997c65b7b3 100644
--- a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
+++ b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
@@ -24,12 +24,12 @@ String FindVariableName(CSSParserTokenStream& stream) {
V8CSSUnparsedSegment* VariableReferenceValue(
const StringView& variable_name,
- const HeapVector<Member<V8CSSUnparsedSegment>>& tokens) {
+ const HeapVector<Member<V8CSSUnparsedSegment>>& segments) {
CSSUnparsedValue* unparsed_value;
- if (tokens.size() == 0) {
+ if (segments.size() == 0) {
unparsed_value = nullptr;
} else {
- unparsed_value = CSSUnparsedValue::Create(tokens);
+ unparsed_value = CSSUnparsedValue::Create(segments);
}
CSSStyleVariableReferenceValue* variable_reference =
@@ -41,13 +41,13 @@ V8CSSUnparsedSegment* VariableReferenceValue(
HeapVector<Member<V8CSSUnparsedSegment>> ParserTokenStreamToTokens(
CSSParserTokenStream& stream) {
int nesting_level = 0;
- HeapVector<Member<V8CSSUnparsedSegment>> tokens;
+ HeapVector<Member<V8CSSUnparsedSegment>> segments;
StringBuilder builder;
while (stream.Peek().GetType() != kEOFToken) {
if (stream.Peek().FunctionId() == CSSValueID::kVar ||
stream.Peek().FunctionId() == CSSValueID::kEnv) {
if (!builder.empty()) {
- tokens.push_back(MakeGarbageCollected<V8CSSUnparsedSegment>(
+ segments.push_back(MakeGarbageCollected<V8CSSUnparsedSegment>(
builder.ReleaseString()));
}
@@ -57,7 +57,7 @@ HeapVector<Member<V8CSSUnparsedSegment>> ParserTokenStreamToTokens(
if (stream.Peek().GetType() == CSSParserTokenType::kCommaToken) {
stream.Consume();
}
- tokens.push_back(VariableReferenceValue(
+ segments.push_back(VariableReferenceValue(
variable_name, ParserTokenStreamToTokens(stream)));
} else {
if (stream.Peek().GetBlockType() == CSSParserToken::kBlockStart) {
@@ -73,10 +73,10 @@ HeapVector<Member<V8CSSUnparsedSegment>> ParserTokenStreamToTokens(
}
}
if (!builder.empty()) {
- tokens.push_back(
+ segments.push_back(
MakeGarbageCollected<V8CSSUnparsedSegment>(builder.ReleaseString()));
}
- return tokens;
+ return segments;
}
} // namespace
@@ -96,8 +96,8 @@ CSSUnparsedValue* CSSUnparsedValue::FromCSSVariableData(
V8CSSUnparsedSegment* CSSUnparsedValue::AnonymousIndexedGetter(
uint32_t index,
ExceptionState& exception_state) const {
- if (index < tokens_.size()) {
- return tokens_[index].Get();
+ if (index < segments_.size()) {
+ return segments_[index].Get();
}
return nullptr;
}
@@ -106,20 +106,20 @@ IndexedPropertySetterResult CSSUnparsedValue::AnonymousIndexedSetter(
uint32_t index,
V8CSSUnparsedSegment* segment,
ExceptionState& exception_state) {
- if (index < tokens_.size()) {
- tokens_[index] = segment;
+ if (index < segments_.size()) {
+ segments_[index] = segment;
return IndexedPropertySetterResult::kIntercepted;
}
- if (index == tokens_.size()) {
- tokens_.push_back(segment);
+ if (index == segments_.size()) {
+ segments_.push_back(segment);
return IndexedPropertySetterResult::kIntercepted;
}
exception_state.ThrowRangeError(
ExceptionMessages::IndexOutsideRange<unsigned>(
- "index", index, 0, ExceptionMessages::kInclusiveBound, tokens_.size(),
- ExceptionMessages::kInclusiveBound));
+ "index", index, 0, ExceptionMessages::kInclusiveBound,
+ segments_.size(), ExceptionMessages::kInclusiveBound));
return IndexedPropertySetterResult::kIntercepted;
}
@@ -182,14 +182,14 @@ bool CSSUnparsedValue::AppendUnparsedString(
return false; // Cycle.
}
values_on_stack.insert(this);
- for (unsigned i = 0; i < tokens_.size(); i++) {
+ for (unsigned i = 0; i < segments_.size(); i++) {
if (i) {
builder.Append("/**/");
}
- switch (tokens_[i]->GetContentType()) {
+ switch (segments_[i]->GetContentType()) {
case V8CSSUnparsedSegment::ContentType::kCSSVariableReferenceValue: {
const auto* reference_value =
- tokens_[i]->GetAsCSSVariableReferenceValue();
+ segments_[i]->GetAsCSSVariableReferenceValue();
builder.Append("var(");
builder.Append(reference_value->variable());
if (reference_value->fallback()) {
@@ -203,7 +203,7 @@ bool CSSUnparsedValue::AppendUnparsedString(
break;
}
case V8CSSUnparsedSegment::ContentType::kString:
- builder.Append(tokens_[i]->GetAsString());
+ builder.Append(segments_[i]->GetAsString());
break;
}
}
diff --git a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
index c9dab7a0b3ffeaeb6b5d2ab50d876d40c38a760e..5d1961b170f14ae21ca8f69b3c3cd8af28f4478a 100644
--- a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
+++ b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
@@ -26,8 +26,8 @@ class CORE_EXPORT CSSUnparsedValue final : public CSSStyleValue {
public:
static CSSUnparsedValue* Create(
- const HeapVector<Member<V8CSSUnparsedSegment>>& tokens) {
- return MakeGarbageCollected<CSSUnparsedValue>(tokens);
+ const HeapVector<Member<V8CSSUnparsedSegment>>& segments) {
+ return MakeGarbageCollected<CSSUnparsedValue>(segments);
}
// Blink-internal constructor
@@ -37,14 +37,14 @@ class CORE_EXPORT CSSUnparsedValue final : public CSSStyleValue {
static CSSUnparsedValue* FromCSSValue(const CSSUnparsedDeclarationValue&);
static CSSUnparsedValue* FromCSSVariableData(const CSSVariableData&);
static CSSUnparsedValue* FromString(const String& string) {
- HeapVector<Member<V8CSSUnparsedSegment>> tokens;
- tokens.push_back(MakeGarbageCollected<V8CSSUnparsedSegment>(string));
- return Create(tokens);
+ HeapVector<Member<V8CSSUnparsedSegment>> segments;
+ segments.push_back(MakeGarbageCollected<V8CSSUnparsedSegment>(string));
+ return Create(segments);
}
explicit CSSUnparsedValue(
- const HeapVector<Member<V8CSSUnparsedSegment>>& tokens)
- : tokens_(tokens) {}
+ const HeapVector<Member<V8CSSUnparsedSegment>>& segments)
+ : segments_(segments) {}
CSSUnparsedValue(const CSSUnparsedValue&) = delete;
CSSUnparsedValue& operator=(const CSSUnparsedValue&) = delete;
@@ -60,10 +60,10 @@ class CORE_EXPORT CSSUnparsedValue final : public CSSStyleValue {
V8CSSUnparsedSegment* segment,
ExceptionState& exception_state);
- wtf_size_t length() const { return tokens_.size(); }
+ wtf_size_t length() const { return segments_.size(); }
void Trace(Visitor* visitor) const override {
- visitor->Trace(tokens_);
+ visitor->Trace(segments_);
CSSStyleValue::Trace(visitor);
}
@@ -81,7 +81,7 @@ class CORE_EXPORT CSSUnparsedValue final : public CSSStyleValue {
StringBuilder&,
HeapHashSet<Member<const CSSUnparsedValue>>& values_on_stack) const;
- HeapVector<Member<V8CSSUnparsedSegment>> tokens_;
+ HeapVector<Member<V8CSSUnparsedSegment>> segments_;
FRIEND_TEST_ALL_PREFIXES(CSSUnparsedDeclarationValueTest, MixedList);
};

View File

@@ -0,0 +1,149 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Kai Ninomiya <kainino@chromium.org>
Date: Wed, 11 Mar 2026 14:52:44 -0700
Subject: [M146] Increment WebGL context generation number on context restore
Objects created while the context is lost should not be valid to use
after the context is restored.
- Replace number_of_context_losses_ with a "context generation number"
which increments on both context loss and context restore.
- Technically, it would make sense to increment it only on context
restore, but just in case any logic is relying on the current
behavior, increment it in both places.
- It's uint64_t just in case someone figures out how to increment it 4
billion times.
- Remove unused WebGLRenderingContextBase::number_of_context_losses_,
left over from before it was moved into WebGLContextObjectSupport.
(cherry picked from commit c1433740f3ea902fd6b15d63c4865ad60a3761f9)
Bug: 485935305
Change-Id: I1007217c8e69cfb8de4f117e0b7845ca574579c4
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7630664
Reviewed-by: Kenneth Russell <kbr@chromium.org>
Commit-Queue: Kai Ninomiya <kainino@chromium.org>
Cr-Original-Commit-Position: refs/heads/main@{#1593726}
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7658823
Auto-Submit: Kai Ninomiya <kainino@chromium.org>
Bot-Commit: Rubber Stamper <rubber-stamper@appspot.gserviceaccount.com>
Commit-Queue: Rubber Stamper <rubber-stamper@appspot.gserviceaccount.com>
Cr-Commit-Position: refs/branch-heads/7680@{#2370}
Cr-Branched-From: 76b7d80e5cda23fe6537eed26d68c92e995c7f39-refs/heads/main@{#1582197}
diff --git a/third_party/blink/renderer/modules/webgl/webgl_context_object_support.cc b/third_party/blink/renderer/modules/webgl/webgl_context_object_support.cc
index 6a3b1416354e7993e7a9ebd25c4ca08593105d9a..83941f8163a5e9425f2df8fd3bb98e1fd37537ad 100644
--- a/third_party/blink/renderer/modules/webgl/webgl_context_object_support.cc
+++ b/third_party/blink/renderer/modules/webgl/webgl_context_object_support.cc
@@ -22,7 +22,10 @@ WebGLContextObjectSupport::WebGLContextObjectSupport(
void WebGLContextObjectSupport::OnContextLost() {
DCHECK(!is_lost_);
- number_of_context_losses_++;
+ // Invalidate all past objects.
+ // (It may not be strictly necessary to do this here, since it's also done in
+ // OnContextRestored, but we did it historically, and there's no harm in it.)
+ context_generation_++;
is_lost_ = true;
gles2_interface_ = nullptr;
extensions_enabled_.reset();
@@ -31,6 +34,8 @@ void WebGLContextObjectSupport::OnContextLost() {
void WebGLContextObjectSupport::OnContextRestored(
gpu::gles2::GLES2Interface* gl) {
DCHECK(is_lost_);
+ // Invalidate all past objects.
+ context_generation_++;
is_lost_ = false;
gles2_interface_ = gl;
}
diff --git a/third_party/blink/renderer/modules/webgl/webgl_context_object_support.h b/third_party/blink/renderer/modules/webgl/webgl_context_object_support.h
index 907866bb21acf9647d1c0ecd791e642e96b734fc..ba8b79f8bb9db12058614982a625baaff5546af7 100644
--- a/third_party/blink/renderer/modules/webgl/webgl_context_object_support.h
+++ b/third_party/blink/renderer/modules/webgl/webgl_context_object_support.h
@@ -33,10 +33,10 @@ class MODULES_EXPORT WebGLContextObjectSupport : public ScriptWrappable {
bool IsWebGL2() const { return is_webgl2_; }
bool IsLost() const { return is_lost_; }
- // How many context losses there were, to check whether a WebGLObject was
- // created since the last context resoration or before that (and hence invalid
- // to use).
- uint32_t NumberOfContextLosses() const { return number_of_context_losses_; }
+ // Which "generation" the context is on (essentially, how many times it has
+ // been restored), to check whether a WebGLObject was created since the last
+ // context restoration, or before that (and hence invalid to use).
+ uint64_t GetContextGeneration() const { return context_generation_; }
bool ExtensionEnabled(WebGLExtensionName name) const {
return extensions_enabled_.test(name);
@@ -65,7 +65,7 @@ class MODULES_EXPORT WebGLContextObjectSupport : public ScriptWrappable {
std::bitset<kWebGLExtensionNameCount> extensions_enabled_ = {};
raw_ptr<gpu::gles2::GLES2Interface> gles2_interface_ = nullptr;
- uint32_t number_of_context_losses_ = 0;
+ uint64_t context_generation_ = 0;
bool is_lost_ = true;
bool is_webgl2_;
};
diff --git a/third_party/blink/renderer/modules/webgl/webgl_object.cc b/third_party/blink/renderer/modules/webgl/webgl_object.cc
index 9d984de0073796f23a5038bfc0a51ec676179765..07e0a9a4aa3406a1298a677a3159edadc5f2cbb5 100644
--- a/third_party/blink/renderer/modules/webgl/webgl_object.cc
+++ b/third_party/blink/renderer/modules/webgl/webgl_object.cc
@@ -33,9 +33,9 @@ namespace blink {
WebGLObject::WebGLObject(WebGLContextObjectSupport* context)
: context_(context),
- cached_number_of_context_losses_(std::numeric_limits<uint32_t>::max()) {
+ context_generation_at_creation_(std::numeric_limits<uint64_t>::max()) {
if (context_) {
- cached_number_of_context_losses_ = context->NumberOfContextLosses();
+ context_generation_at_creation_ = context->GetContextGeneration();
}
}
@@ -46,7 +46,7 @@ bool WebGLObject::Validate(const WebGLContextObjectSupport* context) const {
// the objects they ever created, so there's no way to invalidate them
// eagerly during context loss. The invalidation is discovered lazily.
return (context == context_ && context_ != nullptr &&
- cached_number_of_context_losses_ == context->NumberOfContextLosses());
+ context_generation_at_creation_ == context->GetContextGeneration());
}
void WebGLObject::SetObject(GLuint object) {
@@ -71,7 +71,7 @@ void WebGLObject::DeleteObject(gpu::gles2::GLES2Interface* gl) {
return;
}
- if (context_->NumberOfContextLosses() != cached_number_of_context_losses_) {
+ if (context_->GetContextGeneration() != context_generation_at_creation_) {
// This object has been invalidated.
return;
}
diff --git a/third_party/blink/renderer/modules/webgl/webgl_object.h b/third_party/blink/renderer/modules/webgl/webgl_object.h
index bb56df0f99e8e8432e03442feb9302b8dde27d01..97caa90e34288911b1a827e60c2569544d2b8f69 100644
--- a/third_party/blink/renderer/modules/webgl/webgl_object.h
+++ b/third_party/blink/renderer/modules/webgl/webgl_object.h
@@ -123,9 +123,9 @@ class WebGLObject : public ScriptWrappable {
GLuint object_ = 0;
- // This was the number of context losses of the object's associated
- // WebGLContext at the time this object was created.
- uint32_t cached_number_of_context_losses_;
+ // The context generation number of the associated WebGLContext when the
+ // object was created, to prevent reuse in later generations.
+ uint64_t context_generation_at_creation_;
unsigned attachment_count_ = 0;
diff --git a/third_party/blink/renderer/modules/webgl/webgl_rendering_context_base.h b/third_party/blink/renderer/modules/webgl/webgl_rendering_context_base.h
index 48b5dd8f3baffed5ae898e029f7dd187ddf20ce3..33554fe5b9cdbbd703b6e02d6131104a4e591f4e 100644
--- a/third_party/blink/renderer/modules/webgl/webgl_rendering_context_base.h
+++ b/third_party/blink/renderer/modules/webgl/webgl_rendering_context_base.h
@@ -2064,8 +2064,6 @@ class MODULES_EXPORT WebGLRenderingContextBase
bool has_been_drawn_to_ = false;
- uint32_t number_of_context_losses_ = 0;
-
// Tracks if the context has ever called glBeginPixelLocalStorageANGLE. If it
// has, we need to start using the pixel local storage interrupt mechanism
// when we take over the client's context.

View File

@@ -0,0 +1,201 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Anders Hartvoll Ruud <andruud@chromium.org>
Date: Wed, 25 Feb 2026 12:17:00 -0800
Subject: Validate CSSUnparsedValues upon assignment
CSS Typed OM has a concept of a value "matching a grammar" (or not)
upon assignment to a property [1]. For CSSUnparsedValues, we currently
don't perform any significant validation, and as a consequence
we allow "invalid" CSSUnparsedDeclarationValues to be created
(causing DCHECKs later in the pipeline).
This CL makes sure values can be parsed using CSSVariableParser::
ConsumeUnparsedDeclaration before assignment.
We're still not handling the value in the context of the destination
property, which we probably should. This is also a problem with
current state of things, however, so for now the goal is primarily
to avoid the DCHECKs in Issue 484751092.
Finally, I opened an issue against the specification [2], which
currently doesn't define any of this.
[1] https://drafts.css-houdini.org/css-typed-om-1/#create-an-internal-representation
[2] https://github.com/w3c/csswg-drafts/issues/13547
Fixed: 484751092
Change-Id: Id7f888a6df8c02ade24910900f5d01909cb2dfad
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7595347
Reviewed-by: Steinar H Gunderson <sesse@chromium.org>
Commit-Queue: Anders Hartvoll Ruud <andruud@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1590110}
diff --git a/third_party/blink/renderer/build/scripts/core/css/templates/cssom_types.cc.tmpl b/third_party/blink/renderer/build/scripts/core/css/templates/cssom_types.cc.tmpl
index edfa73a57d30ebd4f9a7147702df42b836f7d82b..4442ba0872ca4c739596b546e6d3b600c5a31598 100644
--- a/third_party/blink/renderer/build/scripts/core/css/templates/cssom_types.cc.tmpl
+++ b/third_party/blink/renderer/build/scripts/core/css/templates/cssom_types.cc.tmpl
@@ -11,6 +11,7 @@
#include "third_party/blink/renderer/core/css/cssom/css_keyword_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_numeric_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_style_value.h"
+#include "third_party/blink/renderer/core/css/cssom/css_unparsed_value.h"
#include "third_party/blink/renderer/core/css/cssom/css_unsupported_style_value.h"
#include "third_party/blink/renderer/core/css/cssom/cssom_keywords.h"
#include "third_party/blink/renderer/core/css/properties/css_property.h"
@@ -105,8 +106,8 @@ bool CSSOMTypes::PropertyCanTake(CSSPropertyID id,
: CSSPropertyName(id);
return unsupported_style_value->IsValidFor(name);
}
- if (value.GetType() == CSSStyleValue::kUnparsedType) {
- return true;
+ if (auto* unparsed_value = DynamicTo<CSSUnparsedValue>(value)) {
+ return unparsed_value->IsValidDeclarationValue();
}
switch (id) {
diff --git a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
index 9c6bb62d044f804b0ce7bc8df398d77695cf950c..17a86e1292d76ff0b0c74b701750c0e972b4d885 100644
--- a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
+++ b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.cc
@@ -4,11 +4,13 @@
#include "third_party/blink/renderer/core/css/cssom/css_unparsed_value.h"
+#include "css_style_value.h"
#include "third_party/blink/renderer/core/css/css_unparsed_declaration_value.h"
#include "third_party/blink/renderer/core/css/css_variable_data.h"
#include "third_party/blink/renderer/core/css/cssom/css_style_variable_reference_value.h"
#include "third_party/blink/renderer/core/css/parser/css_parser_token_stream.h"
#include "third_party/blink/renderer/core/css/parser/css_tokenizer.h"
+#include "third_party/blink/renderer/core/css/parser/css_variable_parser.h"
#include "third_party/blink/renderer/core/css_value_keywords.h"
#include "third_party/blink/renderer/platform/bindings/exception_messages.h"
#include "third_party/blink/renderer/platform/bindings/exception_state.h"
@@ -123,6 +125,10 @@ IndexedPropertySetterResult CSSUnparsedValue::AnonymousIndexedSetter(
return IndexedPropertySetterResult::kIntercepted;
}
+bool CSSUnparsedValue::IsValidDeclarationValue() const {
+ return IsValidDeclarationValue(ToStringInternal());
+}
+
const CSSValue* CSSUnparsedValue::ToCSSValue() const {
String unparsed_string = ToStringInternal();
@@ -131,12 +137,40 @@ const CSSValue* CSSUnparsedValue::ToCSSValue() const {
MakeGarbageCollected<CSSVariableData>());
}
+ CHECK(IsValidDeclarationValue(unparsed_string));
+ // The call to IsValidDeclarationValue() above also creates a CSSVariableData
+ // to carry out its check. It would be nice to use that here, but WPTs
+ // expect leading whitespace to be preserved, even though it's not possible
+ // to create such declaration values normally.
+ CSSVariableData* variable_data =
+ CSSVariableData::Create(unparsed_string,
+ /*is_animation_tainted=*/false,
+ /*is_attr_tainted=*/false,
+ /*needs_variable_resolution=*/false);
+
// TODO(crbug.com/985028): We should probably propagate the CSSParserContext
// to here.
- return MakeGarbageCollected<CSSUnparsedDeclarationValue>(
- CSSVariableData::Create(unparsed_string, false /* is_animation_tainted */,
- false /* is_attr_tainted */,
- false /* needs_variable_resolution */));
+ return MakeGarbageCollected<CSSUnparsedDeclarationValue>(variable_data);
+}
+
+bool CSSUnparsedValue::IsValidDeclarationValue(const String& string) {
+ CSSParserTokenStream stream(string);
+ bool important_unused;
+ // This checks that the value does not violate the "argument grammar" [1]
+ // of any substitution functions, and that it is a valid <declaration-value>
+ // otherwise.
+ //
+ // [1] https://drafts.csswg.org/css-values-5/#argument-grammar
+ //
+ // TODO(andruud): 'restricted_value' depends on the destination property.
+ return CSSVariableParser::ConsumeUnparsedDeclaration(
+ stream,
+ /*allow_important_annotation=*/false,
+ /*is_animation_tainted=*/false,
+ /*must_contain_variable_reference=*/false,
+ /*restricted_value=*/false,
+ /*comma_ends_declaration=*/false, important_unused,
+ *StrictCSSParserContext(SecureContextMode::kInsecureContext));
}
String CSSUnparsedValue::ToStringInternal() const {
diff --git a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
index ec7e3ed708f406d7a61fdb370b2eed8a8297cffb..7fd66aed677e31046a1bd206854b2cbeac07c25b 100644
--- a/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
+++ b/third_party/blink/renderer/core/css/cssom/css_unparsed_value.h
@@ -48,6 +48,14 @@ class CORE_EXPORT CSSUnparsedValue final : public CSSStyleValue {
CSSUnparsedValue(const CSSUnparsedValue&) = delete;
CSSUnparsedValue& operator=(const CSSUnparsedValue&) = delete;
+ // True if this CSSUnparsedValue can be converted into
+ // a CSSUnparsedDeclarationValue.
+ //
+ // We may want to ban some invalid values earlier, see:
+ // https://github.com/w3c/csswg-drafts/issues/13547
+ bool IsValidDeclarationValue() const;
+
+ // Requires IsValidDeclarationValue()==true.
const CSSValue* ToCSSValue() const override;
StyleValueType GetType() const override { return kUnparsedType; }
@@ -68,6 +76,7 @@ class CORE_EXPORT CSSUnparsedValue final : public CSSStyleValue {
}
private:
+ static bool IsValidDeclarationValue(const String&);
String ToStringInternal() const;
String SerializeSegments() const;
// Return 'false' if there is a cycle in the serialization.
diff --git a/third_party/blink/web_tests/external/wpt/css/css-typed-om/set-invalid-untyped-value-crash.html b/third_party/blink/web_tests/external/wpt/css/css-typed-om/set-invalid-untyped-value-crash.html
new file mode 100644
index 0000000000000000000000000000000000000000..ce618bf38fe651297b969ffdc16e212dee6a3688
--- /dev/null
+++ b/third_party/blink/web_tests/external/wpt/css/css-typed-om/set-invalid-untyped-value-crash.html
@@ -0,0 +1,39 @@
+<!DOCTYPE html>
+<title>Crash when setting invalid CSSUnparsedValue</title>
+<link rel="help" href="https://github.com/w3c/csswg-drafts/issues/13547">
+<div id=target></div>
+<script>
+ let examples = [
+ 'var()',
+ 'var(,)',
+ 'var(0)',
+ 'env()',
+ 'env(,)',
+ 'env(0)',
+ 'attr()',
+ 'attr(,)',
+ 'attr(0)',
+ 'if()',
+ 'if(,)',
+ 'if(0)',
+ '--f()',
+ '--f(,)',
+ '--f(0)',
+ 'thing!!!',
+ 'var(--x) !important',
+ ];
+ // Some of the above cases may be valid. That's fine; just don't crash.
+
+ for (let e of examples) {
+ try {
+ let value = new CSSUnparsedValue([e]);
+ target.attributeStyleMap.set('width', value);
+ // One of the two above statements should likely throw an exception.
+ // If they don't, then we should at least not crash on get():
+ target.attributeStyleMap.get('width');
+ } catch (e) {
+ // Intentionally empty.
+ }
+ target.offsetTop;
+ }
+</script>

View File

@@ -0,0 +1,401 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Alvin Ji <alvinji@chromium.org>
Date: Wed, 18 Mar 2026 21:55:14 -0700
Subject: WebUSB: Strengthen control transfer protection for protected classes
Strengthens control transfer permissions to block requests targeting
protected interface classes like HID or Mass Storage. Requests are now
scrutinized for protected interfaces via the wIndex field regardless of
recipient. Standard requests for device-level management remain allowed.
This behavior is gated by the kWebUsbProtectedClassControlTransferBlock
feature flag.
Bug: 489711638
Change-Id: Ic69c88f81e2a10abcaa80832b330d3789493f3b2
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7677820
Reviewed-by: Reilly Grant <reillyg@chromium.org>
Commit-Queue: Alvin Ji <alvinji@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1601756}
diff --git a/services/device/public/cpp/device_features.cc b/services/device/public/cpp/device_features.cc
index 1345a85271139e702dd371f72f35f5e466630169..c290276590ec50535533d3569cd38c5710b42252 100644
--- a/services/device/public/cpp/device_features.cc
+++ b/services/device/public/cpp/device_features.cc
@@ -34,6 +34,12 @@ BASE_FEATURE(kWebUsbBlocklist,
"WebUSBBlocklist",
base::FEATURE_ENABLED_BY_DEFAULT);
+// When enabled, WebUSB control transfers are blocked if they target a
+// protected interface class, even if the recipient is not set to interface
+// or endpoint. This protects devices which ignore this field.
+BASE_FEATURE(kWebUsbProtectedClassControlTransferBlock,
+ base::FEATURE_ENABLED_BY_DEFAULT);
+
// When enabled, accessing the navigator.hid attribute does not prevent the
// frame from entering the back forward cache.
BASE_FEATURE(kWebHidAttributeAllowsBackForwardCache,
diff --git a/services/device/public/cpp/device_features.h b/services/device/public/cpp/device_features.h
index af05fafcbfdfb00e5a150e6c813e42f5903c270d..e69af38b0913b07cd23ed401183a3569164c852d 100644
--- a/services/device/public/cpp/device_features.h
+++ b/services/device/public/cpp/device_features.h
@@ -23,6 +23,8 @@ DEVICE_FEATURES_EXPORT BASE_DECLARE_FEATURE(
DEVICE_FEATURES_EXPORT BASE_DECLARE_FEATURE(kGenericSensorExtraClasses);
DEVICE_FEATURES_EXPORT BASE_DECLARE_FEATURE(kSerialPortConnected);
DEVICE_FEATURES_EXPORT BASE_DECLARE_FEATURE(kWebUsbBlocklist);
+DEVICE_FEATURES_EXPORT BASE_DECLARE_FEATURE(
+ kWebUsbProtectedClassControlTransferBlock);
DEVICE_FEATURES_EXPORT BASE_DECLARE_FEATURE(
kWebHidAttributeAllowsBackForwardCache);
#if BUILDFLAG(IS_WIN)
diff --git a/services/device/usb/mojo/BUILD.gn b/services/device/usb/mojo/BUILD.gn
index d2398c6ea481ca072af68a562c1be616d97c2427..5c2f1aba2f640f0af6ac037a3922123514f6e03b 100644
--- a/services/device/usb/mojo/BUILD.gn
+++ b/services/device/usb/mojo/BUILD.gn
@@ -17,6 +17,7 @@ source_set("mojo") {
deps = [
"//mojo/public/cpp/bindings",
"//net",
+ "//services/device/public/cpp:device_features",
"//services/device/public/cpp/usb",
"//services/device/public/mojom:usb",
"//services/device/public/mojom:usb_test",
diff --git a/services/device/usb/mojo/device_impl.cc b/services/device/usb/mojo/device_impl.cc
index fb11f7ff5eae436acf45001e656374f882c22708..bf18dbc6ca0f456046b2f2ea6636587bfceb3071 100644
--- a/services/device/usb/mojo/device_impl.cc
+++ b/services/device/usb/mojo/device_impl.cc
@@ -20,6 +20,7 @@
#include "base/memory/ptr_util.h"
#include "base/memory/ref_counted_memory.h"
#include "base/strings/stringprintf.h"
+#include "services/device/public/cpp/device_features.h"
#include "services/device/public/cpp/usb/usb_utils.h"
#include "services/device/usb/usb_device.h"
#include "third_party/blink/public/common/features.h"
@@ -156,31 +157,75 @@ void DeviceImpl::CloseHandle() {
}
bool DeviceImpl::HasControlTransferPermission(
+ mojom::UsbControlTransferType type,
UsbControlTransferRecipient recipient,
uint16_t index) {
DCHECK(device_handle_);
- if (recipient != UsbControlTransferRecipient::INTERFACE &&
- recipient != UsbControlTransferRecipient::ENDPOINT) {
+ // STANDARD requests to the DEVICE or OTHER recipients (e.g. GET_DESCRIPTOR)
+ // are fundamental for device discovery and management. These requests are
+ // always permitted because the USB 2.0 spec (Section 9.3) defines the usage
+ // of the `index` field (wIndex in the spec) for these types as either 0 or a
+ // Language ID. Since they are not used for interface-based routing, they
+ // are always allowed.
+ if (type == mojom::UsbControlTransferType::STANDARD &&
+ (recipient == UsbControlTransferRecipient::DEVICE ||
+ recipient == UsbControlTransferRecipient::OTHER)) {
return true;
}
const mojom::UsbConfigurationInfo* config = device_->GetActiveConfiguration();
- if (!config)
+ if (!config) {
return false;
+ }
+ // Identify the interface targeted by this request.
const mojom::UsbInterfaceInfo* interface = nullptr;
if (recipient == UsbControlTransferRecipient::ENDPOINT) {
+ // For the ENDPOINT recipient, the low byte of `index` is the endpoint
+ // address. We look up the interface that owns this endpoint.
interface = device_handle_->FindInterfaceByEndpoint(index & 0xff);
} else {
+ // For the INTERFACE recipient, the low byte of `index` is the interface
+ // number.
+ // For DEVICE and OTHER recipients, the USB spec allows `index` to be used
+ // arbitrarily by the vendor/class. We treat the low byte of `index` as a
+ // candidate interface ID to prevent routing bypasses.
auto interface_it =
std::ranges::find(config->interfaces, index & 0xff,
&mojom::UsbInterfaceInfo::interface_number);
- if (interface_it != config->interfaces.end())
+ if (interface_it != config->interfaces.end()) {
interface = interface_it->get();
+ }
}
- return interface != nullptr;
+ // If the request targets a protected interface class (e.g. HID, Mass
+ // Storage), it must be blocked. This prevents a site from communicating
+ // with a protected interface,
+ // 1. by explicitly targeting an INTERFACE or ENDPOINT recipient, or
+ // 2. VENDOR or CLASS requests to the DEVICE or OTHER recipient where
+ // index looks like an interface number in case the device will
+ // respond to these requests despite an incorrectly set recipient.
+ if (interface && base::FeatureList::IsEnabled(
+ features::kWebUsbProtectedClassControlTransferBlock)) {
+ for (const auto& alternate : interface->alternates) {
+ if (blocked_interface_classes_.contains(alternate->class_code)) {
+ return false;
+ }
+ }
+ }
+
+ // For requests explicitly targeting an INTERFACE or ENDPOINT, the interface
+ // must actually exist in the current configuration.
+ if (recipient == UsbControlTransferRecipient::INTERFACE ||
+ recipient == UsbControlTransferRecipient::ENDPOINT) {
+ return interface != nullptr;
+ }
+
+ // For DEVICE and OTHER recipients, if we reached here, it means either no
+ // interface was identified by wIndex, or the interface it identified is
+ // not protected. These requests are allowed for device-level management.
+ return true;
}
// static
@@ -343,7 +388,8 @@ void DeviceImpl::ControlTransferIn(UsbControlTransferParamsPtr params,
return;
}
- if (HasControlTransferPermission(params->recipient, params->index)) {
+ if (HasControlTransferPermission(params->type, params->recipient,
+ params->index)) {
auto buffer = base::MakeRefCounted<base::RefCountedBytes>(length);
device_handle_->ControlTransfer(
UsbTransferDirection::INBOUND, params->type, params->recipient,
@@ -366,7 +412,8 @@ void DeviceImpl::ControlTransferOut(UsbControlTransferParamsPtr params,
return;
}
- if (HasControlTransferPermission(params->recipient, params->index) &&
+ if (HasControlTransferPermission(params->type, params->recipient,
+ params->index) &&
(allow_security_key_requests_ ||
!IsAndroidSecurityKeyRequest(params, data))) {
auto buffer = base::MakeRefCounted<base::RefCountedBytes>(data);
diff --git a/services/device/usb/mojo/device_impl.h b/services/device/usb/mojo/device_impl.h
index ddcd3b49e6ae2ec8e08dbbd41c193db4d8a21434..8651ece839b6da653bcb7a7aba238ff37d15095e 100644
--- a/services/device/usb/mojo/device_impl.h
+++ b/services/device/usb/mojo/device_impl.h
@@ -51,6 +51,7 @@ class DeviceImpl : public mojom::UsbDevice, public device::UsbDevice::Observer {
// Checks interface permissions for control transfers.
bool HasControlTransferPermission(
+ mojom::UsbControlTransferType type,
mojom::UsbControlTransferRecipient recipient,
uint16_t index);
diff --git a/services/device/usb/mojo/device_impl_unittest.cc b/services/device/usb/mojo/device_impl_unittest.cc
index 403de59eb1d8bf2fe19a21e3b0c7ee9c172bd323..984a64f9242e0bca91efb5549f6c02e7206b0d89 100644
--- a/services/device/usb/mojo/device_impl_unittest.cc
+++ b/services/device/usb/mojo/device_impl_unittest.cc
@@ -27,11 +27,13 @@
#include "base/strings/stringprintf.h"
#include "base/task/sequenced_task_runner.h"
#include "base/test/bind.h"
+#include "base/test/scoped_feature_list.h"
#include "base/test/task_environment.h"
#include "base/test/test_future.h"
#include "mojo/public/cpp/bindings/receiver.h"
#include "mojo/public/cpp/bindings/remote.h"
#include "mojo/public/cpp/test_support/test_utils.h"
+#include "services/device/public/cpp/device_features.h"
#include "services/device/usb/mock_usb_device.h"
#include "services/device/usb/mock_usb_device_handle.h"
#include "services/device/usb/usb_descriptors.h"
@@ -814,7 +816,7 @@ TEST_F(USBDeviceImplTest, ClaimProtectedInterface) {
// The second interface implements a class which has been blocked above.
AddMockConfig(
- ConfigBuilder(/*value=*/1)
+ ConfigBuilder(/*configuration_value=*/1)
.AddInterface(/*interface_number=*/0, /*alternate_setting=*/0,
/*class_code=*/1, /*subclass_code=*/0,
/*protocol_code=*/0)
@@ -976,6 +978,187 @@ TEST_F(USBDeviceImplTest, ControlTransfer) {
EXPECT_CALL(mock_handle(), Close());
}
+// Test control transfers to an interface with a protected class only work for
+// STANDARD type, not VENDOR or CLASS.
+TEST_F(USBDeviceImplTest, ControlTransferProtectedClassBlock) {
+ // Block interface class 2.
+ mojo::Remote<mojom::UsbDevice> device =
+ GetMockDeviceProxyWithBlockedInterfaces(base::span_from_ref(uint8_t{2}));
+
+ EXPECT_CALL(mock_device(), OpenInternal(_));
+
+ {
+ base::test::TestFuture<mojom::UsbOpenDeviceResultPtr> future;
+ device->Open(future.GetCallback());
+ EXPECT_TRUE(future.Get()->is_success());
+ }
+
+ // Interface 7 has class 2 (blocked).
+ AddMockConfig(ConfigBuilder(/*configuration_value=*/1)
+ .AddInterface(/*interface_number=*/7,
+ /*alternate_setting=*/0,
+ /*class_code=*/2, /*subclass_code=*/0,
+ /*protocol_code=*/0)
+ .Build());
+
+ EXPECT_CALL(mock_handle(), SetConfigurationInternal(1, _));
+
+ {
+ base::RunLoop loop;
+ device->SetConfiguration(
+ 1, base::BindOnce(&ExpectResultAndThen, true, loop.QuitClosure()));
+ loop.Run();
+ }
+
+ {
+ // A VENDOR request to the DEVICE with index 7 (targeting the blocked
+ // interface) should be blocked.
+ auto params = mojom::UsbControlTransferParams::New();
+ params->type = UsbControlTransferType::VENDOR;
+ params->recipient = UsbControlTransferRecipient::DEVICE;
+ params->request = 5;
+ params->value = 6;
+ params->index = 7;
+ base::RunLoop loop;
+ device->ControlTransferIn(
+ std::move(params), 8, 0,
+ base::BindOnce(&ExpectTransferInAndThen,
+ mojom::UsbTransferStatus::PERMISSION_DENIED,
+ std::vector<uint8_t>(), loop.QuitClosure()));
+ loop.Run();
+ }
+
+ {
+ // A CLASS request to the DEVICE with index 7 (targeting the blocked
+ // interface) should be blocked.
+ auto params = mojom::UsbControlTransferParams::New();
+ params->type = UsbControlTransferType::CLASS;
+ params->recipient = UsbControlTransferRecipient::DEVICE;
+ params->request = 5;
+ params->value = 6;
+ params->index = 7;
+ base::RunLoop loop;
+ device->ControlTransferIn(
+ std::move(params), 8, 0,
+ base::BindOnce(&ExpectTransferInAndThen,
+ mojom::UsbTransferStatus::PERMISSION_DENIED,
+ std::vector<uint8_t>(), loop.QuitClosure()));
+ loop.Run();
+ }
+
+ {
+ // A STANDARD request to the DEVICE with index 7 should still be allowed
+ // even if index 7 matches a blocked interface.
+ std::vector<uint8_t> fake_data = {1, 2, 3};
+ AddMockInboundData(fake_data);
+
+ EXPECT_CALL(mock_handle(),
+ ControlTransferInternal(UsbTransferDirection::INBOUND,
+ UsbControlTransferType::STANDARD,
+ UsbControlTransferRecipient::DEVICE, 5,
+ 6, 7, _, 0, _));
+
+ auto params = mojom::UsbControlTransferParams::New();
+ params->type = UsbControlTransferType::STANDARD;
+ params->recipient = UsbControlTransferRecipient::DEVICE;
+ params->request = 5;
+ params->value = 6;
+ params->index = 7;
+ base::RunLoop loop;
+ device->ControlTransferIn(
+ std::move(params), static_cast<uint32_t>(fake_data.size()), 0,
+ base::BindOnce(&ExpectTransferInAndThen,
+ mojom::UsbTransferStatus::COMPLETED, fake_data,
+ loop.QuitClosure()));
+ loop.Run();
+ }
+
+ {
+ // A STANDARD request to the INTERFACE with index 7 (targeting the blocked
+ // interface) should be blocked.
+ auto params = mojom::UsbControlTransferParams::New();
+ params->type = UsbControlTransferType::STANDARD;
+ params->recipient = UsbControlTransferRecipient::INTERFACE;
+ params->request = 5;
+ params->value = 6;
+ params->index = 7;
+ base::RunLoop loop;
+ device->ControlTransferIn(
+ std::move(params), 8, 0,
+ base::BindOnce(&ExpectTransferInAndThen,
+ mojom::UsbTransferStatus::PERMISSION_DENIED,
+ std::vector<uint8_t>(), loop.QuitClosure()));
+ loop.Run();
+ }
+
+ EXPECT_CALL(mock_handle(), Close());
+}
+
+TEST_F(USBDeviceImplTest, ControlTransferProtectedClassBlockDisabled) {
+ base::test::ScopedFeatureList feature_list;
+ feature_list.InitAndDisableFeature(
+ features::kWebUsbProtectedClassControlTransferBlock);
+
+ // Block interface class 2.
+ mojo::Remote<mojom::UsbDevice> device =
+ GetMockDeviceProxyWithBlockedInterfaces(base::span_from_ref(uint8_t{2}));
+
+ EXPECT_CALL(mock_device(), OpenInternal(_));
+
+ {
+ base::test::TestFuture<mojom::UsbOpenDeviceResultPtr> future;
+ device->Open(future.GetCallback());
+ EXPECT_TRUE(future.Get()->is_success());
+ }
+
+ // Interface 7 has class 2 (blocked).
+ AddMockConfig(ConfigBuilder(/*configuration_value=*/1)
+ .AddInterface(/*interface_number=*/7,
+ /*alternate_setting=*/0,
+ /*class_code=*/2, /*subclass_code=*/0,
+ /*protocol_code=*/0)
+ .Build());
+
+ EXPECT_CALL(mock_handle(), SetConfigurationInternal(1, _));
+
+ {
+ base::RunLoop loop;
+ device->SetConfiguration(
+ 1, base::BindOnce(&ExpectResultAndThen, true, loop.QuitClosure()));
+ loop.Run();
+ }
+
+ {
+ // A VENDOR request to the DEVICE with index 7 targeting the blocked
+ // interface should be ALLOWED because
+ // `kWebUsbProtectedClassControlTransferBlock` is disabled.
+ std::vector<uint8_t> fake_data = {1, 2, 3};
+ AddMockInboundData(fake_data);
+
+ EXPECT_CALL(mock_handle(),
+ ControlTransferInternal(UsbTransferDirection::INBOUND,
+ UsbControlTransferType::VENDOR,
+ UsbControlTransferRecipient::DEVICE, 5,
+ 6, 7, _, 0, _));
+
+ auto params = mojom::UsbControlTransferParams::New();
+ params->type = UsbControlTransferType::VENDOR;
+ params->recipient = UsbControlTransferRecipient::DEVICE;
+ params->request = 5;
+ params->value = 6;
+ params->index = 7;
+ base::RunLoop loop;
+ device->ControlTransferIn(
+ std::move(params), static_cast<uint32_t>(fake_data.size()), 0,
+ base::BindOnce(&ExpectTransferInAndThen,
+ mojom::UsbTransferStatus::COMPLETED, fake_data,
+ loop.QuitClosure()));
+ loop.Run();
+ }
+
+ EXPECT_CALL(mock_handle(), Close());
+}
+
TEST_F(USBDeviceImplTest, GenericTransfer) {
mojo::Remote<mojom::UsbDevice> device = GetMockDeviceProxy();

View File

@@ -0,0 +1,39 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: "Steinar H. Gunderson" <sesse@chromium.org>
Date: Fri, 20 Mar 2026 07:22:02 -0700
Subject: Fix another use-after-free with lazy style attributes.
This is a similar problem as regular attribute checks, just for
the special case of input type="" (which is a similar but separate
path).
Style perftest and Speedometer3 are neutral.
Fixed: 493952652
Change-Id: I264503545c345325e6d21afa0726f524bb9394b8
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7686835
Reviewed-by: Anders Hartvoll Ruud <andruud@chromium.org>
Commit-Queue: Steinar H Gunderson <sesse@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1602570}
diff --git a/third_party/blink/renderer/core/css/element_rule_collector.cc b/third_party/blink/renderer/core/css/element_rule_collector.cc
index 9c472ed289a51b828bc7718ffada1df1a68aec30..ebefc123c4fc0d511126b8bf223b34b7e15d6a3e 100644
--- a/third_party/blink/renderer/core/css/element_rule_collector.cc
+++ b/third_party/blink/renderer/core/css/element_rule_collector.cc
@@ -941,10 +941,14 @@ DISABLE_CFI_PERF bool ElementRuleCollector::CollectMatchingRulesInternal(
if (const AtomicString& input_type =
element.getAttribute(html_names::kTypeAttr);
!input_type.IsNull()) {
+ // Do not use input_type in the loop; the reference
+ // may be dangling if CollectMatchingRulesForList()
+ // adds lazy attributes.
+ AtomicString input_type_lower = input_type.LowerASCII();
for (const auto bundle : match_request.RuleSetsWithInputRules()) {
if (CollectMatchingRulesForList<stop_at_first_match>(
- bundle.rule_set->InputRules(input_type.LowerASCII()),
- match_request, bundle.rule_set, bundle.style_sheet_index,
+ bundle.rule_set->InputRules(input_type_lower), match_request,
+ bundle.rule_set, bundle.style_sheet_index,
checker, context.context) &&
stop_at_first_match) {
return true;

View File

@@ -0,0 +1,73 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Samuel Attard <sattard@anthropic.com>
Date: Sat, 7 Mar 2026 23:07:30 -0800
Subject: feat: plumb node_integration_in_worker through WorkerSettings
Copy the node_integration_in_worker flag from the initiating frame's
WebPreferences into WorkerSettings at dedicated worker creation time,
so the value is readable per-worker on the worker thread rather than
relying on a process-wide command line switch. The value is also
propagated to nested workers via WorkerSettings::Copy.
diff --git a/third_party/blink/renderer/core/workers/dedicated_worker.cc b/third_party/blink/renderer/core/workers/dedicated_worker.cc
index b6cd50a0005a69edcd2b5c059ffa35f82faac792..b74fd8f7e290075d37f10a2c216bef6de60f1052 100644
--- a/third_party/blink/renderer/core/workers/dedicated_worker.cc
+++ b/third_party/blink/renderer/core/workers/dedicated_worker.cc
@@ -37,6 +37,7 @@
#include "third_party/blink/renderer/core/frame/local_frame_client.h"
#include "third_party/blink/renderer/core/frame/web_frame_widget_impl.h"
#include "third_party/blink/renderer/core/frame/web_local_frame_impl.h"
+#include "third_party/blink/renderer/core/exported/web_view_impl.h"
#include "third_party/blink/renderer/core/inspector/inspector_trace_events.h"
#include "third_party/blink/renderer/core/inspector/main_thread_debugger.h"
#include "third_party/blink/renderer/core/loader/document_loader.h"
@@ -555,6 +556,12 @@ DedicatedWorker::CreateGlobalScopeCreationParams(
auto* frame = window->GetFrame();
parent_devtools_token = frame->GetDevToolsFrameToken();
settings = std::make_unique<WorkerSettings>(frame->GetSettings());
+ if (auto* web_local_frame = WebLocalFrameImpl::FromFrame(frame)) {
+ if (auto* web_view = web_local_frame->ViewImpl()) {
+ settings->SetNodeIntegrationInWorker(
+ web_view->GetWebPreferences().node_integration_in_worker);
+ }
+ }
agent_group_scheduler_compositor_task_runner =
execution_context->GetScheduler()
->ToFrameScheduler()
diff --git a/third_party/blink/renderer/core/workers/worker_settings.cc b/third_party/blink/renderer/core/workers/worker_settings.cc
index 45680c5f6ea0c7e89ccf43eb88f8a11e3318c02e..3fa3af62f4e7ba8186441c5e3184b1c04fe32d12 100644
--- a/third_party/blink/renderer/core/workers/worker_settings.cc
+++ b/third_party/blink/renderer/core/workers/worker_settings.cc
@@ -40,6 +40,8 @@ std::unique_ptr<WorkerSettings> WorkerSettings::Copy(
old_settings->strictly_block_blockable_mixed_content_;
new_settings->generic_font_family_settings_ =
old_settings->generic_font_family_settings_;
+ new_settings->node_integration_in_worker_ =
+ old_settings->node_integration_in_worker_;
return new_settings;
}
diff --git a/third_party/blink/renderer/core/workers/worker_settings.h b/third_party/blink/renderer/core/workers/worker_settings.h
index 45c60dd2c44b05fdd279f759069383479823c7f2..33a2a0337efb9a46293e11d0d09b3fc182ab9618 100644
--- a/third_party/blink/renderer/core/workers/worker_settings.h
+++ b/third_party/blink/renderer/core/workers/worker_settings.h
@@ -43,6 +43,11 @@ class CORE_EXPORT WorkerSettings {
return generic_font_family_settings_;
}
+ bool NodeIntegrationInWorker() const { return node_integration_in_worker_; }
+ void SetNodeIntegrationInWorker(bool value) {
+ node_integration_in_worker_ = value;
+ }
+
private:
void CopyFlagValuesFromSettings(Settings*);
@@ -54,6 +59,7 @@ class CORE_EXPORT WorkerSettings {
bool strict_mixed_content_checking_ = false;
bool allow_running_of_insecure_content_ = false;
bool strictly_block_blockable_mixed_content_ = false;
+ bool node_integration_in_worker_ = false;
GenericFontFamilySettings generic_font_family_settings_;
};

View File

@@ -0,0 +1,65 @@
From 0000000000000000000000000000000000000000 Mon Sep 17 00:00:00 2001
From: Keeley Hammond <vertedinde@electronjs.org>
Date: Mon, 23 Mar 2026 18:27:45 -0700
Subject: Validate uniform block count limits at compile time on IMG.
Normally these limits are validated at link time but the IMG compiler
has issues when these limits are exceeded. Validate at compile time
instead.
Bug: chromium:475877320
Change-Id: Ieeed6914b8cdd2b5e50242d06facae62badddefd
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/7568129
Auto-Submit: Geoff Lang <geofflang@chromium.org>
Reviewed-by: Kyle Charbonneau <kylechar@chromium.org>
Commit-Queue: Kyle Charbonneau <kylechar@chromium.org>
Cr-Commit-Position: refs/heads/main@{#1586673}
diff --git a/gpu/command_buffer/service/gles2_cmd_decoder.cc b/gpu/command_buffer/service/gles2_cmd_decoder.cc
index 02078983a7b10588c7d68f5a3affa5536edb29d5..a7131ef3a34e86d44ce5af5bb8bfab8853c32a50 100644
--- a/gpu/command_buffer/service/gles2_cmd_decoder.cc
+++ b/gpu/command_buffer/service/gles2_cmd_decoder.cc
@@ -3664,6 +3664,9 @@ bool GLES2DecoderImpl::InitializeShaderTranslator() {
driver_bug_workarounds.dontUseLoopsToInitializeVariables = true;
if (workarounds().remove_dynamic_indexing_of_swizzled_vector)
driver_bug_workarounds.removeDynamicIndexingOfSwizzledVector = true;
+ if (workarounds().validate_max_per_stage_uniform_blocks_at_compile_time) {
+ driver_bug_workarounds.validatePerStageMaxUniformBlocks = true;
+ }
// Initialize uninitialized locals by default
driver_bug_workarounds.initializeUninitializedLocals = true;
diff --git a/gpu/config/gpu_driver_bug_list.json b/gpu/config/gpu_driver_bug_list.json
index deeb8fb1ea810a87bb19bcb85cc352fe802361ac..900b5fa1decf4c1895f32eb5ab05fd54a806669c 100644
--- a/gpu/config/gpu_driver_bug_list.json
+++ b/gpu/config/gpu_driver_bug_list.json
@@ -3772,6 +3772,17 @@
"features": [
"disable_d3d12_av1_multi_ref_encoding"
]
+ },
+ {
+ "id": 472,
+ "description": "Validate GL_MAX_*_UNIFORM_BLOCKS at compile time instead of link time to work around compiler bugs.",
+ "os": {
+ "type": "android"
+ },
+ "gl_vendor": "Imagination.*",
+ "features": [
+ "validate_max_per_stage_uniform_blocks_at_compile_time"
+ ]
}
]
}
diff --git a/gpu/config/gpu_workaround_list.txt b/gpu/config/gpu_workaround_list.txt
index 32b7797c2f19b3e597963d1d2c78da6066ff3bd3..300586499f496d25c571adcb90cb066b4e778899 100644
--- a/gpu/config/gpu_workaround_list.txt
+++ b/gpu/config/gpu_workaround_list.txt
@@ -127,6 +127,7 @@ use_first_valid_ref_for_av1_invalid_ref
use_gpu_driver_workaround_for_testing
use_non_zero_size_for_client_side_stream_buffers
use_virtualized_gl_contexts
+validate_max_per_stage_uniform_blocks_at_compile_time
wake_up_gpu_before_drawing
webgl_or_caps_max_texture_size_limit_4096
webgl_or_caps_max_texture_size_limit_8192

View File

@@ -32,7 +32,8 @@ async function main () {
}));
const hitRate = stats.CacheHit / (stats.Remote + stats.CacheHit + stats.LocalFallback);
console.log(`Effective cache hit rate: ${(hitRate * 100).toFixed(2)}%`);
const messagePrefix = process.env.GITHUB_ACTIONS ? '::notice title=Build Stats::' : '';
console.log(`${messagePrefix}Effective cache hit rate: ${(hitRate * 100).toFixed(2)}%`);
if (uploadStats) {
if (!process.env.DD_API_KEY) {

View File

@@ -6,6 +6,7 @@ Everything here should be project agnostic: it shouldn't rely on project's
structure, or make assumptions about the passed arguments or calls' outcomes.
"""
import hashlib
import io
import os
import posixpath
@@ -18,7 +19,14 @@ sys.path.append(SCRIPT_DIR)
from patches import PATCH_FILENAME_PREFIX, is_patch_location_line
UPSTREAM_HEAD='refs/patches/upstream-head'
# In gclient-new-workdir worktrees, .git/refs is symlinked back to the source
# checkout, so a single fixed ref name would be shared (and clobbered) across
# worktrees. Derive a per-checkout suffix from this script's absolute path so
# each worktree records its own upstream head in the shared refs directory.
_LEGACY_UPSTREAM_HEAD = 'refs/patches/upstream-head'
UPSTREAM_HEAD = (
_LEGACY_UPSTREAM_HEAD + '-' + hashlib.md5(SCRIPT_DIR.encode()).hexdigest()[:8]
)
def is_repo_root(path):
path_exists = os.path.exists(path)
@@ -83,6 +91,8 @@ def import_patches(repo, ref=UPSTREAM_HEAD, **kwargs):
"""same as am(), but we save the upstream HEAD so we can refer to it when we
later export patches"""
update_ref(repo=repo, ref=ref, newvalue='HEAD')
if ref != _LEGACY_UPSTREAM_HEAD:
update_ref(repo=repo, ref=_LEGACY_UPSTREAM_HEAD, newvalue='HEAD')
am(repo=repo, **kwargs)
@@ -102,19 +112,21 @@ def get_commit_count(repo, commit_range):
def guess_base_commit(repo, ref):
"""Guess which commit the patches might be based on"""
try:
upstream_head = get_commit_for_ref(repo, ref)
num_commits = get_commit_count(repo, upstream_head + '..')
return [upstream_head, num_commits]
except subprocess.CalledProcessError:
args = [
'git',
'-C',
repo,
'describe',
'--tags',
]
return subprocess.check_output(args).decode('utf-8').rsplit('-', 2)[0:2]
for candidate in (ref, _LEGACY_UPSTREAM_HEAD):
try:
upstream_head = get_commit_for_ref(repo, candidate)
num_commits = get_commit_count(repo, upstream_head + '..')
return [upstream_head, num_commits]
except subprocess.CalledProcessError:
continue
args = [
'git',
'-C',
repo,
'describe',
'--tags',
]
return subprocess.check_output(args).decode('utf-8').rsplit('-', 2)[0:2]
def format_patch(repo, since):

View File

@@ -1,21 +0,0 @@
import os
import subprocess
import sys
def npx(*npx_args):
npx_env = os.environ.copy()
npx_env['npm_config_yes'] = 'true'
call_args = [__get_executable_name()] + list(npx_args)
subprocess.check_call(call_args, env=npx_env)
def __get_executable_name():
executable = 'npx'
if sys.platform == 'win32':
executable += '.cmd'
return executable
if __name__ == '__main__':
npx(*sys.argv[1:])

View File

@@ -6,7 +6,7 @@ const path = require('node:path');
const BASE = path.resolve(__dirname, '../..');
const NAN_DIR = path.resolve(BASE, 'third_party', 'nan');
const NPX_CMD = process.platform === 'win32' ? 'npx.cmd' : 'npx';
const NODE_GYP_BIN = path.join(NAN_DIR, 'node_modules', 'node-gyp', 'bin', 'node-gyp.js');
const utils = require('./lib/utils');
const { YARN_SCRIPT_PATH } = require('./yarn');
@@ -19,14 +19,6 @@ const args = minimist(process.argv.slice(2), {
string: ['only']
});
const getNodeGypVersion = () => {
const nanPackageJSONPath = path.join(NAN_DIR, 'package.json');
const nanPackageJSON = JSON.parse(fs.readFileSync(nanPackageJSONPath, 'utf8'));
const { devDependencies } = nanPackageJSON;
const nodeGypVersion = devDependencies['node-gyp'];
return nodeGypVersion || 'latest';
};
async function main () {
const outDir = utils.getOutDir({ shouldLog: true });
const nodeDir = path.resolve(BASE, 'out', outDir, 'gen', 'node_headers');
@@ -34,8 +26,7 @@ async function main () {
npm_config_msvs_version: '2022',
...process.env,
npm_config_nodedir: nodeDir,
npm_config_arch: process.env.NPM_CONFIG_ARCH,
npm_config_yes: 'true'
npm_config_arch: process.env.NPM_CONFIG_ARCH
};
const clangDir = path.resolve(BASE, 'third_party', 'llvm-build', 'Release+Asserts', 'bin');
@@ -99,30 +90,26 @@ async function main () {
env.LDFLAGS = ldflags;
}
const nodeGypVersion = getNodeGypVersion();
const { status: buildStatus, signal } = cp.spawnSync(NPX_CMD, [`node-gyp@${nodeGypVersion}`, 'rebuild', '--verbose', '--directory', 'test', '-j', 'max'], {
const { status: installStatus, signal: installSignal } = cp.spawnSync(process.execPath, [YARN_SCRIPT_PATH, 'install'], {
env,
cwd: NAN_DIR,
stdio: 'inherit',
shell: process.platform === 'win32'
stdio: 'inherit'
});
if (installStatus !== 0 || installSignal != null) {
console.error('Failed to install nan node_modules');
return process.exit(installStatus !== 0 ? installStatus : installSignal);
}
const { status: buildStatus, signal } = cp.spawnSync(process.execPath, [NODE_GYP_BIN, 'rebuild', '--verbose', '--directory', 'test', '-j', 'max'], {
env,
cwd: NAN_DIR,
stdio: 'inherit'
});
if (buildStatus !== 0 || signal != null) {
console.error('Failed to build nan test modules');
return process.exit(buildStatus !== 0 ? buildStatus : signal);
}
const { status: installStatus, signal: installSignal } = cp.spawnSync(process.execPath, [YARN_SCRIPT_PATH, 'install'], {
env,
cwd: NAN_DIR,
stdio: 'inherit',
shell: process.platform === 'win32'
});
if (installStatus !== 0 || installSignal != null) {
console.error('Failed to install nan node_modules');
return process.exit(installStatus !== 0 ? installStatus : installSignal);
}
const onlyTests = args.only?.split(',');
const DISABLED_TESTS = new Set([

View File

@@ -212,10 +212,15 @@ new Promise<string>((resolve, reject) => {
});
})
.then((tarballPath) => {
// TODO: Remove NPX
const existingVersionJSON = childProcess.execSync(`npx npm@7 view ${rootPackageJson.name}@${currentElectronVersion} --json`).toString('utf-8');
// It's possible this is a re-run and we already have published the package, if not we just publish like normal
if (!existingVersionJSON) {
let versionAlreadyPublished = false;
try {
childProcess.execSync(`npm view ${rootPackageJson.name}@${currentElectronVersion} --json`, { stdio: 'pipe' });
versionAlreadyPublished = true;
} catch (e: any) {
if (!e.stdout?.toString().includes('E404')) throw e;
}
if (!versionAlreadyPublished) {
childProcess.execSync(`npm publish ${tarballPath} --tag ${npmTag} --otp=${process.env.ELECTRON_NPM_OTP}`);
}
})

View File

@@ -31,9 +31,9 @@ PDB_LIST = [
PDB_LIST += glob.glob(os.path.join(RELEASE_DIR, '*.dll.pdb'))
NPX_CMD = "npx"
SENTRY_CLI = os.path.join(ELECTRON_DIR, 'node_modules', '.bin', 'sentry-cli')
if sys.platform == "win32":
NPX_CMD += ".cmd"
SENTRY_CLI += ".cmd"
def main():
@@ -48,11 +48,8 @@ def main():
for symbol_file in files:
print("Generating Sentry src bundle for: " + symbol_file)
npx_env = os.environ.copy()
npx_env['npm_config_yes'] = 'true'
subprocess.check_output([
NPX_CMD, '@sentry/cli@1.62.0', 'difutil', 'bundle-sources',
symbol_file], env=npx_env)
SENTRY_CLI, 'difutil', 'bundle-sources', symbol_file])
files += glob.glob(SYMBOLS_DIR + '/*/*/*.src.zip')

View File

@@ -317,6 +317,12 @@ void BaseWindow::OnWindowSheetEnd() {
Emit("sheet-end");
}
void BaseWindow::OnWindowIsKeyChanged(bool is_key) {
#if BUILDFLAG(IS_MAC)
window()->SetActive(is_key);
#endif
}
void BaseWindow::OnWindowEnterHtmlFullScreen() {
Emit("enter-html-full-screen");
}

View File

@@ -83,6 +83,7 @@ class BaseWindow : public gin_helper::TrackableObject<BaseWindow>,
void OnWindowRotateGesture(float rotation) override;
void OnWindowSheetBegin() override;
void OnWindowSheetEnd() override;
void OnWindowIsKeyChanged(bool is_key) override;
void OnWindowEnterFullScreen() override;
void OnWindowLeaveFullScreen() override;
void OnWindowEnterHtmlFullScreen() override;

View File

@@ -280,16 +280,22 @@ v8::Local<v8::Value> BrowserWindow::GetWebContents(v8::Isolate* isolate) {
}
void BrowserWindow::OnWindowShow() {
if (!web_contents_shown_) {
web_contents()->WasShown();
web_contents_shown_ = true;
}
BaseWindow::OnWindowShow();
}
void BrowserWindow::OnWindowHide() {
web_contents()->WasOccluded();
web_contents_shown_ = false;
BaseWindow::OnWindowHide();
}
void BrowserWindow::Show() {
web_contents()->WasShown();
web_contents_shown_ = true;
BaseWindow::Show();
}
@@ -298,6 +304,7 @@ void BrowserWindow::ShowInactive() {
if (IsModal())
return;
web_contents()->WasShown();
web_contents_shown_ = true;
BaseWindow::ShowInactive();
}

View File

@@ -80,6 +80,7 @@ class BrowserWindow : public BaseWindow,
// Helpers.
v8::Global<v8::Value> web_contents_;
bool web_contents_shown_ = false;
v8::Global<v8::Value> web_contents_view_;
base::WeakPtr<api::WebContents> api_web_contents_;

View File

@@ -151,7 +151,10 @@ void OnTraceBufferUsageAvailable(
gin_helper::Promise<gin_helper::Dictionary> promise,
float percent_full,
size_t approximate_count) {
auto dict = gin_helper::Dictionary::CreateEmpty(promise.isolate());
v8::Isolate* isolate = promise.isolate();
v8::HandleScope handle_scope(isolate);
auto dict = gin_helper::Dictionary::CreateEmpty(isolate);
dict.Set("percentage", percent_full);
dict.Set("value", approximate_count);

View File

@@ -1667,7 +1667,8 @@ bool WebContents::CheckMediaAccessPermission(
content::WebContents::FromRenderFrameHost(render_frame_host);
auto* permission_helper =
WebContentsPermissionHelper::FromWebContents(web_contents);
return permission_helper->CheckMediaAccessPermission(security_origin, type);
return permission_helper->CheckMediaAccessPermission(render_frame_host,
security_origin, type);
}
void WebContents::RequestMediaAccessPermission(

View File

@@ -51,8 +51,7 @@ bool ElectronSerialDelegate::CanRequestPortPermission(
auto* web_contents = content::WebContents::FromRenderFrameHost(frame);
auto* permission_helper =
WebContentsPermissionHelper::FromWebContents(web_contents);
return permission_helper->CheckSerialAccessPermission(
frame->GetLastCommittedOrigin());
return permission_helper->CheckSerialAccessPermission(frame);
}
bool ElectronSerialDelegate::HasPortPermission(

View File

@@ -12,6 +12,7 @@
#include <utility>
#include "base/base64.h"
#include "base/containers/fixed_flat_set.h"
#include "base/containers/span.h"
#include "base/memory/raw_ptr.h"
#include "base/metrics/histogram.h"
@@ -158,6 +159,13 @@ void OnOpenItemComplete(const base::FilePath& path, const std::string& result) {
constexpr base::TimeDelta kInitialBackoffDelay = base::Milliseconds(250);
constexpr base::TimeDelta kMaxBackoffDelay = base::Seconds(10);
constexpr auto kValidDockStates = base::MakeFixedFlatSet<std::string_view>(
{"bottom", "left", "right", "undocked"});
bool IsValidDockState(const std::string& state) {
return kValidDockStates.contains(state);
}
} // namespace
class InspectableWebContents::NetworkResourceLoader
@@ -392,7 +400,7 @@ void InspectableWebContents::SetDockState(const std::string& state) {
can_dock_ = false;
} else {
can_dock_ = true;
dock_state_ = state;
dock_state_ = IsValidDockState(state) ? state : "right";
}
}
@@ -557,7 +565,13 @@ void InspectableWebContents::LoadCompleted() {
pref_service_->GetDict(kDevToolsPreferences);
const std::string* current_dock_state =
prefs.FindString("currentDockState");
base::RemoveChars(*current_dock_state, "\"", &dock_state_);
if (current_dock_state) {
std::string sanitized;
base::RemoveChars(*current_dock_state, "\"", &sanitized);
dock_state_ = IsValidDockState(sanitized) ? sanitized : "right";
} else {
dock_state_ = "right";
}
}
#if BUILDFLAG(IS_WIN) || BUILDFLAG(IS_LINUX)
auto* api_web_contents = api::WebContents::From(GetWebContents());

View File

@@ -228,14 +228,14 @@ void WebContentsPermissionHelper::RequestPermission(
}
bool WebContentsPermissionHelper::CheckPermission(
content::RenderFrameHost* requesting_frame,
blink::PermissionType permission,
base::Value::Dict details) const {
auto* rfh = web_contents_->GetPrimaryMainFrame();
auto* permission_manager = static_cast<ElectronPermissionManager*>(
web_contents_->GetBrowserContext()->GetPermissionControllerDelegate());
auto origin = web_contents_->GetLastCommittedURL();
return permission_manager->CheckPermissionWithDetails(permission, rfh, origin,
std::move(details));
auto origin = requesting_frame->GetLastCommittedOrigin().GetURL();
return permission_manager->CheckPermissionWithDetails(
permission, requesting_frame, origin, std::move(details));
}
void WebContentsPermissionHelper::RequestFullscreenPermission(
@@ -313,6 +313,7 @@ void WebContentsPermissionHelper::RequestOpenExternalPermission(
}
bool WebContentsPermissionHelper::CheckMediaAccessPermission(
content::RenderFrameHost* requesting_frame,
const url::Origin& security_origin,
blink::mojom::MediaStreamType type) const {
base::Value::Dict details;
@@ -321,14 +322,16 @@ bool WebContentsPermissionHelper::CheckMediaAccessPermission(
auto blink_type = type == blink::mojom::MediaStreamType::DEVICE_AUDIO_CAPTURE
? blink::PermissionType::AUDIO_CAPTURE
: blink::PermissionType::VIDEO_CAPTURE;
return CheckPermission(blink_type, std::move(details));
return CheckPermission(requesting_frame, blink_type, std::move(details));
}
bool WebContentsPermissionHelper::CheckSerialAccessPermission(
const url::Origin& embedding_origin) const {
content::RenderFrameHost* requesting_frame) const {
base::Value::Dict details;
details.Set("securityOrigin", embedding_origin.GetURL().spec());
return CheckPermission(blink::PermissionType::SERIAL, std::move(details));
details.Set("securityOrigin",
requesting_frame->GetLastCommittedOrigin().GetURL().spec());
return CheckPermission(requesting_frame, blink::PermissionType::SERIAL,
std::move(details));
}
WEB_CONTENTS_USER_DATA_KEY_IMPL(WebContentsPermissionHelper);

View File

@@ -47,9 +47,11 @@ class WebContentsPermissionHelper
const GURL& url);
// Synchronous Checks
bool CheckMediaAccessPermission(const url::Origin& security_origin,
bool CheckMediaAccessPermission(content::RenderFrameHost* requesting_frame,
const url::Origin& security_origin,
blink::mojom::MediaStreamType type) const;
bool CheckSerialAccessPermission(const url::Origin& embedding_origin) const;
bool CheckSerialAccessPermission(
content::RenderFrameHost* requesting_frame) const;
private:
explicit WebContentsPermissionHelper(content::WebContents* web_contents);
@@ -61,7 +63,8 @@ class WebContentsPermissionHelper
bool user_gesture = false,
base::Value::Dict details = {});
bool CheckPermission(blink::PermissionType permission,
bool CheckPermission(content::RenderFrameHost* requesting_frame,
blink::PermissionType permission,
base::Value::Dict details) const;
// TODO(clavin): refactor to use the WebContents provided by the

View File

@@ -340,9 +340,6 @@ void WebContentsPreferences::AppendCommandLineSwitches(
command_line->AppendSwitchASCII(::switches::kDisableBlinkFeatures,
*disable_blink_features_);
if (node_integration_in_worker_)
command_line->AppendSwitch(switches::kNodeIntegrationInWorker);
// We are appending args to a webContents so let's save the current state
// of our preferences object so that during the lifetime of the WebContents
// we can fetch the options used to initially configure the WebContents

View File

@@ -245,7 +245,11 @@ gfx::Image Clipboard::ReadImage(gin_helper::Arguments* args) {
[](std::optional<gfx::Image>* image, base::RepeatingClosure cb,
const std::vector<uint8_t>& result) {
SkBitmap bitmap = gfx::PNGCodec::Decode(result);
image->emplace(gfx::Image::CreateFrom1xBitmap(bitmap));
if (bitmap.isNull()) {
image->emplace();
} else {
image->emplace(gfx::Image::CreateFrom1xBitmap(bitmap));
}
std::move(cb).Run();
},
&image, std::move(callback)));

View File

@@ -5,6 +5,8 @@
#ifndef ELECTRON_SHELL_COMMON_GIN_CONVERTERS_FILE_PATH_CONVERTER_H_
#define ELECTRON_SHELL_COMMON_GIN_CONVERTERS_FILE_PATH_CONVERTER_H_
#include <algorithm>
#include "base/files/file_path.h"
#include "gin/converter.h"
#include "shell/common/gin_converters/std_converter.h"
@@ -30,6 +32,11 @@ struct Converter<base::FilePath> {
base::FilePath::StringType path;
if (Converter<base::FilePath::StringType>::FromV8(isolate, val, &path)) {
bool has_control_chars = std::any_of(
path.begin(), path.end(),
[](base::FilePath::CharType c) { return c >= 0 && c < 0x20; });
if (has_control_chars)
return false;
*out = base::FilePath(path);
return true;
} else {

View File

@@ -607,6 +607,9 @@ bool Converter<scoped_refptr<network::ResourceRequestBody>>::FromV8(
const std::string* file = dict.FindString("filePath");
if (!file)
return false;
if (std::any_of(file->begin(), file->end(),
[](char c) { return c >= 0 && c < 0x20; }))
return false;
double modification_time =
dict.FindDouble("modificationTime").value_or(0.0);
int offset = dict.FindInt("offset").value_or(0);

View File

@@ -151,9 +151,12 @@ v8::Local<v8::Value> Converter<electron::OffscreenSharedTextureValue>::ToV8(
root.Set("textureInfo", ConvertToV8(isolate, dict));
auto root_local = ConvertToV8(isolate, root);
// Create a persistent reference of the object, so that we can check the
// monitor again when GC collects this object.
auto* tex_persistent = monitor->CreatePersistent(isolate, root_local);
// Create a weak persistent that tracks the release function rather than the
// texture object. The release function holds a raw pointer to |monitor| via
// its v8::External data, so |monitor| must outlive it. Since the texture
// keeps |release| alive via its property, this also covers the case where
// the texture itself is leaked without calling release().
auto* tex_persistent = monitor->CreatePersistent(isolate, releaser);
tex_persistent->SetWeak(
monitor,
[](const v8::WeakCallbackInfo<OffscreenReleaseHolderMonitor>& data) {

View File

@@ -274,10 +274,6 @@ inline constexpr base::cstring_view kAppPath = "app-path";
// The command line switch versions of the options.
inline constexpr base::cstring_view kScrollBounce = "scroll-bounce";
// Command switch passed to renderer process to control nodeIntegration.
inline constexpr base::cstring_view kNodeIntegrationInWorker =
"node-integration-in-worker";
// Widevine options
// Path to Widevine CDM binaries.
inline constexpr base::cstring_view kWidevineCdmPath = "widevine-cdm-path";

View File

@@ -18,7 +18,7 @@
#include "shell/common/node_bindings.h"
#include "shell/common/node_includes.h"
#include "shell/common/node_util.h"
#include "shell/common/options_switches.h"
#include "shell/common/v8_util.h"
#include "shell/renderer/electron_render_frame_observer.h"
#include "shell/renderer/web_worker_observer.h"
#include "third_party/blink/public/common/web_preferences/web_preferences.h"
@@ -26,6 +26,8 @@
#include "third_party/blink/public/web/web_local_frame.h"
#include "third_party/blink/renderer/core/execution_context/execution_context.h" // nogncheck
#include "third_party/blink/renderer/core/frame/web_local_frame_impl.h" // nogncheck
#include "third_party/blink/renderer/core/workers/worker_global_scope.h" // nogncheck
#include "third_party/blink/renderer/core/workers/worker_settings.h" // nogncheck
#if BUILDFLAG(IS_LINUX) && (defined(ARCH_CPU_X86_64) || defined(ARCH_CPU_ARM64))
#define ENABLE_WEB_ASSEMBLY_TRAP_HANDLER_LINUX
@@ -207,44 +209,54 @@ void ElectronRendererClient::WillReleaseScriptContext(
electron_bindings_->EnvironmentDestroyed(env);
}
void ElectronRendererClient::WorkerScriptReadyForEvaluationOnWorkerThread(
v8::Local<v8::Context> context) {
namespace {
bool WorkerHasNodeIntegration(blink::ExecutionContext* ec) {
// We do not create a Node.js environment in service or shared workers
// owing to an inability to customize sandbox policies in these workers
// given that they're run out-of-process.
// Also avoid creating a Node.js environment for worklet global scope
// created on the main thread.
auto* ec = blink::ExecutionContext::From(context);
if (ec->IsServiceWorkerGlobalScope() || ec->IsSharedWorkerGlobalScope() ||
ec->IsMainThreadWorkletGlobalScope())
return false;
auto* wgs = blink::DynamicTo<blink::WorkerGlobalScope>(ec);
if (!wgs)
return false;
// Read the nodeIntegrationInWorker preference from the worker's settings,
// which were copied from the initiating frame's WebPreferences at worker
// creation time. This ensures that in-process child windows with different
// webPreferences get the correct per-frame value rather than a process-wide
// value.
auto* worker_settings = wgs->GetWorkerSettings();
return worker_settings && worker_settings->NodeIntegrationInWorker();
}
} // namespace
void ElectronRendererClient::WorkerScriptReadyForEvaluationOnWorkerThread(
v8::Local<v8::Context> context) {
auto* ec = blink::ExecutionContext::From(context);
if (!WorkerHasNodeIntegration(ec))
return;
// This won't be correct for in-process child windows with webPreferences
// that have a different value for nodeIntegrationInWorker
if (base::CommandLine::ForCurrentProcess()->HasSwitch(
switches::kNodeIntegrationInWorker)) {
auto* current = WebWorkerObserver::GetCurrent();
if (current)
return;
WebWorkerObserver::Create()->WorkerScriptReadyForEvaluation(context);
}
auto* current = WebWorkerObserver::GetCurrent();
if (current)
return;
WebWorkerObserver::Create()->WorkerScriptReadyForEvaluation(context);
}
void ElectronRendererClient::WillDestroyWorkerContextOnWorkerThread(
v8::Local<v8::Context> context) {
auto* ec = blink::ExecutionContext::From(context);
if (ec->IsServiceWorkerGlobalScope() || ec->IsSharedWorkerGlobalScope() ||
ec->IsMainThreadWorkletGlobalScope())
if (!WorkerHasNodeIntegration(ec))
return;
// TODO(loc): Note that this will not be correct for in-process child windows
// with webPreferences that have a different value for nodeIntegrationInWorker
if (base::CommandLine::ForCurrentProcess()->HasSwitch(
switches::kNodeIntegrationInWorker)) {
auto* current = WebWorkerObserver::GetCurrent();
if (current)
current->ContextWillDestroy(context);
}
auto* current = WebWorkerObserver::GetCurrent();
if (current)
current->ContextWillDestroy(context);
}
void ElectronRendererClient::SetUpWebAssemblyTrapHandler() {

View File

@@ -6853,6 +6853,54 @@ describe('BrowserWindow module', () => {
expect(w.webContents.frameRate).to.equal(30);
});
});
describe('shared texture', () => {
const v8Util = process._linkedBinding('electron_common_v8_util');
it('does not crash when release() is called after the texture is garbage collected', async () => {
const sw = new BrowserWindow({
width: 100,
height: 100,
show: false,
webPreferences: {
backgroundThrottling: false,
offscreen: {
useSharedTexture: true
}
}
});
const paint = once(sw.webContents, 'paint') as Promise<[any, Electron.Rectangle, Electron.NativeImage]>;
sw.loadFile(path.join(fixtures, 'api', 'offscreen-rendering.html'));
const [event] = await paint;
sw.webContents.stopPainting();
if (!event.texture) {
// GPU shared texture not available on this host; skip.
sw.destroy();
return;
}
// Keep only the release closure and drop the owning texture object.
const staleRelease = event.texture.release;
const weakTexture = new WeakRef(event.texture);
event.texture = undefined;
// Force GC until the texture object is collected.
let collected = false;
for (let i = 0; i < 30 && !collected; ++i) {
await setTimeout();
v8Util.requestGarbageCollectionForTesting();
collected = weakTexture.deref() === undefined;
}
expect(collected).to.be.true('texture should be garbage collected');
// This should return safely and not crash the main process.
expect(() => staleRelease()).to.not.throw();
sw.destroy();
});
});
});
describe('"transparent" option', () => {

View File

@@ -132,6 +132,36 @@ ifdescribe(!(['arm', 'arm64'].includes(process.arch)) || (process.platform !== '
});
});
describe('getTraceBufferUsage', function () {
this.timeout(10e3);
it('does not crash and returns valid usage data', async () => {
await app.whenReady();
await contentTracing.startRecording({
categoryFilter: '*',
traceOptions: 'record-until-full'
});
// Yield to the event loop so the JS HandleScope from this tick is gone.
// When the Mojo response arrives it fires OnTraceBufferUsageAvailable
// as a plain Chromium task — if that callback lacks its own HandleScope
// the process will crash with "Cannot create a handle without a HandleScope".
const result = await contentTracing.getTraceBufferUsage();
expect(result).to.have.property('percentage').that.is.a('number');
expect(result).to.have.property('value').that.is.a('number');
await contentTracing.stopRecording();
});
it('returns zero usage when no trace is active', async () => {
await app.whenReady();
const result = await contentTracing.getTraceBufferUsage();
expect(result).to.have.property('percentage').that.is.a('number');
expect(result.percentage).to.equal(0);
});
});
describe('captured events', () => {
it('include V8 samples from the main process', async function () {
this.timeout(60000);

View File

@@ -1764,6 +1764,60 @@ describe('session module', () => {
expect(handlerDetails!.isMainFrame).to.be.false();
expect(handlerDetails!.embeddingOrigin).to.equal('file:///');
});
it('provides iframe origin as requestingOrigin for media check from cross-origin subFrame', async () => {
const w = new BrowserWindow({
show: false,
webPreferences: {
partition: 'very-temp-permission-handler-media'
}
});
const ses = w.webContents.session;
const iframeUrl = 'https://myfakesite/';
let capturedOrigin: string | undefined;
let capturedIsMainFrame: boolean | undefined;
let capturedRequestingUrl: string | undefined;
let capturedSecurityOrigin: string | undefined;
ses.protocol.interceptStringProtocol('https', (req, cb) => {
cb('<html><body>iframe</body></html>');
});
ses.setPermissionCheckHandler((wc, permission, requestingOrigin, details) => {
if (permission === 'media') {
capturedOrigin = requestingOrigin;
capturedIsMainFrame = details.isMainFrame;
capturedRequestingUrl = details.requestingUrl;
capturedSecurityOrigin = (details as any).securityOrigin;
}
return false;
});
try {
await w.loadFile(path.join(fixtures, 'api', 'blank.html'));
w.webContents.executeJavaScript(`
var iframe = document.createElement('iframe');
iframe.src = '${iframeUrl}';
iframe.allow = 'camera; microphone';
document.body.appendChild(iframe);
null;
`);
const [,, frameProcessId, frameRoutingId] = await once(w.webContents, 'did-frame-finish-load');
const frame = webFrameMain.fromId(frameProcessId, frameRoutingId)!;
await frame.executeJavaScript(
'navigator.mediaDevices.enumerateDevices().then(() => {}).catch(() => {});',
true
);
expect(capturedOrigin).to.equal(iframeUrl);
expect(capturedIsMainFrame).to.be.false();
expect(capturedRequestingUrl).to.equal(iframeUrl);
expect(capturedSecurityOrigin).to.equal(iframeUrl);
} finally {
ses.protocol.uninterceptProtocol('https');
ses.setPermissionCheckHandler(null);
}
});
});
describe('ses.isPersistent()', () => {

View File

@@ -1372,6 +1372,89 @@ describe('chromium features', () => {
expect(data).to.equal('object function object function');
});
it('Worker does not have node integration when nodeIntegrationInWorker is disabled via setWindowOpenHandler', async () => {
const w = new BrowserWindow({
show: false,
webPreferences: {
nodeIntegration: true,
nodeIntegrationInWorker: true,
contextIsolation: false
}
});
w.webContents.setWindowOpenHandler(() => ({
action: 'allow',
overrideBrowserWindowOptions: {
show: false,
webPreferences: {
nodeIntegration: false,
nodeIntegrationInWorker: false,
contextIsolation: true
}
}
}));
await w.loadURL(`file://${fixturesPath}/pages/blank.html`);
const childCreated = once(app, 'browser-window-created') as Promise<[any, BrowserWindow]>;
w.webContents.executeJavaScript(`window.open(${JSON.stringify(`file://${fixturesPath}/pages/blank.html`)}); void 0;`);
const [, child] = await childCreated;
await once(child.webContents, 'did-finish-load');
const data = await child.webContents.executeJavaScript(`
const worker = new Worker('../workers/worker_node.js');
new Promise((resolve) => { worker.onmessage = e => resolve(e.data); })
`);
expect(data).to.equal('undefined undefined undefined undefined');
});
it('Worker has node integration when nodeIntegrationInWorker is enabled via setWindowOpenHandler', async () => {
const w = new BrowserWindow({
show: false,
webPreferences: {
nodeIntegration: true,
nodeIntegrationInWorker: false,
contextIsolation: false
}
});
w.webContents.setWindowOpenHandler(() => ({
action: 'allow',
overrideBrowserWindowOptions: {
show: false,
webPreferences: {
nodeIntegration: true,
nodeIntegrationInWorker: true,
contextIsolation: false
}
}
}));
await w.loadURL(`file://${fixturesPath}/pages/blank.html`);
// Parent's workers should NOT have node integration.
const parentData = await w.webContents.executeJavaScript(`
new Promise((resolve) => {
const worker = new Worker('../workers/worker_node.js');
worker.onmessage = e => resolve(e.data);
})
`);
expect(parentData).to.equal('undefined undefined undefined undefined');
const childCreated = once(app, 'browser-window-created') as Promise<[any, BrowserWindow]>;
w.webContents.executeJavaScript(`window.open(${JSON.stringify(`file://${fixturesPath}/pages/blank.html`)}); void 0;`);
const [, child] = await childCreated;
await once(child.webContents, 'did-finish-load');
// Child's workers should have node integration.
const childData = await child.webContents.executeJavaScript(`
new Promise((resolve) => {
const worker = new Worker('../workers/worker_node.js');
worker.onmessage = e => resolve(e.data);
})
`);
expect(childData).to.equal('object function object function');
});
it('Worker has access to fetch-dependent interfaces with nodeIntegrationInWorker', async () => {
const w = new BrowserWindow({
show: false,

View File

@@ -0,0 +1,7 @@
<html>
<body>
<script>
window.open('javascript:alert()');
</script>
</body>
</html>

View File

@@ -0,0 +1,22 @@
const { app, BrowserWindow } = require('electron');
process.on('uncaughtException', (err) => {
console.error(err);
process.exit(1);
});
process.on('unhandledRejection', (reason) => {
console.error(reason);
process.exit(1);
});
app.on('browser-window-created', (_, window) => {
window.webContents.once('did-frame-navigate', () => {
process.exit(0);
});
});
app.whenReady().then(() => {
const win = new BrowserWindow({ show: false });
win.loadFile('index.html');
});

View File

@@ -186,6 +186,39 @@ describe('webContents.setWindowOpenHandler', () => {
await once(browserWindow.webContents, 'did-create-window');
});
it('reuses an existing window when window.open is called with the same frame name', async () => {
let handlerCallCount = 0;
browserWindow.webContents.setWindowOpenHandler(() => {
handlerCallCount++;
return { action: 'allow' };
});
const didCreateWindow = once(browserWindow.webContents, 'did-create-window') as Promise<[BrowserWindow, Electron.DidCreateWindowDetails]>;
await browserWindow.webContents.executeJavaScript("window.open('about:blank?one', 'named-target', 'show=no') && true");
const [childWindow] = await didCreateWindow;
expect(handlerCallCount).to.equal(1);
expect(childWindow.webContents.getURL()).to.equal('about:blank?one');
browserWindow.webContents.on('did-create-window', () => {
assert.fail('did-create-window should not fire when reusing a named window');
});
const didNavigate = once(childWindow.webContents, 'did-navigate');
const sameWindow = await browserWindow.webContents.executeJavaScript(`
(() => {
const first = window.open('about:blank?one', 'named-target', 'show=no');
const second = window.open('about:blank?two', 'named-target', 'show=no');
return first === second;
})()
`);
await didNavigate;
expect(sameWindow).to.be.true('window.open with matching frame name should return the same window proxy');
expect(handlerCallCount).to.equal(1, 'setWindowOpenHandler should not be called when Blink resolves the named target');
expect(childWindow.webContents.getURL()).to.equal('about:blank?two');
expect(BrowserWindow.getAllWindows()).to.have.lengthOf(2);
});
it('can change webPreferences of child windows', async () => {
browserWindow.webContents.setWindowOpenHandler(() => ({ action: 'allow', overrideBrowserWindowOptions: { webPreferences: { defaultFontSize: 30 } } }));

173
yarn.lock
View File

@@ -446,6 +446,7 @@ __metadata:
"@electron/typescript-definitions": "npm:^9.1.2"
"@octokit/rest": "npm:^20.1.2"
"@primer/octicons": "npm:^10.0.0"
"@sentry/cli": "npm:1.72.0"
"@types/minimist": "npm:^1.2.5"
"@types/node": "npm:^22.7.7"
"@types/semver": "npm:^7.5.8"
@@ -492,6 +493,8 @@ __metadata:
wrapper-webpack-plugin: "npm:^2.2.0"
yaml: "npm:^2.8.1"
dependenciesMeta:
"@sentry/cli":
built: true
abstract-socket:
built: true
languageName: unknown
@@ -1503,6 +1506,22 @@ __metadata:
languageName: node
linkType: hard
"@sentry/cli@npm:1.72.0":
version: 1.72.0
resolution: "@sentry/cli@npm:1.72.0"
dependencies:
https-proxy-agent: "npm:^5.0.0"
mkdirp: "npm:^0.5.5"
node-fetch: "npm:^2.6.0"
npmlog: "npm:^4.1.2"
progress: "npm:^2.0.3"
proxy-from-env: "npm:^1.1.0"
bin:
sentry-cli: bin/sentry-cli
checksum: 10c0/ef850dc9938c009dec485224222c272c1765ee59da04ef0c334de214cf79afe49a456671f465f98c9b48ff4dfa8738f92e7d9988dea0df0e318fba6969e4c0a7
languageName: node
linkType: hard
"@sindresorhus/is@npm:^4.0.0":
version: 4.6.0
resolution: "@sindresorhus/is@npm:4.6.0"
@@ -2671,6 +2690,13 @@ __metadata:
languageName: node
linkType: hard
"ansi-regex@npm:^2.0.0":
version: 2.1.1
resolution: "ansi-regex@npm:2.1.1"
checksum: 10c0/78cebaf50bce2cb96341a7230adf28d804611da3ce6bf338efa7b72f06cc6ff648e29f80cd95e582617ba58d5fdbec38abfeed3500a98bce8381a9daec7c548b
languageName: node
linkType: hard
"ansi-regex@npm:^3.0.0":
version: 3.0.1
resolution: "ansi-regex@npm:3.0.1"
@@ -2738,6 +2764,13 @@ __metadata:
languageName: node
linkType: hard
"aproba@npm:^1.0.3":
version: 1.2.0
resolution: "aproba@npm:1.2.0"
checksum: 10c0/2d34f008c9edfa991f42fe4b667d541d38a474a39ae0e24805350486d76744cd91ee45313283c1d39a055b14026dd0fc4d0cbfc13f210855d59d7e8b5a61dc51
languageName: node
linkType: hard
"aproba@npm:^1.0.3 || ^2.0.0":
version: 2.1.0
resolution: "aproba@npm:2.1.0"
@@ -2755,6 +2788,16 @@ __metadata:
languageName: node
linkType: hard
"are-we-there-yet@npm:~1.1.2":
version: 1.1.7
resolution: "are-we-there-yet@npm:1.1.7"
dependencies:
delegates: "npm:^1.0.0"
readable-stream: "npm:^2.0.6"
checksum: 10c0/03cb45f2892767773c86a616205fc67feb8dfdd56685d1b34999cfa6c0d2aebe73ec0e6ba88a406422b998dea24138337fdb9a3f9b172d7c2a7f75d02f3df088
languageName: node
linkType: hard
"argparse@npm:^1.0.7":
version: 1.0.10
resolution: "argparse@npm:1.0.10"
@@ -3822,6 +3865,13 @@ __metadata:
languageName: node
linkType: hard
"code-point-at@npm:^1.0.0":
version: 1.1.0
resolution: "code-point-at@npm:1.1.0"
checksum: 10c0/33f6b234084e46e6e369b6f0b07949392651b4dde70fc6a592a8d3dafa08d5bb32e3981a02f31f6fc323a26bc03a4c063a9d56834848695bda7611c2417ea2e6
languageName: node
linkType: hard
"coffeescript@npm:^2.4.1":
version: 2.7.0
resolution: "coffeescript@npm:2.7.0"
@@ -3981,7 +4031,7 @@ __metadata:
languageName: node
linkType: hard
"console-control-strings@npm:^1.0.0, console-control-strings@npm:^1.1.0":
"console-control-strings@npm:^1.0.0, console-control-strings@npm:^1.1.0, console-control-strings@npm:~1.1.0":
version: 1.1.0
resolution: "console-control-strings@npm:1.1.0"
checksum: 10c0/7ab51d30b52d461412cd467721bb82afe695da78fff8f29fe6f6b9cbaac9a2328e27a22a966014df9532100f6dd85370460be8130b9c677891ba36d96a343f50
@@ -6236,6 +6286,22 @@ __metadata:
languageName: node
linkType: hard
"gauge@npm:~2.7.3":
version: 2.7.4
resolution: "gauge@npm:2.7.4"
dependencies:
aproba: "npm:^1.0.3"
console-control-strings: "npm:^1.0.0"
has-unicode: "npm:^2.0.0"
object-assign: "npm:^4.1.0"
signal-exit: "npm:^3.0.0"
string-width: "npm:^1.0.1"
strip-ansi: "npm:^3.0.1"
wide-align: "npm:^1.1.0"
checksum: 10c0/d606346e2e47829e0bc855d0becb36c4ce492feabd61ae92884b89e07812dd8a67a860ca30ece3a4c2e9f2c73bd68ba2b8e558ed362432ffd86de83c08847f84
languageName: node
linkType: hard
"get-caller-file@npm:^2.0.5":
version: 2.0.5
resolution: "get-caller-file@npm:2.0.5"
@@ -6710,7 +6776,7 @@ __metadata:
languageName: node
linkType: hard
"has-unicode@npm:^2.0.1":
"has-unicode@npm:^2.0.0, has-unicode@npm:^2.0.1":
version: 2.0.1
resolution: "has-unicode@npm:2.0.1"
checksum: 10c0/ebdb2f4895c26bb08a8a100b62d362e49b2190bcfd84b76bc4be1a3bd4d254ec52d0dd9f2fbcc093fc5eb878b20c52146f9dfd33e2686ed28982187be593b47c
@@ -7374,6 +7440,15 @@ __metadata:
languageName: node
linkType: hard
"is-fullwidth-code-point@npm:^1.0.0":
version: 1.0.0
resolution: "is-fullwidth-code-point@npm:1.0.0"
dependencies:
number-is-nan: "npm:^1.0.0"
checksum: 10c0/12acfcf16142f2d431bf6af25d68569d3198e81b9799b4ae41058247aafcc666b0127d64384ea28e67a746372611fcbe9b802f69175287aba466da3eddd5ba0f
languageName: node
linkType: hard
"is-fullwidth-code-point@npm:^3.0.0":
version: 3.0.0
resolution: "is-fullwidth-code-point@npm:3.0.0"
@@ -9308,16 +9383,7 @@ __metadata:
languageName: node
linkType: hard
"mkdirp@npm:^1.0.3":
version: 1.0.4
resolution: "mkdirp@npm:1.0.4"
bin:
mkdirp: bin/cmd.js
checksum: 10c0/46ea0f3ffa8bc6a5bc0c7081ffc3907777f0ed6516888d40a518c5111f8366d97d2678911ad1a6882bf592fa9de6c784fea32e1687bb94e1f4944170af48a5cf
languageName: node
linkType: hard
"mkdirp@npm:~0.5.1":
"mkdirp@npm:^0.5.5, mkdirp@npm:~0.5.1":
version: 0.5.6
resolution: "mkdirp@npm:0.5.6"
dependencies:
@@ -9328,6 +9394,15 @@ __metadata:
languageName: node
linkType: hard
"mkdirp@npm:^1.0.3":
version: 1.0.4
resolution: "mkdirp@npm:1.0.4"
bin:
mkdirp: bin/cmd.js
checksum: 10c0/46ea0f3ffa8bc6a5bc0c7081ffc3907777f0ed6516888d40a518c5111f8366d97d2678911ad1a6882bf592fa9de6c784fea32e1687bb94e1f4944170af48a5cf
languageName: node
linkType: hard
"mocha-junit-reporter@npm:^1.18.0":
version: 1.23.3
resolution: "mocha-junit-reporter@npm:1.23.3"
@@ -9480,6 +9555,20 @@ __metadata:
languageName: node
linkType: hard
"node-fetch@npm:^2.6.0":
version: 2.7.0
resolution: "node-fetch@npm:2.7.0"
dependencies:
whatwg-url: "npm:^5.0.0"
peerDependencies:
encoding: ^0.1.0
peerDependenciesMeta:
encoding:
optional: true
checksum: 10c0/b55786b6028208e6fbe594ccccc213cab67a72899c9234eb59dba51062a299ea853210fcf526998eaa2867b0963ad72338824450905679ff0fa304b8c5093ae8
languageName: node
linkType: hard
"node-fetch@npm:^2.6.1":
version: 2.6.8
resolution: "node-fetch@npm:2.6.8"
@@ -9645,6 +9734,18 @@ __metadata:
languageName: node
linkType: hard
"npmlog@npm:^4.1.2":
version: 4.1.2
resolution: "npmlog@npm:4.1.2"
dependencies:
are-we-there-yet: "npm:~1.1.2"
console-control-strings: "npm:~1.1.0"
gauge: "npm:~2.7.3"
set-blocking: "npm:~2.0.0"
checksum: 10c0/d6a26cb362277c65e24a70ebdaff31f81184ceb5415fd748abaaf26417bf0794a17ba849116e4f454a0370b9067ae320834cc78d74527dbeadf6e9d19a959046
languageName: node
linkType: hard
"npmlog@npm:^5.0.1":
version: 5.0.1
resolution: "npmlog@npm:5.0.1"
@@ -9669,7 +9770,14 @@ __metadata:
languageName: node
linkType: hard
"object-assign@npm:^4.1.1":
"number-is-nan@npm:^1.0.0":
version: 1.0.1
resolution: "number-is-nan@npm:1.0.1"
checksum: 10c0/cb97149006acc5cd512c13c1838223abdf202e76ddfa059c5e8e7507aff2c3a78cd19057516885a2f6f5b576543dc4f7b6f3c997cc7df53ae26c260855466df5
languageName: node
linkType: hard
"object-assign@npm:^4.1.0, object-assign@npm:^4.1.1":
version: 4.1.1
resolution: "object-assign@npm:4.1.1"
checksum: 10c0/1f4df9945120325d041ccf7b86f31e8bcc14e73d29171e37a7903050e96b81323784ec59f93f102ec635bcf6fa8034ba3ea0a8c7e69fa202b87ae3b6cec5a414
@@ -10774,6 +10882,21 @@ __metadata:
languageName: node
linkType: hard
"readable-stream@npm:^2.0.6":
version: 2.3.8
resolution: "readable-stream@npm:2.3.8"
dependencies:
core-util-is: "npm:~1.0.0"
inherits: "npm:~2.0.3"
isarray: "npm:~1.0.0"
process-nextick-args: "npm:~2.0.0"
safe-buffer: "npm:~5.1.1"
string_decoder: "npm:~1.1.1"
util-deprecate: "npm:~1.0.1"
checksum: 10c0/7efdb01f3853bc35ac62ea25493567bf588773213f5f4a79f9c365e1ad13bab845ac0dae7bc946270dc40c3929483228415e92a3fc600cc7e4548992f41ee3fa
languageName: node
linkType: hard
"readable-stream@npm:^3.0.2":
version: 3.6.0
resolution: "readable-stream@npm:3.6.0"
@@ -12102,7 +12225,7 @@ __metadata:
languageName: node
linkType: hard
"set-blocking@npm:^2.0.0":
"set-blocking@npm:^2.0.0, set-blocking@npm:~2.0.0":
version: 2.0.0
resolution: "set-blocking@npm:2.0.0"
checksum: 10c0/9f8c1b2d800800d0b589de1477c753492de5c1548d4ade52f57f1d1f5e04af5481554d75ce5e5c43d4004b80a3eb714398d6907027dc0534177b7539119f4454
@@ -12621,6 +12744,17 @@ __metadata:
languageName: node
linkType: hard
"string-width@npm:^1.0.1":
version: 1.0.2
resolution: "string-width@npm:1.0.2"
dependencies:
code-point-at: "npm:^1.0.0"
is-fullwidth-code-point: "npm:^1.0.0"
strip-ansi: "npm:^3.0.0"
checksum: 10c0/c558438baed23a9ab9370bb6a939acbdb2b2ffc517838d651aad0f5b2b674fb85d460d9b1d0b6a4c210dffd09e3235222d89a5bd4c0c1587f78b2bb7bc00c65e
languageName: node
linkType: hard
"string-width@npm:^4.1.0":
version: 4.2.0
resolution: "string-width@npm:4.2.0"
@@ -12812,6 +12946,15 @@ __metadata:
languageName: node
linkType: hard
"strip-ansi@npm:^3.0.0, strip-ansi@npm:^3.0.1":
version: 3.0.1
resolution: "strip-ansi@npm:3.0.1"
dependencies:
ansi-regex: "npm:^2.0.0"
checksum: 10c0/f6e7fbe8e700105dccf7102eae20e4f03477537c74b286fd22cfc970f139002ed6f0d9c10d0e21aa9ed9245e0fa3c9275930e8795c5b947da136e4ecb644a70f
languageName: node
linkType: hard
"strip-ansi@npm:^4.0.0":
version: 4.0.0
resolution: "strip-ansi@npm:4.0.0"
@@ -14266,7 +14409,7 @@ __metadata:
languageName: node
linkType: hard
"wide-align@npm:^1.1.2":
"wide-align@npm:^1.1.0, wide-align@npm:^1.1.2":
version: 1.1.5
resolution: "wide-align@npm:1.1.5"
dependencies: