mirror of
https://github.com/electron/electron.git
synced 2026-01-05 05:34:14 -05:00
feat: add sharedTexture module to import shared texture (#47317)
feat: add `sharedTexture` module.
This commit is contained in:
1
BUILD.gn
1
BUILD.gn
@@ -480,6 +480,7 @@ source_set("electron_lib") {
|
||||
"//device/bluetooth",
|
||||
"//device/bluetooth/public/cpp",
|
||||
"//gin",
|
||||
"//gpu/ipc/client",
|
||||
"//media/capture/mojom:video_capture",
|
||||
"//media/mojo/mojom",
|
||||
"//media/mojo/mojom:web_speech_recognition",
|
||||
|
||||
58
docs/api/shared-texture.md
Normal file
58
docs/api/shared-texture.md
Normal file
@@ -0,0 +1,58 @@
|
||||
# sharedTexture
|
||||
|
||||
> Import shared textures into Electron and converts platform specific handles into [`VideoFrame`](https://developer.mozilla.org/en-US/docs/Web/API/VideoFrame). Supports all Web rendering systems, and can be transferred across Electron processes. Read [here](https://github.com/electron/electron/blob/main/shell/common/api/shared_texture/README.md) for more information.
|
||||
|
||||
Process: [Main](../glossary.md#main-process), [Renderer](../glossary.md#renderer-process)
|
||||
|
||||
## Methods
|
||||
|
||||
The `sharedTexture` module has the following methods:
|
||||
|
||||
**Note:** Experimental APIs are marked as such and could be removed in the future.
|
||||
|
||||
### `sharedTexture.importSharedTexture(options)` _Experimental_
|
||||
|
||||
* `options` Object - Options for importing shared textures.
|
||||
* `textureInfo` [SharedTextureImportTextureInfo](structures/shared-texture-import-texture-info.md) - The information of the shared texture to import.
|
||||
* `allReferencesReleased` Function (optional) - Called when all references in all processes are released. You should keep the imported texture valid until this callback is called.
|
||||
|
||||
Imports the shared texture from the given options.
|
||||
|
||||
> [!NOTE]
|
||||
> This method is only available in the main process.
|
||||
|
||||
Returns `SharedTextureImported` - The imported shared texture.
|
||||
|
||||
### `sharedTexture.sendSharedTexture(options, ...args)` _Experimental_
|
||||
|
||||
* `options` Object - Options for sending shared texture.
|
||||
* `frame` [WebFrameMain](web-frame-main.md) - The target frame to transfer the shared texture to. For `WebContents`, you can pass `webContents.mainFrame`. If you provide a `webFrameMain` that is not a main frame, you'll need to enable `webPreferences.nodeIntegrationInSubFrames` for this, since this feature requires [IPC](https://www.electronjs.org/docs/latest/api/web-frame-main#frameipc-readonly) between main and the frame.
|
||||
* `importedSharedTexture` [SharedTextureImported](structures/shared-texture-imported.md) - The imported shared texture.
|
||||
* `...args` any[] - Additional arguments to pass to the renderer process.
|
||||
|
||||
Send the imported shared texture to a renderer process. You must register a receiver at renderer process before calling this method. This method has a 1000ms timeout. Ensure the receiver is set and the renderer process is alive before calling this method.
|
||||
|
||||
> [!NOTE]
|
||||
> This method is only available in the main process.
|
||||
|
||||
Returns `Promise<void>` - Resolves when the transfer is complete.
|
||||
|
||||
### `sharedTexture.setSharedTextureReceiver(callback)` _Experimental_
|
||||
|
||||
* `callback` Function\<Promise\<void\>\> - The function to receive the imported shared texture.
|
||||
* `receivedSharedTextureData` Object - The data received from the main process.
|
||||
* `importedSharedTexture` [SharedTextureImported](structures/shared-texture-imported.md) - The imported shared texture.
|
||||
* `...args` any[] - Additional arguments passed from the main process.
|
||||
|
||||
Set a callback to receive imported shared textures from the main process.
|
||||
|
||||
> [!NOTE]
|
||||
> This method is only available in the renderer process.
|
||||
|
||||
## Properties
|
||||
|
||||
The `sharedTexture` module has the following properties:
|
||||
|
||||
### `sharedTexture.subtle` _Experimental_
|
||||
|
||||
A [`SharedTextureSubtle`](structures/shared-texture-subtle.md) property, provides subtle APIs for interacting with shared texture for advanced users.
|
||||
11
docs/api/structures/shared-texture-import-texture-info.md
Normal file
11
docs/api/structures/shared-texture-import-texture-info.md
Normal file
@@ -0,0 +1,11 @@
|
||||
# SharedTextureImportTextureInfo Object
|
||||
|
||||
* `pixelFormat` string - The pixel format of the texture.
|
||||
* `bgra` - 32bpp BGRA (byte-order), 1 plane.
|
||||
* `rgba` - 32bpp RGBA (byte-order), 1 plane.
|
||||
* `rgbaf16` - Half float RGBA, 1 plane.
|
||||
* `colorSpace` [ColorSpace](color-space.md) (optional) - The color space of the texture.
|
||||
* `codedSize` [Size](size.md) - The full dimensions of the shared texture.
|
||||
* `visibleRect` [Rectangle](rectangle.md) (optional) - A subsection of [0, 0, codedSize.width, codedSize.height]. In common cases, it is the full section area.
|
||||
* `timestamp` number (optional) - A timestamp in microseconds that will be reflected to `VideoFrame`.
|
||||
* `handle` [SharedTextureHandle](shared-texture-handle.md) - The shared texture handle.
|
||||
9
docs/api/structures/shared-texture-imported-subtle.md
Normal file
9
docs/api/structures/shared-texture-imported-subtle.md
Normal file
@@ -0,0 +1,9 @@
|
||||
# SharedTextureImportedSubtle Object
|
||||
|
||||
* `getVideoFrame` Function\<[VideoFrame](https://developer.mozilla.org/en-US/docs/Web/API/VideoFrame)\> - Create a `VideoFrame` that uses the imported shared texture in the current process. You can call `VideoFrame.close()` once you've finished using the object. The underlying resources will wait for GPU finish internally.
|
||||
* `release` Function - Release the resources. If you transferred and get multiple `SharedTextureImported` objects, you have to `release` every one of them. The resource on the GPU process will be destroyed when the last one is released.
|
||||
* `callback` Function (optional) - Callback when the GPU command buffer finishes using this shared texture. It provides a precise event to safely release dependent resources. For example, if this object is created by `finishTransferSharedTexture`, you can use this callback to safely release the original one that called `startTransferSharedTexture` in other processes. You can also release the source shared texture that was used to `importSharedTexture` safely.
|
||||
* `startTransferSharedTexture` Function\<[SharedTextureTransfer](shared-texture-transfer.md)\> - Create a `SharedTextureTransfer` that can be serialized and transferred to other processes.
|
||||
* `getFrameCreationSyncToken` Function\<[SharedTextureSyncToken](shared-texture-sync-token.md)\> - This method is for advanced users. If used, it is typically called after `finishTransferSharedTexture`, and should be passed to the object which was called `startTransferSharedTexture` to prevent the source object release the underlying resource before the target object actually acquire the reference at gpu process asyncly.
|
||||
* `setReleaseSyncToken` Function - This method is for advanced users. If used, this object's underlying resource will not be released until the set sync token is fulfilled at gpu process. By using sync tokens, users are not required to use release callbacks for lifetime management.
|
||||
* `syncToken` [SharedTextureSyncToken](shared-texture-sync-token.md) - The sync token to set.
|
||||
6
docs/api/structures/shared-texture-imported.md
Normal file
6
docs/api/structures/shared-texture-imported.md
Normal file
@@ -0,0 +1,6 @@
|
||||
# SharedTextureImported Object
|
||||
|
||||
* `textureId` string - The unique identifier of the imported shared texture.
|
||||
* `getVideoFrame` Function\<[VideoFrame](https://developer.mozilla.org/en-US/docs/Web/API/VideoFrame)\> - Create a `VideoFrame` that uses the imported shared texture in the current process. You can call `VideoFrame.close()` once you've finished using the object. The underlying resources will wait for GPU finish internally.
|
||||
* `release` Function - Release this object's reference of the imported shared texture. The underlying resource will be alive until every reference is released.
|
||||
* `subtle` [SharedTextureImportedSubtle](shared-texture-imported-subtle.md) - Provides subtle APIs to interact with the imported shared texture for advanced users.
|
||||
6
docs/api/structures/shared-texture-subtle.md
Normal file
6
docs/api/structures/shared-texture-subtle.md
Normal file
@@ -0,0 +1,6 @@
|
||||
# SharedTextureSubtle Object
|
||||
|
||||
* `importSharedTexture` Function\<[SharedTextureImportedSubtle](shared-texture-imported-subtle.md)\> - Imports the shared texture from the given options. Returns the imported shared texture.
|
||||
* `textureInfo` [SharedTextureImportTextureInfo](shared-texture-import-texture-info.md) - The information of shared texture to import.
|
||||
* `finishTransferSharedTexture` Function\<[SharedTextureImportedSubtle](shared-texture-imported-subtle.md)\> - Finishes the transfer of the shared texture and gets the transferred shared texture. Returns the imported shared texture from the transfer object.
|
||||
* `transfer` [SharedTextureTransfer](shared-texture-transfer.md) - The transfer object of the shared texture.
|
||||
3
docs/api/structures/shared-texture-sync-token.md
Normal file
3
docs/api/structures/shared-texture-sync-token.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# SharedTextureSyncToken Object
|
||||
|
||||
* `syncToken` string - The opaque data for sync token.
|
||||
10
docs/api/structures/shared-texture-transfer.md
Normal file
10
docs/api/structures/shared-texture-transfer.md
Normal file
@@ -0,0 +1,10 @@
|
||||
# SharedTextureTransfer Object
|
||||
|
||||
* `transfer` string _Readonly_ - The opaque transfer data of the shared texture. This can be transferred across Electron processes.
|
||||
* `syncToken` string _Readonly_ - The opaque sync token data for frame creation.
|
||||
* `pixelFormat` string _Readonly_ - The pixel format of the transferring texture.
|
||||
* `codedSize` [Size](size.md) _Readonly_ - The full dimensions of the shared texture.
|
||||
* `visibleRect` [Rectangle](rectangle.md) _Readonly_ - A subsection of [0, 0, codedSize.width(), codedSize.height()]. In common cases, it is the full section area.
|
||||
* `timestamp` number _Readonly_ - A timestamp in microseconds that will be reflected to `VideoFrame`.
|
||||
|
||||
Use `sharedTexture.subtle.finishTransferSharedTexture` to get [`SharedTextureImportedSubtle`](shared-texture-imported-subtle.md) back.
|
||||
@@ -52,6 +52,7 @@ auto_filenames = {
|
||||
"docs/api/service-workers.md",
|
||||
"docs/api/session.md",
|
||||
"docs/api/share-menu.md",
|
||||
"docs/api/shared-texture.md",
|
||||
"docs/api/shell.md",
|
||||
"docs/api/structures",
|
||||
"docs/api/system-preferences.md",
|
||||
@@ -145,6 +146,12 @@ auto_filenames = {
|
||||
"docs/api/structures/shared-dictionary-info.md",
|
||||
"docs/api/structures/shared-dictionary-usage-info.md",
|
||||
"docs/api/structures/shared-texture-handle.md",
|
||||
"docs/api/structures/shared-texture-import-texture-info.md",
|
||||
"docs/api/structures/shared-texture-imported-subtle.md",
|
||||
"docs/api/structures/shared-texture-imported.md",
|
||||
"docs/api/structures/shared-texture-subtle.md",
|
||||
"docs/api/structures/shared-texture-sync-token.md",
|
||||
"docs/api/structures/shared-texture-transfer.md",
|
||||
"docs/api/structures/shared-worker-info.md",
|
||||
"docs/api/structures/sharing-item.md",
|
||||
"docs/api/structures/shortcut-details.md",
|
||||
@@ -176,6 +183,7 @@ auto_filenames = {
|
||||
"lib/renderer/api/context-bridge.ts",
|
||||
"lib/renderer/api/crash-reporter.ts",
|
||||
"lib/renderer/api/ipc-renderer.ts",
|
||||
"lib/renderer/api/shared-texture.ts",
|
||||
"lib/renderer/api/web-frame.ts",
|
||||
"lib/renderer/api/web-utils.ts",
|
||||
"lib/renderer/common-init.ts",
|
||||
@@ -257,6 +265,7 @@ auto_filenames = {
|
||||
"lib/browser/api/service-worker-main.ts",
|
||||
"lib/browser/api/session.ts",
|
||||
"lib/browser/api/share-menu.ts",
|
||||
"lib/browser/api/shared-texture.ts",
|
||||
"lib/browser/api/system-preferences.ts",
|
||||
"lib/browser/api/touch-bar.ts",
|
||||
"lib/browser/api/tray.ts",
|
||||
@@ -312,6 +321,7 @@ auto_filenames = {
|
||||
"lib/renderer/api/exports/electron.ts",
|
||||
"lib/renderer/api/ipc-renderer.ts",
|
||||
"lib/renderer/api/module-list.ts",
|
||||
"lib/renderer/api/shared-texture.ts",
|
||||
"lib/renderer/api/web-frame.ts",
|
||||
"lib/renderer/api/web-utils.ts",
|
||||
"lib/renderer/common-init.ts",
|
||||
@@ -352,6 +362,7 @@ auto_filenames = {
|
||||
"lib/renderer/api/exports/electron.ts",
|
||||
"lib/renderer/api/ipc-renderer.ts",
|
||||
"lib/renderer/api/module-list.ts",
|
||||
"lib/renderer/api/shared-texture.ts",
|
||||
"lib/renderer/api/web-frame.ts",
|
||||
"lib/renderer/api/web-utils.ts",
|
||||
"lib/renderer/ipc-renderer-bindings.ts",
|
||||
|
||||
@@ -563,6 +563,7 @@ filenames = {
|
||||
"shell/common/api/electron_api_native_image.cc",
|
||||
"shell/common/api/electron_api_native_image.h",
|
||||
"shell/common/api/electron_api_net.cc",
|
||||
"shell/common/api/electron_api_shared_texture.cc",
|
||||
"shell/common/api/electron_api_shell.cc",
|
||||
"shell/common/api/electron_api_testing.cc",
|
||||
"shell/common/api/electron_api_url_loader.cc",
|
||||
|
||||
@@ -31,6 +31,7 @@ export const browserModuleList: ElectronInternal.ModuleEntry[] = [
|
||||
{ name: 'screen', loader: () => require('./screen') },
|
||||
{ name: 'ServiceWorkerMain', loader: () => require('./service-worker-main') },
|
||||
{ name: 'session', loader: () => require('./session') },
|
||||
{ name: 'sharedTexture', loader: () => require('./shared-texture') },
|
||||
{ name: 'ShareMenu', loader: () => require('./share-menu') },
|
||||
{ name: 'systemPreferences', loader: () => require('./system-preferences') },
|
||||
{ name: 'TouchBar', loader: () => require('./touch-bar') },
|
||||
|
||||
191
lib/browser/api/shared-texture.ts
Normal file
191
lib/browser/api/shared-texture.ts
Normal file
@@ -0,0 +1,191 @@
|
||||
import ipcMain from '@electron/internal/browser/api/ipc-main';
|
||||
import * as ipcMainInternalUtils from '@electron/internal/browser/ipc-main-internal-utils';
|
||||
import { IPC_MESSAGES } from '@electron/internal/common/ipc-messages';
|
||||
|
||||
import { randomUUID } from 'crypto';
|
||||
|
||||
const transferTimeout = 1000;
|
||||
const sharedTextureNative = process._linkedBinding('electron_common_shared_texture');
|
||||
const managedSharedTextures = new Map<string, SharedTextureImportedWrapper>();
|
||||
|
||||
type AllReleasedCallback = (imported: Electron.SharedTextureImported) => void;
|
||||
|
||||
type SharedTextureImportedWrapper = {
|
||||
texture: Electron.SharedTextureImported;
|
||||
allReferencesReleased: AllReleasedCallback | undefined;
|
||||
mainReference: boolean;
|
||||
rendererFrameReferences: Map<number, { count: number, reference: Electron.WebFrameMain }>;
|
||||
}
|
||||
|
||||
ipcMain.handle(IPC_MESSAGES.IMPORT_SHARED_TEXTURE_RELEASE_RENDERER_TO_MAIN, (event: Electron.IpcMainInvokeEvent, textureId: string) => {
|
||||
const frameTreeNodeId = event.frameTreeNodeId ?? event.sender.mainFrame.frameTreeNodeId;
|
||||
wrapperReleaseFromRenderer(textureId, frameTreeNodeId);
|
||||
});
|
||||
|
||||
let checkManagedSharedTexturesInterval: NodeJS.Timeout | null = null;
|
||||
|
||||
function scheduleCheckManagedSharedTextures () {
|
||||
if (checkManagedSharedTexturesInterval === null) {
|
||||
checkManagedSharedTexturesInterval = setInterval(checkManagedSharedTextures, 1000);
|
||||
}
|
||||
}
|
||||
|
||||
function unscheduleCheckManagedSharedTextures () {
|
||||
if (checkManagedSharedTexturesInterval !== null) {
|
||||
clearInterval(checkManagedSharedTexturesInterval);
|
||||
checkManagedSharedTexturesInterval = null;
|
||||
}
|
||||
}
|
||||
|
||||
function checkManagedSharedTextures () {
|
||||
const texturesToRemoveTracking = new Set<string>();
|
||||
for (const [, wrapper] of managedSharedTextures) {
|
||||
for (const [frameTreeNodeId, entry] of wrapper.rendererFrameReferences) {
|
||||
const frame = entry.reference;
|
||||
if (!frame || frame.isDestroyed()) {
|
||||
console.error(`The imported shared texture ${wrapper.texture.textureId} is referenced by a destroyed webContent/webFrameMain, this means a imported shared texture in renderer process is not released before the process is exited. Releasing that dangling reference now.`);
|
||||
wrapper.rendererFrameReferences.delete(frameTreeNodeId);
|
||||
}
|
||||
}
|
||||
|
||||
if (wrapper.rendererFrameReferences.size === 0 && !wrapper.mainReference) {
|
||||
texturesToRemoveTracking.add(wrapper.texture.textureId);
|
||||
wrapper.texture.subtle.release(() => {
|
||||
wrapper.allReferencesReleased?.(wrapper.texture);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
for (const id of texturesToRemoveTracking) {
|
||||
managedSharedTextures.delete(id);
|
||||
}
|
||||
|
||||
if (managedSharedTextures.size === 0) {
|
||||
unscheduleCheckManagedSharedTextures();
|
||||
}
|
||||
}
|
||||
|
||||
function wrapperReleaseFromRenderer (id: string, frameTreeNodeId: number) {
|
||||
const wrapper = managedSharedTextures.get(id);
|
||||
if (!wrapper) {
|
||||
throw new Error(`Shared texture with id ${id} not found`);
|
||||
}
|
||||
|
||||
const entry = wrapper.rendererFrameReferences.get(frameTreeNodeId);
|
||||
if (!entry) {
|
||||
throw new Error(`Shared texture ${id} is not referenced by renderer frame ${frameTreeNodeId}`);
|
||||
}
|
||||
|
||||
entry.count -= 1;
|
||||
if (entry.count === 0) {
|
||||
wrapper.rendererFrameReferences.delete(frameTreeNodeId);
|
||||
} else {
|
||||
wrapper.rendererFrameReferences.set(frameTreeNodeId, entry);
|
||||
}
|
||||
|
||||
// Actually release the texture if no one is referencing it
|
||||
if (wrapper.rendererFrameReferences.size === 0 && !wrapper.mainReference) {
|
||||
managedSharedTextures.delete(id);
|
||||
wrapper.texture.subtle.release(() => {
|
||||
wrapper.allReferencesReleased?.(wrapper.texture);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
function wrapperReleaseFromMain (id: string) {
|
||||
const wrapper = managedSharedTextures.get(id);
|
||||
if (!wrapper) {
|
||||
throw new Error(`Shared texture with id ${id} not found`);
|
||||
}
|
||||
|
||||
// Actually release the texture if no one is referencing it
|
||||
wrapper.mainReference = false;
|
||||
if (wrapper.rendererFrameReferences.size === 0) {
|
||||
managedSharedTextures.delete(id);
|
||||
wrapper.texture.subtle.release(() => {
|
||||
wrapper.allReferencesReleased?.(wrapper.texture);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
async function sendSharedTexture (options: Electron.SendSharedTextureOptions, ...args: any[]) {
|
||||
const imported = options.importedSharedTexture;
|
||||
const transfer = imported.subtle.startTransferSharedTexture();
|
||||
|
||||
let timeoutHandle: NodeJS.Timeout | null = null;
|
||||
const timeoutPromise = new Promise<never>((resolve, reject) => {
|
||||
timeoutHandle = setTimeout(() => {
|
||||
reject(new Error(`transfer shared texture timed out after ${transferTimeout}ms, ensure you have registered receiver at renderer process.`));
|
||||
}, transferTimeout);
|
||||
});
|
||||
|
||||
const targetFrame: Electron.WebFrameMain | undefined = options.frame;
|
||||
if (!targetFrame) {
|
||||
throw new Error('`frame` should be provided');
|
||||
}
|
||||
|
||||
const invokePromise: Promise<Electron.SharedTextureSyncToken> = ipcMainInternalUtils.invokeInWebFrameMain<Electron.SharedTextureSyncToken>(
|
||||
targetFrame,
|
||||
IPC_MESSAGES.IMPORT_SHARED_TEXTURE_TRANSFER_MAIN_TO_RENDERER,
|
||||
transfer,
|
||||
imported.textureId,
|
||||
...args
|
||||
);
|
||||
|
||||
try {
|
||||
const syncToken = await Promise.race([invokePromise, timeoutPromise]);
|
||||
imported.subtle.setReleaseSyncToken(syncToken);
|
||||
|
||||
const wrapper = managedSharedTextures.get(imported.textureId);
|
||||
if (!wrapper) {
|
||||
throw new Error(`Shared texture with id ${imported.textureId} not found`);
|
||||
}
|
||||
|
||||
const key = targetFrame.frameTreeNodeId;
|
||||
const existing = wrapper.rendererFrameReferences.get(key);
|
||||
if (existing) {
|
||||
existing.count += 1;
|
||||
wrapper.rendererFrameReferences.set(key, existing);
|
||||
} else {
|
||||
wrapper.rendererFrameReferences.set(key, { count: 1, reference: targetFrame });
|
||||
}
|
||||
} finally {
|
||||
if (timeoutHandle) {
|
||||
clearTimeout(timeoutHandle);
|
||||
}
|
||||
}
|
||||
|
||||
// Schedule a check to see if any texture is referenced by any dangling renderer
|
||||
scheduleCheckManagedSharedTextures();
|
||||
}
|
||||
|
||||
function importSharedTexture (options: Electron.ImportSharedTextureOptions) {
|
||||
const id = randomUUID();
|
||||
const imported = sharedTextureNative.importSharedTexture(Object.assign(options.textureInfo, { id }));
|
||||
const ret: Electron.SharedTextureImported = {
|
||||
textureId: id,
|
||||
subtle: imported,
|
||||
getVideoFrame: imported.getVideoFrame,
|
||||
release: () => {
|
||||
wrapperReleaseFromMain(id);
|
||||
}
|
||||
};
|
||||
|
||||
const wrapper: SharedTextureImportedWrapper = {
|
||||
texture: ret,
|
||||
allReferencesReleased: options.allReferencesReleased,
|
||||
mainReference: true,
|
||||
rendererFrameReferences: new Map()
|
||||
};
|
||||
managedSharedTextures.set(id, wrapper);
|
||||
|
||||
return ret;
|
||||
}
|
||||
|
||||
const sharedTexture = {
|
||||
subtle: sharedTextureNative,
|
||||
importSharedTexture,
|
||||
sendSharedTexture
|
||||
};
|
||||
|
||||
export default sharedTexture;
|
||||
@@ -36,3 +36,27 @@ export function invokeInWebContents<T> (sender: Electron.WebContents, command: s
|
||||
sender._sendInternal(command, requestId, ...args);
|
||||
});
|
||||
}
|
||||
|
||||
export function invokeInWebFrameMain<T> (sender: Electron.WebFrameMain, command: string, ...args: any[]) {
|
||||
return new Promise<T>((resolve, reject) => {
|
||||
const requestId = ++nextId;
|
||||
const channel = `${command}_RESPONSE_${requestId}`;
|
||||
const frameTreeNodeId = sender.frameTreeNodeId;
|
||||
ipcMainInternal.on(channel, function handler (event, error: Error, result: any) {
|
||||
if (event.type === 'frame' && event.frameTreeNodeId !== frameTreeNodeId) {
|
||||
console.error(`Reply to ${command} sent by unexpected WebFrameMain (${event.frameTreeNodeId})`);
|
||||
return;
|
||||
}
|
||||
|
||||
ipcMainInternal.removeListener(channel, handler);
|
||||
|
||||
if (error) {
|
||||
reject(error);
|
||||
} else {
|
||||
resolve(result);
|
||||
}
|
||||
});
|
||||
|
||||
sender._sendInternal(command, requestId, ...args);
|
||||
});
|
||||
}
|
||||
|
||||
@@ -25,4 +25,7 @@ export const enum IPC_MESSAGES {
|
||||
INSPECTOR_CONFIRM = 'INSPECTOR_CONFIRM',
|
||||
INSPECTOR_CONTEXT_MENU = 'INSPECTOR_CONTEXT_MENU',
|
||||
INSPECTOR_SELECT_FILE = 'INSPECTOR_SELECT_FILE',
|
||||
|
||||
IMPORT_SHARED_TEXTURE_TRANSFER_MAIN_TO_RENDERER = 'IMPORT_SHARED_TEXTURE_TRANSFER_MAIN_TO_RENDERER',
|
||||
IMPORT_SHARED_TEXTURE_RELEASE_RENDERER_TO_MAIN = 'IMPORT_SHARED_TEXTURE_RELEASE_RENDERER_TO_MAIN',
|
||||
}
|
||||
|
||||
@@ -4,6 +4,7 @@ export const rendererModuleList: ElectronInternal.ModuleEntry[] = [
|
||||
{ name: 'contextBridge', loader: () => require('./context-bridge') },
|
||||
{ name: 'crashReporter', loader: () => require('./crash-reporter') },
|
||||
{ name: 'ipcRenderer', loader: () => require('./ipc-renderer') },
|
||||
{ name: 'sharedTexture', loader: () => require('./shared-texture') },
|
||||
{ name: 'webFrame', loader: () => require('./web-frame') },
|
||||
{ name: 'webUtils', loader: () => require('./web-utils') }
|
||||
];
|
||||
|
||||
45
lib/renderer/api/shared-texture.ts
Normal file
45
lib/renderer/api/shared-texture.ts
Normal file
@@ -0,0 +1,45 @@
|
||||
import { IPC_MESSAGES } from '@electron/internal/common/ipc-messages';
|
||||
import ipcRenderer from '@electron/internal/renderer/api/ipc-renderer';
|
||||
import { ipcRendererInternal } from '@electron/internal/renderer/ipc-renderer-internal';
|
||||
|
||||
const sharedTextureNative = process._linkedBinding('electron_common_shared_texture');
|
||||
const transferChannelName = IPC_MESSAGES.IMPORT_SHARED_TEXTURE_TRANSFER_MAIN_TO_RENDERER;
|
||||
|
||||
type SharedTextureReceiverCallback = (data: Electron.ReceivedSharedTextureData, ...args: any[]) => Promise<void>;
|
||||
let sharedTextureReceiverCallback: SharedTextureReceiverCallback | null = null;
|
||||
|
||||
ipcRendererInternal.on(transferChannelName, async (event, requestId, ...args) => {
|
||||
const replyChannel = `${transferChannelName}_RESPONSE_${requestId}`;
|
||||
try {
|
||||
const transfer = args[0] as Electron.SharedTextureTransfer;
|
||||
const textureId = args[1] as string;
|
||||
const imported = sharedTextureNative.finishTransferSharedTexture(Object.assign(transfer, { id: textureId }));
|
||||
const syncToken = imported.getFrameCreationSyncToken();
|
||||
event.sender.send(replyChannel, null, syncToken);
|
||||
|
||||
const wrapper: Electron.SharedTextureImported = {
|
||||
textureId,
|
||||
subtle: imported,
|
||||
getVideoFrame: imported.getVideoFrame,
|
||||
release: () => {
|
||||
imported.release(async () => {
|
||||
await ipcRenderer.invoke(IPC_MESSAGES.IMPORT_SHARED_TEXTURE_RELEASE_RENDERER_TO_MAIN, textureId);
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const data: Electron.ReceivedSharedTextureData = { importedSharedTexture: wrapper };
|
||||
await sharedTextureReceiverCallback?.(data, ...args.slice(2));
|
||||
} catch (error) {
|
||||
event.sender.send(replyChannel, error);
|
||||
}
|
||||
});
|
||||
|
||||
const sharedTexture = {
|
||||
subtle: sharedTextureNative,
|
||||
setSharedTextureReceiver: (callback: SharedTextureReceiverCallback) => {
|
||||
sharedTextureReceiverCallback = callback;
|
||||
}
|
||||
};
|
||||
|
||||
export default sharedTexture;
|
||||
@@ -15,6 +15,10 @@ export const moduleList: ElectronInternal.ModuleEntry[] = [
|
||||
name: 'nativeImage',
|
||||
loader: () => require('@electron/internal/common/api/native-image')
|
||||
},
|
||||
{
|
||||
name: 'sharedTexture',
|
||||
loader: () => require('@electron/internal/renderer/api/shared-texture')
|
||||
},
|
||||
{
|
||||
name: 'webFrame',
|
||||
loader: () => require('@electron/internal/renderer/api/web-frame')
|
||||
|
||||
@@ -210,6 +210,15 @@ OffScreenRenderWidgetHostView::OffScreenRenderWidgetHostView(
|
||||
compositor_->SetDelegate(this);
|
||||
compositor_->SetRootLayer(root_layer_.get());
|
||||
|
||||
// For offscreen rendering with format rgbaf16, we need to set correct display
|
||||
// color spaces to the compositor, otherwise it won't support hdr.
|
||||
if (offscreen_use_shared_texture_ &&
|
||||
offscreen_shared_texture_pixel_format_ == "rgbaf16") {
|
||||
gfx::DisplayColorSpaces hdr_display_color_spaces(
|
||||
gfx::ColorSpace::CreateSRGBLinear(), viz::SinglePlaneFormat::kRGBA_F16);
|
||||
compositor_->SetDisplayColorSpaces(hdr_display_color_spaces);
|
||||
}
|
||||
|
||||
ResizeRootLayer(false);
|
||||
|
||||
render_widget_host_->SetView(this);
|
||||
|
||||
810
shell/common/api/electron_api_shared_texture.cc
Normal file
810
shell/common/api/electron_api_shared_texture.cc
Normal file
@@ -0,0 +1,810 @@
|
||||
// Copyright (c) 2025 Reito <reito@chromium.org>
|
||||
// Use of this source code is governed by the MIT license that can be
|
||||
// found in the LICENSE file.
|
||||
|
||||
#include "shell/common/api/electron_api_shared_texture.h"
|
||||
|
||||
#include "base/base64.h"
|
||||
#include "base/command_line.h"
|
||||
#include "base/numerics/byte_conversions.h"
|
||||
#include "components/viz/common/resources/shared_image_format_utils.h"
|
||||
#include "content/browser/compositor/image_transport_factory.h" // nogncheck
|
||||
#include "gpu/command_buffer/client/context_support.h"
|
||||
#include "gpu/ipc/client/client_shared_image_interface.h"
|
||||
#include "gpu/ipc/client/gpu_channel_host.h"
|
||||
#include "gpu/ipc/common/exported_shared_image.mojom-shared.h"
|
||||
#include "gpu/ipc/common/exported_shared_image_mojom_traits.h"
|
||||
#include "media/base/format_utils.h"
|
||||
#include "media/base/video_frame.h"
|
||||
#include "media/mojo/mojom/video_frame_mojom_traits.h"
|
||||
#include "shell/common/gin_converters/blink_converter.h"
|
||||
#include "shell/common/gin_converters/callback_converter.h"
|
||||
#include "shell/common/gin_converters/gfx_converter.h"
|
||||
#include "shell/common/gin_helper/dictionary.h"
|
||||
#include "shell/common/gin_helper/error_thrower.h"
|
||||
#include "shell/common/node_includes.h"
|
||||
#include "shell/common/node_util.h"
|
||||
#include "third_party/blink/renderer/modules/webcodecs/video_frame.h" // nogncheck
|
||||
#include "third_party/blink/renderer/platform/graphics/gpu/shared_gpu_context.h" // nogncheck
|
||||
#include "ui/compositor/compositor.h"
|
||||
|
||||
#if BUILDFLAG(IS_LINUX)
|
||||
#include "base/posix/eintr_wrapper.h"
|
||||
#include "base/strings/string_number_conversions.h"
|
||||
#endif
|
||||
|
||||
namespace {
|
||||
|
||||
bool IsBrowserProcess() {
|
||||
static int is_browser_process = -1;
|
||||
if (is_browser_process == -1) {
|
||||
// Browser process does not specify a type.
|
||||
is_browser_process = base::CommandLine::ForCurrentProcess()
|
||||
->GetSwitchValueASCII("type")
|
||||
.empty();
|
||||
}
|
||||
|
||||
return is_browser_process == 1;
|
||||
}
|
||||
|
||||
gpu::ContextSupport* GetContextSupport() {
|
||||
if (IsBrowserProcess()) {
|
||||
auto* factory = content::ImageTransportFactory::GetInstance();
|
||||
return factory->GetContextFactory()
|
||||
->SharedMainThreadRasterContextProvider()
|
||||
->ContextSupport();
|
||||
} else {
|
||||
return blink::SharedGpuContext::ContextProviderWrapper()
|
||||
->ContextProvider()
|
||||
.ContextSupport();
|
||||
}
|
||||
}
|
||||
|
||||
gpu::SharedImageInterface* GetSharedImageInterface() {
|
||||
if (IsBrowserProcess()) {
|
||||
auto* factory = content::ImageTransportFactory::GetInstance();
|
||||
return factory->GetContextFactory()
|
||||
->SharedMainThreadRasterContextProvider()
|
||||
->SharedImageInterface();
|
||||
} else {
|
||||
return blink::SharedGpuContext::SharedImageInterfaceProvider()
|
||||
->SharedImageInterface();
|
||||
}
|
||||
}
|
||||
|
||||
std::string GetBase64StringFromSyncToken(gpu::SyncToken& sync_token) {
|
||||
if (!sync_token.verified_flush()) {
|
||||
auto* sii = GetSharedImageInterface();
|
||||
sii->VerifySyncToken(sync_token);
|
||||
}
|
||||
|
||||
auto sync_token_data = base::Base64Encode(UNSAFE_BUFFERS(
|
||||
base::span(reinterpret_cast<uint8_t*>(sync_token.GetData()),
|
||||
sizeof(gpu::SyncToken))));
|
||||
|
||||
return sync_token_data;
|
||||
}
|
||||
|
||||
gpu::SyncToken GetSyncTokenFromBase64String(
|
||||
const std::string& sync_token_data) {
|
||||
if (sync_token_data.empty()) {
|
||||
LOG(ERROR) << "Sync token data is empty.";
|
||||
return {};
|
||||
}
|
||||
|
||||
auto sync_token_bytes = base::Base64Decode(sync_token_data);
|
||||
if (!sync_token_bytes.has_value()) {
|
||||
LOG(ERROR) << "Failed to decode sync token from base64 string.";
|
||||
return {};
|
||||
}
|
||||
|
||||
if (sync_token_bytes->size() != sizeof(gpu::SyncToken)) {
|
||||
LOG(ERROR) << "Invalid sync token size: " << sync_token_bytes->size();
|
||||
return {};
|
||||
}
|
||||
|
||||
base::span<const uint8_t> sync_token_span = UNSAFE_BUFFERS(
|
||||
base::span(sync_token_bytes->data(), sync_token_bytes->size()));
|
||||
auto* sync_token_source =
|
||||
reinterpret_cast<const gpu::SyncToken*>(sync_token_span.data());
|
||||
gpu::SyncToken sync_token(sync_token_source->namespace_id(),
|
||||
sync_token_source->command_buffer_id(),
|
||||
sync_token_source->release_count());
|
||||
|
||||
if (sync_token_source->verified_flush()) {
|
||||
sync_token.SetVerifyFlush();
|
||||
}
|
||||
|
||||
return sync_token;
|
||||
}
|
||||
|
||||
std::string TransferVideoPixelFormatToString(media::VideoPixelFormat format) {
|
||||
switch (format) {
|
||||
case media::PIXEL_FORMAT_ARGB:
|
||||
return "bgra";
|
||||
case media::PIXEL_FORMAT_ABGR:
|
||||
return "rgba";
|
||||
case media::PIXEL_FORMAT_RGBAF16:
|
||||
return "rgbaf16";
|
||||
default:
|
||||
NOTREACHED();
|
||||
}
|
||||
}
|
||||
|
||||
struct ImportedSharedTexture
|
||||
: base::RefCountedThreadSafe<ImportedSharedTexture> {
|
||||
// Metadata
|
||||
gfx::Size coded_size;
|
||||
gfx::Rect visible_rect;
|
||||
int64_t timestamp;
|
||||
media::VideoPixelFormat pixel_format;
|
||||
|
||||
// Holds a reference to prevent it from being destroyed.
|
||||
scoped_refptr<gpu::ClientSharedImage> client_shared_image;
|
||||
gpu::SyncToken frame_creation_sync_token;
|
||||
base::Lock release_sync_token_lock_;
|
||||
gpu::SyncToken release_sync_token GUARDED_BY(release_sync_token_lock_);
|
||||
base::OnceClosure release_callback;
|
||||
|
||||
// Texture id for printing warnings like GC check.
|
||||
std::string id;
|
||||
|
||||
void UpdateReleaseSyncToken(const gpu::SyncToken& token);
|
||||
void SetupReleaseSyncTokenCallback();
|
||||
|
||||
// Transfer to other Chromium processes.
|
||||
v8::Local<v8::Value> StartTransferSharedTexture(v8::Isolate* isolate);
|
||||
|
||||
// Get the creation sync token for the shared image. This is called
|
||||
// after |finishTransferSharedTexture| and users need to pass the
|
||||
// sync token back to the source object, and call |setReleaseSyncToken|
|
||||
// to prevent the resource being released before actual acquisition
|
||||
// happens for the target object.
|
||||
v8::Local<v8::Value> GetFrameCreationSyncToken(v8::Isolate* isolate);
|
||||
|
||||
// Set a release sync token for this shared texture. This is set
|
||||
// when |finishTransferSharedTexture| is called and prevent the source
|
||||
// object release the underlying resource before the target object
|
||||
// actually done acquiring the resource at gpu process.
|
||||
// Note that this is optional to set, if not set, users can still use
|
||||
// `release()` and use the callback to wait for the signal that source
|
||||
// object is safe to do the further release. If set, users can call
|
||||
// `release()` on the source object without worrying the target object
|
||||
// didn't finish acquiring the resource at gpu process.
|
||||
void SetReleaseSyncToken(v8::Isolate* isolate, v8::Local<v8::Value> options);
|
||||
|
||||
// The cleanup happens at destructor.
|
||||
private:
|
||||
friend class base::RefCountedThreadSafe<ImportedSharedTexture>;
|
||||
~ImportedSharedTexture();
|
||||
};
|
||||
|
||||
// Wraps the structure so that it can be ref counted.
|
||||
struct ImportedSharedTextureWrapper {
|
||||
// Make the import shared texture wrapper ref counted, so that it can be
|
||||
// held by multiple VideoFrames.
|
||||
scoped_refptr<ImportedSharedTexture> ist;
|
||||
|
||||
// Monitor garbage collection.
|
||||
std::unique_ptr<v8::Persistent<v8::Value>> persistent_;
|
||||
void ResetPersistent() const { persistent_->Reset(); }
|
||||
v8::Persistent<v8::Value>* CreatePersistent(v8::Isolate* isolate,
|
||||
v8::Local<v8::Value> value) {
|
||||
persistent_ = std::make_unique<v8::Persistent<v8::Value>>(isolate, value);
|
||||
return persistent_.get();
|
||||
}
|
||||
|
||||
// Create VideoFrame from the shared image.
|
||||
v8::Local<v8::Value> CreateVideoFrame(v8::Isolate* isolate);
|
||||
|
||||
// Release the shared image.
|
||||
bool IsReferenceReleased() const { return ist.get() == nullptr; }
|
||||
void ReleaseReference();
|
||||
};
|
||||
|
||||
void ImportedSharedTextureWrapper::ReleaseReference() {
|
||||
if (IsReferenceReleased()) {
|
||||
LOG(ERROR) << "This imported shared texture is already released.";
|
||||
return;
|
||||
}
|
||||
|
||||
// Drop the reference, if at lease one VideoFrame is still holding it,
|
||||
// the final clean up will wait for them.
|
||||
ist.reset();
|
||||
}
|
||||
|
||||
// This function will be called when the VideoFrame is destructed.
|
||||
void OnVideoFrameMailboxReleased(
|
||||
const scoped_refptr<ImportedSharedTexture>& ist,
|
||||
const gpu::SyncToken& sync_token) {
|
||||
ist->UpdateReleaseSyncToken(sync_token);
|
||||
}
|
||||
|
||||
v8::Local<v8::Value> ImportedSharedTextureWrapper::CreateVideoFrame(
|
||||
v8::Isolate* isolate) {
|
||||
auto* current_script_state = blink::ScriptState::ForCurrentRealm(isolate);
|
||||
auto* current_execution_context =
|
||||
blink::ToExecutionContext(current_script_state);
|
||||
|
||||
auto si = ist->client_shared_image;
|
||||
auto cb = base::BindOnce(OnVideoFrameMailboxReleased, ist);
|
||||
|
||||
scoped_refptr<media::VideoFrame> raw_frame =
|
||||
media::VideoFrame::WrapSharedImage(
|
||||
ist->pixel_format, si, ist->frame_creation_sync_token, std::move(cb),
|
||||
ist->coded_size, ist->visible_rect, ist->coded_size,
|
||||
base::Microseconds(ist->timestamp));
|
||||
|
||||
raw_frame->set_color_space(si->color_space());
|
||||
|
||||
blink::VideoFrame* frame = blink::MakeGarbageCollected<blink::VideoFrame>(
|
||||
raw_frame, current_execution_context);
|
||||
return blink::ToV8Traits<blink::VideoFrame>::ToV8(current_script_state,
|
||||
frame);
|
||||
}
|
||||
|
||||
v8::Local<v8::Value> ImportedSharedTexture::StartTransferSharedTexture(
|
||||
v8::Isolate* isolate) {
|
||||
auto exported = client_shared_image->Export();
|
||||
|
||||
// Use mojo to serialize the exported shared image.
|
||||
mojo::Message message(0, 0, MOJO_CREATE_MESSAGE_FLAG_UNLIMITED_SIZE, 0);
|
||||
mojo::internal::MessageFragment<
|
||||
gpu::mojom::internal::ExportedSharedImage_Data>
|
||||
data(message);
|
||||
data.Allocate();
|
||||
mojo::internal::Serializer<gpu::mojom::ExportedSharedImageDataView,
|
||||
gpu::ExportedSharedImage>::Serialize(exported,
|
||||
data);
|
||||
|
||||
auto encoded = base::Base64Encode(UNSAFE_BUFFERS(
|
||||
base::span(message.payload(), message.payload_num_bytes())));
|
||||
gin_helper::Dictionary root(isolate, v8::Object::New(isolate));
|
||||
root.SetReadOnly("transfer", encoded);
|
||||
|
||||
auto sync_token = GetBase64StringFromSyncToken(frame_creation_sync_token);
|
||||
root.SetReadOnly("syncToken", sync_token);
|
||||
|
||||
root.SetReadOnly("pixelFormat",
|
||||
TransferVideoPixelFormatToString(pixel_format));
|
||||
root.SetReadOnly("codedSize", coded_size);
|
||||
root.SetReadOnly("visibleRect", visible_rect);
|
||||
root.SetReadOnly("timestamp", timestamp);
|
||||
|
||||
return gin::ConvertToV8(isolate, root);
|
||||
}
|
||||
|
||||
v8::Local<v8::Value> ImportedSharedTexture::GetFrameCreationSyncToken(
|
||||
v8::Isolate* isolate) {
|
||||
gin::Dictionary root(isolate, v8::Object::New(isolate));
|
||||
|
||||
auto sync_token = GetBase64StringFromSyncToken(frame_creation_sync_token);
|
||||
root.Set("syncToken", sync_token);
|
||||
|
||||
return gin::ConvertToV8(isolate, root);
|
||||
}
|
||||
|
||||
void ImportedSharedTexture::SetReleaseSyncToken(v8::Isolate* isolate,
|
||||
v8::Local<v8::Value> options) {
|
||||
std::string sync_token_data;
|
||||
gin::Dictionary dict(isolate, options.As<v8::Object>());
|
||||
dict.Get("syncToken", &sync_token_data);
|
||||
|
||||
auto sync_token = GetSyncTokenFromBase64String(sync_token_data);
|
||||
UpdateReleaseSyncToken(sync_token);
|
||||
}
|
||||
|
||||
ImportedSharedTexture::~ImportedSharedTexture() {
|
||||
// When nothing holds this, 1) all VideoFrames are destructed and the
|
||||
// release_sync_token have been updated; 2) the user called `release()`
|
||||
// explicitly. This is destructed and the final clean up is started.
|
||||
SetupReleaseSyncTokenCallback();
|
||||
client_shared_image.reset();
|
||||
}
|
||||
|
||||
void ImportedSharedTexture::UpdateReleaseSyncToken(
|
||||
const gpu::SyncToken& token) {
|
||||
base::AutoLock locker(release_sync_token_lock_);
|
||||
|
||||
auto* sii = GetSharedImageInterface();
|
||||
if (release_sync_token.HasData()) {
|
||||
// If we already have a release sync token, we need to wait for it
|
||||
// to be signaled before we can set the new one.
|
||||
sii->WaitSyncToken(release_sync_token);
|
||||
}
|
||||
|
||||
// Set the new release sync token to use at last.
|
||||
release_sync_token = token;
|
||||
}
|
||||
|
||||
void ImportedSharedTexture::SetupReleaseSyncTokenCallback() {
|
||||
base::AutoLock locker(release_sync_token_lock_);
|
||||
|
||||
auto* sii = GetSharedImageInterface();
|
||||
if (!release_sync_token.HasData()) {
|
||||
release_sync_token = sii->GenUnverifiedSyncToken();
|
||||
}
|
||||
|
||||
client_shared_image->UpdateDestructionSyncToken(release_sync_token);
|
||||
|
||||
if (release_callback) {
|
||||
GetContextSupport()->SignalSyncToken(release_sync_token,
|
||||
std::move(release_callback));
|
||||
}
|
||||
}
|
||||
|
||||
void PersistentCallbackPass1(
|
||||
const v8::WeakCallbackInfo<ImportedSharedTextureWrapper>& data) {
|
||||
auto* wrapper = data.GetParameter();
|
||||
// The |wrapper->ist| must be valid here, as we are a holder of it.
|
||||
if (!wrapper->IsReferenceReleased()) {
|
||||
// Emit a warning when the user didn't properly manually release the
|
||||
// texture.
|
||||
LOG(ERROR) << "The imported shared texture " << wrapper->ist->id
|
||||
<< " was garbage collected before calling `release()`. You have "
|
||||
"to manually release the resource once you're done with it.";
|
||||
|
||||
// Release it for user here.
|
||||
wrapper->ReleaseReference();
|
||||
}
|
||||
// We are responsible for resetting the persistent handle.
|
||||
wrapper->ResetPersistent();
|
||||
// Finally, release the import monitor;
|
||||
delete wrapper;
|
||||
}
|
||||
|
||||
void ImportedTextureGetVideoFrame(
|
||||
const v8::FunctionCallbackInfo<v8::Value>& info) {
|
||||
auto* isolate = info.GetIsolate();
|
||||
auto* wrapper = static_cast<ImportedSharedTextureWrapper*>(
|
||||
info.Data().As<v8::External>()->Value());
|
||||
|
||||
if (wrapper->IsReferenceReleased()) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"The shared texture has been released.");
|
||||
return;
|
||||
}
|
||||
|
||||
if (IsBrowserProcess()) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"The VideoFrame cannot be created at current process.");
|
||||
return;
|
||||
}
|
||||
|
||||
auto ret = wrapper->CreateVideoFrame(isolate);
|
||||
info.GetReturnValue().Set(ret);
|
||||
}
|
||||
|
||||
void ImportedTextureStartTransferSharedTexture(
|
||||
const v8::FunctionCallbackInfo<v8::Value>& info) {
|
||||
auto* isolate = info.GetIsolate();
|
||||
auto* wrapper = static_cast<ImportedSharedTextureWrapper*>(
|
||||
info.Data().As<v8::External>()->Value());
|
||||
|
||||
if (wrapper->IsReferenceReleased()) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"The shared texture has been released.");
|
||||
return;
|
||||
}
|
||||
|
||||
auto ret = wrapper->ist->StartTransferSharedTexture(isolate);
|
||||
info.GetReturnValue().Set(ret);
|
||||
}
|
||||
|
||||
void ImportedTextureRelease(const v8::FunctionCallbackInfo<v8::Value>& info) {
|
||||
auto* wrapper = static_cast<ImportedSharedTextureWrapper*>(
|
||||
info.Data().As<v8::External>()->Value());
|
||||
|
||||
auto cb = info[0];
|
||||
if (cb->IsFunction()) {
|
||||
auto* isolate = info.GetIsolate();
|
||||
gin::ConvertFromV8(isolate, cb, &wrapper->ist->release_callback);
|
||||
}
|
||||
|
||||
// Release the shared texture, so that future frames can be generated.
|
||||
wrapper->ReleaseReference();
|
||||
|
||||
// Release of the wrapper happens at GC persistent callback.
|
||||
// Release of the |ist| happens when nothing holds a reference to it.
|
||||
}
|
||||
|
||||
void ImportedTextureGetFrameCreationSyncToken(
|
||||
const v8::FunctionCallbackInfo<v8::Value>& info) {
|
||||
auto* isolate = info.GetIsolate();
|
||||
auto* wrapper = static_cast<ImportedSharedTextureWrapper*>(
|
||||
info.Data().As<v8::External>()->Value());
|
||||
|
||||
if (wrapper->IsReferenceReleased()) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"The shared texture has been released.");
|
||||
return;
|
||||
}
|
||||
|
||||
auto ret = wrapper->ist->GetFrameCreationSyncToken(isolate);
|
||||
info.GetReturnValue().Set(ret);
|
||||
}
|
||||
|
||||
void ImportedTextureSetReleaseSyncToken(
|
||||
const v8::FunctionCallbackInfo<v8::Value>& info) {
|
||||
auto* isolate = info.GetIsolate();
|
||||
auto* wrapper = static_cast<ImportedSharedTextureWrapper*>(
|
||||
info.Data().As<v8::External>()->Value());
|
||||
|
||||
if (wrapper->IsReferenceReleased()) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"The shared texture has been released.");
|
||||
return;
|
||||
}
|
||||
|
||||
if (info.Length() < 1 || !info[0]->IsObject()) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"Expected an options object with a syncToken property.");
|
||||
return;
|
||||
}
|
||||
|
||||
wrapper->ist->SetReleaseSyncToken(isolate, info[0].As<v8::Object>());
|
||||
}
|
||||
|
||||
v8::Local<v8::Value> CreateImportedSharedTextureFromSharedImage(
|
||||
v8::Isolate* isolate,
|
||||
ImportedSharedTexture* imported) {
|
||||
auto* wrapper = new ImportedSharedTextureWrapper();
|
||||
wrapper->ist = base::WrapRefCounted(imported);
|
||||
|
||||
auto imported_wrapped = v8::External::New(isolate, wrapper);
|
||||
gin::Dictionary root(isolate, v8::Object::New(isolate));
|
||||
|
||||
auto releaser = v8::Function::New(isolate->GetCurrentContext(),
|
||||
ImportedTextureRelease, imported_wrapped)
|
||||
.ToLocalChecked();
|
||||
|
||||
auto get_video_frame =
|
||||
v8::Function::New(isolate->GetCurrentContext(),
|
||||
ImportedTextureGetVideoFrame, imported_wrapped)
|
||||
.ToLocalChecked();
|
||||
|
||||
auto start_transfer =
|
||||
v8::Function::New(isolate->GetCurrentContext(),
|
||||
ImportedTextureStartTransferSharedTexture,
|
||||
imported_wrapped)
|
||||
.ToLocalChecked();
|
||||
|
||||
auto get_frame_creation_sync_token =
|
||||
v8::Function::New(isolate->GetCurrentContext(),
|
||||
ImportedTextureGetFrameCreationSyncToken,
|
||||
imported_wrapped)
|
||||
.ToLocalChecked();
|
||||
|
||||
auto set_release_sync_token =
|
||||
v8::Function::New(isolate->GetCurrentContext(),
|
||||
ImportedTextureSetReleaseSyncToken, imported_wrapped)
|
||||
.ToLocalChecked();
|
||||
|
||||
root.Set("release", releaser);
|
||||
root.Set("getVideoFrame", get_video_frame);
|
||||
root.Set("startTransferSharedTexture", start_transfer);
|
||||
root.Set("getFrameCreationSyncToken", get_frame_creation_sync_token);
|
||||
root.Set("setReleaseSyncToken", set_release_sync_token);
|
||||
|
||||
auto root_local = gin::ConvertToV8(isolate, root);
|
||||
auto* persistent = wrapper->CreatePersistent(isolate, root_local);
|
||||
|
||||
persistent->SetWeak(wrapper, PersistentCallbackPass1,
|
||||
v8::WeakCallbackType::kParameter);
|
||||
|
||||
return root_local;
|
||||
}
|
||||
|
||||
struct ImportSharedTextureInfoPlane {
|
||||
// The strides and offsets in bytes to be used when accessing the buffers
|
||||
// via a memory mapping. One per plane per entry. Size in bytes of the
|
||||
// plane is necessary to map the buffers.
|
||||
uint32_t stride;
|
||||
uint64_t offset;
|
||||
uint64_t size;
|
||||
|
||||
// File descriptor for the underlying memory object (usually dmabuf).
|
||||
int fd = 0;
|
||||
};
|
||||
|
||||
struct ImportSharedTextureInfo {
|
||||
// Texture id for printing warnings like GC check.
|
||||
std::string id;
|
||||
|
||||
// The pixel format of the shared texture, RGBA or BGRA depends on platform.
|
||||
media::VideoPixelFormat pixel_format;
|
||||
|
||||
// The full dimensions of the video frame data.
|
||||
gfx::Size coded_size;
|
||||
|
||||
// A subsection of [0, 0, coded_size().width(), coded_size.height()].
|
||||
// In OSR case, it is expected to have the full area of the section.
|
||||
gfx::Rect visible_rect;
|
||||
|
||||
// The color space of the video frame.
|
||||
gfx::ColorSpace color_space = gfx::ColorSpace::CreateSRGB();
|
||||
|
||||
// The capture timestamp, microseconds since capture start
|
||||
int64_t timestamp = 0;
|
||||
|
||||
#if BUILDFLAG(IS_WIN)
|
||||
// On Windows, it must be a NT HANDLE (CreateSharedHandle) to the shared
|
||||
// texture, it can't be a deprecated non-NT HANDLE (GetSharedHandle). This
|
||||
// must be a handle already duplicated for the current process and can be
|
||||
// owned by this.
|
||||
uintptr_t nt_handle = 0;
|
||||
#elif BUILDFLAG(IS_APPLE)
|
||||
// On macOS, it is an IOSurfaceRef, this must be a valid IOSurface at the
|
||||
// current process.
|
||||
uintptr_t io_surface = 0;
|
||||
#elif BUILDFLAG(IS_LINUX)
|
||||
// On Linux, to be implemented.
|
||||
std::vector<ImportSharedTextureInfoPlane> planes;
|
||||
uint64_t modifier = gfx::NativePixmapHandle::kNoModifier;
|
||||
bool supports_zero_copy_webgpu_import = false;
|
||||
#endif
|
||||
};
|
||||
|
||||
} // namespace
|
||||
|
||||
namespace gin {
|
||||
|
||||
template <>
|
||||
struct Converter<ImportSharedTextureInfo> {
|
||||
static bool FromV8(v8::Isolate* isolate,
|
||||
v8::Local<v8::Value> val,
|
||||
ImportSharedTextureInfo* out) {
|
||||
if (!val->IsObject())
|
||||
return false;
|
||||
gin::Dictionary dict(isolate, val.As<v8::Object>());
|
||||
|
||||
std::string pixel_format_str;
|
||||
if (dict.Get("pixelFormat", &pixel_format_str)) {
|
||||
if (pixel_format_str == "bgra")
|
||||
out->pixel_format = media::PIXEL_FORMAT_ARGB;
|
||||
else if (pixel_format_str == "rgba")
|
||||
out->pixel_format = media::PIXEL_FORMAT_ABGR;
|
||||
else if (pixel_format_str == "rgbaf16")
|
||||
out->pixel_format = media::PIXEL_FORMAT_RGBAF16;
|
||||
else
|
||||
return false;
|
||||
}
|
||||
|
||||
dict.Get("codedSize", &out->coded_size);
|
||||
if (!dict.Get("visibleRect", &out->visible_rect)) {
|
||||
out->visible_rect = gfx::Rect(out->coded_size);
|
||||
}
|
||||
dict.Get("colorSpace", &out->color_space);
|
||||
dict.Get("timestamp", &out->timestamp);
|
||||
dict.Get("id", &out->id);
|
||||
|
||||
gin::Dictionary shared_texture(isolate, val.As<v8::Object>());
|
||||
if (!dict.Get("handle", &shared_texture)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
#if BUILDFLAG(IS_WIN) || BUILDFLAG(IS_APPLE)
|
||||
auto GetNativeHandle = [&](const std::string& property_key,
|
||||
uintptr_t* output) {
|
||||
v8::Local<v8::Value> handle_buf;
|
||||
if (shared_texture.Get(property_key, &handle_buf) &&
|
||||
node::Buffer::HasInstance(handle_buf)) {
|
||||
char* data = node::Buffer::Data(handle_buf);
|
||||
if (node::Buffer::Length(handle_buf) == sizeof(uintptr_t)) {
|
||||
*output = *reinterpret_cast<uintptr_t*>(data);
|
||||
}
|
||||
}
|
||||
};
|
||||
#endif
|
||||
|
||||
#if BUILDFLAG(IS_WIN)
|
||||
GetNativeHandle("ntHandle", &out->nt_handle);
|
||||
#elif BUILDFLAG(IS_APPLE)
|
||||
GetNativeHandle("ioSurface", &out->io_surface);
|
||||
#elif BUILDFLAG(IS_LINUX)
|
||||
v8::Local<v8::Object> native_pixmap;
|
||||
if (shared_texture.Get("nativePixmap", &native_pixmap)) {
|
||||
gin::Dictionary v8_native_pixmap(isolate, native_pixmap);
|
||||
v8::Local<v8::Array> v8_planes;
|
||||
if (v8_native_pixmap.Get("planes", &v8_planes)) {
|
||||
out->planes.clear();
|
||||
for (uint32_t i = 0; i < v8_planes->Length(); ++i) {
|
||||
v8::Local<v8::Value> v8_item =
|
||||
v8_planes->Get(isolate->GetCurrentContext(), i).ToLocalChecked();
|
||||
gin::Dictionary v8_plane(isolate, v8_item.As<v8::Object>());
|
||||
ImportSharedTextureInfoPlane plane;
|
||||
v8_plane.Get("stride", &plane.stride);
|
||||
v8_plane.Get("offset", &plane.offset);
|
||||
v8_plane.Get("size", &plane.size);
|
||||
v8_plane.Get("fd", &plane.fd);
|
||||
out->planes.push_back(plane);
|
||||
}
|
||||
}
|
||||
std::string modifier_str;
|
||||
if (v8_native_pixmap.Get("modifier", &modifier_str)) {
|
||||
base::StringToUint64(modifier_str, &out->modifier);
|
||||
}
|
||||
v8_native_pixmap.Get("supportsZeroCopyWebGpuImport",
|
||||
&out->supports_zero_copy_webgpu_import);
|
||||
}
|
||||
#endif
|
||||
|
||||
return true;
|
||||
}
|
||||
};
|
||||
|
||||
} // namespace gin
|
||||
|
||||
namespace electron::api::shared_texture {
|
||||
|
||||
v8::Local<v8::Value> ImportSharedTexture(v8::Isolate* isolate,
|
||||
v8::Local<v8::Value> options) {
|
||||
ImportSharedTextureInfo shared_texture{};
|
||||
if (!gin::ConvertFromV8(isolate, options, &shared_texture)) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"Invalid shared texture info object");
|
||||
return v8::Null(isolate);
|
||||
}
|
||||
|
||||
gfx::GpuMemoryBufferHandle gmb_handle;
|
||||
#if BUILDFLAG(IS_WIN)
|
||||
if (shared_texture.nt_handle == 0) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError("Invalid ntHandle value");
|
||||
return v8::Null(isolate);
|
||||
}
|
||||
|
||||
auto handle = reinterpret_cast<HANDLE>(shared_texture.nt_handle);
|
||||
|
||||
HANDLE dup_handle;
|
||||
// Duplicate the handle to allow scoped handle close it.
|
||||
if (!DuplicateHandle(GetCurrentProcess(), handle, GetCurrentProcess(),
|
||||
&dup_handle, 0, FALSE, DUPLICATE_SAME_ACCESS)) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"Unable to duplicate handle.");
|
||||
return v8::Null(isolate);
|
||||
}
|
||||
|
||||
auto dxgi_handle = gfx::DXGIHandle(base::win::ScopedHandle(dup_handle));
|
||||
gmb_handle = gfx::GpuMemoryBufferHandle(std::move(dxgi_handle));
|
||||
#elif BUILDFLAG(IS_APPLE)
|
||||
if (shared_texture.io_surface == 0) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError("Invalid ioSurface value");
|
||||
return v8::Null(isolate);
|
||||
}
|
||||
|
||||
// Retain the io_surface reference to increase the reference count.
|
||||
auto io_surface = reinterpret_cast<IOSurfaceRef>(shared_texture.io_surface);
|
||||
auto io_surface_scoped = base::apple::ScopedCFTypeRef<IOSurfaceRef>(
|
||||
io_surface, base::scoped_policy::RETAIN);
|
||||
gmb_handle = gfx::GpuMemoryBufferHandle(std::move(io_surface_scoped));
|
||||
#elif BUILDFLAG(IS_LINUX)
|
||||
gfx::NativePixmapHandle pixmap;
|
||||
pixmap.modifier = shared_texture.modifier;
|
||||
pixmap.supports_zero_copy_webgpu_import =
|
||||
shared_texture.supports_zero_copy_webgpu_import;
|
||||
|
||||
for (const auto& plane : shared_texture.planes) {
|
||||
gfx::NativePixmapPlane plane_info;
|
||||
plane_info.stride = plane.stride;
|
||||
plane_info.offset = plane.offset;
|
||||
plane_info.size = plane.size;
|
||||
|
||||
// Duplicate fd, otherwise the process may already have ownership.
|
||||
int checked_dup = HANDLE_EINTR(dup(plane.fd));
|
||||
plane_info.fd = base::ScopedFD(checked_dup);
|
||||
|
||||
pixmap.planes.push_back(std::move(plane_info));
|
||||
}
|
||||
|
||||
gmb_handle = gfx::GpuMemoryBufferHandle(std::move(pixmap));
|
||||
#endif
|
||||
|
||||
gfx::Size coded_size = shared_texture.coded_size;
|
||||
media::VideoPixelFormat pixel_format = shared_texture.pixel_format;
|
||||
gfx::ColorSpace color_space = shared_texture.color_space;
|
||||
|
||||
auto buffer_format = media::VideoPixelFormatToGfxBufferFormat(pixel_format);
|
||||
if (!buffer_format.has_value()) {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"Invalid shared texture buffer format");
|
||||
return v8::Null(isolate);
|
||||
}
|
||||
|
||||
auto* sii = GetSharedImageInterface();
|
||||
gpu::SharedImageUsageSet shared_image_usage =
|
||||
#if BUILDFLAG(IS_WIN) || BUILDFLAG(IS_APPLE)
|
||||
gpu::SHARED_IMAGE_USAGE_GLES2_READ | gpu::SHARED_IMAGE_USAGE_GLES2_WRITE |
|
||||
gpu::SHARED_IMAGE_USAGE_RASTER_READ |
|
||||
gpu::SHARED_IMAGE_USAGE_DISPLAY_READ |
|
||||
gpu::SHARED_IMAGE_USAGE_WEBGPU_READ |
|
||||
gpu::SHARED_IMAGE_USAGE_WEBGPU_WRITE;
|
||||
#else
|
||||
gpu::SHARED_IMAGE_USAGE_GLES2_READ | gpu::SHARED_IMAGE_USAGE_GLES2_WRITE |
|
||||
gpu::SHARED_IMAGE_USAGE_RASTER_READ |
|
||||
gpu::SHARED_IMAGE_USAGE_DISPLAY_READ;
|
||||
#endif
|
||||
|
||||
auto si_format = viz::GetSharedImageFormat(buffer_format.value());
|
||||
auto si =
|
||||
sii->CreateSharedImage({si_format, coded_size, color_space,
|
||||
shared_image_usage, "SharedTextureVideoFrame"},
|
||||
std::move(gmb_handle));
|
||||
|
||||
ImportedSharedTexture* imported = new ImportedSharedTexture();
|
||||
imported->pixel_format = shared_texture.pixel_format;
|
||||
imported->coded_size = shared_texture.coded_size;
|
||||
imported->visible_rect = shared_texture.visible_rect;
|
||||
imported->timestamp = shared_texture.timestamp;
|
||||
imported->frame_creation_sync_token = si->creation_sync_token();
|
||||
imported->client_shared_image = std::move(si);
|
||||
imported->id = shared_texture.id;
|
||||
|
||||
return CreateImportedSharedTextureFromSharedImage(isolate, imported);
|
||||
}
|
||||
|
||||
v8::Local<v8::Value> FinishTransferSharedTexture(v8::Isolate* isolate,
|
||||
v8::Local<v8::Value> options) {
|
||||
ImportSharedTextureInfo partial{};
|
||||
gin::ConvertFromV8(isolate, options, &partial);
|
||||
|
||||
std::string id;
|
||||
std::string transfer;
|
||||
std::string sync_token_data;
|
||||
|
||||
gin::Dictionary dict(isolate, options.As<v8::Object>());
|
||||
dict.Get("id", &id);
|
||||
dict.Get("transfer", &transfer);
|
||||
dict.Get("syncToken", &sync_token_data);
|
||||
|
||||
auto transfer_data = base::Base64Decode(transfer);
|
||||
|
||||
// Use mojo to deserialize the exported shared image.
|
||||
mojo::Message message(transfer_data.value(), {});
|
||||
mojo::internal::MessageFragment<
|
||||
gpu::mojom::internal::ExportedSharedImage_Data>
|
||||
data(message);
|
||||
data.Claim(message.mutable_payload());
|
||||
|
||||
gpu::ExportedSharedImage exported;
|
||||
mojo::internal::Serializer<gpu::mojom::ExportedSharedImageDataView,
|
||||
gpu::ExportedSharedImage>::Deserialize(data.data(),
|
||||
&exported,
|
||||
&message);
|
||||
|
||||
auto* sii = GetSharedImageInterface();
|
||||
auto si = sii->ImportSharedImage(std::move(exported));
|
||||
|
||||
auto source_st = GetSyncTokenFromBase64String(sync_token_data);
|
||||
sii->WaitSyncToken(source_st);
|
||||
|
||||
ImportedSharedTexture* imported = new ImportedSharedTexture();
|
||||
imported->pixel_format = partial.pixel_format;
|
||||
imported->coded_size = partial.coded_size;
|
||||
imported->visible_rect = partial.visible_rect;
|
||||
imported->timestamp = partial.timestamp;
|
||||
imported->frame_creation_sync_token = sii->GenUnverifiedSyncToken();
|
||||
imported->client_shared_image = std::move(si);
|
||||
imported->id = id;
|
||||
|
||||
return CreateImportedSharedTextureFromSharedImage(isolate, imported);
|
||||
}
|
||||
|
||||
} // namespace electron::api::shared_texture
|
||||
|
||||
namespace {
|
||||
|
||||
void Initialize(v8::Local<v8::Object> exports,
|
||||
v8::Local<v8::Value> unused,
|
||||
v8::Local<v8::Context> context,
|
||||
void* priv) {
|
||||
v8::Isolate* const isolate = v8::Isolate::GetCurrent();
|
||||
gin_helper::Dictionary dict(isolate, exports);
|
||||
dict.SetMethod("importSharedTexture",
|
||||
&electron::api::shared_texture::ImportSharedTexture);
|
||||
dict.SetMethod("finishTransferSharedTexture",
|
||||
&electron::api::shared_texture::FinishTransferSharedTexture);
|
||||
}
|
||||
|
||||
} // namespace
|
||||
|
||||
NODE_LINKED_BINDING_CONTEXT_AWARE(electron_common_shared_texture, Initialize)
|
||||
23
shell/common/api/electron_api_shared_texture.h
Normal file
23
shell/common/api/electron_api_shared_texture.h
Normal file
@@ -0,0 +1,23 @@
|
||||
// Copyright (c) 2025 Reito <reito@chromium.org>
|
||||
// Use of this source code is governed by the MIT license that can be
|
||||
// found in the LICENSE file.
|
||||
|
||||
#ifndef ELECTRON_SHELL_COMMON_API_ELECTRON_API_SHARED_TEXTURE_H_
|
||||
#define ELECTRON_SHELL_COMMON_API_ELECTRON_API_SHARED_TEXTURE_H_
|
||||
|
||||
#include <string>
|
||||
#include <vector>
|
||||
|
||||
#include "v8/include/v8-forward.h"
|
||||
|
||||
namespace electron::api::shared_texture {
|
||||
|
||||
v8::Local<v8::Value> ImportSharedTexture(v8::Isolate* isolate,
|
||||
v8::Local<v8::Value> options);
|
||||
|
||||
v8::Local<v8::Value> FinishTransferSharedTexture(v8::Isolate* isolate,
|
||||
v8::Local<v8::Value> options);
|
||||
|
||||
} // namespace electron::api::shared_texture
|
||||
|
||||
#endif // ELECTRON_SHELL_COMMON_API_ELECTRON_API_SHARED_TEXTURE_H_
|
||||
@@ -455,6 +455,12 @@ bool Converter<gfx::ColorSpace>::FromV8(v8::Isolate* isolate,
|
||||
|
||||
// Get primaries
|
||||
if (dict.Get("primaries", &primaries_str)) {
|
||||
if (primaries_str == "custom") {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"'custom' not supported.");
|
||||
return false;
|
||||
}
|
||||
|
||||
if (primaries_str == "bt709")
|
||||
primaries = gfx::ColorSpace::PrimaryID::BT709;
|
||||
else if (primaries_str == "bt470m")
|
||||
@@ -485,18 +491,18 @@ bool Converter<gfx::ColorSpace>::FromV8(v8::Isolate* isolate,
|
||||
primaries = gfx::ColorSpace::PrimaryID::WIDE_GAMUT_COLOR_SPIN;
|
||||
else if (primaries_str == "ebu-3213-e")
|
||||
primaries = gfx::ColorSpace::PrimaryID::EBU_3213_E;
|
||||
|
||||
if (primaries_str == "custom") {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"'custom' not supported.");
|
||||
return false;
|
||||
} else {
|
||||
else
|
||||
primaries = gfx::ColorSpace::PrimaryID::INVALID;
|
||||
}
|
||||
}
|
||||
|
||||
// Get transfer
|
||||
if (dict.Get("transfer", &transfer_str)) {
|
||||
if (transfer_str == "custom" || transfer_str == "custom-hdr") {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"'custom', 'custom-hdr' not supported.");
|
||||
return false;
|
||||
}
|
||||
|
||||
if (transfer_str == "bt709")
|
||||
transfer = gfx::ColorSpace::TransferID::BT709;
|
||||
else if (transfer_str == "bt709-apple")
|
||||
@@ -541,14 +547,8 @@ bool Converter<gfx::ColorSpace>::FromV8(v8::Isolate* isolate,
|
||||
transfer = gfx::ColorSpace::TransferID::LINEAR_HDR;
|
||||
else if (transfer_str == "scrgb-linear-80-nits")
|
||||
transfer = gfx::ColorSpace::TransferID::SCRGB_LINEAR_80_NITS;
|
||||
|
||||
if (transfer_str == "custom" || transfer_str == "custom-hdr") {
|
||||
gin_helper::ErrorThrower(isolate).ThrowTypeError(
|
||||
"'custom', 'custom-hdr' not supported.");
|
||||
return false;
|
||||
} else {
|
||||
primaries = gfx::ColorSpace::PrimaryID::INVALID;
|
||||
}
|
||||
else
|
||||
transfer = gfx::ColorSpace::TransferID::INVALID;
|
||||
}
|
||||
|
||||
// Get matrix
|
||||
|
||||
@@ -99,6 +99,7 @@
|
||||
V(electron_common_environment) \
|
||||
V(electron_common_features) \
|
||||
V(electron_common_native_image) \
|
||||
V(electron_common_shared_texture) \
|
||||
V(electron_common_shell) \
|
||||
V(electron_common_v8_util)
|
||||
|
||||
|
||||
281
spec/api-shared-texture-spec.ts
Normal file
281
spec/api-shared-texture-spec.ts
Normal file
@@ -0,0 +1,281 @@
|
||||
import { BaseWindow } from 'electron';
|
||||
|
||||
import { expect } from 'chai';
|
||||
|
||||
import { randomUUID } from 'node:crypto';
|
||||
import * as path from 'node:path';
|
||||
|
||||
import { closeWindow } from './lib/window-helpers';
|
||||
|
||||
const fixtures = path.resolve(__dirname, 'fixtures');
|
||||
|
||||
describe('sharedTexture module', () => {
|
||||
const {
|
||||
nativeImage
|
||||
} = require('electron');
|
||||
|
||||
const debugSpec = false;
|
||||
const dirPath = path.join(fixtures, 'api', 'shared-texture');
|
||||
const osrPath = path.join(dirPath, 'osr.html');
|
||||
const imagePath = path.join(dirPath, 'image.png');
|
||||
const targetImage = nativeImage.createFromPath(imagePath);
|
||||
|
||||
describe('import shared texture produced by osr', () => {
|
||||
const {
|
||||
app,
|
||||
BrowserWindow,
|
||||
sharedTexture,
|
||||
ipcMain
|
||||
} = require('electron');
|
||||
|
||||
afterEach(async () => {
|
||||
ipcMain.removeAllListeners();
|
||||
for (const w of BaseWindow.getAllWindows()) {
|
||||
await closeWindow(w);
|
||||
}
|
||||
});
|
||||
|
||||
it('successfully imported and rendered with subtle api', (done) => {
|
||||
type CapturedTextureHolder = {
|
||||
importedSubtle: Electron.SharedTextureImportedSubtle,
|
||||
texture: Electron.OffscreenSharedTexture
|
||||
}
|
||||
|
||||
const capturedTextures = new Map<string, CapturedTextureHolder>();
|
||||
const preloadPath = path.join(dirPath, 'subtle', 'preload.js');
|
||||
const htmlPath = path.join(dirPath, 'subtle', 'index.html');
|
||||
|
||||
const createWindow = () => {
|
||||
const win = new BrowserWindow({
|
||||
width: 256,
|
||||
height: 256,
|
||||
show: debugSpec,
|
||||
webPreferences: {
|
||||
preload: preloadPath
|
||||
}
|
||||
});
|
||||
|
||||
const osr = new BrowserWindow({
|
||||
width: 128,
|
||||
height: 128,
|
||||
show: debugSpec,
|
||||
webPreferences: {
|
||||
offscreen: {
|
||||
useSharedTexture: true
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
osr.webContents.setFrameRate(1);
|
||||
osr.webContents.on('paint', (event: any) => {
|
||||
// Step 1: Input source of shared texture handle.
|
||||
const texture = event.texture;
|
||||
|
||||
if (!texture) {
|
||||
console.error('No texture, GPU may be unavailable, skipping.');
|
||||
done();
|
||||
return;
|
||||
}
|
||||
|
||||
// Step 2: Import as SharedTextureImported
|
||||
console.log(texture.textureInfo);
|
||||
const importedSubtle = sharedTexture.subtle.importSharedTexture(texture.textureInfo);
|
||||
|
||||
// Step 3: Prepare for transfer to another process (win's renderer)
|
||||
const transfer = importedSubtle.startTransferSharedTexture();
|
||||
|
||||
const id = randomUUID();
|
||||
capturedTextures.set(id, { importedSubtle, texture });
|
||||
|
||||
// Step 4: Send the shared texture to the renderer process (goto preload.js)
|
||||
win.webContents.send('shared-texture', id, transfer);
|
||||
});
|
||||
|
||||
ipcMain.on('shared-texture-done', (event: any, id: string) => {
|
||||
// Step 12: Release the shared texture resources at main process
|
||||
const data = capturedTextures.get(id);
|
||||
if (data) {
|
||||
capturedTextures.delete(id);
|
||||
const { importedSubtle, texture } = data;
|
||||
|
||||
// Step 13: Release the imported shared texture
|
||||
importedSubtle.release(() => {
|
||||
// Step 14: Release the shared texture once GPU is done
|
||||
texture.release();
|
||||
});
|
||||
|
||||
// Step 15: Slightly timeout and capture the node screenshot
|
||||
setTimeout(async () => {
|
||||
// Step 16: Compare the captured image with the target image
|
||||
const captured = await win.webContents.capturePage({
|
||||
x: 16,
|
||||
y: 16,
|
||||
width: 128,
|
||||
height: 128
|
||||
});
|
||||
|
||||
// Step 17: Resize the target image to match the captured image size, in case dpr != 1
|
||||
const target = targetImage.resize({ ...captured.getSize() });
|
||||
|
||||
// Step 18: nativeImage have error comparing pixel data when color space is different,
|
||||
// send to browser for comparison using canvas.
|
||||
win.webContents.send('verify-captured-image', {
|
||||
captured: captured.toDataURL(),
|
||||
target: target.toDataURL()
|
||||
});
|
||||
}, 300);
|
||||
}
|
||||
});
|
||||
|
||||
ipcMain.on('verify-captured-image-done', (event: any, result: { difference: number, total: number }) => {
|
||||
// Step 22: Verify the result from renderer process
|
||||
try {
|
||||
// macOS may have tiny color difference after the whole rendering process,
|
||||
// and the color may change slightly when resizing at device pixel ratio != 1.
|
||||
// Limit error should not be different more than 1% of the whole image.
|
||||
const ratio = result.difference / result.total;
|
||||
console.log('image difference: ', ratio);
|
||||
expect(ratio).to.be.lessThan(0.01);
|
||||
done();
|
||||
} catch (e) {
|
||||
done(e);
|
||||
}
|
||||
});
|
||||
|
||||
ipcMain.on('webgpu-unavailable', () => {
|
||||
console.error('WebGPU is not available, skipping.');
|
||||
done();
|
||||
});
|
||||
|
||||
win.loadFile(htmlPath);
|
||||
osr.loadFile(osrPath);
|
||||
};
|
||||
|
||||
app.whenReady().then(() => {
|
||||
createWindow();
|
||||
});
|
||||
}).timeout(debugSpec ? 100000 : 10000);
|
||||
|
||||
const runSharedTextureManagedTest = (done: Mocha.Done, iframe: boolean) => {
|
||||
const preloadPath = path.join(dirPath, 'managed', 'preload.js');
|
||||
const htmlPath = path.join(dirPath, 'managed', iframe ? 'frame.html' : 'index.html');
|
||||
|
||||
const createWindow = () => {
|
||||
const win = new BrowserWindow({
|
||||
width: 256,
|
||||
height: 256,
|
||||
show: debugSpec,
|
||||
webPreferences: {
|
||||
preload: preloadPath,
|
||||
nodeIntegrationInSubFrames: iframe
|
||||
}
|
||||
});
|
||||
|
||||
const osr = new BrowserWindow({
|
||||
width: 128,
|
||||
height: 128,
|
||||
show: debugSpec,
|
||||
webPreferences: {
|
||||
offscreen: {
|
||||
useSharedTexture: true
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
osr.webContents.setFrameRate(1);
|
||||
osr.webContents.on('paint', async (event: any) => {
|
||||
const targetFrame = iframe ? win.webContents.mainFrame.frames[0] : win.webContents.mainFrame;
|
||||
if (!targetFrame) {
|
||||
done(new Error('Target frame not found'));
|
||||
return;
|
||||
}
|
||||
|
||||
// Step 1: Input source of shared texture handle.
|
||||
const texture = event.texture;
|
||||
|
||||
if (!texture) {
|
||||
console.error('No texture, GPU may be unavailable, skipping.');
|
||||
done();
|
||||
return;
|
||||
}
|
||||
|
||||
// Step 2: Import as SharedTextureImported
|
||||
console.log(texture.textureInfo);
|
||||
const imported = sharedTexture.importSharedTexture({
|
||||
textureInfo: texture.textureInfo,
|
||||
allReferencesReleased: () => {
|
||||
// Release the shared texture source once GPU is done.
|
||||
// Will be called when all processes have finished using the shared texture.
|
||||
texture.release();
|
||||
|
||||
// Slightly timeout and capture the node screenshot
|
||||
setTimeout(async () => {
|
||||
// Compare the captured image with the target image
|
||||
const captured = await win.webContents.capturePage({
|
||||
x: 16,
|
||||
y: 16,
|
||||
width: 128,
|
||||
height: 128
|
||||
});
|
||||
|
||||
// Resize the target image to match the captured image size, in case dpr != 1
|
||||
const target = targetImage.resize({ ...captured.getSize() });
|
||||
|
||||
// nativeImage have error comparing pixel data when color space is different,
|
||||
// send to browser for comparison using canvas.
|
||||
targetFrame.send('verify-captured-image', {
|
||||
captured: captured.toDataURL(),
|
||||
target: target.toDataURL()
|
||||
});
|
||||
}, 300);
|
||||
}
|
||||
});
|
||||
|
||||
// Step 3: Transfer to another process (win's renderer)
|
||||
await sharedTexture.sendSharedTexture({
|
||||
frame: iframe ? targetFrame : win.webContents.mainFrame,
|
||||
importedSharedTexture: imported
|
||||
});
|
||||
|
||||
// Step 4: Release the imported and wait for signal to release the source
|
||||
imported.release();
|
||||
});
|
||||
|
||||
ipcMain.on('verify-captured-image-done', (event: any, result: { difference: number, total: number }) => {
|
||||
// Verify the result from renderer process
|
||||
try {
|
||||
// macOS may have tiny color difference after the whole rendering process,
|
||||
// and the color may change slightly when resizing at device pixel ratio != 1.
|
||||
// Limit error should not be different more than 1% of the whole image.
|
||||
const ratio = result.difference / result.total;
|
||||
console.log('image difference: ', ratio);
|
||||
expect(ratio).to.be.lessThan(0.01);
|
||||
done();
|
||||
} catch (e) {
|
||||
setTimeout(() => done(e), 1000000);
|
||||
}
|
||||
});
|
||||
|
||||
ipcMain.on('webgpu-unavailable', () => {
|
||||
console.error('WebGPU is not available, skipping.');
|
||||
done();
|
||||
});
|
||||
|
||||
win.loadFile(htmlPath);
|
||||
osr.loadFile(osrPath);
|
||||
};
|
||||
|
||||
app.whenReady().then(() => {
|
||||
createWindow();
|
||||
});
|
||||
};
|
||||
|
||||
it('successfully imported and rendered with managed api, without iframe', (done) => {
|
||||
runSharedTextureManagedTest(done, false);
|
||||
}).timeout(debugSpec ? 100000 : 10000);
|
||||
|
||||
it('successfully imported and rendered with managed api, with iframe', (done) => {
|
||||
runSharedTextureManagedTest(done, true);
|
||||
}).timeout(debugSpec ? 100000 : 10000);
|
||||
});
|
||||
});
|
||||
168
spec/fixtures/api/shared-texture/common.js
vendored
Normal file
168
spec/fixtures/api/shared-texture/common.js
vendored
Normal file
@@ -0,0 +1,168 @@
|
||||
window.verifyCapturedImage = (images, result) => {
|
||||
const { captured, target } = images;
|
||||
// Compare the captured image with the target image
|
||||
const capturedImage = new Image();
|
||||
capturedImage.src = captured;
|
||||
capturedImage.onload = () => {
|
||||
const targetImage = new Image();
|
||||
targetImage.src = target;
|
||||
targetImage.onload = () => {
|
||||
const canvas = document.createElement('canvas');
|
||||
canvas.width = capturedImage.width;
|
||||
canvas.height = capturedImage.height;
|
||||
|
||||
const ctx = canvas.getContext('2d');
|
||||
ctx.drawImage(capturedImage, 0, 0);
|
||||
const capturedData = ctx.getImageData(0, 0, canvas.width, canvas.height).data;
|
||||
|
||||
ctx.clearRect(0, 0, canvas.width, canvas.height);
|
||||
ctx.drawImage(targetImage, 0, 0);
|
||||
const targetData = ctx.getImageData(0, 0, canvas.width, canvas.height).data;
|
||||
|
||||
// Compare the pixel data
|
||||
let difference = 0;
|
||||
for (let i = 0; i < capturedData.length; i += 4) {
|
||||
difference += Math.abs(capturedData[i] - targetData[i]);
|
||||
difference += Math.abs(capturedData[i + 1] - targetData[i + 1]);
|
||||
difference += Math.abs(capturedData[i + 2] - targetData[i + 2]);
|
||||
difference += Math.abs(capturedData[i + 3] - targetData[i + 3]);
|
||||
}
|
||||
|
||||
// Send the result back to the main process
|
||||
result({ difference, total: capturedData.length * 255 });
|
||||
canvas.remove();
|
||||
capturedImage.remove();
|
||||
targetImage.remove();
|
||||
};
|
||||
};
|
||||
};
|
||||
|
||||
window.initWebGpu = async () => {
|
||||
// Init WebGPU
|
||||
const canvas = document.createElement('canvas');
|
||||
canvas.width = 128;
|
||||
canvas.height = 128;
|
||||
canvas.style.width = '128px';
|
||||
canvas.style.height = '128px';
|
||||
canvas.style.position = 'absolute';
|
||||
canvas.style.top = '16px';
|
||||
canvas.style.left = '16px';
|
||||
|
||||
document.body.appendChild(canvas);
|
||||
const context = canvas.getContext('webgpu');
|
||||
|
||||
// Configure WebGPU context
|
||||
const adapter = await navigator.gpu.requestAdapter();
|
||||
const device = await adapter.requestDevice();
|
||||
const format = navigator.gpu.getPreferredCanvasFormat();
|
||||
context.configure({ device, format });
|
||||
|
||||
window.renderFrame = async (frame) => {
|
||||
try {
|
||||
// Create external texture
|
||||
const externalTexture = device.importExternalTexture({ source: frame });
|
||||
|
||||
// Create bind group layout, correctly specifying the external texture type
|
||||
const bindGroupLayout = device.createBindGroupLayout({
|
||||
entries: [
|
||||
{
|
||||
binding: 0,
|
||||
visibility: window.GPUShaderStage.FRAGMENT,
|
||||
externalTexture: {}
|
||||
},
|
||||
{
|
||||
binding: 1,
|
||||
visibility: window.GPUShaderStage.FRAGMENT,
|
||||
sampler: {}
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
// Create pipeline layout
|
||||
const pipelineLayout = device.createPipelineLayout({
|
||||
bindGroupLayouts: [bindGroupLayout]
|
||||
});
|
||||
|
||||
// Create render pipeline
|
||||
const pipeline = device.createRenderPipeline({
|
||||
layout: pipelineLayout,
|
||||
vertex: {
|
||||
module: device.createShaderModule({
|
||||
code: `
|
||||
@vertex
|
||||
fn main(@builtin(vertex_index) VertexIndex : u32) -> @builtin(position) vec4<f32> {
|
||||
var pos = array<vec2<f32>, 6>(
|
||||
vec2<f32>(-1.0, -1.0),
|
||||
vec2<f32>(1.0, -1.0),
|
||||
vec2<f32>(-1.0, 1.0),
|
||||
vec2<f32>(-1.0, 1.0),
|
||||
vec2<f32>(1.0, -1.0),
|
||||
vec2<f32>(1.0, 1.0)
|
||||
);
|
||||
return vec4<f32>(pos[VertexIndex], 0.0, 1.0);
|
||||
}
|
||||
`
|
||||
}),
|
||||
entryPoint: 'main'
|
||||
},
|
||||
fragment: {
|
||||
module: device.createShaderModule({
|
||||
code: `
|
||||
@group(0) @binding(0) var extTex: texture_external;
|
||||
@group(0) @binding(1) var mySampler: sampler;
|
||||
|
||||
@fragment
|
||||
fn main(@builtin(position) fragCoord: vec4<f32>) -> @location(0) vec4<f32> {
|
||||
let texCoord = fragCoord.xy / vec2<f32>(${canvas.width}.0, ${canvas.height}.0);
|
||||
return textureSampleBaseClampToEdge(extTex, mySampler, texCoord);
|
||||
}
|
||||
`
|
||||
}),
|
||||
entryPoint: 'main',
|
||||
targets: [{ format }]
|
||||
},
|
||||
primitive: { topology: 'triangle-list' }
|
||||
});
|
||||
|
||||
// Create bind group
|
||||
const bindGroup = device.createBindGroup({
|
||||
layout: bindGroupLayout,
|
||||
entries: [
|
||||
{
|
||||
binding: 0,
|
||||
resource: externalTexture
|
||||
},
|
||||
{
|
||||
binding: 1,
|
||||
resource: device.createSampler()
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
// Create command encoder and render pass
|
||||
const commandEncoder = device.createCommandEncoder();
|
||||
const textureView = context.getCurrentTexture().createView();
|
||||
const renderPass = commandEncoder.beginRenderPass({
|
||||
colorAttachments: [
|
||||
{
|
||||
view: textureView,
|
||||
clearValue: { r: 0.0, g: 0.0, b: 0.0, a: 1.0 },
|
||||
loadOp: 'clear',
|
||||
storeOp: 'store'
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
// Set pipeline and bind group
|
||||
renderPass.setPipeline(pipeline);
|
||||
renderPass.setBindGroup(0, bindGroup);
|
||||
renderPass.draw(6); // Draw a rectangle composed of two triangles
|
||||
renderPass.end();
|
||||
|
||||
// Submit commands
|
||||
device.queue.submit([commandEncoder.finish()]);
|
||||
} catch (error) {
|
||||
console.error('Rendering error:', error);
|
||||
}
|
||||
};
|
||||
};
|
||||
BIN
spec/fixtures/api/shared-texture/image.png
vendored
Normal file
BIN
spec/fixtures/api/shared-texture/image.png
vendored
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 873 B |
12
spec/fixtures/api/shared-texture/managed/frame.html
vendored
Normal file
12
spec/fixtures/api/shared-texture/managed/frame.html
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
</head>
|
||||
|
||||
<body style="margin: 0;">
|
||||
<iframe src="index.html" style="width: 100%; height: 100%; border: none;"></iframe>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
14
spec/fixtures/api/shared-texture/managed/index.html
vendored
Normal file
14
spec/fixtures/api/shared-texture/managed/index.html
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Hello World!</title>
|
||||
<script src="../common.js" defer></script>
|
||||
<script src="renderer.js" defer></script>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
28
spec/fixtures/api/shared-texture/managed/preload.js
vendored
Normal file
28
spec/fixtures/api/shared-texture/managed/preload.js
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
const { sharedTexture } = require('electron');
|
||||
const { ipcRenderer, contextBridge } = require('electron/renderer');
|
||||
|
||||
contextBridge.exposeInMainWorld('textures', {
|
||||
onSharedTexture: (cb) => {
|
||||
// Step 0: Register the receiver for transferred shared texture
|
||||
sharedTexture.setSharedTextureReceiver(async (data) => {
|
||||
// Step 5: Receive the imported shared texture
|
||||
const { importedSharedTexture: imported } = data;
|
||||
await cb(imported);
|
||||
|
||||
// Release the imported shared texture since we're done.
|
||||
// No need to use the callback here, as the util function automatically
|
||||
// managed the lifetime via sync token.
|
||||
imported.release();
|
||||
});
|
||||
},
|
||||
webGpuUnavailable: () => {
|
||||
ipcRenderer.send('webgpu-unavailable');
|
||||
},
|
||||
verifyCapturedImage: (verify) => {
|
||||
ipcRenderer.on('verify-captured-image', (e, images) => {
|
||||
verify(images, (result) => {
|
||||
ipcRenderer.send('verify-captured-image-done', result);
|
||||
});
|
||||
});
|
||||
}
|
||||
});
|
||||
21
spec/fixtures/api/shared-texture/managed/renderer.js
vendored
Normal file
21
spec/fixtures/api/shared-texture/managed/renderer.js
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
window.initWebGpu().catch((err) => {
|
||||
console.error('Failed to initialize WebGPU:', err);
|
||||
window.textures.webGpuUnavailable();
|
||||
});
|
||||
|
||||
window.textures.onSharedTexture(async (imported) => {
|
||||
try {
|
||||
// Step 6: Get VideoFrame from the imported texture
|
||||
const frame = imported.getVideoFrame();
|
||||
|
||||
// Step 7: Render using WebGPU
|
||||
await window.renderFrame(frame);
|
||||
|
||||
// Step 8: Release the VideoFrame as we no longer need it
|
||||
frame.close();
|
||||
} catch (error) {
|
||||
console.error('Error getting VideoFrame:', error);
|
||||
}
|
||||
});
|
||||
|
||||
window.textures.verifyCapturedImage(window.verifyCapturedImage);
|
||||
13
spec/fixtures/api/shared-texture/osr.html
vendored
Normal file
13
spec/fixtures/api/shared-texture/osr.html
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
|
||||
<header>
|
||||
<meta charset="UTF-8">
|
||||
<title>Hello World!</title>
|
||||
</header>
|
||||
|
||||
<body>
|
||||
<img src="image.png" style="width: 128px; height: 128px; position: absolute; left: 0; top: 0;"></img>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
14
spec/fixtures/api/shared-texture/subtle/index.html
vendored
Normal file
14
spec/fixtures/api/shared-texture/subtle/index.html
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Hello World!</title>
|
||||
<script src="../common.js" defer></script>
|
||||
<script src="renderer.js" defer></script>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
30
spec/fixtures/api/shared-texture/subtle/preload.js
vendored
Normal file
30
spec/fixtures/api/shared-texture/subtle/preload.js
vendored
Normal file
@@ -0,0 +1,30 @@
|
||||
const { sharedTexture } = require('electron');
|
||||
const { ipcRenderer, contextBridge } = require('electron/renderer');
|
||||
|
||||
contextBridge.exposeInMainWorld('textures', {
|
||||
onSharedTexture: (cb) => {
|
||||
ipcRenderer.on('shared-texture', async (e, id, transfer) => {
|
||||
// Step 5: Get the shared texture from the transfer
|
||||
const importedSubtle = sharedTexture.subtle.finishTransferSharedTexture(transfer);
|
||||
|
||||
// Step 6: Let the renderer render using WebGPU
|
||||
await cb(id, importedSubtle);
|
||||
|
||||
// Step 10: Release the shared texture with a callback
|
||||
importedSubtle.release(() => {
|
||||
// Step 11: When GPU command buffer is done, we can notify the main process to release
|
||||
ipcRenderer.send('shared-texture-done', id);
|
||||
});
|
||||
});
|
||||
},
|
||||
webGpuUnavailable: () => {
|
||||
ipcRenderer.send('webgpu-unavailable');
|
||||
},
|
||||
verifyCapturedImage: (verify) => {
|
||||
ipcRenderer.on('verify-captured-image', (e, images) => {
|
||||
verify(images, (result) => {
|
||||
ipcRenderer.send('verify-captured-image-done', result);
|
||||
});
|
||||
});
|
||||
}
|
||||
});
|
||||
21
spec/fixtures/api/shared-texture/subtle/renderer.js
vendored
Normal file
21
spec/fixtures/api/shared-texture/subtle/renderer.js
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
window.initWebGpu().catch((err) => {
|
||||
console.error('Failed to initialize WebGPU:', err);
|
||||
window.textures.webGpuUnavailable();
|
||||
});
|
||||
|
||||
window.textures.onSharedTexture(async (id, importedSubtle) => {
|
||||
try {
|
||||
// Step 7: Get VideoFrame from the imported texture
|
||||
const frame = importedSubtle.getVideoFrame();
|
||||
|
||||
// Step 8: Render using WebGPU
|
||||
await window.renderFrame(frame);
|
||||
|
||||
// Step 9: Release the VideoFrame as we no longer need it
|
||||
frame.close();
|
||||
} catch (error) {
|
||||
console.error('Error getting VideoFrame:', error);
|
||||
}
|
||||
});
|
||||
|
||||
window.textures.verifyCapturedImage(window.verifyCapturedImage);
|
||||
1
typings/internal-ambient.d.ts
vendored
1
typings/internal-ambient.d.ts
vendored
@@ -223,6 +223,7 @@ declare namespace NodeJS {
|
||||
_linkedBinding(name: 'electron_common_environment'): EnvironmentBinding;
|
||||
_linkedBinding(name: 'electron_common_features'): FeaturesBinding;
|
||||
_linkedBinding(name: 'electron_common_native_image'): { nativeImage: typeof Electron.NativeImage };
|
||||
_linkedBinding(name: 'electron_common_shared_texture'): Electron.SharedTextureSubtle;
|
||||
_linkedBinding(name: 'electron_common_net'): NetBinding;
|
||||
_linkedBinding(name: 'electron_common_shell'): Electron.Shell;
|
||||
_linkedBinding(name: 'electron_common_v8_util'): V8UtilBinding;
|
||||
|
||||
Reference in New Issue
Block a user