- From: Adam Pietrasiak via GitHub <sysbot+gh@w3.org>
- Date: Mon, 09 Jan 2023 20:13:48 +0000
- To: public-webrtc-logs@w3.org
My use case is rendering canvas based project frame-by-frame.
In general the flow is: prepare webgl stage and render it, capture frame, go to next frame, repeat.
Currently I use readPixels API, but it is the biggest bottleneck of the rendering pipeline as it requires pixels to be sent from GPU to CPU. It takes ~80% of render time.
Thus I was trying to find another method that avoids that and https://stackoverflow.com/questions/58907270/record-at-constant-fps-with-canvascapturemediastream-even-on-slow-computers/58969196#58969196 this was very promising.
I did create my recorder like:
```ts
import { waitForElementEvent } from "@s/shared/dom/events";
import { wait } from "@s/shared/time";
const waitForRecorderEvent = waitForElementEvent<keyof MediaRecorderEventMap>;
export function createCanvasRecorder(source: HTMLCanvasElement, fps: number) {
const target = source.cloneNode() as HTMLCanvasElement;
const ctx = target.getContext("2d")!;
ctx.drawImage(source, 0, 0);
const stream = target.captureStream(0);
const track = stream.getVideoTracks()[0] as CanvasCaptureMediaStreamTrack;
const recorder = new MediaRecorder(stream, { mimeType: "video/webm;codecs=H264", });
const dataChunks: Blob[] = [];
recorder.ondataavailable = (evt) => dataChunks.push(evt.data);
recorder.start();
recorder.pause();
return {
async captureFrame() {
const timer = wait(1000 / fps);
recorder.resume();
console.log("did resume");
ctx.clearRect(0, 0, target.width, target.height);
ctx.drawImage(source, 0, 0);
track.requestFrame();
await timer;
recorder.pause();
console.log("did pause");
},
async finish() {
recorder.stop();
stream.getTracks().forEach((track) => track.stop());
await waitForRecorderEvent(recorder, "stop");
return new Blob(dataChunks);
},
};
}
```
and it seems to work. The point is I have to wait 1 FPS time (eg. 1/60th of a second) before I go to the next frame - and now this is the biggest bottleneck as webgl rendering takes way less than 1 frame time so technically I should be able to export videos faster than their duration is.
As I am not able to tell MediaRecorder - 'capture this frame and make it have 1/60s long in the final video' I need to wait 1/60s only to feed recorder long enough, even tho my frame is perfectly ready and next one could be already rendered.
As a result - if exporting 2 minutes long video - if I need to wait for each frame, export will never be faster than 2 minutes, whereas if I do not wait for that - webgl is able to iterate over all frames and have them rendered in 10 seconds which is 12x faster.
--
GitHub Notification of comment by pie6k
Please view or discuss this issue at https://github.com/w3c/mediacapture-fromelement/issues/28#issuecomment-1376253766 using your GitHub account
--
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config
Received on Monday, 9 January 2023 20:13:50 UTC