- From: Jan-Ivar Bruaroey <notifications@github.com>
- Date: Mon, 13 Dec 2021 15:52:40 -0800
- To: whatwg/streams <streams@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/streams/pull/1193/review/830772787@github.com>
@jan-ivar commented on this pull request.
Content looks great! Just a question on mentioning WebIDL, and nits. I also think we should show workers in the example, since that's what we've spec'ed (see suggestion).
> +* Performing realtime transformations of `VideoFrame` objects, for instance taking a camera `MediaStreamTrack` and applying
+ a background blur effect as a `TransformStream` on each `VideoFrame` of the `MediaStreamTrack`.
+
+## End-user benefit
+
+* `VideoFrame` needs specific management and be closed as quickly as possible, without relying on garbage collection.
+ This is important to not create hangs/stutters in the processing pipeline. By building support for safe patterns
+ directly in streams, this will allow web developers to optimize `VideoFrame` management, and allow user experience
+ to be more consistent accross devices.
+
+## Principles
+
+The envisioned changes to the streams specification could look like the following:
+* Add a new 'transfer' value that can be passed to `ReadableStream` type, `WritableStream` type and `TransformStream` readableType/writableType.
+ For streams that do not use the 'transfer' type, nothing changes.
+* Streams of the 'transfer' type can only manipulate that are marked both as Transferable and Serializable.
```suggestion
* Streams of the 'transfer' type can only manipulate objects that are marked both as Transferable and Serializable.
```
I was also wondering if a `s/object/chunk/` throughout would work for context?
> +* `VideoFrame` needs specific management and be closed as quickly as possible, without relying on garbage collection.
+ This is important to not create hangs/stutters in the processing pipeline. By building support for safe patterns
+ directly in streams, this will allow web developers to optimize `VideoFrame` management, and allow user experience
+ to be more consistent accross devices.
+
+## Principles
+
+The envisioned changes to the streams specification could look like the following:
+* Add a new 'transfer' value that can be passed to `ReadableStream` type, `WritableStream` type and `TransformStream` readableType/writableType.
+ For streams that do not use the 'transfer' type, nothing changes.
+* Streams of the 'transfer' type can only manipulate that are marked both as Transferable and Serializable.
+* If an object that is either not Transferable or not Serializable is enqueued or written, the object is ignored as if it was never enqueued/written.
+* If a Transferable and Serializable object is enqueueud/written in a 'transfer' type `ReadableStreamDefaultController`, `TransformStreamDefaultController`
+ or `WritableStreamDefaultWriter`, create a transferred version of the object using StructuredSerializeWithTransfer/StructuredDeserializeWithTransfer.
+ Proceed with the regular stream algorithm by using the transferred object instead of the object itself.
+* Introduce a WhatWG streams 'close-able' concept. An object that is 'close-able' defines closing steps.
Didn't we talk about "Introduce a WebIDL concept" last meeting?
Also, "closeable" or "closable"? 😬 Google [verbatim](https://www.google.com/search?q=closeable&tbs=li:1) thinks the latter
```suggestion
* Introduce a WebIDL [Closable] concept. An object that is [Closable] defines closing steps.
```
<img width="849" alt="image" src="https://user-images.githubusercontent.com/3136226/145897562-e894eb22-6764-45e8-8d97-9db90b6c2a07.png">
> @@ -0,0 +1,124 @@
+# Transfering Ownership Streams Explained
```suggestion
# Transferring Ownership Streams Explained
```
> + // At this point, videoFrame has been transfered within controller.enqueue call. frameCountTransform cannot mutate it.
+ if (!(++frameCount % 30) && frameCountTransform.onEach30Frame)
+ frameCountTransform.onEach30Frame(frameCount);
+ } catch {
+ videoFrame.close();
+ }
It cannot mutate it, because it cannot even access it, is that right?
Also, don't we need to rethrow in the catch, to not encourage overriding default error handling?
I also worry this risks calling `close` on a transferred object, or is that benign? Seems confusing given the comments. How about:
```suggestion
} catch (e) {
videoFrame.close();
throw e;
}
// At this point, videoFrame has been transferred with controller.enqueue. frameCountTransform cannot access it.
if (!(++frameCount % 30))
frameCountTransform?.onEach30Frame(frameCount);
```
> +If A does not close `VideoFrame`, it is the responsibility of the remaining of the pipeline to close it.
+There is a need to clearly identify who is owning these objects and who is responsible to close these objects at any point in time.
+
+The proposed solution is to transfer ownership of an object to the stream when it gets written or enqueueud to the stream.
+By doing so, the processing unit that enqueues/writes objects will not be able to mutate objects manipulated by the stream and is relieved of the lifetime management of these objects.
+Conversely, processing units take ownership of objects when they receive them from a stream.
+
+Transferring ownership should be opt-in. For that purpose, a new streams type, named 'transfer' in this document, would be added.
+
+## Example
+
+Below is an example of JavaScript that shows how this can be used.
+The example creates a processing pipe starting with a camera stream and applying two transforms, one for doing a processing for every 30 frame, and one for doing background blur.
+
+```javascript
+// JS transform
I think it's important we lead with worker examples from the get-go.
```suggestion
// worker.js
```
> +const cameraStream = await navigator.mediaDevices.getUserMedia({ video : true });
+const videoFrameStream = getReadableStreamFromTrack(cameraStream.getVideoTracks()[0]);
+const blurredVideoFrameStream = videoFrameStream.pipeThrough(backgroundBlurTransform)
+ .pipeThrough(frameCountTransform);
+const blurredStream = new MediaStream([getTrackFromReadableStream(blurredVideoFrameStream)]);
+// Make use of blurredStream.
Use workers part 2. Also, mediacapture-transform is (almost as of this writing) adopted, so I think we can be specific now:
```suggestion
self.onmessage = async ({data: {cameraTrack}}) => {
const {readable} = new MediaStreamTrackProcessor({cameraTrack});
const {writable, track} = new VideoTrackGenerator();
parent.postMessage({track}, [track]);
await readable.pipeThrough(backgroundBlurTransform).pipeThrough(frameCountTransform).pipeTo(writable);
}
// main.js
const [cameraTrack] = (await navigator.mediaDevices.getUserMedia({video: true}).getVideoTracks();
const worker = new Worker('worker.js');
worker.postMessage({cameraTrack}, [cameraTrack]);
const {data} = await new Promise(r => worker.onmessage);
const blurredStream = new MediaStream([data.track]);
video.srcObject = blurredStream;
```
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/whatwg/streams/pull/1193#pullrequestreview-830772787
Received on Monday, 13 December 2021 23:52:54 UTC