Re: Mediastream Recording implementation

On Tue, Jul 16, 2013 at 3:54 PM, Robert O'Callahan <robert@ocallahan.org>wrote:

> On Wed, Jul 17, 2013 at 10:33 AM, Greg Billock <gbillock@google.com>wrote:
>
>> For access to streaming output of the encoder, ArrayBuffer seems to me
>> more appropriate. It's compatible with XHR2, File, and direct manipulation.
>>
>
> So are Blobs.
>

I think a bigger issue is more what else you can do with Blobs, like
createObjectURL. What should that return for a Blob which is a slice of
encoded stream data? How will clients detect that such blob urls are
suitable for some uses but not others? It's currently very easy to do
ArrayBuffer->Blob, but more involved to do the reverse.


I actually think Blobs (memory-backed) can be more efficient than
> ArrayBuffers for streaming (via Websockets for example), because we can
> share the Blob's memory buffer between processes without copying or
> breaking semantics.
>

That's a good point. If the goal is to stream these slices out somewhere,
this could be important. But if the app is handling the streaming data, it
seems to me that low-latency access to that streaming data is the most
important thing, and while some apps may then desire Blob semantics, they
are just a new Blob(ArrayBuffer) away.


> I also think that it's simpler to have a single data delivery event that
> produces a Blob, than to have separate events one of which produces a Blob
> and the other which produces an ArrayBuffer.
>

But that's an artifact of the API design. There could just as easily be two
methods to accomplish these two use cases. If they're fairly different,
then JS parameters are not a very strong overloading signal. If they're
fairly similar, then this is a good solution. So I think this is just
restating the nub of the problem: figuring out the right intuition about
how the API should present full-content and sliced encoded data.

Looking elsewhere in webrtc, you can do createObjectURL on a media stream,
then video/canvas stuff to get bytes out as ArrayBuffer types. Web Audio's
decodeAudioData method takes an ArrayBuffer, which you're recommended to
get from XHR using responseType="arraybuffer", and its buffers are
organized around ArrayBuffer. Ultimately, a lot rides on the perception of
whether encoded byte slices are "raw" -- they certainly aren't mutable in
anything like the way raw video frames or audio slices are.

Received on Tuesday, 16 July 2013 23:44:03 UTC