W3C home > Mailing lists > Public > public-media-capture@w3.org > October 2013

Re: [Bug 23220] Add 'zoom' constraint to VideoStreamTrack

From: Rob Manson <roBman@mob-labs.com>
Date: Wed, 02 Oct 2013 15:13:51 +1000
Message-ID: <524BAB8F.5050708@mob-labs.com>
To: public-media-capture@w3.org
Hi Harald,

> Just a nit - I'd strongly object to using the name "Stream". The term
> has been horribly abused in so many ways, starting with Unix System 7's
> "Stream" model (the "solution for which Berkeley Sockets was a quick hack").

Fair point.


> Pairing "Stream" with some modifier preserves a modicum of sanity.

Where "some" is tiny and erroding 8)


> (I know of one effort that is using "Stream" as "stream of bytes", aka
> "the TCP model", and I think there's also one using "Stream" as "stream
> of records", aka "the OSI Transport model". Neither is a media stream
> (or a MediaStream), of course. A plague on both their houses' names, I say.)

Heh.

There's also this[1] but I'm really not clear on how or if this is 
really being pursued.  Anyone here have any thoughts/comments on that 
(yes I know it's a webapps thing and I have discussed it with people on 
that list)?

This related discussion is evolving too[2].

I'm not really too concerned at the moment with the name of this 
concept.  But I think the key point that some type of underlying "stream 
of bits" architecture would be really useful.

These new types of applications and use cases are pushing the existing 
browser architectures to their limits and at some point I think it would 
be good to step back and look at this with fresh eyes and with these 
realtime binary flows in mind.

For me one of the key things that might allow us to optimise these 
horrifically inefficient pipelines[3] is the use of ArrayBuffers.  Then 
we can just use Typed Arrays/Array Buffer Views to manipulate the 
underlying bit streams without expensive copies (kind of related to the 
transferable discussions/optimisations).

But as one of the implementors pointed out.  Depending upon the device 
you may want the ImageData/Texture info to be either on the GPU or the 
CPU and if this isn't done correctly a lot of the benefits evaporate.

Any thoughts anyone has on this are very welcome.

roBman


[1] http://www.w3.org/TR/streams-api/
[2] http://lists.w3.org/Archives/Public/public-webapps/2013AprJun/0706.html
[3] 
https://github.com/buildar/getting_started_with_webrtc/#image_processing_pipelinehtml
NOTE: We also have a version of this that works on the GPU using WebGLSL 
that we'll be publishing soon.
Received on Wednesday, 2 October 2013 05:14:19 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 16:26:20 UTC