W3C home > Mailing lists > Public > public-media-capture@w3.org > May 2012

Re: Rendering on a <canvas>

From: Charles Pritchard <chuck@jumis.com>
Date: Mon, 30 Apr 2012 21:03:32 -0600
Message-Id: <4B620193-B986-464C-8F03-5F17059837F2@jumis.com>
Cc: Paul Neave <paul.neave@gmail.com>, Anant Narayanan <anant@mozilla.com>, "public-media-capture@w3.org" <public-media-capture@w3.org>
To: Charles Pritchard <chuck@jumis.com>
Little addendum-- I may be off on my WebKit observation. Chromium has pushed the Canvas GPU path and it sounds like it has a four frame lag.



On Apr 30, 2012, at 8:50 PM, Charles Pritchard <chuck@jumis.com> wrote:

> Seems in all cases the issue is getting the frames via callback.
> 
> Again, the media stream processing and workers from RoC sets up some ideas.
> 
> From what I've seen, WebKit has heavily pushed Canvas into GPU, despite the hazards. So, we're nearly there -- a drawImage call as well as a texture reference should be within reach.
> 
> But for us Canvas folk, we still need an in, to do our draws and blends.
> 
> Shaders and processing workers are fine for pixel-tweaks, but we need 2d for text and path overlays.
> 
> also, keep in mind, multiple streams may be a real use case; I think we just need that onframeready callback on video. Everything else is already in the gpu pipeline.
> 
> 
> 
> On Apr 30, 2012, at 2:57 PM, Paul Neave <paul.neave@gmail.com> wrote:
> 
>> Hi Anant,
>> 
>> In principle this sounds great. Do you have any idea what performance increase you would expect to gain? Perhaps the browser could interpret a drawImage(video…) call and make it more efficient by inferring that a video element is being drawn, and instead access the video's stream rather than drawing the element the standard way.
>> 
>> Also, would is be possible to stream a video into a WebGL context as well as a 2d context?
>> 
>> I already have a WebGL web app that uses getUserMedia and updates the context via a requestAnimationFrame loop: http://neave.com/webcam/html5/
>> 
>> Paul.
>> 
>> 
>> On Monday, 30 April 2012 at 18:57, Anant Narayanan wrote:
>> 
>>> Hi all,
>>> 
>>> As we try to resolve the <video> assignment issue, I'd like begin the  
>>> <canvas> discussion in parallel.
>>> 
>>> I propose that we allow direct assignment of a MediaStream to a <canvas>  
>>> object, like so:
>>> 
>>> let stream = [MediaStream obtained by some means]
>>> let ctx = document.getElementById('canvas').getContext('2d');
>>> ctx.stream = stream;
>>> 
>>> If we decide to go with the URL approach for <video> we can change this  
>>> API to match. The key point though, is to allow a <canvas> to be a  
>>> *direct* recipient of video data from a MediaStream.
>>> 
>>> It is possible to do this without explicit support from the getUserMedia  
>>> spec:
>>> 
>>> let canvas, video; [DOM objects preset]
>>> canvas.getContext('2d').drawImage(video, x, y, w, h, offsets...);
>>> 
>>> However, the developer will have to call drawImage on a DOM timer. It is  
>>> much more efficient and cleaner for the MediaStream to manipulate the  
>>> <canvas> directly.
>>> 
>>> It is a little weird to have a canvas in the DOM changing constantly (at  
>>> the frame rate of the MediaStream), but I think the benefits outweigh  
>>> the drawbacks.
>>> 
>>> Look forward to your feedback!
>>> 
>>> Regards,
>>> -Anant
>> 
>> 
>> 
>> 
> 
Received on Tuesday, 1 May 2012 03:03:59 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 16:14:59 GMT