- From: Harald Alvestrand <harald@alvestrand.no>
- Date: Thu, 15 Mar 2012 18:03:24 +0100
- To: public-media-capture@w3.org
- Message-ID: <4F6220DC.5080208@alvestrand.no>
On 03/15/2012 12:40 AM, Robert O'Callahan wrote:
> On Tue, Mar 13, 2012 at 5:01 PM, Randell Jesup <randell-ietf@jesup.org
> <mailto:randell-ietf@jesup.org>> wrote:
>
> On 3/13/2012 4:17 AM, Anant Narayanan wrote:
>
> I'm trying to avoid a dependency on MediaStreams for the
> particular case where all the web page wants is a single image
> from the user's camera. Profile pictures, QR codes, there
> might be moreā¦
>
>
> But you almost always want a preview, and that often needs to be
> app-specific, even for profile pictures and QR codes, so I think
> MediaStreams are usually needed anyway.
>
> I don't think we want to propagate events along MediaStreams. That
> doesn't seem necessary.
>
> I think it would make sense to have VideoMediaStreamTrack (like
> AudioMediaStreamTrack already), and APIs on video tracks to request
> setting of various camera parameters (resolution, focus mode, etc)
> (async of course), and an API to request a snapshot with/without flash
> and various other parameters set momentarily (also async). The
> snapshot API should probably trigger a callback passing a Blob which
> can be used as an <img> source.
I kind of dislike talking about the flash property of a
VideoMediaStreamTrack. It's another one of those things that don't make
sense in the 90% case (remote stream, stream from file, stream from
video camera without flash).
If we adopt Anant's proposal with a specific "head of chain" object, I
think the flash belongs up there.
Harald
Received on Thursday, 15 March 2012 17:03:53 UTC