W3C home > Mailing lists > Public > public-media-capture@w3.org > April 2013

Re: Rationalizing new/start/end/mute/unmute/enabled/disabled

From: Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com>
Date: Tue, 9 Apr 2013 12:49:58 +0200
Message-ID: <5163F256.40803@ericsson.com>
To: public-media-capture@w3.org
On 4/9/13 12:26 PM, Robert O'Callahan wrote:
> On Tue, Apr 9, 2013 at 8:02 PM, Harald Alvestrand <harald@alvestrand.no
> <mailto:harald@alvestrand.no>> wrote:
>
>     On 04/09/2013 12:16 AM, Robert O'Callahan wrote:
>>     On Tue, Apr 9, 2013 at 12:43 AM, Stefan Håkansson LK
>>     <stefan.lk.hakansson@ericsson.com
>>     <mailto:stefan.lk.hakansson@ericsson.com>> wrote:
>>
>>
>>             All tracks that we can decode. So e.g. if you play a
>>             resource with a
>>             video track in an <audio> element and capture that to a
>>             MediaStream, the
>>             MediaStream contains the video track.
>>
>>
>>         What if there are two video tracks? Only one of them is
>>         selected/played naturally, but in principle both could be
>>         decoded. (What I am saying is that we need to spec this up).
>>
>>
>>     Definitely. Yes, I think we should decode them both.
>     Not sure I get where this is coming from....
>
>     I see absolutely no reason to decode a video stream until we know
>     where it's going.
>     The destination might be another PeerConnection with a compatibly
>     negotiated codec, or a hardware device with special codec support,
>     or a Recorder willing to store the bytes as previoiusly encoded.
>
>     I think we should carefully *avoid* specifying exactly where and
>     when decoding takes place. Only the observable result should be in
>     the standard.
>
>
> Sorry, yes, I totally agree. I didn't mean to suggest that data should
> be decompressed. I just meant to say that both video tracks should be
> obtained from the resource and be present in the MediaStream.
>
>     I don't want to go there at this time.
>
>     We seriously run the risk of defining a toolkit that is able to do a
>     set of tasks that nobody wants done (because they're not satisfied
>     with the result), placing a heavy burden on browser implementors for
>     very marginal benefit.
>
>
>     We already support putting a video on a Canvas (in a somewhat
>     cumbersome way, true). Should we focus on getting the video back out
>     of the Canvas, and let people play with that toolkit before we start
>     in on defining smaller toolkits that might not be able to do what
>     people really want?
>
>
> I'm not sure what you mean here. I prototyped ProcessedMediaStream (as
> in MSP) a while back (minus the canvas support) and it worked well.
> Trying to manipulate video by drawImage'ing to a canvas and re-encoding
> the result to a MediaStream is quite likely to mean "they're not
> satisfied with the result" (since it's subject to main-thread latency,
> among other problems). Also, at least for us the simple
> ProcessedMediaStream API I just suggested is not a heavy burden, it's a
> very small feature compared to almost everything else being proposed in
> this group.
>
> Anyway, we don't have to specify ProcessedMediaStream or anything like
> it right now, but I think it's helpful to have some shared idea of where
> APIs might go because that informs the decisions we make now. In
> particular people have been talking about whether and how the source of
> a track might change, and features like ProcessedMediaStream are
> relevant to that.

I agree, we don't need to specify this right now, but it is good to have 
in mind. As this would (presumably) be an extension to MediaStream, it 
could be a document of its own.

And it also seems that we have a solution for the most urgent "switch 
source" use-case: Muzak to a legacy end-point (where we can use the web 
audio api).

My outstanding question is if we should do something about 
"media-element.captureUntilEnded" now, or if that can also wait.

Stefan

>
> Rob
> --
> q“qIqfq qyqoquq qlqoqvqeq qtqhqoqsqeq qwqhqoq qlqoqvqeq qyqoquq,q
> qwqhqaqtq qcqrqeqdqiqtq qiqsq qtqhqaqtq qtqoq qyqoquq?q qEqvqeqnq
> qsqiqnqnqeqrqsq qlqoqvqeq qtqhqoqsqeq qwqhqoq qlqoqvqeq qtqhqeqmq.q
> qAqnqdq qiqfq qyqoquq qdqoq qgqoqoqdq qtqoq qtqhqoqsqeq qwqhqoq qaqrqeq
> qgqoqoqdq qtqoq qyqoquq,q qwqhqaqtq qcqrqeqdqiqtq qiqsq qtqhqaqtq qtqoq
> qyqoquq?q qEqvqeqnq qsqiqnqnqeqrqsq qdqoq qtqhqaqtq.q"
Received on Tuesday, 9 April 2013 10:50:19 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:24:40 UTC