- From: Ian Hickson <ian@hixie.ch>
- Date: Fri, 15 Jul 2011 06:55:37 +0000 (UTC)
- To: "Timothy B. Terriberry" <tterriberry@mozilla.com>
- cc: "public-webrtc@w3.org" <public-webrtc@w3.org>
On Thu, 14 Jul 2011, Timothy B. Terriberry wrote: > Ian Hickson wrote: > > environment to be transmitted. This differs from knowing what kind of > > audio is expected in that the page rarely knows the latter. You use > > the same video conferencing app for music as for chatting. You don't > > tend to > > On the contrary, I imagine you'd want to use a very different app for > something like a garageband web page or for things like a live DJ app. > Even if a regular conferencing app _might_ work for distributed music > performance, having things like a metronome available if the network > latency is too high (i.e., over 25 ms) or other features specific to > such performances would be highly desirable, and I expect there to be > sites that provide them, as well as plenty of other things neither you > nor I have thought of. For things where the ultimate audio source is not > a microphone, the UA _might_ be able to detect that and work out what > the right thing to do is, but only if the MediaStream graph is fully > configured when codec negotiation first happens. Are those use cases you want to address in the first attempt at this API? Or are they things we can address in subsequent iterations? -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Received on Friday, 15 July 2011 06:56:03 UTC