[MEDIA_PIPELINE_TF] WebRTC Coordination

------- Forwarded message -------
From: "Stefan Hakansson LK" <stefan.lk.hakansson@ericsson.com<mailto:stefan.lk.hakansson@ericsson.com>>
To: public-web-and-tv@w3.org<mailto:public-web-and-tv@w3.org>
Cc: "Harald Alvestrand" <hta@google.com<mailto:hta@google.com>>
Subject: [MEDIA_PIPELINE_TF] Any overlap with webrtc WG?
Date: Tue, 20 Mar 2012 10:52:24 +0100
Hi all in the Web and TV IG, Media Pipeline TF,
I have the task, from the webrtc WG, ([1]) to investigate the
commonalities between the group's definition and use of MediaStreams and
tracks. (I also note that in [2] there is a reference to the webrtc WG.)
Having looked into the documentation at [2], and compared to the webrtc
definitions ([3]), I find that there is not much overlap.
Where the Media Pipeline TF seem to focus on enabling JS to handle rate
control and on content protection the concept in the webrtc WG group is
more to let the UA handle all of this, with the application being able
to control the routing of the media (by use of MediaStream objects).
The part of overlap may be for the media elements, and particularly the
ability to handle several tracks (and tracks coming and going during
playout), and the bugs filed for the html document may very well be
relevant for webrtc as well.
Does this analysis make sense to you?
Br,
Stefan Håkansson (co-chair, webrtc WG)
[1] http://www.w3.org/2011/04/webrtc/track/actions/18
[2] http://www.w3.org/2011/webtv/wiki/MPTF
[3] http://dev.w3.org/2011/webrtc/editor/webrtc.html#stream-api
--

Received on Wednesday, 30 May 2012 20:51:52 UTC