- From: Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com>
- Date: Tue, 8 Nov 2011 11:57:46 +0100
- To: public-webrtc@w3.org
On 11/08/2011 01:15 AM, Harald Alvestrand wrote: > Discharging a task taken on at the TPAC meeting, some possible words on > what a media stream, a media stream track or a channel is.... > > This is based on the introduction section in section 3.1 of the current > API editors' draft. > > The|MediaStream > <http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>|interface > is used to represent streams of media data, typically (but not > necessarily) of audio and/or video content, e.g. from a local camera or > a remote site. The data from a|MediaStream > <http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>|object > does not necessarily have a canonical binary form; for example, it could > just be "the video currently coming from the user's video camera". This > allows user agents to manipulate media streams in whatever fashion is > most suitable on the user's platform. > > Each|MediaStream > <http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>|object > can represent zero or more tracks, in particular audio and video tracks. > Tracks can contain multiple channels of parallel data; for example a > single audio track could have nine channels of audio data to represent a > 7.2 surround sound audio track. > > <new text below> > > All tracks in a MediaStream are presumed to be synchronized at some > level. Different MediaStreams may or may not be synchronized. > > Each track represented by a|MediaStream > <http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>|object > has a corresponding|MediaStreamTrack > <http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastreamtrack>|object. Nit: why do you say "represented by a MediaStream" in this section but "in a MediaStream" in the previous one? "In a" sounds better to me. > > A MediaStreamTrack represents content comprising one or more channels, > where the channels have a defined well known relationship to each other > (such as a stereo or 5.1 audio signal), and may be encoded together for > transmission as, for instance, an RTP payload type. > > A channel is the smallest unit considered in this API specification. > > <end new text> > > Would including this text help add any clarity after our discussions at > TPAC? I think it helps. I am not convinced that the channels must be exposed in any of the webrtc API's. The Audio API that Chris presented enables access to the channels (in audio tracks) if there is a need to. > > Query: What other examples of embedded channels would be useful to add? > Are there good examples of audio or non-audio data embedded in video tracks? > > Harald > > >
Received on Tuesday, 8 November 2011 10:58:25 UTC