Re: Proposal: Media streams, media stream tracks and channels

>
>    Yeh, I understand that. Just wanted to point out that semantic of
identifiers you've chosen intersect with what we deal in other standards
(MPEG container for instance) and real life analogues. So hierarchy of
Stream, Track and Channel is unclear without reading the doc (which is a
good thing sometimes). Personally I would prefer
Stream->Substream->SubstreamChannel hierarchy or similar. For instance
MPEG-4 container has a notion of tracks (trak block). In audio world tracks
have a default meaning as well.

>
>
> On Mon, Nov 7, 2011 at 5:17 PM, Harald Alvestrand <harald@alvestrand.no>wrote:
>
>>  On 11/08/2011 01:37 AM, Alex wrote:
>>
>> Relationship between Stream, Track and Channel seems a bit confusing to
>> me. Why not to call MediaStreamTrack as MediaSubstream? :)
>>
>>
>> Just because it's been called Track or MediaStreamTrack for a while.
>> Tradition's set in.
>>
>> "Track" and "Channel" have apparently been used for similar entities in
>> other contexts too; if we can define them with reasonable precision here
>> while keeping them similar to those other contexts, that might be a win.
>>
>>
>>
>> On Mon, Nov 7, 2011 at 4:15 PM, Harald Alvestrand <harald@alvestrand.no>wrote:
>>
>>>  Discharging a task taken on at the TPAC meeting, some possible words on
>>> what a media stream, a media stream track or a channel is....
>>>
>>> This is based on the introduction section in section 3.1 of the current
>>> API editors' draft.
>>>
>>>  The MediaStream<http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>
>>>  interface is used to represent streams of media data, typically (but
>>> not necessarily) of audio and/or video content, e.g. from a local camera or
>>> a remote site. The data from a MediaStream<http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>
>>>  object does not necessarily have a canonical binary form; for example,
>>> it could just be "the video currently coming from the user's video camera".
>>> This allows user agents to manipulate media streams in whatever fashion is
>>> most suitable on the user's platform.
>>>
>>> Each MediaStream<http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>
>>>  object can represent zero or more tracks, in particular audio and
>>> video tracks. Tracks can contain multiple channels of parallel data; for
>>> example a single audio track could have nine channels of audio data to
>>> represent a 7.2 surround sound audio track.
>>>
>>> <new text below>
>>>
>>> All tracks in a MediaStream are presumed to be synchronized at some
>>> level. Different MediaStreams may or may not be synchronized.
>>>
>>> Each track represented by a MediaStream<http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastream>
>>>  object has a corresponding MediaStreamTrack<http://dev.w3.org/2011/webrtc/editor/webrtc.html#mediastreamtrack>
>>>  object.
>>>
>>> A MediaStreamTrack represents content comprising one or more channels,
>>> where the channels have a defined well known  relationship to each other
>>> (such as a stereo or 5.1 audio signal), and may be encoded together for
>>> transmission as, for instance, an RTP payload type.
>>>
>>> A channel is the smallest unit considered in this API specification.
>>>
>>> <end new text>
>>>
>>> Would including this text help add any clarity after our discussions at
>>> TPAC?
>>>
>>> Query: What other examples of embedded channels would be useful to add?
>>> Are there good examples of audio or non-audio data embedded in video tracks?
>>>
>>>                 Harald
>>>
>>>
>>>
>>
>>
>>  --
>> ------------------------------
>> Regards, Alex
>> www.pictures2.com
>>
>>
>>
>
>
> --
> ------------------------------
> Regards, Alex
> www.pictures2.com
>



-- 
------------------------------
Regards, Alex
www.pictures2.com

Received on Sunday, 13 November 2011 20:35:53 UTC