Re: Wiki update

On Mon, Nov 3, 2014 at 10:33 AM, Silvia Pfeiffer
<silviapfeiffer1@gmail.com> wrote:
> On Mon, Nov 3, 2014 at 9:12 AM, Cyril Concolato
> <cyril.concolato@telecom-paristech.fr> wrote:
>> Le 02/11/2014 21:30, Silvia Pfeiffer a écrit :
>>>
>>>
>>>
>>> On 3 Nov 2014 03:42, "Cyril Concolato"
>>> <cyril.concolato@telecom-paristech.fr
>>> <mailto:cyril.concolato@telecom-paristech.fr>> wrote:
>>> >
>>> > Hi Bob, Silvia,
>>> >
>>> > Le 02/11/2014 16:57, Bob Lund a écrit :
>>> >>
>>> >>
>>> >> On 11/2/14, 3:45 AM, "Silvia Pfeiffer" <silviapfeiffer1@gmail.com
>>> >> <mailto:silviapfeiffer1@gmail.com>> wrote:
>>> >>
>>> >>> On Wed, Oct 29, 2014 at 8:48 AM, Cyril Concolato
>>> >>> <cyril.concolato@telecom-paristech.fr
>>> >>> <mailto:cyril.concolato@telecom-paristech.fr>> wrote:
>>> >>>>
>>> >>>> Yes, I do think HLS is another format that we should target. It has
>>> >>>> some
>>> >>>> commonalities with existing formats that we target: DASH for the use
>>> >>>> of
>>> >>>> manifests and adaptative streaming aspects; and MPEG-2 TS as a
>>> >>>> segment
>>> >>>> format. The basics of exposing tracks from TS should be reusable. It
>>> >>>> might
>>> >>>> need some tweaking compared to what we started specifying, for
>>> >>>> instance
>>> >>>> to
>>> >>>> expose ID3 Tag PES streams. In an informal discussion with Eric (in
>>> >>>> copy), I
>>> >>>> discovered that WebKit already exposes them using some API, see:
>>> >>>>
>>> >>>>
>>> >>>> http://trac.webkit.org/browser/trunk/LayoutTests/http/tests/media/track-i
>>> >>>> n-band-hls-metadata.html
>>> >>>
>>> >>> For the record: I do have some reservations about adding DASH and HLS
>>> >>> support, since browsers do not typically support these formats
>>> >>> natively.
>>> >
>>> > [CC] I've heard that Webkit/GTK has some native support for DASH, but I
>>> > couldn't verify it. I would expect that in the future some browsers have
>>> > native support for DASH or HLS. But I agree with you that support for
>>> > DASH/HLS is different from direct support for TS/MP4/OGG...
>>> >
>>> >>> Actually, HLS is supported in Safari, so it has some excuse,
>>> >>> but DASH is only supported via Media Source Extensions. I have been
>>> >>> worried about that a bit.
>>> >>
>>> >> There has been text added to the spec for DASH using MSE. MSE behavior
>>> >> is
>>> >> that the UA sources tracks based on information in Initialization
>>> >> Segments. The application may specify default track attributes for
>>> >> those
>>> >> tracks, which the UA will use if those same  attributes are not sourced
>>> >> from Initialization Segment data. It seems useful to me for the
>>> >> sourcing
>>> >> spec to describe this.
>>> >>
>>> >> On a related note, I plan to submit an MSE bug so that it references
>>> >> the
>>> >> sourcing spec for sourcing tracks as described above.
>>> >
>>> > What about adding a diagram like this one to the introduction:
>>> > http://concolato.wp.mines-telecom.fr/files/2014/11/inband-sourcing.png
>>> > and maybe indicating that some implementations may use one path or the
>>> > other.
>>> >
>>>
>>> Adding a diagram like this is useful, but I don't understand the one you
>>> made. In particular, all the media format parsing is happening in the UAs
>>> and not in an app.
>>>
>> In the diagram, I meant to say that: Browser+MSE+HTML Media+Other API is the
>> UA. Maybe that's not obvious. Can you suggest any change here?
>> Then I tried to carry to options in this diagram:
>> - when processing DASH (or HLS), the manifest is fetched and parsed by the
>> Web App, and then media data is fetched using XHR (other APIs) and passed to
>> the MSE part of the Browser. When it goes in the MSE API part in the
>> diagram, this is meant to say that some parsing is done (eg. MPD, or ISOBMFF
>> sidx ...) and then some part is done in the browser (in the MSE part).
>> - it is perfectly possible to process media in the Web app, it's exactly
>> what I do in mp4box.js. Other projects such as [2] do it for TS.
>
>
> OK. It might be better to paint individual paths (parallel paths) for
> each of these different paths that a media resource can go through
> from being picked up by the browser all the way to rendering. Then
> identify what APIs are being used.
>
> I'm not very good at drawing - do you want to try this?

After all, I gave it a try, see attached.

The red arrows are what we are defining. I've tried to explain the
different paths through the UA APIs.

Hope that makes sense?

Cheers,
Silvia.

Received on Monday, 3 November 2014 00:09:03 UTC