- From: Mark Watson <watsonm@netflix.com>
- Date: Thu, 17 Feb 2011 09:41:57 -0800
- To: Jan Lindquist <jan.lindquist@ericsson.com>
- CC: Glenn Adams <glenn@skynav.com>, Philippe Le Hegaret <plh@w3.org>, "Ali C. Begen (abegen)" <abegen@cisco.com>, Silvia Pfeiffer <silviapfeiffer1@gmail.com>, Jean-Claude Dufourd <jean-claude.dufourd@telecom-paristech.fr>, "Richard Maunder (rmaunder)" <rmaunder@cisco.com>, "public-web-and-tv@w3.org" <public-web-and-tv@w3.org>
- Message-ID: <EF90BF63-BCD7-46E0-BFEA-E11CAF7FF5CB@netflix.com>
On Feb 17, 2011, at 4:59 AM, Jan Lindquist wrote: Hi, If I may piggyback this thread I did a comparison between the video element to what we have done in OIPF DAE specification. Here are some observations and differences. Copy of the DAE spec can be found at: http://www.oipf.org/docs/Release2/OIPF-T1-R2-Specification-Volume-5-Declarative-Application-Environment-v2_0-2010-09-07.pdf The purpose of sharing the information is to compare notes and try to find gaps that can be filled in HTML5. The focus is on HTTP adapative streaming. Short Comparison: 1. The support of media and text tracks seem to be equivelent to what we have defined as AV Components (refer to section 7.16.5 in DAE spec). The media tracks has a different logic than what was done in DAE so several attributes in tracks element are not applicable. In DAE the components are treated as one source while video element allows it to be seperated as individual "streams". Might not be clear how a media element readyState supercedes that of the track element states. 2. The configuration of the terminal default setting is not available (refer to section 7.3.2.1 in DAE spec). Properties preferredLanguage and preferredSubtitleLanguage. This is more of a device management than video element issue. 3. Support of HTTP adapative streaming (DASH or equivelent) is supported in DAE, unfortunately it is not published and available to the generic public (available in 2 months). In order to share it a liaison has to be initiated towards OIPF with the request. - At a high level we have means of reporting changes in quality for a represetation or a new period. The quality is the form of bandwidth according to the manifest 3GPP HAS (have not checked DASH equivelent). Stream bitrate on it's own is not very useful unless you specify how it is calculated (average, peak, over what time window etc.). There may also be multiple streams with the same bitrate which differ in some other respect (codec, resolution, fps, etc.). In DASH every representation has an @id attribute. It would make sense to report this at stream changes. What to do about Periods (high level division in time of the content) requires some discussion as it is a DASH-specific concept. Perhaps some way to map to a generic timed-metadata reporting mechanism on the video tag ? - There is a means of retrieving the available representations (from min to max) and the current representation. I'm not sure why this would be needed ? ...Mark Regards, JanL ________________________________ From: public-web-and-tv-request@w3.org<mailto:public-web-and-tv-request@w3.org> [mailto:public-web-and-tv-request@w3.org] On Behalf Of Mark Watson Sent: den 16 februari 2011 07:11 To: Glenn Adams Cc: Philippe Le Hegaret; Ali C. Begen (abegen); Silvia Pfeiffer; Jean-Claude Dufourd; Richard Maunder (rmaunder); public-web-and-tv@w3.org<mailto:public-web-and-tv@w3.org> Subject: Re: HTML5 Last Call May 2011 & DASH/Adaptive Streaming What I think would be useful, though, would be to specify that *if* DASH (say) is supported by an HTML5 video implementation *then* is should be done in a specific way. Otherwise we have the prospect of multiple different implementations supporting different things. For the examples you give below, it is at least clear what you support if you support a certain version of ECMAScript, say. For DASH used in HTML5 a minimal set of things that should be nailed down are: - that a URL for a DASH manifest can be provided in the @src or <source> element - what error should be reported if the manifest is invalid, or if the media it points to cannot be found - how to label DASH tracks so that the different kinds of track map to the Multi-Track API (or whatever is agreed for ISSUE-152) and a single way to do the mapping - what event to use to report automatic bitrate changes This is pretty basic stuff, but it would help if everyone that does it, does it the same way. ...Mark On Feb 15, 2011, at 5:33 PM, Glenn Adams wrote: For that matter, HTML5 does not require a UA to support either the HTML syntax of HTML5 or the XML syntax of HTML5. A compliant HTML5 UA could support neither (albeit with little utility). HTML5 does *not* specify: * which top-level document types must be supported * which version of ECMAScript must be supported * which version/modules of CSS must be supported * which version/modules of DOM must be supported * which URI schemes (protocols) must be supported * which non-streaming media types must be supported * which streaming media types must be supported HTML5 is a technology framework specification that requires other specifications (profiles) to fill in these gaps. In the context of certain TV standardization activities, such work to define one or more profiles is already underway. N.B. I am not criticizing HTML5 for not making such choices. In fact, I think the editor and group has taken the best approach in that regard. G. On Tue, Feb 15, 2011 at 5:59 PM, Philippe Le Hegaret <plh@w3.org<mailto:plh@w3.org>> wrote: On Tue, 2011-02-15 at 18:40 -0500, Ali C. Begen (abegen) wrote: > I think folks need to agree on the container format not the codec type. A good container format will be good for several codecs that exist today and will yet to come. My understanding is that the IP issues surrounding the codec types are also surrounding the container formats and the streaming technologies. So, I'd be surprised if any agreement was reached within the HTML Working Group on those topics. I can't imagine a different conclusion that the H.264/Theora discussion at this point. In any case, as Glenn alluded to, HTML has been technology neutral since the beginning. Unless I'm mistaken, we don't require implementations to support a specific image format. Philippe
Received on Thursday, 17 February 2011 17:45:49 UTC