- From: Eric Winkelman <E.Winkelman@cablelabs.com>
- Date: Mon, 21 Mar 2011 16:35:33 -0600
On Friday, March 18, 2011 9:23 PM, Silvia Pfeiffer [mailto:silviapfeiffer1 at gmail.com] wrote: > > Use Case: > > > > Many video streams contain in-band metadata for application signaling, and > other uses. ?By using this metadata, a web page can synchronize an > application with the delivered video, or provide other synchronized services. > > > > An example of this type of metadata is EISS ( > http://www.cablelabs.com/specifications/OC-SP-ETV-AM1.0-I06-110128.pdf > ) which is used to control applications that are synchronized with a television > broadcast. > > > > In general, a media stream can be expected to carry several types of > metadata and the types of metadata may vary in time. > > > > Problem: > > > > For in-band metadata tracks, there is neither a standard way to represent > the type of metadata in the HTMLTrackElement interface nor is there a > standard way to represent multiple different types of metadata tracks. > > > > Proposal: > > > > For TimedTextTracks with kind=metadata the @label attribute should > contain a MIME type for the metadata and that a track only contain Cues > created from metadata of that MIME type. > > > > This implies that streams with multiple types of metadata require the > creation of multiple metadata track objects, one for each MIME type. > > > I don't understand. Are you saying that right now all tracks that are of > kind=metadata are made available through a single TextTrack? Cause I don't > think that's the case. No, I'm not saying that, but as far as I can tell from the spec, it is undefined how the user agent should map in-band data to metadata tracks. I am proposing that the algorithm should be that different types of data should go into different Timed Text Tracks, and that the track's @label should reflect the type. > Or are you worried about text track files that contain more than one type of > metadata? > > If the latter, then how is the browser to know how to separate > out the individual cues from a single track into multipled? I assume you mean out-of-band tracks here, and they don't concern me. The easiest way to deal with multiple types of out-of-band metadata is through multiple tracks, one for each type of metadata. You would simply associate the appropriate handler for each track. What I'm proposing is that an in-band metadata Timed Text Track follow this same logic and only contain metadata of the type identified by the label. The @kind attribute addresses this for the 3 types identified: captions, subtitles and descriptive text. My proposal is that @label provide this distinction for @kind=metadata. > Can you clarify? Perhaps an example will make this clearer: An MPEG transport stream could contain three metadata streams, each using a separate packet Id (PID). The three metadata streams are: EISS (which synchronizes applications with the video), SCTE-35 (which controls ad insertion), and content advisories (which supports parental controls). What I'm proposing is that three separate text tracks be created for the three different types of metadata, and that the @label attribute contain the type information for that track's cues. This way an application running on the User Agent would able to determine that a particular track contained the type of data it was designed to work with. (An application that enforced parental controls would look at new tracks to see if they contained content advisories and ignore the ones that didn't.) Recent updates to the spec, section 4.8.10.12.2 (http://www.whatwg.org/specs/web-apps/current-work/multipage/video.html#sourcing-in-band-text-tracks) appear to address my concern in step 2: "2. Set the new text track's kind, label, and language based on the semantics of the relevant data, as defined by the relevant specification." Provided that the relevant specification defines the metadata type encoding to be put in the label, e.g. application/x-eiss, application/x-scte35, application/x-contentadvisory, etc. Does this make sense? Eric
Received on Monday, 21 March 2011 15:35:33 UTC