- From: Mark Watson <watsonm@netflix.com>
- Date: Mon, 20 Jun 2011 02:31:57 -0700
On Jun 20, 2011, at 10:42 AM, Silvia Pfeiffer wrote: On Mon, Jun 20, 2011 at 6:29 PM, Mark Watson <watsonm at netflix.com<mailto:watsonm at netflix.com>> wrote: On Jun 9, 2011, at 4:32 PM, Eric Carlson wrote: On Jun 9, 2011, at 12:02 AM, Silvia Pfeiffer wrote: On Thu, Jun 9, 2011 at 4:34 PM, Simon Pieters <simonp at opera.com<mailto:simonp at opera.com>> wrote: On Thu, 09 Jun 2011 03:47:49 +0200, Silvia Pfeiffer <silviapfeiffer1 at gmail.com<mailto:silviapfeiffer1 at gmail.com>> wrote: For commercial video providers, the tracks in a live stream change all the time; this is not limited to audio and video tracks but would include text tracks as well. OK, all this indicates to me that we probably want a "metadatachanged" event to indicate there has been a change and that JS may need to check some of its assumptions. We already have durationchange. Duration is metadata. If we want to support changes to width/height, and the script is interested in when that happens, maybe there should be a dimensionchange event (but what's the use case for changing width/height mid-stream?). Does the spec support changes to text tracks mid-stream? It's not about what the spec supports, but what real-world streams provide. I don't think it makes sense to put an event on every single type of metadata that can change. Most of the time, when you have a stream change, many variables will change together, so a single event is a lot less events to raise. It's an event that signifies that the media framework has reset the video/audio decoding pipeline and loaded a whole bunch of new stuff. You should imagine it as a concatenation of different media resources. And yes, they can have different track constitution and different audio sampling rate (which the audio API will care about) etc etc. In addition, it is possible for a stream to lose or gain an audio track. In this case the dimensions won't change but a script may want to react to the change in audioTracks. The TrackList object has an onchanged event, which I assumed would fire when any of the information in the TrackList changes (e.g. tracks added or removed). But actually the spec doesn't state when this event fires (as far as I could tell - unless it is implied by some general definition of events called onchanged). Should there be some clarification here ? I understood that to relate to a change of cues only, since it is on the tracklist. I.e. it's an aggregate event from the oncuechange event of a cue inside the track. I didn't think it would relate to a change of existence of that track. Note that the even is attached to the TrackList, not the TrackList[], so it cannot be raised when a track is added or removed, only when something inside the TrackList changes. Are we talking about the same thing ? There is no TrackList array and TrackList is only used for audio/video, not text, so I don't understand the comment about cues. I'm talking about http://www.whatwg.org/specs/web-apps/current-work/multipage/the-iframe-element.html#tracklist which is the base class for MultipleTrackList and ExclusiveTrackList used to represent all the audio and video tracks (respectively). One instance of the object represents all the tracks, so I would assume that a change in the number of tracks is a change to this object. I agree with Silvia, a more generic "metadata changed" event makes more sense. Yes, and it should support the case in which text tracks are added/removed too. Yes, it needs to be an event on the MediaElement. Also, as Eric (C) pointed out, one of the things which can change is which of seve ral available versions of the content is being rendered (for adaptive bitrate cases). This doesn't necessarily change any of the metadata currently exposed on the video element, but nevertheless it's information that the application may need. It would be nice to expose some kind of identifier for the currently rendered stream and have an event when this changes. I think that a stream-format-supplied identifier would be sufficient. I don't know about the adaptive streaming situation. I think that is more about statistics/metrics rather than about change of resource. All the alternatives in an adaptive streaming "resource" should provide the same number of tracks and the same video dimensions, just at different bitrate/quality, no? I think of the different adaptive versions on a per-track basis (i.e. the alternatives are *within* each track), not a bunch of alternatives each of which contains several tracks. Both are possible, of course. It's certainly possible (indeed common) for different bitrate video encodings to have different resolutions - there are video encoding reasons to do this. Of course the aspect ratio should not change and nor should the dimensions on the screen (both would be a little peculiar for the user). Now, the videoWidth and videoHeight attributes of HTMLVideoElement are not the same as the resolution (for a start, they are in CSS pixels, which are square), but I think it quite likely that if the resolution of the video changes than the videoWidth and videoHeight might change. I'd be interested to hear how existing implementations relate resolution to videoWidth and videoHeight. Different video dimensions should be provided through the <source> element and @media attribute, but within an adaptive stream, the alternatives should be consistent because the target device won't change. I guess this is a discussion for another thread... :-) Possibly ;-) The device knows much better than the page author what capabilities it has and so what resolutions are suitable for the device. So it is better to provide all the alternatives as a single resource and have the device work out which subset it can support. Or at least, the list should be provided all at the same level - there is no rationale for a hierarchy of alternatives. Cheers, Silvia.
Received on Monday, 20 June 2011 02:31:57 UTC