Re: Buffered bytes for media elements

Some background behind the request:
It is common for web sites displaying video (such as YouTube) to control the
buffering and playback of video in order to optimize the user experience.
One example is deciding when sufficient data has been buffered to begin
playback - while the spec currently include the "canPlayThrough" event, this
isn't really flexible enough for all use cases.  Users often do not watch
entire videos, so delaying playback until complete playthrough is possible
is too much of a delay.  It's often appropriate to play *some* video as soon
as possible, and modify buffering behavior during playback as the user's
intentions become more clear.
Another example is that web sites may have multiple versions of videos, and
want to be able to make their own determination of when to switch from one
to another, vs. leaving that up to the user agent.

While the buffered time ranges provide an approximation to this, they can be
very far off the mark for some types of content.  Similarly, there are
scenarios where buffered byte counts are also incorrect, but the
overwhelming majority of the time, video will be a static file served by a
standard server, such that it's trivial for the agent to provide that data.

-John


On Mon, Oct 13, 2008 at 6:06 PM, Ian Hickson <ian@hixie.ch> wrote:

> On Thu, 18 Sep 2008, Dave Singer wrote:
> >
> > <http://www.w3.org/html/wg/html5/#media>
> >
> > From the spec.:
> >
> # The bufferedBytes attribute must return a static normalized  object that
> # represents the ranges of the media resource, if any, that the user agent
> # has buffered, at the time the attribute is evaluated.
> #
> # The totalBytes attribute must return the length of the media resource,
> # in bytes, if it is known and finite. If it is not known, is infinite
> # (e.g. streaming radio), or if no media data is available, the attribute
> # must return 0.
> >
> > We don't think these are well-defined for a whole host of cases:
> > -- live streams
> > -- SMIL files that reference several media files
> > -- media files like MOV and MP4 that reference media files, or even MP4
> or MOV
> > files that are not interleaved in time order
> > -- streaming protocols in general (non-buffering)
> >
> > It's by no means clear to us what these attributes are for -- what the
> use
> > cases are.  We think they should be removed, or supported with use cases
> that
> > are able to show why they are useful despite all these cases where either
> > their meaning or utility (or both) are unclear...
>
> These attributes were added primarily in response to a request from
> YouTube, if I recall correctly. The problem they solve is that there can
> be a great difference between the amount of time buffered and the number
> of bytes buffered, and a clever author would find information about the
> bytes buffered to be far more useful than the amount of time available.
>
> I agree that for many use cases, these are underdefined. I would be
> interesting in hearing feedback on what would be the best way to resolve
> this problem. If we remove the feature altogether, we should have an
> alternative of some kind, though.
>
> --
> Ian Hickson               U+1047E                )\._.,--....,'``.    fL
> http://ln.hixie.ch/       U+263A                /,   _.. \   _\  ;`._ ,.
> Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'
>

Received on Friday, 17 October 2008 01:05:12 UTC