[Bug 18615] Define how SourceBuffer.buffered maps to HTMLMediaElement.buffered

https://www.w3.org/Bugs/Public/show_bug.cgi?id=18615

--- Comment #9 from Aaron Colwell <acolwell@chromium.org> ---
(In reply to comment #7)

I have a few concerns with this proposal, especially in relation to demuxed
content where different SourceBuffers are used for audio & video.

> Our proposal is as follows:
> 
> * The HTMLMediaElement.buffered range is the union of the time ranges of all
> the active audio and video ranges in the activeSourceBuffers collection.

My problem with using the union is that it hides the fact that we may not have
data buffered for all active streams. If the audio source buffer has data
buffered for [0-30] and the video source buffer has video buffered for [0-10]
and [20-30] the union would mislead the web application into thinking that it
has all the data for starting playback at 15. In this situation a seek to 15
should stall in my opinion. 

> 
> * If there is video content but not audio then fill with silence.
> 
> * If there is audio content but not video content then fill with the final
> frame of video before the gap and play the audio.
> 
> * If there is no content for a range then stall the playback waiting for the
> application to fill it.

For muxed content I don't really have a problem with this, but when using
demuxed content where the audio & video are in different source buffers, I
think these rules could lead to unexpected behavior. Say I only append data to
the video source buffer. If I understand these rules then playback should start
immediately and silence should be output until the audio data actually gets
appended. This doesn't seem like a good user experience. Perhaps I'm
misunderstanding something here.

Another scenario where these rules seem to be problematic is when playback
encounters a gap where one of the SourceBuffers is missing data like the
example I give above. If playback starts at 0, I'd expect playback to stall
when it gets to 10 since there is missing video data for the range [10-20]. If
I understand these rules correctly, you'd want playback to continue over this
gap and playback not to stall.

> 
> * We don't think this should change based on endOfStream.

The reason I have this behavior changing based on endOfStream is because until
endOfStream() is called, there is no way to differentiate whether we are
actually at the end of the content or whether we are waiting for more content
to be appended. In the former case, it seems like we could follow the rules you
suggest, but in the later case where we don't know whether more content is
going to be appended it seems better to stall when the playback position
reaches an area where there is missing data. That is why I was suggesting the
use of the intersection instead of the union, because the intersection reflects
the ranges were playback won't stall.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.

Received on Tuesday, 29 January 2013 18:45:43 UTC