[media-source] Is HTMLVideoElement.canplay event expected to be dispatched following MediaSource.appendBuffer()?

guest271314 has just created a new issue for https://github.com/w3c/media-source:

== Is HTMLVideoElement.canplay event expected to be dispatched following MediaSource.appendBuffer()? ==
According to the [specification](https://html.spec.whatwg.org/multipage/media.html#event-media-canplay)

> Event name canplay
> 
> Interface Event
> 
> Fired when... The user agent can resume playback of the media data, but estimates that if playback were to be started now, the media resource could not be rendered at the current playback rate up to its end without having to stop for further buffering of content.

> Preconditions `readyState` newly increased to `HAVE_FUTURE_DATA` or greater.


[`HAVE_FUTURE_DATA` (numeric value 3)](https://html.spec.whatwg.org/multipage/media.html#dom-media-have_future_data)

> Data for the immediate current playback position is available, as well as enough data for the user agent to advance the current playback position in the direction of playback at least a little without immediately reverting to the HAVE_METADATA state, and the text tracks are ready. For example, in video this corresponds to the user agent having data for at least the current frame and the next frame when the current playback position is at the instant in time between the two frames, or to the user agent having the video data for the current frame and audio data to keep playing at least a little when the current playback position is in the middle of a frame. The user agent cannot be in this state if playback has ended, as the current playback position can never advance in this case.

When an `ArrayBuffer` is appended to a `MediaSource` instance using `.appendBuffer()` `canplay` is fired at Chromium 60, though not at Firefox 55.

Is `canplay` expected to be fired at Firefox when an `ArrayBuffer` is appended to ` MediaSource` instance?

When `autoplay` attribute is set to `true`, why is `video.paused` `true` at Firefox 55, though not at Chromium 60, where duration of the media as array buffer is approximately one second?

Why are there inconsistent implementations at Chromium and Firefox? 

Not sure if the issue is a Blink or Gecko bug.

Relevant code

HTML

`<video preload="auto" autoplay="true" width="320" height="280" controls="true"></video>`

JavaScript

```
    // ..
    const sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
    sourceBuffer.appendWindowStart = 0;
    sourceBuffer.appendWindowEnd = mediaDuration;
    sourceBuffer.onupdateend = e => {
      sourceBuffer.onupdateend = null;
      console.log(mediaDuration, mediaSource.duration);
      video.currentTime = 0;
      // chromium 60 renders expected result
      // firefox 55 does not reach `canplay` event
      // chromium 60: `1 false false 0`
      // firefox 55 `1 true false 0`
      // firefox issue here
      console.log(video.readyState, video.paused, video.ended, video.currentTime);
      video.oncanplay = async(e) => {
          video.oncanplay = null;
          console.log(e);
          console.log("media source playing", video.readyState);
          // firefox issue here, we need playback to continue
          const play = await video.play();
          // ..
      }
      sourceBuffer.appendBuffer(mediaBuffer);
     // ..
```

Please view or discuss this issue at https://github.com/w3c/media-source/issues/196 using your GitHub account

Received on Monday, 4 September 2017 19:56:01 UTC