[media-source] How to correctly append buffers to play multiple media files in sequence?

guest271314 has just created a new issue for https://github.com/w3c/media-source:

== How to correctly append buffers to play multiple media files in sequence? ==
Have been attempting to implement, for lack of a more descriptive reference, an "offline media context". The basic concept is to be able to use the tools available at the most recent browsers alone to record or request media fragments capable of independent playback  and to be able to concatenate those discrete media fragments into a single stream of media playback at an `HTMLMediaElement`.  A brief summary of the progression of the proof of concept [Proposal: Implement OfflineMediaContext #2824](https://github.com/guest271314/OfflineMediaContext).

>From the outset have had the sense that  `MediaSource` could possibly be utilized to achieve part if not all of the requirement. However, had not located an existing or configured an appropriate pattern during own testing to realize the concatenation of discrete files using `MediaSource`.

Found this question and answer [How do i append two video files data to a source buffer using media source api?](https://stackoverflow.com/questions/14108536/how-do-i-append-two-video-files-data-to-a-source-buffer-using-media-source-api/) which appeared to indicate that setting the `.timestampOffset` property of `MediaSource` could result in sequencing media playback of discrete buffers appended to `SourceBuffer`. Following the question led to a 2012 Editor's Draft [Media Source Extensions
W3C Editor's Draft 8 October 2012](https://rawgit.com/w3c/media-source/416b646/media-source.html) which states at [2.11. Applying Timestamp Offsets](https://rawgit.com/w3c/media-source/416b646/media-source.html#source-buffer-timestamp-offsets)


> Here is a simple example to clarify how timestampOffset can be used. Say I have two sounds I want to play in sequence. The first sound is 5 seconds long and the second one is 10 seconds. Both sound files have timestamps that start at 0. First append the initialization segment and all media segments for the first sound. Now set timestampOffset to 5 seconds. Finally append the initialization segment and media segments for the second sound. This will result in a 15 second presentation that plays the two sounds in sequence.

Which tried dozens of times using different patterns over the past several days. Interestingly all attempts using `.webm` video files failed; generally resulting in the following being logged at `console` at [plnkr](https://plnkr.co)

> `Uncaught DOMException: Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
> `

> `Uncaught DOMException: Failed to set the 'timestampOffset' property on 'SourceBuffer': This SourceBuffer has been removed from the parent media source.`

All of attempts using `.mp4` video files failed save for a single `.mp4` file which is a downloaded copy of "Big Buck Bunny" trailer. Not entirely sure where downloaded the file from during testing, though may have been "https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4". Is this fact related to what is revealed at [FFmpeg FAQ](http://www.ffmpeg.org/faq.html#How-can-I-concatenate-video-files)

> A few multimedia containers (MPEG-1, MPEG-2 PS, DV) allow one to concatenate video by merely concatenating the files containing them.

?

Made a copy of the original file in the same directory. Used `<input type="file">` with `multiple` attribute set to upload the files. Converted the `File` objects to `ArrayBuffer` using `FileReader` and used, in pertinent part, this pattern 

               // code within a loop, within `Promise` constructor
               // `reader` is a `FileReader` instance
               reader.onload = () => {
                  reader.onload = null;
                  sourceBuffer.appendBuffer(reader.result);
                  sourceBuffer.onupdateend = e => {
                    sourceBuffer.onupdateend = null;
                    // `chunk.mediaDuration` is `.duration` of media retrieved from `loadedmetadata` event
                    sourceBuffer.timestampOffset += chunk.mediaDuration;
                    // read next `File` object
                    resolve()
                  }
                }
                reader.readAsArrayBuffer(chunk.mediaBuffer);

The questions that have for authors and contributors to the specification are

1. What is the correct code pattern (as clear and definitive as possible) to use to append array buffers from discrete files or media fragments to one or more `SourceBuffer`s of `MediaSource`, where the `HTMLMediaElement` renders playback of each of the files or media fragments? 

2. Why was a single `.mp4` which was copied the only two files which `MediaSource` correctly set the `.duration` of the to total time of the two files and rendered playback?

Please view or discuss this issue at https://github.com/w3c/media-source/issues/190 using your GitHub account

Received on Friday, 25 August 2017 16:04:21 UTC