Re: Streaming with Media Source

Hi Aymeric,

Comments inline...

On Wed, Mar 26, 2014 at 8:24 AM, Aymeric Vitte <vitteaymeric@gmail.com>wrote:

> Thanks, yes we can share experiences (I thought it was not implemented in
> FF, from which version?), I saw your tutorial already, very well explained,
> what I want to do is a bit different and very simple:
>
> Usually you use something like <video src='movie.mp4' />, then you get the
> chunks via HTTP and the video is displayed.
>
> I want to do exactly the same thing, I receive the chunks (from the Peersm
> protocol via WebSockets or WebRTC Data Channels) and MSE displays the movie.
>

MSE was not designed for this. To support this in the generic case the code
fetching the data needs to understand the file format so it can properly
fetch the necessary chunks for playback and make tradeoffs about the size
of the chunk to fetch vs the amount of bytes actually needed in that chunk.


>
> I have read from some Aaron posts that MSE only works with fragmented
> files, in addition you have to submit the fragments as defined in the
> manifest file (or at least the initialization segment? I have not tested
> this for now).
>

The manifest file isn't required by MSE, but that is typically how specs
like DASH describe the layout & location of the fragments.


>
> It does not work for mp3 (as far as I saw the Web Audio API does not allow
> to do what I want to do too).
>

I plan on submitting an MSE byte stream format specification for MP3 and
AAC in ADTS in the next day or two. I'm just putting the final touches on
the language. Chrome has experimental support for both of these formats
right now.


>
> Format available inside the browsers are not numerous, now the API is
> rectricting to fragmented files, so I find it extremely limited.
>

I'm sorry you feel this way. Fragmented files allow segments of media to be
self contained and only contain local references. This makes dealing with
chunked media much easier. For the generic case you have to deal with
absolute file references and arbitrary file layout which is much more
difficult to work with. The API would look very different and would not be
able to provide many of the features that MSE does.


>
> I have some hard time believing that no API did envision that you might
> want to stream video and audio based on the received chunks, is it the case?
>

Again, your use case is different from what MSE goals were. It appears that
you simply want to replace the UAs HTTP fetching code with your own P2P
code. That is fine, but not what MSE was designed to solve.


>
> Do IE and FF only implement it too for fragmented files?
>

Yes. They conform to the MSE
spec<https://dvcs.w3.org/hg/html-media/raw-file/default/media-source/media-source.html>and
implement the requirements laid out in the byte
stream format specifications<https://dvcs.w3.org/hg/html-media/raw-file/default/media-source/byte-stream-format-registry.html>
they
have chosen to support. For MP4 in particular, this means fragmented files
only.

I hope this helps you understand,

Aaron


>
> Regards
>
> Aymeric
>
> Le 25/03/2014 17:46, Jay Munro a écrit :
>
>  How are you preparing your source files? Do they have the initialization
>> box/packet (normally on any mp4 or webm file) for at least the first
>> segment?
>>
>> I've created a MSE player for DASH mp4 files.. I'm not an expert, but
>> I've gotten it to work on a single stream on IE and Firefox. I would like
>> to get it to work on chrome, so maybe we can share experiences.
>>
>> I don't know if this will help, but here's a blog post I wrote on one way
>> to do it (there seems to be quite a few) http://blogs.msdn.com/b/
>> mediastuff/archive/2014/02/19/video-streaming-without-plug-
>> ins-or-special-servers.aspx
>>
>> -Jay
>>
>>
>> -----Original Message-----
>> From: Aymeric Vitte [mailto:vitteaymeric@gmail.com]
>> Sent: Tuesday, March 25, 2014 4:23 AM
>> To: public-html-media@w3.org
>> Cc: acolwell@chromium.org
>> Subject: Streaming with Media Source
>>
>> Hi,
>>
>> For [1] I have started to implement Streaming for audio and video using
>> the Media Source API, the app is retrieving chunks and is appending them to
>> the "source" of audio/video tags.
>>
>> This was supposed to be simple but it is not, the API (or at least the
>> Chrome implementation) seems to handle only adaptive rate/manifest
>> structures both for mp4 and webm, it's unclear if the audio tag is
>> supported.
>>
>> I am not an expert in video/audio format but I don't see very well what's
>> the issue to append chunks to a destination, or to turn it into a stream
>> (since I am participating to the Streams API, I know it's not there yet)
>> outside of other interfaces such as WebRTC.
>>
>> For the final phase of the project, WebRTC will be used but with
>> DataChannels, so the issue remains.
>>
>> I have looked in all existing APIs and unless I am missing something, I
>> still don't see how to achieve this simple thing, how can it be done?
>>
>> Regards,
>>
>> Aymeric
>>
>> [1] http://www.peersm.com
>>
>> --
>> Peersm : http://www.peersm.com
>> node-Tor : https://www.github.com/Ayms/node-Tor
>> GitHub : https://www.github.com/Ayms
>>
>>
>>
> --
> Peersm : http://www.peersm.com
> node-Tor : https://www.github.com/Ayms/node-Tor
> GitHub : https://www.github.com/Ayms
>
>

Received on Wednesday, 26 March 2014 16:32:48 UTC