W3C home > Mailing lists > Public > public-html-media@w3.org > March 2014

Re: Streaming with Media Source

From: Aaron Colwell <acolwell@chromium.org>
Date: Tue, 25 Mar 2014 10:22:44 -0700
Message-ID: <CAA0c1bD_i8es5b5AAT-N1VRd5E8sS3g+RF31hi=2Zu6eOGn7tg@mail.gmail.com>
To: Jay Munro <jaymunro@microsoft.com>
Cc: Aymeric Vitte <vitteaymeric@gmail.com>, "public-html-media@w3.org" <public-html-media@w3.org>
Hi Jay,

I believe your demo should work if you specify the codecs you are using in
the mimetype you pass to addSourceBuffer(). It looks like your code only
passes in "video/mp4". Chrome requires that the codecs also be specified so
you should pass in 'video/mp4;codecs="avc1.4d0020,mp4a.40.2"' since that
appears to be what codecs are specified in the DASH manifest.


On Tue, Mar 25, 2014 at 9:46 AM, Jay Munro <jaymunro@microsoft.com> wrote:

> How are you preparing your source files? Do they have the initialization
> box/packet (normally on any mp4 or webm file) for at least the first
> segment?
> I've created a MSE player for DASH mp4 files.. I'm not an expert, but I've
> gotten it to work on a single stream on IE and Firefox. I would like to get
> it to work on chrome, so maybe we can share experiences.
> I don't know if this will help, but here's a blog post I wrote on one way
> to do it (there seems to be quite a few)
> http://blogs.msdn.com/b/mediastuff/archive/2014/02/19/video-streaming-without-plug-ins-or-special-servers.aspx
> -Jay
> -----Original Message-----
> From: Aymeric Vitte [mailto:vitteaymeric@gmail.com]
> Sent: Tuesday, March 25, 2014 4:23 AM
> To: public-html-media@w3.org
> Cc: acolwell@chromium.org
> Subject: Streaming with Media Source
> Hi,
> For [1] I have started to implement Streaming for audio and video using
> the Media Source API, the app is retrieving chunks and is appending them to
> the "source" of audio/video tags.
> This was supposed to be simple but it is not, the API (or at least the
> Chrome implementation) seems to handle only adaptive rate/manifest
> structures both for mp4 and webm, it's unclear if the audio tag is
> supported.
> I am not an expert in video/audio format but I don't see very well what's
> the issue to append chunks to a destination, or to turn it into a stream
> (since I am participating to the Streams API, I know it's not there yet)
> outside of other interfaces such as WebRTC.
> For the final phase of the project, WebRTC will be used but with
> DataChannels, so the issue remains.
> I have looked in all existing APIs and unless I am missing something, I
> still don't see how to achieve this simple thing, how can it be done?
> Regards,
> Aymeric
> [1] http://www.peersm.com
> --
> Peersm : http://www.peersm.com
> node-Tor : https://www.github.com/Ayms/node-Tor
> GitHub : https://www.github.com/Ayms
Received on Tuesday, 25 March 2014 17:23:13 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:33:02 UTC