Re: Streaming with Media Source

Hi Aaron,

Aaron Colwell <acolwell@chromium.org>, 2014-03-25 10:22 -0700:

> Hi Jay,
> 
> I believe your demo should work if you specify the codecs you are using in
> the mimetype you pass to addSourceBuffer(). It looks like your code only
> passes in "video/mp4". Chrome requires that the codecs also be specified so
> you should pass in 'video/mp4;codecs="avc1.4d0020,mp4a.40.2"' since that
> appears to be what codecs are specified in the DASH manifest.

It seems like a lot of developers who are trying to work with MSE in Chrome
are likely to stumble over that undocumented Chrome requirement. Especially
since it's not strict requirement in other places -- e.g., with .canPlayType.

Has there ever been any discussion about making it an actual requirement in
the MSE spec that the codecs parameter must be specified in the argument to
addSourceBuffer()? That way the spec could require it to throw if the
codecs param isn't specified in the mime type.

(Incidentally, does blink currently throw in the case that the codecs
parameter isn't in the mime type that's passed to addSourceBuffer()? If
not, how else are web developers supposed to know what's going on when they
don't specify a code param and it doesn't work?)

  --Mike

> On Tue, Mar 25, 2014 at 9:46 AM, Jay Munro <jaymunro@microsoft.com> wrote:
> 
> > How are you preparing your source files? Do they have the initialization
> > box/packet (normally on any mp4 or webm file) for at least the first
> > segment?
> >
> > I've created a MSE player for DASH mp4 files.. I'm not an expert, but I've
> > gotten it to work on a single stream on IE and Firefox. I would like to get
> > it to work on chrome, so maybe we can share experiences.
> >
> > I don't know if this will help, but here's a blog post I wrote on one way
> > to do it (there seems to be quite a few)
> > http://blogs.msdn.com/b/mediastuff/archive/2014/02/19/video-streaming-without-plug-ins-or-special-servers.aspx
> >
> > -Jay
> >
> >
> > -----Original Message-----
> > From: Aymeric Vitte [mailto:vitteaymeric@gmail.com]
> > Sent: Tuesday, March 25, 2014 4:23 AM
> > To: public-html-media@w3.org
> > Cc: acolwell@chromium.org
> > Subject: Streaming with Media Source
> >
> > Hi,
> >
> > For [1] I have started to implement Streaming for audio and video using
> > the Media Source API, the app is retrieving chunks and is appending them to
> > the "source" of audio/video tags.
> >
> > This was supposed to be simple but it is not, the API (or at least the
> > Chrome implementation) seems to handle only adaptive rate/manifest
> > structures both for mp4 and webm, it's unclear if the audio tag is
> > supported.
> >
> > I am not an expert in video/audio format but I don't see very well what's
> > the issue to append chunks to a destination, or to turn it into a stream
> > (since I am participating to the Streams API, I know it's not there yet)
> > outside of other interfaces such as WebRTC.
> >
> > For the final phase of the project, WebRTC will be used but with
> > DataChannels, so the issue remains.
> >
> > I have looked in all existing APIs and unless I am missing something, I
> > still don't see how to achieve this simple thing, how can it be done?
> >
> > Regards,
> >
> > Aymeric
> >
> > [1] http://www.peersm.com
> >

-- 
Michael[tm] Smith http://people.w3.org/mike

Received on Wednesday, 26 March 2014 09:40:11 UTC