Re: Streaming with Media Source

Hi Mike,

Comments inline...


On Wed, Mar 26, 2014 at 2:40 AM, Michael[tm] Smith <mike@w3.org> wrote:

> Hi Aaron,
>
> Aaron Colwell <acolwell@chromium.org>, 2014-03-25 10:22 -0700:
>
> > Hi Jay,
> >
> > I believe your demo should work if you specify the codecs you are using
> in
> > the mimetype you pass to addSourceBuffer(). It looks like your code only
> > passes in "video/mp4". Chrome requires that the codecs also be specified
> so
> > you should pass in 'video/mp4;codecs="avc1.4d0020,mp4a.40.2"' since that
> > appears to be what codecs are specified in the DASH manifest.
>
> It seems like a lot of developers who are trying to work with MSE in Chrome
> are likely to stumble over that undocumented Chrome requirement. Especially
> since it's not strict requirement in other places -- e.g., with
> .canPlayType.
>
> Has there ever been any discussion about making it an actual requirement in
> the MSE spec that the codecs parameter must be specified in the argument to
> addSourceBuffer()? That way the spec could require it to throw if the
> codecs param isn't specified in the mime type.
>

I could add a note that encourages that codecs be specified and that
indicate some UAs may reject a mimetype if they aren't present. I suspect
making the codecs a normative requirement would result in pushback though.


>
> (Incidentally, does blink currently throw in the case that the codecs
> parameter isn't in the mime type that's passed to addSourceBuffer()? If
> not, how else are web developers supposed to know what's going on when they
> don't specify a code param and it doesn't work?)
>

Yes. Blink throws the NotSupportedError like the spec indicates. The error
text could be made a little more helpful so developers realize they need to
specify codecs though.

In general there are a lot of things that can go wrong in MSE that are
difficult to inform the web developer about. In Chrome at least we've added
some extra logging in chrome:media-internals to help web developers figure
out what is wrong with the data they are appending. It doesn't log
everything, but I've been trying to make sure that it provides messages for
the common mistakes people make. Unfortunately in this particular case,
chrome:media-internals wouldn't have helped either. I'm planning on fixing
that too. Stay tuned. :)

Aaron


>
>   --Mike
>
> > On Tue, Mar 25, 2014 at 9:46 AM, Jay Munro <jaymunro@microsoft.com>
> wrote:
> >
> > > How are you preparing your source files? Do they have the
> initialization
> > > box/packet (normally on any mp4 or webm file) for at least the first
> > > segment?
> > >
> > > I've created a MSE player for DASH mp4 files.. I'm not an expert, but
> I've
> > > gotten it to work on a single stream on IE and Firefox. I would like
> to get
> > > it to work on chrome, so maybe we can share experiences.
> > >
> > > I don't know if this will help, but here's a blog post I wrote on one
> way
> > > to do it (there seems to be quite a few)
> > >
> http://blogs.msdn.com/b/mediastuff/archive/2014/02/19/video-streaming-without-plug-ins-or-special-servers.aspx
> > >
> > > -Jay
> > >
> > >
> > > -----Original Message-----
> > > From: Aymeric Vitte [mailto:vitteaymeric@gmail.com]
> > > Sent: Tuesday, March 25, 2014 4:23 AM
> > > To: public-html-media@w3.org
> > > Cc: acolwell@chromium.org
> > > Subject: Streaming with Media Source
> > >
> > > Hi,
> > >
> > > For [1] I have started to implement Streaming for audio and video using
> > > the Media Source API, the app is retrieving chunks and is appending
> them to
> > > the "source" of audio/video tags.
> > >
> > > This was supposed to be simple but it is not, the API (or at least the
> > > Chrome implementation) seems to handle only adaptive rate/manifest
> > > structures both for mp4 and webm, it's unclear if the audio tag is
> > > supported.
> > >
> > > I am not an expert in video/audio format but I don't see very well
> what's
> > > the issue to append chunks to a destination, or to turn it into a
> stream
> > > (since I am participating to the Streams API, I know it's not there
> yet)
> > > outside of other interfaces such as WebRTC.
> > >
> > > For the final phase of the project, WebRTC will be used but with
> > > DataChannels, so the issue remains.
> > >
> > > I have looked in all existing APIs and unless I am missing something, I
> > > still don't see how to achieve this simple thing, how can it be done?
> > >
> > > Regards,
> > >
> > > Aymeric
> > >
> > > [1] http://www.peersm.com
> > >
>
> --
> Michael[tm] Smith http://people.w3.org/mike
>
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.10 (GNU/Linux)
>
> iQIcBAEBCgAGBQJTMqB5AAoJEIfRdHe8OkuVvcEP+wTWzgwDOv2gFTyTOZpfjmN5
> GiIrHVOEt1ewGLG4tt2E5nMMwI6d0LTIe6/jNfi9azbKSBHd6IqTyOhHFt/euMwO
> rtR1qBaXfLc09nkwa/OtYM0UCwRBWguUKR10vmOrSgu4AdRXZ3xX9VofU2VDX42d
> qUw61DjlQQXRkJaPr9pxcEE567oVarHSZykJ3V83st0n0ni+Goz+NWXLR6NKPoM7
> t/C2e7px1CIiQTJHuHVUg3N9BOt4PVjIFLlB7NuDgmFGCyM9lHci8t7PWw0ImZX1
> 4Rw+KmqAx1An9QymI1Id0sWNsDLRh67VF5mupeoUj6CTHuAivMlxJ6v2IWZI9rn7
> SECPGpswQeIlbHr1h5Ia5tu7zhZRLY4Nel8b33SfFDhKOQBHsTUQbxwpiiFS0voO
> ysH8Mw0PrwOHyKZ2pKMFHAvyaOZRZ4o8Z2Er9ZhAjNkit8p1f2z6BBACSvYQ8Tln
> winvizFG8zyZQh4HfF8cLhXcLTFCUOA4qFI5eD7xnhC29yWH1smN9mRkWcJok4yM
> X0tmYk1O2rUR8R2z0BQYvWIdYalS6f+qCvdhStG5HVfbkUkAKctKS4HCJhSiueKm
> nZyyGbq+XGyxFOBY1kSjr2wHbHhniltDeTUtABpKByRcod5f2rnGq6DW4MfsIcIq
> 3Eoqv9KEAB8jLima+B4X
> =7D+K
> -----END PGP SIGNATURE-----
>
>

Received on Wednesday, 26 March 2014 16:11:51 UTC