- From: Aymeric Vitte <vitteaymeric@gmail.com>
- Date: Wed, 26 Mar 2014 16:24:17 +0100
- To: Jay Munro <jaymunro@microsoft.com>
- CC: "public-html-media@w3.org" <public-html-media@w3.org>, "acolwell@chromium.org" <acolwell@chromium.org>, mike@w3.org
Thanks, yes we can share experiences (I thought it was not implemented in FF, from which version?), I saw your tutorial already, very well explained, what I want to do is a bit different and very simple: Usually you use something like <video src='movie.mp4' />, then you get the chunks via HTTP and the video is displayed. I want to do exactly the same thing, I receive the chunks (from the Peersm protocol via WebSockets or WebRTC Data Channels) and MSE displays the movie. I have read from some Aaron posts that MSE only works with fragmented files, in addition you have to submit the fragments as defined in the manifest file (or at least the initialization segment? I have not tested this for now). It does not work for mp3 (as far as I saw the Web Audio API does not allow to do what I want to do too). Format available inside the browsers are not numerous, now the API is rectricting to fragmented files, so I find it extremely limited. I have some hard time believing that no API did envision that you might want to stream video and audio based on the received chunks, is it the case? Do IE and FF only implement it too for fragmented files? Regards Aymeric Le 25/03/2014 17:46, Jay Munro a écrit : > How are you preparing your source files? Do they have the initialization box/packet (normally on any mp4 or webm file) for at least the first segment? > > I've created a MSE player for DASH mp4 files.. I'm not an expert, but I've gotten it to work on a single stream on IE and Firefox. I would like to get it to work on chrome, so maybe we can share experiences. > > I don't know if this will help, but here's a blog post I wrote on one way to do it (there seems to be quite a few) http://blogs.msdn.com/b/mediastuff/archive/2014/02/19/video-streaming-without-plug-ins-or-special-servers.aspx > > -Jay > > > -----Original Message----- > From: Aymeric Vitte [mailto:vitteaymeric@gmail.com] > Sent: Tuesday, March 25, 2014 4:23 AM > To: public-html-media@w3.org > Cc: acolwell@chromium.org > Subject: Streaming with Media Source > > Hi, > > For [1] I have started to implement Streaming for audio and video using the Media Source API, the app is retrieving chunks and is appending them to the "source" of audio/video tags. > > This was supposed to be simple but it is not, the API (or at least the Chrome implementation) seems to handle only adaptive rate/manifest structures both for mp4 and webm, it's unclear if the audio tag is supported. > > I am not an expert in video/audio format but I don't see very well what's the issue to append chunks to a destination, or to turn it into a stream (since I am participating to the Streams API, I know it's not there yet) outside of other interfaces such as WebRTC. > > For the final phase of the project, WebRTC will be used but with DataChannels, so the issue remains. > > I have looked in all existing APIs and unless I am missing something, I still don't see how to achieve this simple thing, how can it be done? > > Regards, > > Aymeric > > [1] http://www.peersm.com > > -- > Peersm : http://www.peersm.com > node-Tor : https://www.github.com/Ayms/node-Tor > GitHub : https://www.github.com/Ayms > > -- Peersm : http://www.peersm.com node-Tor : https://www.github.com/Ayms/node-Tor GitHub : https://www.github.com/Ayms
Received on Wednesday, 26 March 2014 15:24:51 UTC