MediaStream processing on the Augmented Web Platform

Hi,

here's some info on the presentation I gave at a WebRTC meetup earlier 
this week.

The presentation topic is:

   MediaStream processing on the Augmented Web Platform

In this presentation I call out what I think is the major technical 
innovation that will differentiate the Augmented Web.

   www or web 1.0 was enabled by the "href" that made
   links to pages work.

   web2.0 was enabled by the "xhr" that delivered AJAX, etc.

   And I believe the Augmented Web will be most significantly
   enabled by the Array Buffer.

It may sound a little arcane but I remember discussions around xhr 
seemed similar back in the very early days.  Anyway, have a look at my 
presentation and see what you think.

The slides from my presentation are online here.
http://www.slideshare.net/robman/mediastream-processing-pipelines-on-the-augmented-web-platform

The code is online here.
https://github.com/buildar/getting_started_with_webrtc#image_processing_pipelinehtml

And a video of my presentation is from 00h:11m:30s onwards here.
http://www.ustream.tv/recorded/38491979


I'd be interested to hear your thoughts and feedback.


roBman

Received on Wednesday, 11 September 2013 02:55:18 UTC