W3C home > Mailing lists > Public > whatwg@whatwg.org > May 2011

[whatwg] HTML5 based Eduvid Slidecasting Demo

From: Narendra Sisodiya <narendra@narendrasisodiya.com>
Date: Sun, 22 May 2011 20:46:02 +0530
Message-ID: <BANLkTikgt7M1t5s74x8HNvSF4iVmznTX9w@mail.gmail.com>
On Sun, May 22, 2011 at 7:19 PM, Odin <odin.omdal at gmail.com> wrote:

> On Sun, May 22, 2011 at 3:03 PM, Narendra Sisodiya
> <narendra at narendrasisodiya.com> wrote:
> > infact I too want to do same..
>
> Cool!
>
> > Basically you want to send slides (via websocket or comet) and sync with
> > video..
>
> Yes. My old system (early Errantia) did that, using comet.
>
> > Here is the mistake. Sending Slides in real time has no use. Please
> > do not take me wrong. Basically in almost any conference or workshop,
> > presenter already has slides ready. So if something is ready then you can
> > directly push it to user. send video/audio in real time.
>
> You misunderstand :-) The slides are no problem at all, I upload them
> to the server, then when done it sends a Comet PUSH to the clients
> which tells them, a new slide resides at  """
> http://example.com/images/test-conference/14.jpg """. So that part is
> working swell.
>
> Also, I send the time when the slide appeared. However, the syncing to
> the video/audio is the impossible part, because there's no way for the
> browser to know where the video is in time/place **in a live
> setting**, you can very easily do this with archived video, it's just
> to read currentTime.
>
> We had a discussion about this, and that was why startOffsetTime made
> it into the spec:
>
>
> http://lists.whatwg.org/htdig.cgi/whatwg-whatwg.org/2010-May/thread.html#26342
>
>
>
> As you see, there are lots of buffering of video. And such, different
> people will be different places in the real time video.
>
> So I'm watching "live" and that's only one minute "off", whilst a
> friend is watching via wireless internet and is a full 10 minutes off
> because of his buffering etc, etc.
>
> In order to sync slides to the video/audio in such places (when people
> connect 10 minutes into the video and get a new currentTime = 0 at
> that time), you need a static reference point, but as of now; all time
> is still relative. Getting startOffsetTime will get us a static time
> to know when we're going to show the new slides (which will also have
> a  datetime  field for syncing).
>
>
Yes, I got your point.. I am sorry, I misunderstood your email. Also I was
not aware of problems in Live Video...




> > So , No need to send slides in real time.. Send all slides at once.. and
> > then keep on sending the configurations
> > I will also try to work on this...
>
> Well, for some conferences this won't work, and I already have code to
> do that live, so I don't need to send slides afterwards. Anyway, both
> ways work. But you need to know where people are in the video in order
> to sync to the slides; and that's where startOffsetTime come in.
>
> Alternatively you might try to control the buffers/caches, but that's
> not always possible. I've tried before, and can't really get it thight
> enough, there's too many variables, and Icecast might not be able to
> tweak itself to make a really good low latency low buffering live
> sending.
>
> --
> Beste helsing,
> Odin H?rthe Omdal <odin.omdal at gmail.com>
> http://velmont.no
>



-- 
???????????????????????????
?    Narendra Sisodiya
?    http://narendrasisodiya.com
???????????????????????????
Received on Sunday, 22 May 2011 08:16:02 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:33 UTC