W3C home > Mailing lists > Public > public-web-and-tv@w3.org > December 2014

Scenarios and Shared Motion

From: Njaal Borch <njaal.borch@norut.no>
Date: Fri, 19 Dec 2014 14:21:30 +0000
Message-ID: <CAOc996u52uDNKvfSVx_Cq7iNDFgL_-Z9MTWQsZBmR6v3QObMzQ@mail.gmail.com>
To: public-web-and-tv@w3.org
Hi all,

My colleague Ingar has already presented our ideas for timing related
issues as well as Shared Motion for multi device synchronization.  I’ve
gone through the “New Scenarios” on the wiki (
http://www.w3.org/2011/webtv/wiki/New_Ideas) and will try to come with some
suggestions as to how we would address them.

Note: The demos included are multi-device demos, thus their effect is most
evident if you use different devices for the different URL’s. If you only
have one computer in front of you, open the links in different browser
windows or tabs, and make sure that you can see them all.  Be aware that
the Chrome browser refuses to play media in two tabs at the same time (as
one Chrome user).  If you want two tabs showing video you can open the
second tab in an incognito window.

The first two UCs are about content identification via audio.  We are not
really doing anything to address these particular UCs.
UC2-3 Identical Media Stream Synchronization

   -

   The ability for two or more identical media streams to play in sync on
   separate devices. This can either be used to create an immersive experience
   or to allow multiple people to enjoy the same content simultaneously. For
   example, Alice and Bob are sitting on a train. They want to watch a
   streaming video over the web on their separate devices. They both got to
   the same web page with a unique ID. Once Alice clicks play, Bob's video
   should start playing immediately, or it there's a delay in playing the
   video should jump forward to the position of Alice's video.


UC2-3 is exactly the kind of thing Shared Motion was designed to do. We’ve
created a small demonstration that allows you to experiment a bit.  Due to
a lack of standardisation, we of course have to do various trickery in
order to make the media elements to slave after our shared motion.  As
such, we recommend using Chrome or Firefox on PCs and Macs or Firefox on
Android.  Chrome on Android is currently non functional for
synchronization. If you use any other browser you should be able to
experience lip-sync, but you will likely get an echo. At
http://mcorp.no/examples/ted/video.html you can see a video of Sting at the
TED conference.  You must log in with a Google account, which we use to
ensure that you get your own Shared Motions on all your devices (and for
access control purposes).  One can of course easily also share the URL of
the Shared Motion in a session object or something, in order to allow Alice
and Bob to watch it together.  This demonstration is also useful for later
scenarios, but let’s do them in order.

UC2-4 Related Media Stream Synchronization

   -

   The ability for two or more related media streams to play in sync. This
   could be between multiple media elements on a single web page or on
   separate devices. For example, Eve is watching a figure skating video. The
   video stream is available with multiple camera angles. She'd like to see an
   overall view of the event in one media element and a close-up of the skater
   in another media element. Both streams should be synchronised.


The Norwegian Public Broadcaster NRK does on occasion film multiple angle
videos of train rides and the like, which are made publicly available under
creative commons licenses.  You can see one of these demonstrations on
http://mcorp.no/examples/holmenkollbanen/ - this particular view has 3
videos that are timed (one is actually 4 seconds off the others the media
sync wrapper has the skew parameter set to ensure that it is in sync).
  The three videos will be played in sync, but none of them communicate
with each other. They are all directly connected to the Shared Motion.
There is also a map icon in the top right - if you click it, a map will
open in a new tab.  This map is also connected to the Shared Motion and
displays the position of the train at all times.  You can of course click
the track to move to it - making it a kind of custom remote control for
this content.

This demo has also been extended with multi-device adaption by partners in
the MediaScape project (IRT and Vicomtech). A recording is available on
https://www.youtube.com/watch?v=zmGhuyg-gr4&feature=youtu.be



UC2-5 Triggered Interactive Overlay

   1.

   John is watching the live streaming of a San Francisco Giants v. Detroit
   Tigers baseball game.
   2.

   An overlay is triggered to suggest a movie (The Bachelor) to John. For
   example, it may show short clips of the trailer in the overlay with a
   suggestion such as ”Press OK to pause programming and watch the trailer,
   and get options for purchasing the movie”.
   3.

   John sees the Giants are winning so he decides to watch the movie. He
   presses the OK button and watches the trailer.
   4.

   A second overlay is triggered on top of the trailer to suggest John to
   purchase the movie on demand.
   5.

   John purchases the movie and starts watching. The overlay is gone.


We don’t have a running demo of this exact thing yet, but we do have one
that pauses an experience for “branching out” as it were.  You can
experience it on http://mcorp.no/examples/Google/  which is a talk by Sam
Dutton at Google about WebRTC, which we’ve added (their own) interactive
slides to.  If you click the video button, or open
http://mcorp.no/examples/Google/video.html you can see the video in sync
with the slides.  The natural thing would be to use the mobile or tablet
with the slides, but they are not rendered very nicely on mobiles (not my
fault). If you click on a link on one of the slides, the video will
automatically pause. This should translate quite nicely into the scenario
above.

UC2-6 Clean Audio

   -

   Bob enjoys going to the movies but as he's grown older he has lost more
   and more of his hearing. Understanding movie dialog amidst all the other
   background sounds has become more and more difficult for him. He could rely
   on captions but he'd really like to use his hearing whenever possible. He
   discovered that some TV programs provide an alternative audio track
called "Clean
   Audio <http://www.w3.org/TR/media-accessibility-reqs/#clean-audio>" and
   he now has an app on his mobile phone that allows him to select this
   alternate audio. His hearing therapist has also equalized the audio in a
   way that emphasizes the frequencies where his hearing works well, and
   de-emphasizes those where it doesn't work well. Now, he's able to listen to
   this alternative audio track on his mobile, on a headphone attached to his
   mobile, or even on his living room speakers by sending this alternative
   audio to his TV's speakers. He's also discovered that Clean Audio is
   available for certain movies at the cinema and that he can select that
   alternative track for listening over headphones connected to his mobile
   phone using the same app that he uses at home.


This is one of my favourite scenarios, and I’m thrilled to see it explained
so well.  If you haven’t already tried the short url / QR code on the Sting
demo, this would be the time to do so.  It should indeed provide you with
almost an exact implementation of this, with the exception of added
personal sound filters, which in itself should not add additional
synchronization issues.  Again, due to the browsers varying and not
particularly coherent implementations of currentTime and playbackRate, we
suggest Firefox on Android for this, although Safari on iOS should provide
lip-sync as well.

Best regards and happy holidays,
Njål
Received on Friday, 19 December 2014 14:28:15 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:44:15 UTC