Re: Shared Motion - multi-device synchronization and media control for the Web.

Hi Ingar,

This is an interesting technical area. Thanks for bringing this to the IG.

I’d like to address the different paths within the W3C to get this into a W3C spec. W3C specs are written only by Working Groups (WG). Your area would seem to involve modifications to the media support in the HTML5 spec, which is managed by the HTML WG. There are a few different paths to getting a modification to the HTML5 spec implemented by the HTML WG:

1. Bug -> HTML WG
You could submit a bug to the HTML WG bug system (https://www.w3.org/Bugs/Public/). There are large and small issues entered as bugs. The advantage is it would be immediately considered by the HTML WG. Many changes from the Web & TV IG have been initiated as bugs.

2. Web&TV IG -> HTML WG
The Web & TV IG (Interest Group) like all IGs is limited to writing requirements. We cannot write specs. The IG can generate requirements and present them to the HTML WG. We have done this successfully in the past for larger proposals, but the HTML WG is now preferring to receive developed specifications as extension proposals. Many of those are coming from Community Groups (CGs).

3. CG -> HTML WG
Anyone can start and participate in a CG, which is designed to draft specification proposals. (http://www.w3.org/community/) If you already have specifications, it may make sense to start there directly.

4. Web&TV IG -> CG -> HTML WG
In this path, we could work to generate a requirements document and then start a CG from that. We have started a few CGs from this IG previously. The advantage to this somewhat longer path is that we have a standing group of folks interested in media issues, but the restriction is that we can’t draft specifications.

Anyone can participate fully in a CG. Anyone can participate in IG and WG public mailing lists and bug systems, but IG and WG phone calls and F2F meeting participation requires W3C membership.

I hope that helps!

Thanks,
mav


On Dec 17, 2014, at 12:57 AM, Ingar Mæhlum Arntzen <ingar.arntzen@gmail.com<mailto:ingar.arntzen@gmail.com>> wrote:


Hi Sangwhan

You are correct, seekTo operations introduce asynchrony/errors. Even worse, in a distributed system it is hard to ensure that seekTo operations occur at the same time. Shared Motion could help with this though. By delaying operations until a specific point on a shared clock (i.e. motion), you could mask effects of differences in network latency among devices and approximate simultaneous performance of the seekTo operation.

However, we are not doing it like that, as it would introduce un-necessary latency in user interaction, thereby hurting end-user experience.

Instead we are performing seekTo operations as soon as they are available on different devices. This leads to the media being visibly out of synch for a brief moment (after seekTo).

Crucially though, the presence of and ideal media clock (shared motion) means that our JavaScript synch wrappers know locally exactly what the error is. So this error is quickly compensated by temporary tuning the playbackRate or doing adjustments to the playback offset (if variable playback rate is not available, or if the errors are above some limit).

Monitoring errors and adjusting for them is something one has to do continuously anyway, as media elements playback drift at different rates. So, "recovering" after seekTo doesn't even require logic that wasn't needed in any case.

Best,

Ingar






2014-12-17 9:32 GMT+01:00 Sangwhan Moon <smoon@opera.com<mailto:smoon@opera.com>>:


On Wed, Dec 17, 2014 at 5:31 PM, Sangwhan Moon <smoon@opera.com<mailto:smoon@opera.com>> wrote:


On Fri, Dec 12, 2014 at 9:50 PM, Ingar Mæhlum Arntzen <ingar.arntzen@gmail.com<mailto:ingar.arntzen@gmail.com>> wrote:

Dear IG Members


We would like to present ourselves to this forum, as we share your interest in improving the Web as a platform for broadcast and multi-device media, and because we have some contributions which you might find relevant.

My collegue (Njål Borch) and myself (Ingar Arntzen) are researchers working for NORUT (Northern Research Institute), Tromsø, Norway. Over the last couple of years we have focused on timing, synchronization and media control in multi-device media. Currently NORUT is in charge of the workpackage that deals with this topic in MediaScape, a FP7 EU project aiming to provide a fundament for multi-device Web applications. The consortium includes BBC R&D, Vicomtech, IRT, NEC, NORUT, BR and W3C.

To the point: We have invented and developed the concept of  "Shared Motion", a generic mechanism for synchronization and media control in time-sensitive, multi-device Web applications. This mechanism has already been included as fundamental component in the multi-device architecture explored within the MediaScape project.

To give you a rough idea what this is about:
- Shared Motion synchronizes globally, thus multi-device synchronization is not limited to Intranet or specific network carrier.
- Shared Motion synchronizes across Internet with errors < 10ms, and works fine even under poor network conditions (e.g. edge - albeit a modest reduction in precision may be expected)

This is a "out of plain curiosity" question, but it wasn't clear from the paper how the < 10ms synchronization
deals across devices using different codecs - would it be possible to share how that bit works?

While I am not a expert in this area, from my experience if a synchronized seek request happens between
two devices that are decoding using different codecs there is a high chance of the different streams having
different i-frames, which will require the decoder to advance to a i-frame, which could end up making the two
devices go out of sync.

Stream 1: IBBPBBPBBPBBPBBPIBBPBBPBBPBBPBBPI
Stream 2: IBBPBBPIBBPBBPBPBBPIBPBBPBBPBBBPI
              ^
         Seek request
                 ^
       Stream 2 Decode Starts
                          ^
                Stream 1 Decode Starts



That should have looked like this: (Silly rich text mail clients.)

Stream 1: IBBPBBPBBPBBPBBPIBBPBBPBBPBBPBBPI
Stream 2: IBBPBBPIBBPBBPBPBBPIBPBBPBBPBBBPI
              ^
         Seek request
                 ^
       Stream 2 Decode Starts
                          ^
                Stream 1 Decode Starts


--
Sangwhan Moon [Opera Software ASA]
Software Engineer | Tokyo, Japan

Received on Wednesday, 17 December 2014 23:36:37 UTC