W3C home > Mailing lists > Public > public-web-and-tv@w3.org > December 2014

RE: Shared Motion - multi-device synchronization and media control for the Web.

From: GAUSMAN, PAUL <pg2483@att.com>
Date: Tue, 16 Dec 2014 22:31:44 +0000
To: Giuseppe Pascale <giuseppep@opera.com>, Ingar Mæhlum Arntzen <ingar.arntzen@gmail.com>
CC: public-web-and-tv <public-web-and-tv@w3.org>, Njål Borch <njaal.borch@gmail.com>, Dominique Hazael-Massieux <dom@w3.org>, François Daoust <fd@w3.org>
Message-ID: <F403326A8484704DAAAD9BA5687DDE870AA78B75@MISOUT7MSGUSRCD.ITServices.sbc.com>
I find this proposed solution and this topic in general to be very exciting with tremendous potential for greatly enhancing user experiences of all types with HTML5.

Over a year ago, I asked the HTML5 interest group if there were any plans to establish methods and standards within HTML5 for distributed temporal controls relating to various events, particularly media related. The response at the time was basically that temporal controls could be addressed with current facilities and JavaScript. I don’t recall that the MediaController functions were defined at that time.

Temporal controls, enabling broad distributed user experiences of all types, would need to:

·       Synchronize media, triggers, messaging and IO across:

o   Different apps on the same device

o   Different or the same app on different devices

o   Client apps and server apps in the cloud or elsewhere

o   Apps, servers and potentially could virtual network components

·       Be resistant to distance effects on timing

·       Handle temporal (e.g. linear) media and static media (e.g. images, web pages)

·       Be aware of and responsive to IO capabilities of all kinds (e.g. user inputs, m2m events, etc.)

·       Span service provider ecosystems

·       Be capable of integrating with intelligent controller functions

·       (Probably many more capabilities)

These kinds of capabilities can propel web-based user experiences (both locally and around the globe) from being trapped in a screen to being integral with real world environments in a natural way (or unnatural as requirements dictate.) They would further the ability of the digital user experience to merge seamlessly with the physical world. This would evolve the concept of Virtual Reality from its current confines to become an integrated Augmented Reality experience engine based on an HTML5 framework of digital entities.

This interest group is comprised of smart visionary people. Please embrace the potential of distributed temporal controls!

Thanks! And Happy New Year!

-Paul

From: Giuseppe Pascale [mailto:giuseppep@opera.com]
Sent: Tuesday, December 16, 2014 4:06 AM
To: Ingar Mæhlum Arntzen
Cc: public-web-and-tv; Njål Borch; Dominique Hazael-Massieux; François Daoust
Subject: Re: Shared Motion - multi-device synchronization and media control for the Web.

On Mon, Dec 15, 2014 at 11:23 PM, Ingar Mæhlum Arntzen <ingar.arntzen@gmail.com<mailto:ingar.arntzen@gmail.com>> wrote:

Guiseppe


Ingar


Indeed, we have been looking closely at the mediacontroller, and it is the main target for our proposal. I’ll come back to this towards the end.


First, I want to address your concern that, since this already works - it is not clear why standardization is needed.


To be clear, my "challenge" is only in order to stimulate a discussion. There are cases where is worth standardizing and cases where is better to keep things in libraries. I don't have a strong opinion yet on this case, hence interested in hearing yours (and that of other people on this list) ideas

Also, only once we know exactly what we want to standardize we can figure out what is the best group to approach and if what you need belongs to W3C or to another groups



It is true that this works, and that Motion Corporation already offers distributed media control. However, not standardizing this would mean that future motion providers would design their own similar, but different, protocols to do essentially the same thing. The result would be that the ability to synchronize different media would depend on choice of motion providers. This is an unnecessary dependency, and it would work against interoperability, extensibility and flexible composition, hallmarks of the Web.


On the other hand, a clear benefit of standardization would be that browser providers could optimise their implementations of media elements and controllers thereby actively supporting a wide range of applications requiring precise, multi-device coordination of linear media. In contrast, currently multi-device synchronization is a nightmare because browsers all implement media elements a little differently, and none of them are appropriately optimised for external media control.


Further evidence to the great value of standardization of media control comes from the music industry. The MIDI (Musical Instrument Digital Interface) has served as standard protocol for synchronization during the last three decades. In this time is has continuously ensured interoperability between synthesizers, samplers, drum machines, computers, or even stage lighting and events in theatrical production.


With Shared Motion we are able to escape the great limitation of MIDI - that it only works for devices in close proximity. This is good for the music industry, because it means that the Web can now offer global MIDI (based on Shared Motion). More importantly, it unlocks the value of interoperability through synchronization and precisely coordinated execution to the entire Web. This is no small thing.


In short, we think this represents a historic opportunity to define a common concept of shared media control for the Web, (even before any on the big ones have started investing in their own concepts). This, we argue, could secure interoperability of all kinds of linear media and make the Web into an even greater platform for online media.



Back to the media controller - and what exactly we think is missing.


The media controller is the HTML5 way of synchronizing multiple media elements. However, crucially, its scope is limited to synchronizing media elements in the same DOM. We are effectively suggesting to extend this scope to the Internet, thus enabling multi-device playback, globally.


We have designed Shared Motion specifically with this in mind, to be a multi-device media controller for Web. Like the current HTML media controller, Shared Motion does play, pause, rewind etc.  So, in terms of standardization, our suggestion is (at least conceptually) very simple: we would like to see the mediacontroller extended with a motion attribute specifying a source URL for the online synchronization point. Secondly, we’d suggest standardization of Shared Motion itself, as it has a vast number of uses that are not related to media elements.


For the protocol itself I'm not sure W3C is the right place though, isn't it?
(haven't looked at the details of your protocol so I may be missing something)


There is a lot more to discuss here, but this short version should work as a starting point. We are currently laying the finishing touches on a paper that gives a more complete presentation. We would be happy to share this as a basis for further discussion.


feel free to share once ready

/g
Received on Tuesday, 16 December 2014 22:32:46 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:57:25 UTC