W3C home > Mailing lists > Public > public-html@w3.org > January 2013

Re: HTML and Multimedia Synchronization

From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
Date: Thu, 3 Jan 2013 22:17:13 +1100
Message-ID: <CAHp8n2noM9FbmWuaWmdm199JYyRf_k0eg6mOWq+qtjA=ZaDxTA@mail.gmail.com>
To: Adam Sobieski <adamsobieski@hotmail.com>
Cc: "public-html@w3.org" <public-html@w3.org>
Hi Adam,

I am beginning to wonder what your emails to the list mean. Are they
suggestions of new features that should be added to HTML?

I am confused because each one of your emails broadly states existing
technology or research fields. None of the emails provides a basis for the
definition of new features for HTML, since they do not explain use cases
(what user problem are you trying to solve?), nor do they state why
existing Web technologies don't already solve the problems, neither do they
state what in particular is broken on the current Web and needs fixing.

In the below email, for example, you mention 3D graphics, audio overlays,
object motion, multimedia synchronization - these alone are 4 technology
areas, each with many use cases, some of which are already supported by
current Web technologies.

You need to be more concrete if you would like to see a discussion.

For example, let's focus on multimedia synchronization (which is the title
of the email that starts this thread).

Right now, HTML5 supports the synchronization of multiple audio, video and
text tracks that may each originate from a different file. Therefore, audio
overlays for video are already supported. However, you also mention that
you would like to see SSML documents being synthesized and synchronized
with audio or video files. What are your use cases for this? Why support
SSML and not a different speech synthesis format? What is the difference,
for example, between an audio description file in WebVTT, synchronized
through <track> with a <video> to what you require? But above all: what is
the use case - the user problem - that you'd like to see resolved?

HTH.

Regards,
Silvia.


On Sun, Dec 30, 2012 at 4:28 AM, Adam Sobieski <adamsobieski@hotmail.com>wrote:

> HTML Working Group,
>
> Greetings. In *Some HTML 5.1 Discussion Topics*,
> http://lists.w3.org/Archives/Public/public-html/2012Dec/0064.html, XHTML
> and a SMIL subset, the multimedia synchronization of hypertext, audio and
> video overlays, 3D graphics, and speech synthesis, and SMIL and WebGL, SMIL
> and X3DOM topics were broached.
>
> *Multimedia Synchronization and 3D Graphics, WebGL:*
>
> Audio overlay or speech synthesis with 3D graphics camera and object
> motions and events.
> Example: Discussion of a math function's graph.
>
> Interactive 3D graphics and multimedia synchronizations.
> Example: Navigation of a 3D model or scene with dynamic content.
>
> *Multimedia Synchronization and Speech Synthesis:*
>
> The functionality for audio overlays and multimedia synchronization can
> corrolate with functionality for the speech synthesis of hypertext
> documents and HTML+SSML documents.
>
> Some hyperlinks on the topics of hypertext documents and multimedia
> synchronization:
>
> *SMIL 3.0 Timing and Synchronization*
> http://www.w3.org/TR/SMIL3/smil-timing.html
>
> *SMIL Timesheets 1.0* http://www.w3.org/TR/timesheets/
>
> *Timesheets: XML Timing Language* http://www.w3.org/Submission/xml-timing/
>
> *timesheets.js* http://wam.inrialpes.fr/timesheets/
>
> *Timed Interactive Multimedia Extensions for HTML (HTML+TIME)*
> http://www.w3.org/TR/NOTE-HTMLplusTIME
>
> *XHTML+SMIL Profile* http://www.w3.org/TR/XHTMLplusSMIL/
>
>
>
> Kind regards,
>
> Adam Sobieski
>
Received on Thursday, 3 January 2013 11:18:02 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 3 January 2013 11:18:02 GMT