RE: Why use time as a unit of measurement? (was: Proposal 0.0)

Glenn wrote:

> As an FYI, SMIL 2.0 already supports what you describe below via
> the syncMaster attribute. For example,

> <par>
>   <video src="..." syncBehavior="locked" syncMaster="true"/>
>   <audio src="..." syncBehavior="locked"/>
>   <textstream src="..." syncBehavior="locked" syncTolerance="0.5s"/>
> </par>
 
> In this example, the video media serves as the sync master, while
> the audio and text streams are locked to the video. The text stream
> may slip up to +/- 0.5s from sync without resync.

If I am correct in my understanding of SMIL2.0 documentation - this allows
you to specify
the tolerance of timing between streams, and to specify which stream is the
master
reference for timing.

Useful, but I don't see how this allows SMIL 2.0 to work 
for the practical subtitling of broadcast video. 

I'll try to explain why.

A subtitle file is created from a master tape of a video program.
That master tape is timecoded probably (but not necessarily) with timing
starting typically from 10.00hrs.
The master tape has a duration of say 1.5 hrs, so the first timecode is
10.00 and the last 11.30.
The subtitle file uses the timecodes as a reference to the exact position
(frame accurate)
in the master tape for the presentation timing of the subtitles / captions.

When played by the Movie Channel X - a section of the tape may not be played

(i.e. the master tape is edited to produce the playout master). 
This might occur to remove a sex scene for example. But the start timecode
is still 10.00 
and the last is still 11.30. There is a discontinuity in the middle.

Now the subtitle file would **not** be edited. [Editing subtitle files is
expensive].
The subtitle file uses timecode references - so when the playout tape is
used it can still
align the subtitles correctly.

When the same video program is played by a different channel - or after the
9.00pm watershed 
- the full [or a different amount of] content may be played. The **same**
subtitle file still works. 
The **same** subtitle file may be re-used for all playouts that are derived
from the original master tape.

This relationship between the tape timecode and the file also allows
interruptions to the 
subtitled video program for news and advert insertion - without affecting
the synchronisation
of the subtitles to the video program.

I guess the question I am asking is - where in SMIL 2.0 does it describe how
you lock a 
stream to an **external** clock, that may be stopped, 
moving at a non 1s to 1s rate, or going backwards **and that may jump**.

Is this intended to be left as an issue for the UA - if so - surely there
should
be some markup to indicate this is the intended behaviour?

I would say that in video broadcast, it is the timecode timeline that is the

controlling synchronisation source, everything is synchronised by the
timecode. In SMIL 2.0,
from the documentation I've read and the examples I've seen - there appears
an underlying assumption
that video plays forward linearly - and that you can reference using the
played duration of the video.
This works fine for the presentation scenario - but does not appear to allow
the kind of 
synchronisation **through the editing / distribution process** that
broadcast subtitling requires.

SMIL 2.0 does not appear to offer mechanisms to make SMIL based documents 
resilient to changes in the media streams they use as master synchronisation
sources.

As usual, all comments solely wrt broadcast subtitling, and all IMHO :-)

regards 
John Birch

The views and opinions expressed are the author's own and do not necessarily
reflect the views and opinions of Screen Subtitling Systems Limited.

Received on Wednesday, 19 February 2003 04:50:56 UTC