W3C home > Mailing lists > Public > public-tt@w3.org > February 2003

RE: Why use time as a unit of measurement? (was: Proposal 0.0)

From: Glenn A. Adams <glenn@xfsi.com>
Date: Wed, 19 Feb 2003 07:12:04 -0500
Message-ID: <7249D02C4D2DFD4D80F2E040E8CAF37C01FBD0@longxuyen.xfsi.com>
To: <Johnb@screen.subtitling.com>
Cc: <public-tt@w3.org>, <cogit@ludicrum.org>
As far as I understand it, the semantics of synchronizing timelines
between locked continuous media, as shown in my video+audio+text
example below, accounts for both pauses and seeks in the syncmaster,
and would also account for discontinuities in the timeline of the
syncmaster.
 
SMIL doesn't specify how a specific continuous media's internal
timeline is formulated, it merely assumes that the is some internal
timeline, and it is up to the media specific decoder to interpret
that timeline in a manner that can serve as a syncmaster for other
continuous media that are locked to that timeline.
 
So, for example, if each of the locked continuous media used a
PCR based with a reconstructed STC, or used another internal
time code, like SMPTE 12M, then discontinuities in the sync
master's timeline would cause seeks in any dependent media
element's timeline.
 
Note that SMIL 2.0 also defines seek semantics for hyperlinking
in time and accounts for synchronization between locked continuous
media in this context.
 
Please read SMIL 2.0 Timing and Synchronization [1], specifically
the sections on syncBehvaior, syncTolerance, syncMaster, and hyperlink
seek behaviors.
 
Perhaps Patrick Schmitz can comment on the accuracy of this
interpretation.
 
Regards,
Glenn
 
[1] http://www.w3.org/TR/smil20/smil-timing.html

	-----Original Message-----
	From: Johnb@screen.subtitling.com [mailto:Johnb@screen.subtitling.com] 
	Sent: Wednesday, February 19, 2003 5:00 AM
	To: Glenn A. Adams; cyril.concolato@enst.fr; J.Glauert@sys.uea.ac.uk; singer@apple.com
	Cc: public-tt@w3.org
	Subject: RE: Why use time as a unit of measurement? (was: Proposal 0.0)
	
	

	Glenn wrote: 

	> As an FYI, SMIL 2.0 already supports what you describe below via 
	> the syncMaster attribute. For example, 

	> <par> 
	>   <video src="..." syncBehavior="locked" syncMaster="true"/> 
	>   <audio src="..." syncBehavior="locked"/> 
	>   <textstream src="..." syncBehavior="locked" syncTolerance="0.5s"/> 
	> </par> 
	  
	> In this example, the video media serves as the sync master, while 
	> the audio and text streams are locked to the video. The text stream 
	> may slip up to +/- 0.5s from sync without resync. 

	If I am correct in my understanding of SMIL2.0 documentation - this allows you to specify 
	the tolerance of timing between streams, and to specify which stream is the master 
	reference for timing. 

	Useful, but I don't see how this allows SMIL 2.0 to work 
	for the practical subtitling of broadcast video. 

	I'll try to explain why. 

	A subtitle file is created from a master tape of a video program. 
	That master tape is timecoded probably (but not necessarily) with timing starting typically from 10.00hrs. 
	The master tape has a duration of say 1.5 hrs, so the first timecode is 10.00 and the last 11.30. 
	The subtitle file uses the timecodes as a reference to the exact position (frame accurate) 
	in the master tape for the presentation timing of the subtitles / captions. 

	When played by the Movie Channel X - a section of the tape may not be played 
	(i.e. the master tape is edited to produce the playout master). 
	This might occur to remove a sex scene for example. But the start timecode is still 10.00 
	and the last is still 11.30. There is a discontinuity in the middle. 

	Now the subtitle file would **not** be edited. [Editing subtitle files is expensive]. 
	The subtitle file uses timecode references - so when the playout tape is used it can still 
	align the subtitles correctly. 

	When the same video program is played by a different channel - or after the 9.00pm watershed 
	- the full [or a different amount of] content may be played. The **same** subtitle file still works. 
	The **same** subtitle file may be re-used for all playouts that are derived from the original master tape. 

	This relationship between the tape timecode and the file also allows interruptions to the 
	subtitled video program for news and advert insertion - without affecting the synchronisation 
	of the subtitles to the video program. 

	I guess the question I am asking is - where in SMIL 2.0 does it describe how you lock a 
	stream to an **external** clock, that may be stopped, 
	moving at a non 1s to 1s rate, or going backwards **and that may jump**. 

	Is this intended to be left as an issue for the UA - if so - surely there should 
	be some markup to indicate this is the intended behaviour? 

	I would say that in video broadcast, it is the timecode timeline that is the 
	controlling synchronisation source, everything is synchronised by the timecode. In SMIL 2.0, 
	from the documentation I've read and the examples I've seen - there appears an underlying assumption 
	that video plays forward linearly - and that you can reference using the played duration of the video. 
	This works fine for the presentation scenario - but does not appear to allow the kind of 
	synchronisation **through the editing / distribution process** that broadcast subtitling requires. 

	SMIL 2.0 does not appear to offer mechanisms to make SMIL based documents 
	resilient to changes in the media streams they use as master synchronisation sources. 

	As usual, all comments solely wrt broadcast subtitling, and all IMHO :-) 

	regards 
	John Birch 

	The views and opinions expressed are the author's own and do not necessarily 
	reflect the views and opinions of Screen Subtitling Systems Limited. 
Received on Wednesday, 19 February 2003 07:12:08 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 2 November 2009 22:41:26 GMT