Re: Why use time as a unit of measurement? (was: Proposal 0.0)

Johnb@screen.subtitling.com wrote:
> Please explain why you see the timecodes as needing to change? 

If a timecode for the start of a movie is 10:00 as I saw in a previous 
email, I understand that as: this movie shall start at 10:00pm tonight.
If I want to play it at another time, I need to do some retiming.

> I see a major difference between PC media playback and TV playback
> in that in PCs the presentation may run slower than real time
> if problems occur, whereas in TV it 'drops'. 
> That is - in broadcast you can never get behind (wrt to the timecode).

Again, this is a specification (or a bug) of the player.
You can specify the player to play all frames whatever the cost in time 
slips, or specify that the player shall drop frames.

Whenever you have video lipsync with audio, even on PCs, the audio 
renderer clock will drive the whole system and video frames will get 
dropped if necessary.

> The timecodes within slaved synchronised streams do not need to change 
> - they are semantically equivalent to mediaMarkers. The problem 
> with mediaMarking is that it is too verbose and an extra 
> level of indirection above what is IMHO required.

Does this mean that the 10:00 does not refer to actual wall clock time ?

> If parts of a stream use absolute times that reference a
> defined (within the stream) synchronisation master, what has to change?

I do not understand this question, but I am not sure it is important.

> The stream players (if you assume that each stream is played by a 
> different component e.g. audio, video, text..) need to be
> kept in sync to the defined syncMaster reference, rather than just told of
> starts and stops.

Indeed but it is two different mechanisms.
In MPEG-4, the CTS and DTS of each access unit in each stream allow the 
system to keep the lipsync, while the starts and stops are dealt with at 
the level of the scene (MediaControl node).

> I see no implementation difference between synchronising streams 
> against an arbitrary internal page clock (which is what I perceive 
> as the current situation in SMIL) and synchronising streams with 
> an explicitly provided external referenced clock (or a discontinuous 
> clock extracted from another stream - see note below).

Probably there are implementation differences ;-), but nothing 
conceptually different.

> What **may** be required in SMIL is the ability to define 
> in the markup the offset of the timecode from the 
> internal page clock. So you can say that syncMaster timecode 
> 10.00.00:000 is equal to begin 0. 
> A further implication is then that if the timecode
> **jumps** from 10.10.00:000 to 10.11.00.000 
> then the internal page clock jumps formward 
> from begin 10 minutes to begin 11 minutes (i.e. A seek forward occurs).

It is a lot easier to implement the management of a list of edits, 
because you can anticipate better than when you are just listening to 
timecodes and detecting a jump. The jump can be detected only quite 
"late", whereas the edit list can be sent a long time in advance.

>>OK, this means another requirement:
>>the TT stream maybe sent separately from the rest, even if it has to 
>>stay ABSOLUTELY synchronized, whatever editing is done on the rest of 
>>the movie
> 
> For TT to be more universally accepted, I feel it is
> essential for this requirement to be met.

This is a very strong requirement, and it means the edit list *has* to 
be sent with the movie, either as the time codes, or as a higher level 
list of segments (of the original, full movie) to play, the time codes 
being just a discrete way of representing the edit list.

Best regards
JC
--
Jean-Claude Dufourd       @======================================@
ENST, Dept COMELEC             The wing, over the big rock...
46, rue Barrault          @======================================@
75013 Paris                Tel: +33145817807    Fax: +33145804036

Received on Friday, 21 February 2003 11:39:06 UTC