W3C home > Mailing lists > Public > public-tt@w3.org > February 2003

Re: Why use time as a unit of measurement? (was: Proposal 0.0)

From: Jean-Claude Dufourd <Jean-Claude.Dufourd@enst.fr>
Date: Fri, 21 Feb 2003 11:54:13 +0100
Message-ID: <3E560555.7060400@enst.fr>
To: public-tt@w3.org

Dear John,

Johnb@screen.subtitling.com wrote:
> These two are actually restatements of the same root requirement, are 
> they not?

They are conceptually different: 1 is more a requirement of the 
specification of the content, and 2 is a specification of the player of 
that content. It is all about the same content, yes, but it feels like 
different levels to me.

> Absolutely. SMIL is IMHO a valid direction to go in, but currently IMHO 
> suffers
> from 'tunnel vision'.

I agree with the 'tunnel vision' ;-))

> I find myself leaning towards a view that TT is more of a 'profile'
> (if that is the correct term) describing how to use XML, CSS and SMIL 
> for TT.

A few more elements may be required.

> Requirement 2 is what creates requirement 1. The process of editing AVT 
> material,
> a cycle of creation, revision and review, means that a simple manner of
> preserving the sync relationship between streams is desirable. This is 
> the root of my
> dislike of relative from start 'begin', it is unwieldy in the editing 
> process.

Just as the absolute time code seems horrible to us (PC people) because 
it seems like you have to change the timecodes everytime you play the movie.

> In the broadcast environment - the majority of audio/video is still 
> stored and
> manipulated in an uncompressed format - allowing edits to occur at any 
> frame boundary.
> FYI The problems associated with compressed streams and subtitles 
> originate from the
> requirement to pre-send subtitles (due to bandwidth limitations). 
> Without a priori knowledge
> of when an edit is going to occur this can (and sometimes does) cause 
> artifacts in the resultant
> presentation. Be aware that in many circumstances in broadcast and edit 
> list is not available, examples
> are Newsflashes, Advert insertion and local censorship. Whilst captions 
> are generally pre-burnt
> into the material - and thus are intrinsically synchronised, subtitles - 
> for language translation -
> are typically inserted over an incoming broadcast from another region - 
> and must follow that
> incoming broadcast.

OK, this means another requirement:
the TT stream maybe sent separately from the rest, even if it has to 
stay ABSOLUTELY synchronized, whatever editing is done on the rest of 
the movie

Best regards
JC
--
Jean-Claude Dufourd       @======================================@
ENST, Dept COMELEC             The wing, over the big rock...
46, rue Barrault          @======================================@
75013 Paris                Tel: +33145817807    Fax: +33145804036
Received on Friday, 21 February 2003 06:02:38 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 2 November 2009 22:41:26 GMT