Re: Media--Technical Implications of Our User Requirements

On Jul 13, 2010, at 8:51 PM, Janina Sajka wrote:

> Colleagues:
> 
> The following is a first-pass, high level description of the content types and API mechanisms our user requirements expose.  It is meant to be a list of
> things we need to support in our eventual change proposal. it is intentionally agnostic
> regarding any particular technology.
> 


>          + 2.5 Content Navigation by Content Structure
> 
> A structured data file.
> 
> NOTE: Data in this file is used to synchronize all media representations
> available for a given content publication, i.e. whatever audio, video, and
> text document--default and alternative--versions may be provided.
> 
  To me, "synchronize all media representations" means the data file will be used to control the *playback* of the media file. Is this what you intended, or is the data file supposed to be used to allow *navigation* of the media file? 


>          + 3.1 Access to interactive controls / menus
> 
> An API providing access to:
> 
> Stop/Start
> Pause
> Fast Forward and Rewind (time based)
> time-scale modification control
> volume (for each available audio track)
> pan location (for each available audio track)
> pitch-shift control
> audio filters

  You said that this is meant to be a list of "things we need to support in our eventual change proposal", when did programmatic access to real-time audio filters become a requirement? CA-4 says only "Potentially support pre-emphasis filers, pitch shifting, and other audio processing algorithms". 

> Viewport content selection, on screen location and sizing control

  I don't understand what this means 

> Extended descriptions and extended captions configuration/control

  How does one configure descriptions and captions?

> Ancillary content configuration/control
> 
  I still don't see a definition of "ancillary" in the requirements document.  

Received on Wednesday, 14 July 2010 16:54:08 UTC