Re: video use-case

At 11:13  +1100 7/10/08, Silvia Pfeiffer wrote:
>
>OK, I may be misunderstanding something. For a UA to ask for byte
>ranges when it only knows about time ranges, it would need a default
>mapping for a media type from time to byte. Are you saying that
>independent of the sampling rate and framerate, you will always have a
>direct mapping for any MOV file from time to byte offset? Can you
>explain?


OK, the MP4 and MOV files store all the frame size, timestamp, and 
location information NOT interleaved with the actual compressed data, 
but in run-length compacted tables in a separate area.  In there, 
there is a table that indexes from time to frame number, and another 
table that indexes frame number to frame size, and another that 
effectively gives you absolute file-offsets for each frame (actually, 
they are collected into chunks of frames, but that's not terribly 
important).

So, once you have these tables -- and they are all together in one 
place -- you have all you need to map time to byte-ranges, and you 
can go ahead and do byte-range requests to any http 1.1 server.

Finding the table is not all that hard.  They are all in the moov 
atom which is at top-level of the file, and in any file that's been 
properly laid out, it's either first, or after a very small 'file 
type' atom.  So, read a decent chunk of the beginning of the file, 
and you'll probably get it.  It has a size indicator (second word of 
it), and if it exceeds what you read, you can issue a second read for 
the rest.
-- 
David Singer
Apple/QuickTime

Received on Tuesday, 7 October 2008 00:44:16 UTC