W3C home > Mailing lists > Public > public-media-fragment@w3.org > October 2008

Re: video use-case

From: Dave Singer <singer@apple.com>
Date: Mon, 6 Oct 2008 17:42:44 -0700
Message-Id: <p0624081ec5105dc1cc9f@[]>
To: Media Fragment <public-media-fragment@w3.org>

At 11:13  +1100 7/10/08, Silvia Pfeiffer wrote:
>OK, I may be misunderstanding something. For a UA to ask for byte
>ranges when it only knows about time ranges, it would need a default
>mapping for a media type from time to byte. Are you saying that
>independent of the sampling rate and framerate, you will always have a
>direct mapping for any MOV file from time to byte offset? Can you

OK, the MP4 and MOV files store all the frame size, timestamp, and 
location information NOT interleaved with the actual compressed data, 
but in run-length compacted tables in a separate area.  In there, 
there is a table that indexes from time to frame number, and another 
table that indexes frame number to frame size, and another that 
effectively gives you absolute file-offsets for each frame (actually, 
they are collected into chunks of frames, but that's not terribly 

So, once you have these tables -- and they are all together in one 
place -- you have all you need to map time to byte-ranges, and you 
can go ahead and do byte-range requests to any http 1.1 server.

Finding the table is not all that hard.  They are all in the moov 
atom which is at top-level of the file, and in any file that's been 
properly laid out, it's either first, or after a very small 'file 
type' atom.  So, read a decent chunk of the beginning of the file, 
and you'll probably get it.  It has a size indicator (second word of 
it), and if it exceeds what you read, you can issue a second read for 
the rest.
David Singer
Received on Tuesday, 7 October 2008 00:44:16 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:52:41 UTC