W3C home > Mailing lists > Public > public-media-fragment@w3.org > October 2008

Re: video use-case

From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
Date: Tue, 7 Oct 2008 12:07:23 +1100
Message-ID: <2c0e02830810061807t5f69297ep4f9650b390399ef6@mail.gmail.com>
To: "Dave Singer" <singer@apple.com>
Cc: "Media Fragment" <public-media-fragment@w3.org>

On Tue, Oct 7, 2008 at 11:42 AM, Dave Singer <singer@apple.com> wrote:
>
> At 11:13  +1100 7/10/08, Silvia Pfeiffer wrote:
>>
>> OK, I may be misunderstanding something. For a UA to ask for byte
>> ranges when it only knows about time ranges, it would need a default
>> mapping for a media type from time to byte. Are you saying that
>> independent of the sampling rate and framerate, you will always have a
>> direct mapping for any MOV file from time to byte offset? Can you
>> explain?
>
>
> OK, the MP4 and MOV files store all the frame size, timestamp, and location
> information NOT interleaved with the actual compressed data, but in
> run-length compacted tables in a separate area.  In there, there is a table
> that indexes from time to frame number, and another table that indexes frame
> number to frame size, and another that effectively gives you absolute
> file-offsets for each frame (actually, they are collected into chunks of
> frames, but that's not terribly important).
>
> So, once you have these tables -- and they are all together in one place --
> you have all you need to map time to byte-ranges, and you can go ahead and
> do byte-range requests to any http 1.1 server.
>
> Finding the table is not all that hard.  They are all in the moov atom which
> is at top-level of the file, and in any file that's been properly laid out,
> it's either first, or after a very small 'file type' atom.  So, read a
> decent chunk of the beginning of the file, and you'll probably get it.  It
> has a size indicator (second word of it), and if it exceeds what you read,
> you can issue a second read for the rest.

OK, so the tables are in the MOV file and therefore not available to
the client without talking to the server. Therefore, you need to do at
least one communication between client and server to get the
information how to map time to byte-ranges. That is exactly the
process I talked about earlier: the server has to tell the client.

It is true however that the server can either tell the client
generically how to map times to byte offsets for a specific video, or
it can just provide the byte ranges directly to the client, which is
then further dependent on communication with the server for further
byte range mappings. I assume that was what you were trying to tell
me?

Cheers,
Silvia.
Received on Tuesday, 7 October 2008 01:07:59 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 21 September 2011 12:13:31 GMT