W3C home > Mailing lists > Public > public-media-fragment@w3.org > October 2008

Re: video use-case

From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
Date: Wed, 8 Oct 2008 10:39:00 +1100
Message-ID: <2c0e02830810071639j6854a62br1da02ef27e2dd3e2@mail.gmail.com>
To: "RaphaŽl Troncy" <Raphael.Troncy@cwi.nl>
Cc: "Dave Singer" <singer@apple.com>, "Media Fragment" <public-media-fragment@w3.org>

On Tue, Oct 7, 2008 at 11:35 PM, RaphaŽl Troncy <Raphael.Troncy@cwi.nl> wrote:
> Dear Silvia,
>
> [...]
>
>>> For any protocol?  What is the 'media type' in RTSP?  The only thing that
>>> comes back with any kind of recognizable types are (a) the description
>>> format (e.g. SDP) and (b) the individual streams (video/mpeg4, for
>>> example,
>>> is the MIME name of the RTP mpeg4 payload format).  Which one 'owns' the
>>> fragment on the bundle that the URL represents.
>>
>> The media type is the MIME type of the resource that is returned. In
>> RTSP we don't actually have real "video" resources. We only have codec
>> streams. So, the meaning of "fragment" in RTSP is not the same as that
>> for ftp or http for example. However, I think RTSP already has
>> mechanisms from doing fragment addressing since it was a main part of
>> the standardisation of RTSP. I don't think we need to change that. We
>> should learn from it and consider means of harmonising in the end I
>> think.
>
> That would be great if we could have further information explaining how the
> fragment addressing mechanism works in RTSP. Like you said, we will need to
> learn from their experience. Do you have any pointer to start with? Would
> you like to study that further and report your findings during one of our
> next telecon?

I won't have the time to go into details, but I studied it 7 years ago
as we were defining temporal URIs. Essentially: the play request has a
range parameter as part of the protocol parameters of RTSP (see
http://www.ietf.org/rfc/rfc2326.txt and search for range). It is the
equivalent of what Dave and I have been discussing: the use of byte
ranges to deliver the data. However, in HTTP we need to have a
mechanism to communicate the temporal segment (or more generally: the
media fragment) between a user, a user agent, and a server, such that
the server can help convert the fragment to a byte range, which in
turn the UA can then request from the server and can be proxied by
intermediates.

I'll make a picture of it, since it seems this is difficult to
understand. We only have very few possibilities of how we can do this
communication because of the way in which video resources, HTTP, and
URIs work. I don't have much time today to get onto it but will what I
can.

Cheers,
Silvia.
Received on Tuesday, 7 October 2008 23:39:37 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 21 September 2011 12:13:31 GMT