Re: [XHR] support for streaming data

Hi Charles,

Le 10/08/2011 23:19, Charles Pritchard a écrit :
> On 8/9/2011 1:00 AM, Cyril Concolato wrote:
>> Hi Charles,
>>> I believe that GPAC seeks through large SVG files via offsets and small buffers, from what I understood at SVG F2F.
>>> The technique is similar to what PDF has in it's spec.
>> I don't know what you're referring to.
> PDF Cross-Reference Stream Data
> PDF supports byte offsets, links and SMIL.
Thanks for the reference.

> I suppose I was referring more to the MP4Box work than GPAC, though they do work in harmony.
> MP4 has chunk offsets, and GPAC includes SVG <discard> support.
> I believe that MP4Box stores, and GPAC reads fragments of a large SVG file
> throughout the MP4 stream, in a limited manner, similar to how a PDF processes streams.
> They both allow someone to seek and render portions of a large file,
> without loading it all into memory.
>  From the article:
> "We have applied the proposed method to fragment SVG content into SVG streams on
> long-running animated vector graphics cartoons, resulting from
> the transcoding of Flash content... NHML descriptions were generated
> automatically by the cartoon or subtitle transcoders."
> "... the smallest amount of memory [consumed] is the 'Streaming and Progressive Rendering'. The
> memory consumption peak is reduced by 64%"
>>> SVG does not have byte offset hints, but GPAC expects
>>> data to be processed by an authoring tool and otherwise works with transcoding, much as VLC (VideoLan) does.
>> The details of how we can do it is here:
>> Basically, for long running SVG animations (e.g. automatic translation from Flash to SVG), it is interesting to load only some SVG parts when they are needed and to discard them (using the SVG Tiny 1.2 <discard> element), when they are no longer needed. For that, we use an auxiliary file that indicates how to fragment the SVG file into a stream, giving timestamps to each SVG file fragment. That auxiliary file is then used to store the SVG fragments as regular access units in MP4 files, we use MP4Box for that. The manipulation of those fragments for storage and playback is then similar to what you would do for audio/video streams. We don't do transcoding for SVG fragments but for instance individual gzip encoding is possible.
>> I think an interesting use case for XHR would be to be able to request data with some synchronization, i.e. with a clock reference and timestamp for each response data.
> Some part of that could be handled via custom HTTP headers; though it's certainly a bit of extra-work,
> much as implementing "seek" over http can be work.
Custom HTTP headers or other HTTP Streaming solutions (e.g. MPEG DASH). That's the benefit of storing the SVG as fragments in an MP4. At the time we wrote the paper we were able to stream the SVG with an unmodified Darwin Streaming Server using the RTP protocol. I believe there would be no problem in streaming the SVG in an MP4 with an unmodified HTTP Server using the DASH approach. I haven't tried though.

> I'll keep thinking about the case you brought up. I do believe timestamps are currently
> available on events, relating to when the event was raised.
> What do you mean by a clock reference?
That's a general concept when synchronizing multiple media streams, possibly not all synchronized together, you need to group them by their common clock.



Cyril Concolato
Maître de Conférences/Associate Professor
Groupe Multimedia/Multimedia Group
Telecom ParisTech
46 rue Barrault
75 013 Paris, France

Received on Thursday, 11 August 2011 08:34:57 UTC