W3C home > Mailing lists > Public > xproc-dev@w3.org > April 2009

Re: Which is more efficient in the pipe, a whole feed or a sequence of entries?

From: Norman Walsh <ndw@nwalsh.com>
Date: Tue, 21 Apr 2009 21:18:31 -0400
To: XProc Dev <xproc-dev@w3.org>
Message-ID: <m2mya9qye0.fsf@nwalsh.com>
"Philip Fennell" <Philip.Fennell@bbc.co.uk> writes:
> It is quite straight-forward to construct a pipeline that takes content
> from the file system (or zip file), wrap it in an Atom Entry and then
> PUT|POST it to the store. However, I was wondering whether creating a
> sequence of Entry documents would be more, internally, efficient within
> the pipeline processor than a single Feed document. I've found when
> using Saxon for XSLT processing of large documents collections that the
> saxon:discard-document function very useful in keeping memory usage
> under control.

I don't think it would make very much difference in XML Calabash
today. Looking forward, I can imagine that it would be easier to
manage them as separate documents.

> There doesn't appear to be any indication of 'streaming' going-on here
> but should I expect any difference in the way memory is released when
> Calabash deals with whole feed documents or sequences of entries?

No, my immediate focus for XML Calabash is correctness and
completeness.

                                        Be seeing you,
                                          norm

-- 
Norman Walsh <ndw@nwalsh.com> | The way to get things done is not to
http://nwalsh.com/            | mind who gets the credit of doing
                              | them.--Benjamin Jowett

Received on Wednesday, 22 April 2009 01:19:16 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 22 April 2009 01:19:18 GMT