Re: Which is more efficient in the pipe, a whole feed or a sequence of entries?

"Philip Fennell" <Philip.Fennell@bbc.co.uk> writes:
> It is quite straight-forward to construct a pipeline that takes content
> from the file system (or zip file), wrap it in an Atom Entry and then
> PUT|POST it to the store. However, I was wondering whether creating a
> sequence of Entry documents would be more, internally, efficient within
> the pipeline processor than a single Feed document. I've found when
> using Saxon for XSLT processing of large documents collections that the
> saxon:discard-document function very useful in keeping memory usage
> under control.

I don't think it would make very much difference in XML Calabash
today. Looking forward, I can imagine that it would be easier to
manage them as separate documents.

> There doesn't appear to be any indication of 'streaming' going-on here
> but should I expect any difference in the way memory is released when
> Calabash deals with whole feed documents or sequences of entries?

No, my immediate focus for XML Calabash is correctness and
completeness.

                                        Be seeing you,
                                          norm

-- 
Norman Walsh <ndw@nwalsh.com> | The way to get things done is not to
http://nwalsh.com/            | mind who gets the credit of doing
                              | them.--Benjamin Jowett

Received on Wednesday, 22 April 2009 01:19:16 UTC