W3C home > Mailing lists > Public > xproc-dev@w3.org > November 2011

Question about performance handling many files..

From: Geert Josten <geert.josten@dayon.nl>
Date: Fri, 11 Nov 2011 16:11:35 +0100
Message-ID: <e7e7ec361412469e6d396abf80caaa9c@mail.gmail.com>
To: XProc Dev <xproc-dev@w3.org>

Consider the case where I have an input folder with thousands, or perhaps
even millions of documents, and I would like to load them, apply an xslt
on each individually, and store the result. In pseudo could something like


<p:for-each select="c:file">

How well would that behave? Particularly the directory-list step. If that
would generate the whole list first, and start the for-each only after the
list is complete, that would sound not very efficient.

Are there any track records on how XMLCalabash and other processors behave
with such a case?

Kind regards,

drs. G.P.H. (Geert) Josten
Senior Developer

Dayon B.V.
Delftechpark 37b
2628 XJ Delft

T +31 (0)88 26 82 570
M +31 (0)6 5438 1359


De informatie - verzonden in of met dit e-mailbericht - is afkomstig van
Dayon BV en is uitsluitend bestemd voor de geadresseerde. Indien u dit
bericht onbedoeld hebt ontvangen, verzoeken wij u het te verwijderen. Aan
dit bericht kunnen geen rechten worden ontleend.
Received on Friday, 11 November 2011 15:12:42 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 23:17:01 UTC