- From: Philip Fennell <Philip.Fennell@marklogic.com>
- Date: Mon, 10 May 2010 00:21:37 -0700
- To: XProc Dev <xproc-dev@w3.org>
Nic, Just a suggestion, but, depending on which version of Calabash, and therefore Saxon that you are using you may have access to the Saxon extension functions. If so, you could try weaving into you transforms the saxon:discard-document() function. I have used it in the past for dealing with transforming many large documents. I know this is not necessarily treating the problem but it may act as a workaround for now. Regards Philip Fennell ________________________________________ From: xproc-dev-request@w3.org [xproc-dev-request@w3.org] On Behalf Of Nic Gibson [nicg@corbas.net] Sent: 07 May 2010 11:22 To: XProc Dev Subject: Memory usage We're seeing an XProc script through Calabash that shows increasing memory usage over time. I suspect that this is to be expected under the circumstances but I wanted to check and see if anyone can suggest a mitigating action. The script takes and XML file containing (basically) a list of file URLs. Each of these URLs is a directory on the local filesystem. All XML files in each directory are read using p:load then transformed using several XSLT pipelines. The whole script is basically two big nested p:for-each loops (one to read directories and a nested one to read and process the files found) As this runs the memory usage goes up for each file loaded and, eventually, the jvm kills the process with a heap exhaustion error. I suspect that there is nothing in the script above that might indicate to calabash that any file can be discarded so each one is held in memory until the end of the script. Is that likely? I'm not exactly a skilled Java programmer so I'm not in a position to read the code. Can anyone see any sensible approach that might allow us to run this script over several thousand XML file when it currently dies after around nine? cheers nic
Received on Monday, 10 May 2010 07:22:12 UTC