Re: [xsl] Transforming large XML docs in small amounts of memory

Subject: Re: [xsl] Transforming large XML docs in small amounts of memory
From: "Andrew Welch" <andrew.j.welch@xxxxxxxxx>
Date: Mon, 30 Apr 2007 10:34:21 +0100
On 4/30/07, Ronan Klyne <ronan.klyne@xxxxxxxxxxx> wrote:
Hi all,

I am trying to find ways of reducing the memory requirements of our
transforms.
The main factor driving the memory usage up is the size of the input
document (upto about 150Mb), but this is out of our control at this point.
So, the question: Is there anything which can be done (or avoided) in
the XSL to decrease the amount of memory used in the transform?

(I appreciate that this question is very abstract, and I apologise - I'm
mostly fishing for ideas, or a confirmation of my suspicion that not
much can be done...)

Much can be done, but your available options all depend on the processor and environment you're running, and how flexible you are - is it a pure XSLT 1.0/2.0 solution you're after, or can you use extensions or modify the processing pipeline?

Also you need to let us know:

- Is the input uniform chunks of data in a single file?  (likely if
its a "data-centric" xml file) or does the processing require access
to the whole input for the whole transform?

- What is your current memory usage?  Whats the limit, what is an
acceptable bound? etc..

- How are you measuring memory usage?  Is it simply the input XML that
is using up all available memory, or do other parts of the pipeline
use a lot of memory too?

cheers
andrew

Current Thread