RE: [xsl] Transforming large XML docs in small amounts of memory

Subject: RE: [xsl] Transforming large XML docs in small amounts of memory
From: "Michael Kay" <mike@xxxxxxxxxxxx>
Date: Mon, 30 Apr 2007 11:04:52 +0100
It depends very much on the nature of the transformation.

Some transformations are amenable to the approach described at

http://www.saxonica.com/documentation/sourcedocs/serial.html

Some can benefit from inserting a SAX filter into the pipeline before the
transformation proper, simply to remove the parts of the input document that
aren't needed. 

Be aware also that there are considerable differences between different tree
models in the amount of space they use. The Saxon TinyTree is generally 4-5
times raw data size whereas a DOM is often 10 times raw data size.

Michael Kay
http://www.saxonica.com/


> -----Original Message-----
> From: Ronan Klyne [mailto:ronan.klyne@xxxxxxxxxxx] 
> Sent: 30 April 2007 10:19
> To: xsl-list@xxxxxxxxxxxxxxxxxxxxxx
> Subject: [xsl] Transforming large XML docs in small amounts of memory
> 
> Hi all,
> 
> I am trying to find ways of reducing the memory requirements 
> of our transforms.
> The main factor driving the memory usage up is the size of 
> the input document (upto about 150Mb), but this is out of our 
> control at this point.
> So, the question: Is there anything which can be done (or 
> avoided) in the XSL to decrease the amount of memory used in 
> the transform?
> 
> (I appreciate that this question is very abstract, and I 
> apologise - I'm mostly fishing for ideas, or a confirmation 
> of my suspicion that not much can be done...)
> 
> 	# r
> 
> --
> Ronan Klyne
> Business Collaborator Developer
> Tel: +44 (0)870 163 2555
> ronan.klyne@xxxxxxxxxxx
> www.groupbc.com

Current Thread