RE: [xsl] Reducing memeory overheads with xsl

Subject: RE: [xsl] Reducing memeory overheads with xsl
From: Jarno.Elovirta@xxxxxxxxx
Date: Thu, 17 Apr 2003 11:49:59 +0300
Hi,

> I currently have the problem that my system is running out of 
> memeory when
> processing my xslt.  I have set the max memory to 526MB (over 
> the 512 I
> have) in a vain attempt to get the xsl-processor to use every ounce of
> memory.  But it is still keeling over at the point when I 
> need to work on
> about 40,000+ tags.

[snip]

> The whole final file should only be around the 6MB mark, and 
> I'm creating it
> in 1/2 a gig of ram, so I have a couple of questions.
> 
> 1)  Why does it run out of memory if the file sizes are 1/64 
> of max memory?

Because the processor needs to build the XPath tree into memory, and objects used in the three take up more space that tags in a file; ok, it may not *have to* build the whole tree into memory, but most, if not all, processors do). Still, that shouldn't be a problem, if the document is ~6MB. Do you use xsl:sort, or generate RTFs in your stylesheet?

> 2)  Is there a way to generate the file in a more memory 
> efficient way??

STX <http://stx.sourceforge.net/>, or write your own SAX filter--XSLT works for trasformations like the one mentioned above, but it might not be the best hammer.

Cheers,

Jarno - VNV Nation: Genesis

 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread