Re: [xsl] Reducing memeory overheads with xsl

Subject: Re: [xsl] Reducing memeory overheads with xsl
From: "Simon Kelly" <kelly@xxxxxxxxxx>
Date: Thu, 17 Apr 2003 10:59:19 +0200
Thanks Jarno, I'll give it a go :-)

Simon



----- Original Message -----
From: <Jarno.Elovirta@xxxxxxxxx>
To: <xsl-list@xxxxxxxxxxxxxxxxxxxxxx>
Sent: Thursday, April 17, 2003 10:49 AM
Subject: RE: [xsl] Reducing memeory overheads with xsl


> Hi,
>
> > I currently have the problem that my system is running out of
> > memeory when
> > processing my xslt.  I have set the max memory to 526MB (over
> > the 512 I
> > have) in a vain attempt to get the xsl-processor to use every ounce of
> > memory.  But it is still keeling over at the point when I
> > need to work on
> > about 40,000+ tags.
>
> [snip]
>
> > The whole final file should only be around the 6MB mark, and
> > I'm creating it
> > in 1/2 a gig of ram, so I have a couple of questions.
> >
> > 1)  Why does it run out of memory if the file sizes are 1/64
> > of max memory?
>
> Because the processor needs to build the XPath tree into memory, and
objects used in the three take up more space that tags in a file; ok, it may
not *have to* build the whole tree into memory, but most, if not all,
processors do). Still, that shouldn't be a problem, if the document is ~6MB.
Do you use xsl:sort, or generate RTFs in your stylesheet?
>
> > 2)  Is there a way to generate the file in a more memory
> > efficient way??
>
> STX <http://stx.sourceforge.net/>, or write your own SAX filter--XSLT
works for trasformations like the one mentioned above, but it might not be
the best hammer.
>
> Cheers,
>
> Jarno - VNV Nation: Genesis
>
>  XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list
>
>


 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread