[xsl] Reducing memeory overheads with xsl

Subject: [xsl] Reducing memeory overheads with xsl
From: "Simon Kelly" <kelly@xxxxxxxxxx>
Date: Thu, 17 Apr 2003 09:29:19 +0200
Hi all,

I currently have the problem that my system is running out of memeory when
processing my xslt.  I have set the max memory to 526MB (over the 512 I
have) in a vain attempt to get the xsl-processor to use every ounce of
memory.  But it is still keeling over at the point when I need to work on
about 40,000+ tags.

I get this structure form the db

<rowset>
  <row>
    <timestamp>1</timestamp>
    <sig1>1</sig1>
    <sig2>2</sig2>
    ....
    <sign>n</sign>
  </row>
  ....
  <row>
    <timestamp>N</timestamp>
    <sig1>1</sig1>
    <sig2>2</sig2>
    ....
    <sign>n</sign>
  </row>
</rowset>

and what I need to end up with is a <measuring sequence> containing one of
the following for each row.

<measuringdata>
  <timestamp>1</timestamp>
  <sigval>
    <label>sig1</label>
    <value>1</value>
  </sigval>
  ....
  <sigval>
    <label>sign</label>
    <value>n</value>
  </sigval>
</measuringdata>

The whole final file should only be around the 6MB mark, and I'm creating it
in 1/2 a gig of ram, so I have a couple of questions.

1)  Why does it run out of memory if the file sizes are 1/64 of max memory?
2)  Is there a way to generate the file in a more memory efficient way??

Your help, as always, would be most appreciated.

Cheers

Simon


Institut fuer
Prozessdatenverarbeitung
und Elektronik,
Forschungszentrum Karlsruhe GmbH,
Postfach 3640,
D-76021 Karlsruhe,
Germany.

Tel: (+49)/7247 82-4042
E-mail : kelly@xxxxxxxxxx


 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread