RE: [xsl] performance issues saxon

Subject: RE: [xsl] performance issues saxon
From: "Michael Kay" <michael.h.kay@xxxxxxxxxxxx>
Date: Mon, 17 Feb 2003 12:34:10 -0000
As I replied to another post on this subject today, some of your options

(a) buy lots of memory (at least 10 times the source file size), and
carefully configure the JVM to make sure it is being used

(b) use a SAX filter to break the document up into small pieces before

(c) use saxon:preview to transform the document one piece at a time

(d) load the data into an XML or SQL database

(e) use STX

Many people have found that saxon:preview works well in this situation.
It's not a feature I'm very fond of (it's rather fragile if you try to
do anything too clever with it), but it does enable you to process large
documents using small amounts of memory, without learning how to write
in Java.

Michael Kay
Software AG
home: Michael.H.Kay@xxxxxxxxxxxx
work: Michael.Kay@xxxxxxxxxxxxxx 

> -----Original Message-----
> From: owner-xsl-list@xxxxxxxxxxxxxxxxxxxxxx 
> [mailto:owner-xsl-list@xxxxxxxxxxxxxxxxxxxxxx] On Behalf Of 
> Vasu Chakkera
> Sent: 17 February 2003 11:37
> To: xsl-list@xxxxxxxxxxxxxxxxxxxxxx
> Subject: [xsl] performance issues saxon
> Hi all,
> I have a bit of problem running saxon for my XML which is as 
> huge as 250Mg. ( monster markup language :) ). The 
> transformer fails as it runs out of memory. Is there any 
> suggestions to situations like this. The XML is designed by a 
> different team , and I would want to look into it to see if 
> there are any ways of optimising . I also looked at -X option 
> of java to get round the out of memory exception.While i do 
> this , It would be quite helpful if the Gurus here can let me 
> know some tips regarding how to deal with situations like 
> this.I am looking at ways to reduce the size of the file at 
> the moment. Thanks a lot Vasu
>  XSL-List info and archive:

 XSL-List info and archive:

Current Thread