RE: Improving performance for huge files with XSLT

Subject: RE: Improving performance for huge files with XSLT
From: Kay Michael <Michael.Kay@xxxxxxx>
Date: Wed, 13 Sep 2000 15:00:22 +0100
Yes, it's a problem. Every XSLT processor builds a tree of the source
document in memory, this tree will often occupy about 100 bytes per node in
the document.

One solution is to write a SAX filter to subset the data on the way in to
the XSLT processor, if you only need to access part of it.

Saxon has an extension <saxon:preview>, which processes the document one
subtree at a time. This is rather messy but it can sometimes help.

Sebastian Rahtz has published some performance comparisons for various
processors on large XML files.

Mike Kay

> -----Original Message-----
> From: Ornella Piva [mailto:Ornella.Piva@xxxxxx]
> Sent: 13 September 2000 08:14
> To: XSL-List@xxxxxxxxxxxxxxxx
> Subject: Improving performance for huge files with XSLT
> 
> 
> Hi,
> I'm using XSLT to convert xml files into other xml files. I have to
> convert huge xml files (containing, e.g. 50000/100000 nodes), but the
> perfomance is becoming a real problem: it takes more or less 
> 20 minutes
> to convert a file with 100000 nodes.
> Are there some general methods to improve the performance 
> with huge xml
> files? Did somebody encounter the same problem? How did you solve it?
> 
> Thanks,
> Ornella Piva
> 
> 
>  XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list
> 


 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread