RE: Improving performance for huge files with XSLT

Subject: RE: Improving performance for huge files with XSLT
From: Thorbjørn Ravn Andersen <TRA@xxxxxxxx>
Date: Wed, 13 Sep 2000 10:54:07 +0200

> -----Original Message-----
> From: Ornella Piva [mailto:Ornella.Piva@xxxxxx]
> Sent: Wednesday, 13 September, 2000 09:14
> To: XSL-List@xxxxxxxxxxxxxxxx
> Subject: Improving performance for huge files with XSLT
> 
> 
> Hi,
> I'm using XSLT to convert xml files into other xml files. I have to
> convert huge xml files (containing, e.g. 50000/100000 nodes), but the
> perfomance is becoming a real problem: it takes more or less 
> 20 minutes
> to convert a file with 100000 nodes.
> Are there some general methods to improve the performance 
> with huge xml
> files? Did somebody encounter the same problem? How did you solve it?

Depending on where the bottleneck is.  Most XSLT-processors require *lots* of memory, so first check if your computer is swapping.

If you have sufficient memory, then see if faster alternatives are available.  For Java based transformations "XP" (www.jclark.com) is most likely the fastest but not 100% compliant.  I have yet to find a reasonable processor written in C, although Xalan-C might work well for you.

Could you perhaps provide more details?

-- 
  Thorbjørn Ravn Andersen             "...and...Tubular Bells!"
  http://bigfoot.com/~thunderbear


 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread