RE: Improving performance for huge files with XSLT

Subject: RE: Improving performance for huge files with XSLT
From: Don Bruey <dbruey@xxxxxxxxxxxxxxxxxxxxx>
Date: Fri, 15 Sep 2000 07:18:31 -0400
>I'm using XSLT to convert xml files into other xml files. I have to
>convert huge xml files (containing, e.g. 50000/100000 nodes), but the
>perfomance is becoming a real problem: it takes more or less 20 minutes
>to convert a file with 100000 nodes.
>Are there some general methods to improve the performance with huge xml
>files? Did somebody encounter the same problem? How did you solve it?

If you have control over the contents of the XML (you may not) I have found
that doing such things as eliminating "pretty" output with tabs and
newlines, etc. and shortening tag names has saved me many megabytes in some
output.  For example, changing <Description> to <Dsc> and squashing all the
elements together with no spaces in between them cut down one output file
from 87M to 37M.  40M of spaces and unnecessary data that the parser doesn't
have to deal with. 
If you don't have control over your XML data, perhaps you could find a
utility to strip spaces first or do replace operations on long tag names
that will have better performance than the parser you're using. 
Using keys in some instances made multiple passes through some large XML
files to be a lot faster.  I have one stylesheet where I do multiple passes
through the same data, and keys helped a great deal with sorting and


 XSL-List info and archive:

Current Thread