Re: [xsl] Processing a huge XML document

Subject: Re: [xsl] Processing a huge XML document
From: cagle@xxxxxxxxx (Kurt Cagle)
Date: Sun, 29 Jul 2001 20:34:08 -0700
The answer really depends upon how much memory you have. If you can put it
all into DOM and can minimize any other processes, then the DOM calls will
be much faster, and the XSLT easier to manage. On the other hand, you need
some heavy duty memory (figure 256MB system RAM min) without going to cache.
If the process is forced into cache, then I suspect that SAX calls might be
faster, because you won't be thrashing your virtual memory.

Your best solution, however, might be to distribute the XML so that it is
spread across multiple documents. You could then process it synchronously
through a series of document() calls within the XSLT itself. This does
involve making multiple XML file opens and closes, but this will still be
faster (and easier to code) than trying to replicate XSLT from SAX calls.

-- Kurt Cagle
----- Original Message -----
From: "Mahesh V Kondwilkar" <maheshk@xxxxxxxxxxx>
To: <xsl-list@xxxxxxxxxxxxxxxxxxxxxx>
Sent: Sunday, July 29, 2001 4:28 PM
Subject: [xsl] Processing a huge XML document

> Hi,
> I am quite puzzled as to which method would be faster to transform
> a 50 MB XML file: an XSLT, or using SAX call-backs.
> Can someone help me please? Currently the whole process takes me
> about half an hour.
> Thanks
> Regards,
> Mahesh
> --
> Mahesh Kondwilkar (maheshk@xxxxxxxxxxx,
> Kansas State University
>  XSL-List info and archive:

 XSL-List info and archive:

Current Thread