Re: [xsl] Processing large XML Documents [> 50MB]

Subject: Re: [xsl] Processing large XML Documents [> 50MB]
From: Mike Odling-Smee <mike.odlingsmee@xxxxxxxxxxxxxx>
Date: Wed, 24 Feb 2010 13:20:43 +0000
>> a) assemble around 30-40 XML documents [each with a common
>> header and its own lines] into one single XML document, with
>> the common header and all the lines
>> b) Update the assembled document in specific locations
>> c) generate multiple XML document fragments from the huge XML
>> document based on query criteria. Each XML frgment is created
>> by mapping specific fields in the big document. Each document
>> is created for a specific key element value in the huge document.

You could take a multi-pass approach which would not be too dissimilar
to what you do currently:

First pass
A variation on step a) Create a single (more lightweight) XML file
that only contains the necessary meta-data and resolved relationships
from your 30-40 XML documents (equivalent to a database view if you
b) As before but only briefly load XML that is pertinent to the update
using the results from step a) as a map
c) Again as before, but only load the XML that is required or
alternatively just use the more lightweight XML created in step a) if
it contains all the information you need

Whether or not this approach will work for you depends on your
detailed requirements.

Alternatively (if you have not already tried it) you might want to try
to optimise the performance of your existing XSLT - for instance by
using xsl:key.

Kind regards,


Current Thread