[xsl] Optimising multiple document() calls

Subject: [xsl] Optimising multiple document() calls
From: "Hondros, Constantine" <Constantine.Hondros@xxxxxxxxxxxxxxxx>
Date: Mon, 11 Jul 2005 10:43:31 +0200
I am pre-processing batches of about 1000 XML files at a time using Saxon.
Part of the pre-process involves aggregating linked XML documents into the
current document. Naturally, I use the document() function for this:

	<xsl:template match="table-inclusion">
		<!-- Do some cunning index/key lookup to get the inclusion's
path (not shown) -->	
		<xsl:apply-templates
select="document($inclusionpath)//incltable" />
	</xsl:template>

This node-set returned by document() then gets normalised by a general
template rule:

	<xsl:template match="@*|node()">
	  <xsl:copy>
		<xsl:apply-templates select="@*|node()"/>
	  </xsl:copy>
	</xsl:template>

The problem is, this runs so slowly that it is jeopardising the whole
processing pipeline. Some of the documents have five or six inclusions that
have to be processed, and since the inclusions themselves are not massive,
I'm assuming that the log-jam is in how I am pulling in the inclusions (ie.
the document() function).

How would you optimise this? Would a deep-copy with <xsl:copy-of> be faster?
Or am I better off writing my own processor for this aggregating step (easy
enough).

Thanks in advance.


-- 
The contents of this e-mail are intended for the named addressee only. It
contains information that may be confidential. Unless you are the named
addressee or an authorized designee, you may not copy or use it, or disclose
it to anyone else. If you received it in error please notify us immediately
and then destroy it. 

Current Thread