Re: [xsl] Processing node-sets in batches

Subject: Re: [xsl] Processing node-sets in batches
From: Liam R E Quin <liam@xxxxxx>
Date: Mon, 08 Mar 2010 15:40:21 -0500
On Sun, 2010-03-07 at 15:50 -0800, Jeff Hooker wrote:

> I'm trying to create a reference table for a ridiculously large
> document. The resulting table is sufficiently huge that it's causing
> java memory shortages when I try to churn it into XML because of the
> degree of recursion in the table processing scripts, so I'm trying to
> read all of the nodes into a node-set() and process them out into a
> series of 100-row tables.

Sounds like you are for some reason using XSLT 1 here. When XSLT 1
was being developed in the 1990s, 50 megabytes was considered a
large XML document...

> I'm sure that there's a simple way of doing this; I'm also sure that
> I've abjectly failed to find it. Are there any good samples of
> processing node-sets() in batches out there? The main issue that I'm
> having is closing off a table at regular intervals and starting a new
> one; all of my tables currently ended up nested within each other.

You need to think not in terms of opening and closing tags, but in
terms of grouping a set of items together, and processing all the
items in that group. Maybe that's what you mean by the node-set(),
although just using a for-each can work too. You could even do,
    <xsl:for-each select="row[position() mod 100 = 1]">
        xsl:apply-templates
         select=".|following-sibling::row[position() &lt; 100" />
    </xsl:for-each>

Liam


-- 
Liam Quin - XML Activity Lead, W3C, http://www.w3.org/People/Quin/
Pictures from old books: http://fromoldbooks.org/
Ankh: irc.sorcery.net irc.gnome.org www.advogato.org

Current Thread