Subject: Re: [xsl] slow xsltproc XInclude processing w/complex document? From: Mike Trotman <mike.trotman@xxxxxxxxxxxxx> Date: Tue, 06 Jul 2004 23:42:37 +0100 |
5854 <item href="out/tm_3407901268.xml"/> 5855 <item href="out/tm_3408001268.xml"/> 5856 </LIST>
U:\jobs\ISIS\DataTranslator>xsltproc -V Using libxml 20610, libxslt 10107 and libexslt 805 xsltproc was compiled against libxml 20610, libxslt libxslt 10107 was compiled against libxml 20610 libexslt 805 was compiled against libxml 20610
Linux Versions [miket@dl02 DataTranslator]$ xsltproc -V Using libxml 20507, libxslt 10024 and libexslt 715 xsltproc was compiled against libxml 20501, libxslt 10024 and libexslt 715 libxslt 10024 was compiled against libxml 20501 libexslt 715 was compiled against libxml 20501 [miket@dl02 DataTranslator]$
cpan> m XML::LibXML Module id = XML::LibXML DESCRIPTION Interface to the libxml library CPAN_USERID PHISH (Christian Glahn <christian.glahn@xxxxxxxxxx>) CPAN_VERSION 1.58 CPAN_FILE P/PH/PHISH/XML-LibXML-1.58.tar.gz DSLI_STATUS RmhO (released,mailing-list,hybrid,object-oriented) MANPAGE XML::LibXML - Perl Binding for libxml2 INST_FILE /usr/lib/perl5/site_perl/5.6.1/i386-linux/XML/LibXML.pm INST_VERSION 1.58
cpan> m XML::LibXSLT Module id = XML::LibXSLT CPAN_USERID MSERGEANT (Matt Sergeant <matt@xxxxxxxxxxxx>) CPAN_VERSION 1.57 CPAN_FILE M/MS/MSERGEANT/XML-LibXSLT-1.57.tar.gz MANPAGE XML::LibXSLT - Interface to the gnome libxslt library INST_FILE /usr/lib/perl5/site_perl/5.6.1/i386-linux/XML/LibXSLT.pm INST_VERSION 1.57
Hi Paul,
I've been running some tests on a document that includes nested
Xinclude directives. The document is complex: upwards of 1500 files,
nested to a depth of up to 4 levels. Total size of content is about
4.8MB.
As you probably know, XSLT holds the files that it loads in memory. You haven't said how much memory your machines have, but it might simply be that holding all those files (and the flattened result document) in memory is causing problems.
Another thing is that you're traversing every single node in those documents. Every node visit takes time, because the processor has to work out what to do with the node, so cutting down the node visits would be good. You could try, for example, changing the identity template that you're using at the moment for:
<xsl:template match="node()"> <xsl:copy-of select="." /> </xsl:template>
<xsl:template match="*[.//xi:include]"> <xsl:copy> <xsl:copy-of select="@*" /> <xsl:apply-templates /> </xsl:copy> </xsl:template>
Overall, though, if speed is an issue, you would be much better off using a SAX Filter to do the transformation: that way you wouldn't be storing the documents in memory, each node would have to be visited only once, and the output can stream out (if that helps). Mind you, you say that the XInclude resolution is only a test, so perhaps you can't do your real transformation using SAX...
Cheers,
Jeni
--- Jeni Tennison http://www.jenitennison.com/
--+------------------------------------------------------------------
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe@xxxxxxxxxxxxxxxxxxxxxx>
--+--
-- Datalucid Limited 8 Eileen Road South Norwood London SE25 5EJ United Kingdom
/ tel :0208-239-6810 mob: 0794-725-9760 email: mike.trotman@xxxxxxxxxxxxx
UK Co. Reg: 4383635 VAT Reg.: 798 7531 60
Current Thread |
---|
|
<- Previous | Index | Next -> |
---|---|---|
Re: [xsl] slow xsltproc XInclude pr, Jeni Tennison | Thread | Re: [xsl] slow xsltproc XInclude pr, Paul DuBois |
seeking clarification of CSS2 spec, fo | Date | Re: [xsl] seeking clarification of , M. David Peterson |
Month |