[no subject]

From: Bill Cohagan <bill.cohagan@xxxxxxxxxxxxx>
Date: Sun, 8 Sep 2002 06:56:27 -0500
Juggy writes:
>>>>>>>>>>>>>>>>>>>>
Date:	Sun, 08 Sep 2002 12:39:28 +0200
From:	juggy@xxxxxxx <mailto:juggy@xxxxxxx> 
Subject:	[xsl] speed questions

				Hi there,

I have a xml dictionary file with about 95000 entries, 20 Megabytes in size.
Due to its nature I need to do searching amongst different criterias
(languages, substring matching, ...) and I intend to use XSL for it.
Now - judging from my latest experiments - I wonder if xml/xsl is a good
choice for implementing such a thing, since - given my present understanding
of xml/xsl - each time I invoke the xsl(t)-processor the file is read
(flatly) again. And since this file is so big I wonder if this is efficient?
I also thought about generating separate, smaller xml files which hold
additional statistical data that I could preprocess with another stylesheet
in order to save some time, but I am not sure if this would be useful.
<snip>
>>>>>>>>>>>>>>>>>>>>>>>

We use msxml and a VB app as "driver" and in that environment it is NOT
necessary to reload the XML input file each time. We just load it once and
then reuse the DOM multiple times against various XSLT transforms.

My experience is that the single biggest contributor to speed is the use of
xsl:key. We've seen speedups on the order of 50x by defining keys. My guess
is that, with a 20MB XML file, you will likely see some long processing
times if you don't use xsl:key.

Regards,
 Bill


 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread
  • [no subject]
    • Bill Cohagan - Sun, 8 Sep 2002 06:56:27 -0500 <=