Re: Is XSL suitable for batch processing?

Subject: Re: Is XSL suitable for batch processing?
From: Paul Tchistopolskii <paul@xxxxxxx>
Date: Fri, 18 Aug 2000 11:10:26 -0700
> I'm currently writing an application that has to write settlement files for
> various banks, the input data for each can easily be expressed as an XML
> file but the format of file for each bank is slightly different. This sounds
> like a situation where having a different XSL file for each bank could work.
> The only thing is I'm not sure that XSL will scale to large files. 

Standard XSL will not scale to large files, because typically 

1. the entire XML file is loaded into memory.
2. the enture result tree is constructed in the memory.
3. the resul tree is dumped out.

> In its full glory the output tree can be created in a piecemeal fashion, my
> transformations are so simple that they could easily work in a sequential
> way. Does anyone have any experience of using XSL with large files? 

> Should I  try the XSL route, or should I limit myself to writing a hook into a SAX
> parser for each bank.

Before dropping out  'XSL route' - I suggest you'l try SAXON - it has some 
extension element which allows processing the XML 
file 'section-by-section'. This workaround is not XSLT, 
of course, it is the hack,  done by Michael. 

So the answer is actually simple. If for some religious 
reasons you are not using 'non-conformant tools' 
and 'non-conformant solutions' - you can *not* use 
XSLT for this task. And there is no sign that next 
versions of XSLT paper will do better here.

If you care about getting  job  done - take SAXON  
( the only tool which supports 'section-by-section'  
XML processing ) - and do it in 'SAXON XSLT'.
 
Rgds.Paul.

PS. Pipes have absolutely *nothing* to do with this 
'everything should be in the memory' limitation of XSLT.




 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread