Subject: RE: [xsl] Large XML Files From: "Michael Kay" <michael.h.kay@xxxxxxxxxxxx> Date: Mon, 7 Jan 2002 16:05:40 -0000 |
> > Does any one know of an XSLT processor wich will not > read-in all of the > > XML > > input at once? I read somewhere in the archives that Saxon > has the ability > > to read-in only a subtree at a time. Saxon's <saxon:preview> element basically allows you to transform a sub-tree as soon as it has been read, and then discard it from memory. > Are there any other XSLT processors > > that can do this? Not directly, but what you can do is write a SAX filter application that sits between the XML parser and the XSLT processor, so that the filter effectively breaks up the large document into lots of small ones and transforms each small document as soon as it has been read. > > The W3C Candidate recommendation called XML Fragment Interchange at > > <http://www.w3.org/TR/xml-fragment> addresses this issue. I don't think that proposal is relevant (or it least, I haven't understood its relevance!) Mike Kay XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
Current Thread |
---|
|
<- Previous | Index | Next -> |
---|---|---|
[xsl] Large XML Files, Gilles Maurice | Thread | [xsl] is document('') mandatory for, Oleg Tkachenko |
[xsl] need only one node to be proc, Erik Stunkat | Date | [xsl] Re: RE: quick table layout pr, Dimitre Novatchev |
Month |