RE: [xsl] Splitting file into N-sized chunks

Subject: RE: [xsl] Splitting file into N-sized chunks
From: "Chris Cosentino (ccosenti)" <ccosenti@xxxxxxxxx>
Date: Tue, 4 Aug 2009 16:57:30 -0400
PHP has a bunch of XML parsers. SimpleXML is a good one:

But, how can you ensure that the XHTML will be valid in 300k chunks?
It's possible that an element could contain greater than 300k of text.
If that's the case, you can't have both valid XHTML and a 300k file.

If you know that element <h2> will always contain less than 300k, then
you can do a redirect-write (or result-document) for that element and
all of the <h2> contained content will be in a separate file.

Just curious. What spec are you outputting to? IDPF .epub? I've never
heard of a 300k limit (I've only been working with newer devices though)


> -----Original Message-----
> From: Martynas Jusevicius [mailto:martynas.jusevicius@xxxxxxxxx]
> Sent: Tuesday, August 04, 2009 4:44 PM
> To: xsl-list@xxxxxxxxxxxxxxxxxxxxxx
> Subject: Re: [xsl] Splitting file into N-sized chunks
> Maybe I didn't make it totally clear... But each of the resulting
> files still has to be valid XHTML.
> How do you see that achieved in PHP? Maybe with DOM somehow, but I'm
> using XSLT in my workflow so that would be much easier.
> On Tue, Aug 4, 2009 at 10:37 PM, Chris Cosentino
> (ccosenti)<ccosenti@xxxxxxxxx> wrote:
> > Xalan uses <xsl:redirect-write ... >
> >
> > XSLT 2.0 uses <xsl:result-document ... >
> >
> > But Marynas, you may have an easier time just using something like
> > to chunk it out into 300k-sized file.

Current Thread