Re: [xsl] Splitting file into N-sized chunks

Subject: Re: [xsl] Splitting file into N-sized chunks
From: Martynas Jusevicius <martynas.jusevicius@xxxxxxxxx>
Date: Tue, 4 Aug 2009 23:09:50 +0200
Well it doesn't have to be exactly 300K, could be less of course :)
But XHTML needs to be valid.
So any ideas on (roughly) calculating the size? Number of characters perhaps?

Yes, my output is ePub. Sony PRS-505 has this limitation -- or maybe
it's Digital Editions, but it works on the PC. It gives a "Page error"
if some of the files is too big:
http://www.mobileread.com/forums/archive/index.php?t-27818.html

On Tue, Aug 4, 2009 at 10:57 PM, Chris Cosentino
(ccosenti)<ccosenti@xxxxxxxxx> wrote:
> PHP has a bunch of XML parsers. SimpleXML is a good one:
> http://us3.php.net/simplexml
>
> But, how can you ensure that the XHTML will be valid in 300k chunks?
> It's possible that an element could contain greater than 300k of text.
> If that's the case, you can't have both valid XHTML and a 300k file.
>
> If you know that element <h2> will always contain less than 300k, then
> you can do a redirect-write (or result-document) for that element and
> all of the <h2> contained content will be in a separate file.
>
> Just curious. What spec are you outputting to? IDPF .epub? I've never
> heard of a 300k limit (I've only been working with newer devices though)
>
> -Chris
>
>> -----Original Message-----
>> From: Martynas Jusevicius [mailto:martynas.jusevicius@xxxxxxxxx]
>> Sent: Tuesday, August 04, 2009 4:44 PM
>> To: xsl-list@xxxxxxxxxxxxxxxxxxxxxx
>> Subject: Re: [xsl] Splitting file into N-sized chunks
>>
>> Maybe I didn't make it totally clear... But each of the resulting
>> files still has to be valid XHTML.
>> How do you see that achieved in PHP? Maybe with DOM somehow, but I'm
>> using XSLT in my workflow so that would be much easier.
>>
>> On Tue, Aug 4, 2009 at 10:37 PM, Chris Cosentino
>> (ccosenti)<ccosenti@xxxxxxxxx> wrote:
>> > Xalan uses <xsl:redirect-write ... >
>> >
>> > XSLT 2.0 uses <xsl:result-document ... >
>> >
>> > But Marynas, you may have an easier time just using something like
>> PHP
>> > to chunk it out into 300k-sized file.

Current Thread