Re: [xsl] Splitting file into N-sized chunks

Subject: Re: [xsl] Splitting file into N-sized chunks
From: Stefan Krause <stf@xxxxxxxx>
Date: Mon, 10 Aug 2009 02:07:17 +0200
Michael Kay schrieb:
> I suspect that level of accuracy isn't needed. A heuristic that says 500Kb
> of serialized XHTML = 250K characters in text nodes is probably quite
> adequate for the purpose.

Indeed, I've produced .epubs with thousends of chunks of minimal size
(less than 5 kBytes), and they do well. (Smaller chunks speed up the
page turning, and the end of an chunk forces a page break.)

A real challenge was to synchronize these chunks with epubs .opf and
.ncx files, where the chunks must be registered, and to handle internal
links (consider <a href="#something">...</a>) and other references like
footnotes. I'm afraid there is no simple answer how to split a file for
epub, because it depends in some way from the input and affects some
other parts of the workflow.


Current Thread