Re: [xsl] Design of XML so that it may be efficiently stream-processed

Subject: Re: [xsl] Design of XML so that it may be efficiently stream-processed
From: Hank Ratzesberger <xml@xxxxxxxxxxxx>
Date: Wed, 27 Nov 2013 11:03:48 -0800
Hi Tim,

On Wed, Nov 27, 2013 at 1:43 AM, Timothy W. Cook <tim@xxxxxxxxx> wrote:
> On Fri, Nov 22, 2013 at 9:40 AM, Michael Kay <mike@xxxxxxxxxxxx> wrote:
>> I would add some more important design criteria. Put metadata and reference information (stuff that's needed for reference throughout document processing) at the start of the document rather than the end, or in a separate document.
> In regards to metadata. Is it really sensible, in real world
> applications, to have metadata for identical instance data in every
> instance?  Let's say you have 10,000 or 100,000 different instances of
> a document that references the same metadata.  The instances may be
> 100 lines alone and the metadata is an additional 30 lines.  Doesn't
> this justify a separate document for metadata?

Well, agreed, there may be diminishing returns on so many documents
sharing the same metadata,
in those cases, maybe the metadata could be a permanent URL to a
document rather than a
repetition of the same.  Processors could load the external document
as variable. AFAIK, that
does not violate any streaming principle.  If every document loads the
same external metadata,
then hopefully your processor or system will have cached copy.

Not so different than keeping a local copy of DTD files.


[possibly nothing to do with your issue...]

But in so many instances, this is the pattern that makes XML such a
good replacement for
binary / proprietary files because the document becomes
self-contained.  For example,
when I worked with a seismologist -- all the data is just time series
points of acceleration.
Only until you add the instrument, sensitivity/scale, geo-location,
can it be usefully
integrated with other records for the same event.

Hank Ratzesberger

Current Thread