Re: [xsl] Design of XML so that it may be efficiently stream-processed

Subject: Re: [xsl] Design of XML so that it may be efficiently stream-processed
From: "Timothy W. Cook" <tim@xxxxxxxxx>
Date: Wed, 27 Nov 2013 18:23:47 -0200
On Wed, Nov 27, 2013 at 5:03 PM, Hank Ratzesberger <xml@xxxxxxxxxxxx> wrote:
> Hi Tim,
> Well, agreed, there may be diminishing returns on so many documents
> sharing the same metadata,
> in those cases, maybe the metadata could be a permanent URL to a
> document rather than a
> repetition of the same.  Processors could load the external document
> as variable. AFAIK, that
> does not violate any streaming principle.  If every document loads the
> same external metadata,
> then hopefully your processor or system will have cached copy.
> Not so different than keeping a local copy of DTD files.

Great.  Because this is the approach I am using in healthcare.

> [possibly nothing to do with your issue...]
> But in so many instances, this is the pattern that makes XML such a
> good replacement for
> binary / proprietary files because the document becomes
> self-contained.  For example,
> when I worked with a seismologist -- all the data is just time series
> points of acceleration.
> Only until you add the instrument, sensitivity/scale, geo-location,
> can it be usefully
> integrated with other records for the same event.

Self-contained sounds good.  However, since an XML document can point
to another document, such as a schema. Doesn't it make sense that the
syntactic and semantic parameters are defined in one place?  I am
"assuming" that there are many, many data files created from one
instrument, sensitivity/scale, geo-location, etc. ???


Current Thread