Re: [xsl] In Search of People Who Know About and/or Use the, Document Function

Subject: Re: [xsl] In Search of People Who Know About and/or Use the, Document Function
From: "M. David Peterson" <m.david@xxxxxxxxxxxxx>
Date: Fri, 07 Sep 2007 11:59:21 -0600
On Fri, 07 Sep 2007 11:03:52 -0600, Andrew Welch
<andrew.j.welch@xxxxxxxxx> wrote:

Still this side of the line (er, the Force?) - given the choice I'd
much prefer running the same transforms using Saxon on the server.  I
hate cross browser stuff, and worrying about the intricacies of
different processors on top is too much, especially when its merits
are debatable.

Ahh, see this is where life gets both exciting and interesting. You see with client-side XSLT you can initial serve up static XML files in rapid fire and let each one of them make the determination as to which additional files it might need to render the content properly as it relates to the browsers specific needs.

For example, you can take,

<advice:browser compare="xsl:vendor"
compare-with="Microsoft">ie</advice:browser>
<advice:browser compare="xsl:vendor"
compare-with="Transformiix">mozilla</advice:browser>
<advice:browser compare="xsl:vendor"
compare-with="libxslt">safari</advice:browser>
<advice:browser compare="xsl:vendor"
compare-with="Opera">opera</advice:browser>

and couple it with these,

<advice:base-uri>@@protocol@@@@domain@@|$$test:@@port@@IfTrue:@@colon@@@@port
@@IfFalse:@@empty@@$$|</advice:base-uri>
<advice:static>@@base-uri@@</advice:static>
<advice:static-css-browser>@@static@@/css/@@browser@@</advice:static-css-brow
ser>
<advice:static-js-browser>@@static@@/js/@@browser@@</advice:static-js-browser

<advice:static-css-browser>@@static@@/css/@@browser@@</advice:static-css-brow
ser>

and this,

<page:output>
  <page:head>
    <head:title>@@page-title@@</head:title>
    <head:include fileType="css" href="@@static@@/css/base.css" />
    <head:include fileType="css" href="@@static@@/css/transparency.css" />
    <head:include fileType="css" href="@@static-css-browser@@.css" />
    <head:include fileType="javascript" src="@@static-js-browser@@.js" />
  </page:head>
  <page:body>
    <body:html>
      <ul>
        <li>@@hello-world@@</li>
      </ul>
    </body:html>
  </page:body>
</page:output>

... to then create an experience that has been customized specific to the
browser in question.  For example, a Mozilla-based browser,

<html>
  <head>
  <meta content="text/html; charset=UTF-8" http-equiv="Content-Type">
  <title>atomictalk.org::September 7th, 2007</title>
    <style type="text/css">
    @import "http://test.atomictalk.org/css/base.css";;
    @import "http://test.atomictalk.org/css/transparency.css";;
    @import "http://test.atomictalk.org/css/mozilla.css";;
    </style>
    <script src="http://test.atomictalk.org/js/mozilla.js";
type="text/javascript">
    <!--/* hack to ensure browser compatibility */-->
    </script>
  </head>
  <body>
    <ul>
      <li>Hello, World!</li>
    </ul>
  </body>
</html>

... the result of which means you are now no longer required to hack the
same CSS and Javascript files in hopes of gaining cross-browser
compatibility with every CSS and JavaScript hack known to man.  The result
is smaller CSS and Javascript files and therefore a faster, more enjoyable
end-user experience.  And if you were to ask me, trading in the
cross-browser CSS and JavaScript incompatibilties for a handful of corner
case XSLT incompatibilities is a no brainer.

Of course one might argue: "Yeah, but I can generate those same custom
HTML files on the server" and it's true, you can.  And if the content you
serve is the same for each and every visitor then you might be better off
doing just that.

But if you take a site like MySpace, for example, in which each page is
completely customized to the preference of each user, and the content on
each page is updated on a regular basis, the result is that your server is
going to be regenerating that content on nearly every request (unless you
have a good server-side caching solution such as memcached in which you'll
only regenerate it when at least one of the data sources changes.  But
that's still a lot of processing regardless.) where as if you offload that
content generation to the client, serving static data files (with each raw
data file being updated when the data source changes) via document
function requests, your server can now focus its cycles on serving more
requests -- faster -- while at the same time reducing the amount of data
sent to each client on each request to only the data that has changed
rather than the entire pre-generated HTML page if nothing more than a
single character has been updated on that page.

--
/M:D

M. David Peterson
http://mdavid.name | http://www.oreillynet.com/pub/au/2354 |
http://dev.aol.com/blog/3155

Current Thread