Re: [xsl] XSL 2.0 and .NET and VB

Subject: Re: [xsl] XSL 2.0 and .NET and VB
From: "M. David Peterson" <m.david@xxxxxxxxxxxxx>
Date: Fri, 29 Jun 2007 16:11:58 -0600
On Fri, 29 Jun 2007 14:16:16 -0600, Jirka Kosek <jirka@xxxxxxxx> wrote:

I don't think that current browsers are able to incrementally display
page which was generated client-side from XML using XSLT. You have to
wait till XML is downloaded and DOM is constructed, then you have to
wait till XSLT transforms DOM into another DOM and only after that you
can see something rendered. For slower connections this could be very

Sorry, Jirka, but you haven't show any evidence that suggests its slower to render a document via XSLT than it is using the SGML parser, just that you seem to think that the process of parsing the XML and XSLT to then process the XML with the XSLT is somehow going to result in slower processing of the page.

Is this correct? If yes, then I'm sorry, but you are wrong.

Moreover, if you invoke XSLT by using <?xml-stylesheet?> PI then there
is a known problem of IE which strips all whitespace nodes from input
document. You can overcome this problem by invoking transformation from
JavaScript, but there is no cross-platform Javascript API for XSLT, so
you have to use some bridging library like Sarissa. And there are still
user agents that doesn't do XSLT -- search engines and some minor, but
still used browsers. So it is not that simple, and producing HTML on
server means that you don't have to cope with many annoying browser quirks.

Please read through my entire post again and recognize the fact that I specifically made mention to that fact that this is only 50% of the total solution. If you're not interested in doing a little more work to ensure your visitors have a faster, more enjoyable experience, then so be it. If you are, then doing that extra work will result in a better, faster, more reliable experience.

Something else to consider: There seems to be a massive base of developers who are utilizing AJAX-invoked data rendering who try to use the "yeah, but Google can't find my content if I use client-side XSLT" seemingly neglecting the fact that Google isn't going to render their horrific mass of Javascript goo just to gain access to the same content they claim is invisible to the XSLT-based web page. I'm not suggesting you are this same type of developer, so please don't take it as such. What I am suggesting is that there is a massive base of developers out there that need to pull their head out and realize they don't know jack about how to build a high performance web site and what they need to do is shut their traps, open up a book, and realize that every that with a combination of fewer bytes sent out over the wire coupled with less server side processing means they are going to be able to server more requests in less time and as such, build better, faster, more reliable, and cheaper web-based applications than they *EVER* could if they attempt to maintain 100% of the control of the HTML generation on the server.

Think of it this way,

* If a single HTML page is 100k, and I have 1.5 megabits of bandwidth capacity, I am able to serve up ~15 requests every second. If instead of serving up HTML I serve up an XML file that contains the data specific to that page, and include in that XML a PI that points to an XSLT file, if my data file is 25k, after the first request and because of browser caching I am going to be able to serve 4 times as many requests than I would otherwise.

* If with each request I quickly check the header and determine ahead of time what the requesting client is I can then make a determination as to whether or not that client supports XSLT. In the case of Google, I would send them a prerendered (or potentially dynamically rendered) HTML file. In the case of IE, Mozilla, Safari, and Opera (which collectively form well over 99.9% of the web browser market) I can send them a static XML file (that same static file can be rendered via a background process each time the data for that page changes) of which will be rendered *FASTER* than it's HTML counterpart which took 4 times longer to arrive and *AT LEAST* 4 times longer to parse. Of course, an SGML processor is by its very nature *SLOWER* than an XML parser and as such you are going to be lucky if you can squeak out less than 5-6 times slower than its XML counterpart.

Please do the math and then let me know if you would like to continue forward with your argument that sending pre-rendered HTML to each and every requesting client is still going to be faster. Of course, we haven't even started to consider the fact that if there is any JavaScript involved with the processing once the HTML arrives we are now looking at a process that is up to 100 times slower, but lets start with the basics and see where there leads us.


M. David Peterson | |

Current Thread