Re: [xsl] metrics for evaluating xsl-t?

Subject: Re: [xsl] metrics for evaluating xsl-t?
From: Andrew Franz <afranz0@xxxxxxxxxxxxxxxx>
Date: Thu, 24 Aug 2006 06:29:13 +1000
Think of XSLT as a specification language. Does anyone ever quantify/estimate the resources required to maintain office documents?
And then it has the 'side effect' that it's executable. So we can derive this estimate:
Analysis: less than the benchmark (office documents)
Programming: $nil

Seriously, this idea of metrics is fraught with the risk of illogical derivations.

First, the bulk of the effort is in understanding the problem (see above). People who don't understand it are doomed to keep solving it. How do you quantify business analysis? Are there metrics for meetings?

Second, the kinds of hands-off managers who ask these sorts of questions are usually the same ones who solve the same problems over & over. Usually, their measure is 'completion' of a project regardless of whether it solves the *business* problem or not. Metrics should be *business* metrics not of lines of code, e.g. what does it matter how many loc are used if a one-off automation P saves N people X hours per day?

Third, the idea of quantifying a 'unit of work' in software development is antithetical to good Software Engineering. If I do the same thing twice, I start looking for commonalities. Three times and I abstract it. Four times and I build an engine. After that, any more instances of whatever you're building require minimal effort. Thus work-per-instance is an *outcome* of abstraction/reusability not an input into whether or not you should do it. Abstraction/problem defibnition should *always* be done.

Fourth, effort required to 'maintain' is a function of the rate of change in the environment *and* how well it was written. If you want metrics, keep track of bugs, requests and improvements weighted by effort. When requests dominate, you have a fast-changing environement or poor definition. When bugs dominate, you have poor code. In addition, good programmers will often *not* maintain poor code, they will more likely throw it away and start over, i.e. see #1 and #2. Tools such as JIRA allow you to track and report such metrics.

What does this have to do with XSLT? Nothing. XSLT is a language and a tool.
Beware of idiot middle managers who try to hogtie you with artificial metrics which learned at business school. Michael wrote earlier about a project manager confused by the programmers who produced negative lines of code. Where's the metric for that? Answer: not in XSLT, it is in the business KPIs

Tony Graham wrote:

"bryan rasmussen" <rasmussen.bryan@xxxxxxxxx> writes:

The other thing is determining the complexity of the transformation
task, pertinent to the individual transformation and its place within
a transformation framework.

With that kind of metric one can then determine the amount of time for
writing transform etc.

I got onto the metric idea from the other end: determining what it
would take to understand and maintain an existing stylesheet if I got
work where there are stylesheets already in use.


of course, as noted, this is different than analysing the complexity
of an actual transform.

don't think there's anything like that. Although Sean McGrath had a blog post once about the true cost of XML, cost includes the number of namespaces to be understood etc.

I think we all have a builtin metric for saying how complex we feel a
particular transform likely will be.

I'd like a simple measure of how complex a particular transform was.

For the record, I got started on the XSLT metric idea after reading
about Rick Jelliffe's XML complexity metric:

The future conference paper that I'd like to see (and present) is the
one with lots of manager-friendly graphs relating schema complexity
and XSLT complexity from a survey of real projects.  If that existed,
it would be easier to explain to your manager why a new stylesheet for
a new, complex schema could take longer than a week to write.



Current Thread