CSI – that’s Continual Service Improvement, by the way, not Crime Scene Investigation – is, to my mind, the single most important stage in the ITIL service life cycle. It evaluates what has gone before, identifies areas for improvement, and aids in the implementation of improving. In an ideal situation, CSI informs all the other stages, and is the main driver for the service life cycle.
Category: ITIL
-
Managing expectations
Some years ago, I wrote about Tom Peters‘ Formula for Success, which hinges on doing two things; promising less than you can deliver, and delivering more than you have promised. I can’t foresee a future when this will not be true, however, I think it bears expanding on. Simply put, by telling the user what to expect, we not only set their expectations, we manage them too, particularly if the user’s expectations are wildly unrealistic.
-
Incidents, Requests and what separates them
To the front-line technician, the two most important ITIL processes are Incident and Service Request Management. These are the bread and butter of front-line work, and most tickets handled by a support desk will fall into one of these categories.
-
Critical incidents: the aftermath
For many technicians, a critical incident will trigger something akin to an adrenaline response. With experience, this will give you focus and clarity of thought as the incident unfolds. However, the response can only be sustained for a limited amount of time, and once it is over, you will likely experience some tangible aftereffects.
-
Understanding Incidents: Urgency, Impact and Priority
It is part of the nature of IT service and support that you will, from time to time, be called upon to handle a high priority, high urgency incident. In most production systems these are mercifully rare, but it is still important that you understand how to identify them.
-
ITIL adoption numbers – a critical review of one interpretation
A while back, one of my fellow students questioned to what extent ITIL has been adopted outside of the UK. He cited a source, which claimed that the adoption was very low. This assertion was based on a single statistic; the Global TSO book sales figures. The original blog post can be found here.
I immediately wondered at the truth of the assertion, as I know that ITIL has been widely adopted in Norway. I decided to perform a PROMPT analysis:
Presentation:
The statistics are presented in a clear and concise way. However, there is no information about what metric is used, and whether it corrects for anything at all.
Relevance:
There is reason to question the relevance of the statistics cited, for a few reasons:
It measures a single metric; total sales of books about ITIL from a single publisher
The data appear to be presented as percentage of total sales, and do not appear to correct for the size of the different markets, relative to each other
The data do not take into account that there are several publishers of literature about ITIL
Objectivity:
The language is neutral and measured. I find no issue with objectivity.
Method:
There is reason to question the method. There is no information about how the data was collected, which means that we cannot verify whether the methodology was appropriate. In addition, we cannot know whether the data is representative, which I find that there is reason to question, as the statistics only cite sales of one publisher, and does not appear to correct for relative market size. There is also the possibility that the sales listed as UK also includes UK vendors shipping out of the UK. Lastly, there is no information about whether the information is print-only, electronic-only or print and electronic sales.
Provenance:
There is reason to question the provenance. The author of the article itself is clearly identified, but that’s where provenance stops. The author does cite a source (ITSMF UK), but he does not give a citation for the actual stats, which means that we cannot verify the numbers or the metrics they represent. We cannot identify the authors of the information, nor verify how they were published.
Timeliness:
There is reason to question the timeliness. There is no information about when the data was published. However, the information does cite what periods it relates to, and the information appears to be recent enough to still be relevant. Still, more up to date information seems likely to be available.
In conclusion, the blog post clearly fails five out of the six PROMPT criteria. In particular, I would argue that book sales over a relatively short period of time is a poor metric for how much or little impact any given technology has had in any market, as printed books are not the only way to learn about a given technology.