The recently published Hornbill research paper once again shows how poorly designed and targeted research can produce “answers” and “statistics” that are highly questionable.
In this case, the whole emphasis seems to be wrong – focused on ITIL, rather than ITSM. How many times does one have to say that organizations shouldn’t be trying to “implement ITIL”, but trying to develop truly effective and sustainable service management solutions that mean they deliver quality services to the customer?
Here are a few of the aspects of the study that cause me concern:
There is no reference to ISO/IEC20000, yet it is the industry standard for assessing quality management in this sphere. It also has embedded within it the concept of continual improvement, which is what should be driving most organizations.
Some of the questions are loosely worded and open to interpretation, e.g. what does ‘already implemented ITIL v2’ mean exactly?; does ‘How frequently do IT and the business meet’ apply to any interaction, or interaction at a specific level in relation to specific concerns? What does ‘upgrading to v3’ mean? – if there is such a thing.
There is no information about how anything is measured and what various findings therefore mean. For example, how is process maturity measured by the respondents? Is it the same for each of them? Or is this just a subjective opinion from the individuals that could mean one person’s level 2 is actually higher then another’s level 3? And does ‘measured in terms of its contribution to the business’ really mean ‘measured’ – and, if so, how? And what does one make of this - 66% of the adopters ‘saw an improvement in service quality’ but ‘few have metrics in place to back up and report on the results’. Just how meaningful is this?
Once again, this peddles the old nonsense that v2 consisted of the 10 processes defined in 2 of the 10 books and that there are therefore lots of new processes. Two examples stand out. Organizations have been performing Event Management for decades – they may not have labeled and partitioned it as the v3 book describes it, but they have been doing it! Similarly, any organization operating a Service Desk worth anything at all will have been dealing with Incidents and Requests, and even if they have been doing nothing more than classifying them as such, they will have been following the spirit of much of the thrust of Request Fulfillment. If organizations make changes to these aspects, I believe that they are merely applying the principles of CSI. Organizations with any real degree of service management maturity will recognize this, and indeed such organizations probably already had solutions that were closer to the holistic spirit of ITIL, rather than to the narrow, distorted view of Service Support and Service Delivery than appears throughout this survey.
As for some of the conclusions, perhaps the point is being missed in some cases.
Of course ‘cherry picking’ is going on; organizations will usually tackle those areas where they are either feeling most pain or where they can derive most benefit or achieve quick wins. In some cases, the organization may have consciously decided not to pursue certain areas for a variety of business reasons. Perhaps it would have been more interesting therefore to try to understand why they had implemented some processes/functions and not others?
So, cost is the ‘major influence in the selection of an ITIL tool, followed by ease of customization and use’. Yet the next paragraph states the major reason for dissatisfaction with their tool is ‘lack of appropriate functionality’? So maybe the tools were not chosen very well!
Can a CMS/CMDB ever be 100% accurate? Highly unlikely. Perhaps organizations need to be focused on defining exactly what they are seeking from Configuration Management and the question should then be about whether it meets their requirements.
Maybe the reason why so many are ‘implementing elements from the v3 Service Transition and Service Operation books’ has nothing to do with those books being ‘easier to digest’ but with the fact that they are the obvious and main areas of process failure! But equally, there will be far fewer people engaged in strategy and design than there will be in operational processes, so maybe the answer reflects who the respondents were – it might have been more illuminating to see their job roles rather than the groupings given.
I don’t know where the statement ‘to realize the top drivers, IT needs to focus on its “shop window”, the service desk’ comes from; it certainly isn’t evidenced by anything in the survey. Surely we should be seeking a mature relationship between the whole of IT and the whole of the business, with suitable engagement at all levels to ensure that true integration is achieved. This possibly also explains why ‘40% don’t believe they have the ability currently to define, capture and report on Service Quality’. Could it be it’s because they haven’t got together with their customers to define it as part of creating a mature SLM process!
Respondents were asked what they would do differently if they had a chance, but apparently not whether they had actually changed anything as a result of their experience! After all, most of them clearly haven’t reached the (never actually achievable) state of perfection, which is why continuous improvement is so important.
Overall, I feel that this is another opportunity missed to get people to focus on what’s really important, and instead has engaged in some more navel gazing.
Any feedback and comments are always welcome!!
23rd September 2009
I don’t think it is nonsense that ITIL was the two books and 10 processes. I have read the other books and have even used some of them but the fact remains that for nearly all ITIL students, it was 10 processes and one function. The concept of the full V2 library was just dreamware.
2nd October 2009
If there are faults in the report, then I must accept responsibility, as I led the team responsible for designing the questionnaire and reviewing the paper. As the title of the paper suggests, the survey was focused on ITIL and the progress organisations are making with it, not on the broader aspects of ITSM. Although ITIL is only one approach, we cannot ignore its global penetration and the fact that many organisations consider themselves to be “implementing ITIL”. At the start of their ITIL journey organisations typically don’t have mature processes and the full framework (particularly v3) can appear daunting. What the research points out is that many IT organisations simply implement a few of the Service Operation and Service Transition processes. Taking this ‘bite-size’ approach at the outset is absolutely sensible, as it enables IT operations to mature a few processes, prove value and then take the next steps.
As a Service Management software vendor, Hornbill gets to see hundreds of service desk implementations and many different levels of ITIL maturity. We have several customers that have achieved ISO 20000 certification, one that managed to do so within a few months. Although I agree that ISO 20000 establishes a focus on continual improvement, not every organisation wants to be certified against a standard. In any event, the survey was intentionally specific to ITIL, so we avoided any reference to other standards or frameworks.
With regards to your criticism of the questions being open to interpretation, I can assure you that the survey questionnaire was quite specific in the areas you criticised - the detail was such that it took participants in excess of 40 mins to complete. If anything was lost in translation, it was only with the best intention of balancing detail against keeping the resulting report to a digestible length. The heading "How frequently do IT and the business meet?" relates to the survey question, “How frequently are IT and Business planning meetings held within your organisation?” to which the explanatory text refers. The questions on process maturity were divided into 5 levels, each one enabled the respondent to select whether the process; exists but is not fully documented, is well documented, standardised, monitored for compliance, or continually improved, so nowhere near as open as your comment suggests. Nor did we ask whether the “CMDB was 100% accurate”, the question was whether respondents “considered their CMDB to be accurate”, which again is a different question.
I don’t agree with your comment that the survey has “engaged in some more navel-gazing”. Since its release, we’ve have excellent feedback from a number of independent sources, including practitioners, industry bodies and analysts, who have all said that the research agrees with their view of the world. I sincerely hope that the document is a useful resource for organisations that are considering ITIL and are struggling with questions around; which version, which processes, where do we start etc. The intention of the research was to gain insight into what others have done on their ITIL journey, what benefits they realised and the lessons learned along the way, and I believe it has achieved that.