DIY Process Assessment Planning – Assessment Model, Schedule and Deliverables, Risks and Constraints

In the previous post, we discussed some process assessment planning considerations such as the problem definition, the scope (both organizationally and process-wise), and the analysis of the stakeholders. In this post, we continue and conclude the discussion with additional planning considerations on the assessment model, the schedule and deliverables, as well as the risks and constraints.

Assessment Model (Best Practices, Evaluation Process Integration, Organizational Context)

Using the information resources mentioned previously such as ITIL, COBIT, CMMI/ISO 15504, and ISO 20000, hopefully you would have constructed the best practice models and evaluation criteria to assess the processes and organization in scope. While an assessment spends a great deal of its effort in benchmarking and answering the question “Where are we now?”, it is just as important to start thinking and formulating the response for the question of “Where do we want to be?” In addition to coming up with the best practice model for the processes you plan to assess, I think it is also important to think about and include the following criteria into your assessment model.

  1. Integration between Processes: These days, the ITSM processes rarely get executed effectively in total isolation. The processes often need to inter-operate effectively with other related processes in order to gain a certain level of maturity. For example, I will be surprised to see a high level of maturity for the availability and capacity management processes, if the change management process has been determined to be weak in the same organization. For the processes you plan to assess, be sure to take additional inter-process relationships into account and assess the process effectiveness truly end-to-end.
  2. Organization Considerations: While understanding how mature of a process is and where you can improve is important, it is just as important to understand how the maturity score fit into the overall organizational context. For example, if the problem management process has been assessed to be highly mature but the organization places only a light importance on the process, the good maturity score will have only a limited significance. Hopefully from the problem definition exercise, you have already discovered what is really important to the organization so you can target your improvement plan accordingly.
  3. Benchmarking and Comparison: Benchmarking is the way to compare how your organization is doing relative to other similar organizations or environments. In-depth benchmarking data are often not readily available and something that is a value-add feature when you utilizing external consulting for the assessment. For DIY assessment effort, you may or may not feel compelled to compare your results to other similar organizations, depending on your problem definition. At the minimum, the assessment results can always serve as the baseline for you to compare to when you perform another assessment in the future.

Schedule and Deliverables

Just like any project worthy of an organization’s time and effort, the assessment effort should be run and managed like a formal project. The schedule and deliverables should be carefully planned and spelled out during the planning phase. Because the assessment will likely require participation from a broad array of departments and individuals, being able to communicate the timeline (of what will happen when and with whom) is something the sponsor and participants should expect to have upfront.

When constructing the schedule or formulating the deliverables, keep the following in mind as you plan:

  • What assessment techniques do you plan to deploy to gather the data? Will you use questionnaires or surveys? Workshops or focus groups? One-on-one interviews? Or a combination of two or more techniques?
  • The project should have a firm start date and an end date, with clearly identified milestones. The milestones should include at least the kick-off event, the beginning, reminder, and closing dates for the survey, as well as the presentation date.
  • Identify all potential meetings and schedule them in advance as much as possible. The assessment project will involve a lot of meeting invitations, attendee tracking, and calendar management activities.
  • Formulate a communication strategy and start laying out various communication artifacts and templates that will be used to target various stakeholders and roles within the project.
  • Identify the ITSM education needs and account for the workshops and training classes in the schedule. For most stakeholders such as the sponsor and survey participants, the ITIL Foundation level knowledge or awareness could be sufficient. For certain key participants who may need to provide more specialized information for the assessment, ITIL Intermediate level knowledge maybe necessary. For the assessor (you this case since we are talking about DIY) or anyone helping the assessor in constructing the surveys and analyzing the results, advanced understanding of ITIL and how various processes fit with one another will be required in order to perform the assessment tasks effectively.

Risks and Constraints (Time, Resources, Tools, Politics)

The assessment team should also identify any constraints or risks that could materially impact the assessment, negatively or positively. One typical constraint/risk is the key participants’ schedule and availability during the assessment phase. Considering the assessment’s results are heavily dependent of the quality and quantity of the input provided by the participants, not having the people you need at the time when you need them can pose a noticeable risk to the outcome of your project. It will pay dividend to figure out who you will need and what mitigation options you will have if you don’t get the people right when you need them and need to work around their schedules. Similar risks need to be identified with mitigation measures planned out accordingly during this phase.

After all these considerations are factored in, you should have sufficient information to put together a statement of work or an assessment project charter. The assessment charter will guide your effort and serve as the foundational governance piece as you move forward with the project. In the future posts, we will discuss kicking off the assessment, various approaches of collecting, validating, and analyzing the assessment data, plus presenting the finding and following up.

DIY Process Assessment Planning – Problem Definition, Scope, and Stakeholder Analysis

In the previous post, we talked about some potential information resources you can use to set up an internal ITSM process assessment effort. In this post, we will discuss some of the planning considerations that should be taken into account.

Problem Definition (Purpose and Opportunity)

It is always important to answer the question of “Why Bother?” In order for an assessment to produce meaningful results, it needs to address one or more business problems at hand. Document what are the drivers for conducting the assessment. Document why it is important to address the drivers now. Clearly identify what the organization or the sponsor intends to do with the assessment results. Once the purposes have been articulated with the agreement and support from the sponsor, the assessment team can use them as part of future communications to the stakeholders and participants.

Scope (Process and Organization)

The scope usually comes in two aspects, process scope and organization scope. An assessment can cover one or more ITSM processes, and the assessment purposes defined in the previous section should help to guide the scope definition.

The organization scope also needs to be defined for the assessment. Ideally an ITSM assessment should cover the entire IT organization but that may or may not be needed depending on the business problem you are trying to solve. Defining the organization boundary for the assessment can also be tricky partly because people and political considerations inevitably get involved. It is possible that the assessment will cover only a portion of the organization where the sponsor has more control over, infrastructure operations vs. application development as an example. When that happens, it is important to define clearly just how applicable the assessment’s scope is organizationally. That way, the assessment results will not be misinterpreted, and the follow-up action plans can be appropriately planned and executed.

For the process scope, the assessment should focus on assessing a process’ end-to-end effectiveness, or customer experience from the business perspective. It is quite possible that an overall assessment for a particular process will yield a score that may be different than what an individual team or organization expects. When that happens, it is important that the assessment can provide data to back up its finding or to explain the difference in expectation.

Also, don’t lose sight of the following. One reason to conduct assessment is to baseline what your organization is currently doing. Take advantage of the assessment opportunity to document the baseline performance of your processes, even for those which have had very little formal implementation or performance record.

Analysis of Stakeholders and Participants

Identifying and understanding the interactions stakeholders may have with the assessment is a critical aspect of the assessment planning. The stakeholders or participants of the assessment project can include a number of individuals such as the process owners, the IT staff, senior IT management team, key suppliers, and business customers. There are a number ways of mapping and understanding the stakeholders’ impact to the project. For most assessment projects, I would suggest thinking about the following factors. Customize the list further if your organization requires it.

  1. The stakeholder’s view of the assessment effort: Some participants feel strongly about the assessment for whatever reasons, and some feel less so. Try to gain a better understanding of the participants’ views towards the assessment.
  2. The stakeholder’s power to influence the assessment: Understand what role a participant will play in the assessment and how much influence they may have over the direction and execution of the project.
  3. The stakeholder’s work or responsibility impacted by the assessment: The assessment will have variable degree of impact to the participants’ work or functional areas. Take this factor, along with the view and influence considerations, you can gauge what impact a stakeholder may have on your project more effectively.
  4. The stakeholder’s need for communication or training: Understand and map out the communication strategy based on the information or reporting needs of the stakeholders. Some stakeholders will require periodic updates (say weekly) from the assessment team. Some participants may require more frequent or less frequent communication. Also understand whether the participants will require some form of awareness training or education prior to the actual start of the assessment activities.

At the next post, we will discuss the additional planning factors such as the assessment model, the schedule and deliverables, as well as the risks and constraints.

Fresh Links Sundae – March 25, 2012 Edition

Fresh Links Sundae encapsulates some pieces of information I have come across during the past week. They maybe ITSM related or not entirely. Often they are from the people whose work I admire, and I hope you will find something of value.

Robert Stroud talked about how the role of IT has changed from a pure operational role to a more strategic and business-oriented one. Transformation of the role of the CIO (CA on Service Management)

Rob England discussed that many organizations have not managed or governed their IT function effectively and can do more. Organisations have failed their IT like bad parents (The IT Skeptic)

Charles Betz examined a recent survey on IT demand management and discussed some preliminary findings. IT cannot prioritize (Charles Betz)

Jeff Wayman made some suggestions on how service desk can help its users to combat hacking and protect themselves. 5 Ways a Help Desk Can Stop a Hacker (ITSM Lens)

Don Tennant discussed a research finding where leaders and employees having very different views on who is helping to promote innovation and who is not helping. Research Shows Company Leaders Are Stifling Innovation (From Under the Rug)

Damon Edwards discussed how reducing batch sizes can improve the effectiveness of DevOps within your organization. DevOps Lessons from Lean: Small Batches Improve Flow (- Blog – dev2ops)

Brad Power discussed two examples of IT playing a leadership role in helping two organizations drive competitive advantage. Look to IT for Process Innovation? (Brad Power – Harvard Business Review)

Bob Sutton talked about how effective leaders practice both leadership and management principles together. Hollow Visions, Bullshit, Lies and Leadership Vs. Management (Work Matters)

Marshall Goldsmith discussed a simple, yet effective system for getting better at providing positive recognition. How Do I Provide Meaningful Recognition? (Marshall Goldsmith)

Finally, a simple yet informative explanation of social media for a simple-minded individual like me. Social media explained with donuts (

Process Assessment Using a DIY Approach – Potential Resources

Conducting an ITSM process assessment can be both important and beneficial. The assessment exercise is important because it can provide the baseline information needed in order to improve the processes that were assessed. The improvement in the efficiency and effectiveness gained usually leads to better IT services, which is always a benefit for our customers. Conducting a process assessment is also not a trivial exercise. It takes a good deal of planning and preparation upfront before the first survey goes out to the stakeholders.

A number of organizations utilize the help of external consultants to perform the assessment. A number of organizations also would like to roll their own assessment exercises. This post will go into some suggested resources if you are thinking about performing a self-assessment on your ITSM processes today. Consider the following…

One or more of the core ITIL documents: Since we are talking about assessing ITSM processes, the ITIL documentation seems to be the most logical baseline material to have. Get the entire set of five manuals, preferably the 2011 update edition, from Best Practice Management or another source. There have been a number of books written on ITIL, and they do provide good information. Have the official ITIL manuals handy nevertheless.

A Process Maturity Model: For each process you assess, you will need to be able to assign a numeric rating that denotes the maturity level of the process. I would suggest adopting either the approach of Capability Maturity Model Integration (CMMI) for Service or ISO/IEC 15504. Both CMMI and ISO/IEC 15504 have a numeric ranking system of 0-5 to measure the process maturity level. The ways both standards label the maturity levels are also very similar. However, there is a cost consideration involved. CMMI can be downloaded free from Software Engineering Institute while ISO/IEC 15504 documents need to be purchased from ISO. If you have access to ISO/IEC 15504 at your organization, go for it. If not, download and use CMMI from SEI. At the end of the day, I believe either maturity model definition will do just fine for DIY assessments.

In addition to the maturity levels, you will need a list of best practices or control objectives for each process. The best practices or control objectives contain the finer details that show what a mature process will look like. Take the Problem Management (PM) process for example; a mature PM process should have something like…

  1. A procedure to identify, record, classify, and track problems
  2. A procedure to perform root cause analysis and closure for a problem
  3. A mandate where problems require changes in configuration items for resolution must be handled via the Change Management process
  4. A setup for Known Errors repository when root cause has been identified but the problem cannot be fully resolved for some legitimate reasons
  5. A procedure to monitor and measure the effectiveness of the PM process
  6. A well-defined set of interfacing procedures or protocols between PM and other related processes such as Incident, Service Request, and Change Management

To come up with a list of best practices or control objectives, I can suggest researching, distilling and integrating information from the following sources…

  1. The ITIL Core Documentation.
  2. COBIT 4.1: For our PM example, Process DS10 (Manage problems) has the control objectives which can be used for accessing the PM process. COBIT also contains a number of control objectives that can be mapped to other ITIL processes as well. COBIT can be downloaded from ISACA (registration required).
  3. CMMI Version 1.3 for Service: The section “Incident Resolution and Prevention” on page 171 contains some worth-a-while ideas and what a maturity level 3 will look like for Problem Management. CMMI for Service can be downloaded from SEI.
  4. ISO/IEC 20000-1 and 20000-2: ISO 20K and ITIL v3 share a number of similarities. For example, section 8.2 of ISO 20000 deals specifically with Problem Management and spelled out details on what constitutes compliance with the ISO standard. I would suggest purchasing the two ISO 20000 documents, with the latest updates from 2011 and 2012 if possible. ISO/IEC standards can be purchased from ISO or ANSI.

Combining the four documents between ITIL, COBIT, CMMI, and ISO 20000, you should be able to come up with a list of assessment criteria for a particular process. Coming up with a list of assessment criteria is probably one of the most time-consuming exercise of the process assessment. There is where the external consulting can help or come into the picture. The external consultants will likely already have those assessment criteria on hand and ready to go. Since we are talking about the DIY approach, coming up with a robust list of assessment criteria will be well worth the time to do it the very first time. After the one-time start-up effort, the assessment criteria can be fine-tuned and used over and over again for the subsequent assessments.

Other Supplementary Materials: I suggest obtaining the following documents because they contain valuable information that can help you immensely as you sort through various documents to formulate your assessment model.

  1. Aligning COBIT® 4.1, ITIL® V3 and ISO/IEC 27002 for Business Benefit: This document provides a detailed mapping of COBIT 4.1 processes to ITIL V3 and vice-versa.
  2. COBIT Mapping: Mapping of ISO/IEC 20000 With COBIT 4.1: Another helpful mapping document but available to ISACA member only. Non-ISACA members can also purchase the e-book from ISACA.

With COBIT 5’s planned release in April 2012, we will see some changes in how COBIT presents the process assessment and control objective information. If you need to start planning the process assessment effort for your organization, there is no need to wait for COBIT 5 in my opinion. Between ITIL V3, COBIT 4.1, CMMI Version 1.3, and ISO 20000, you have more than enough pieces to put together your own maturity assessment model picture. In the future posts, we will go into more assessment how-to’s such as surveying the stakeholders, analyzing the results, and presenting the findings.

SACM and CMDB Tools – Strategy and Roadmap Example

Partly due to my track chair duties with the Fusion 12 conference, today’s post will be a bit light. I was going to leave the SACM and CMDB posts as they were, but I went ahead and put together a strategy and roadmap example deck. You can use the information presented in this deck as a starting point to formulate your own CMDB strategy and roadmap.

SACM and CMDB Tools – Strategy and Roadmap Example

As we discussed in the previous posts, the SACM process can enable and affect a number of other ITSM processes, so planning for it can seem overwhelming at times. My recommendation is to initially start with a well-controlled scope and focus on the value-added activities that can bring the quick wins. As usual, please feel free to comment or discuss anything directly with me if you like.

Fresh Links Sundae – March 18, 2012 Edition

Fresh Links Sundae encapsulates some pieces of information I have come across during the past week. They maybe ITSM related or not entirely. Often they are from the people whose work I admire, and I hope you will find something of value.

David Ratcliffe discussed what he saw as IT 2.0 coming of age – what risks to understand and what opportunities to seize for the IT leaders and professionals. The Reason Doris Day Is Not An IT Leader (PinkPresident)

Stephen Mann discussed how awareness for ITIL has gone beyond the I&O professional group and suggested ways to leverage the “outside” awareness to the advantage of the I&O professionals. The Cult Of ITIL: It Has More Followers Than You Think (Forrester Blogs)

Jeff Wayman outlined five important activities that IT can do to proactively manage the cloud computing and outsourcing trends. Combating the Modern Outsourcing of IT (ITSM Lens)

Aprill Allen discussed how to proactively do knowledge management so everyone in the organization benefits while reducing risks. Hoarding for headcount (Knowledge Bird)

Eric Feldman outlined steps to combat “zombie” services or applications that take up resources with little or no return in value. The Looming Zombie Apocalypse in your Data Center (CA on Service Management)

Simon Morris analyzed the different approaches to support VIP in an organization and suggested ways to improve the overall user experience. How to Provide Support for VIPs (The ITSM Review)

Evolven presented outlooks on IT spending in 2012 from various IT analysts such as Gartner, IDC, Forrester, etc. 2012 Predictions on IT Spending from Experts and Industry Analysts (Evolven Blog)

McKinsey Quarterly presented a list of winning entries from the “The Beyond Bureaucracy Challenge” contest that it recently ran. Listening to employees: The ‘Beyond Bureaucracy’ winners (McKinsey Quarterly)

One particular article from McKinsey’s report caught my attention. Managing Beyond the Organizational Hierarchy with Communities and Social Networks at Electronic Arts (Management Innovation eXchange)

Lastly and certainly not the least, we present the problem management practice at the most advanced form for your weekly enlightenment. Jedi Problem Management (Real ITSM)

Credit: Image Courtesy of Wikipedia

SACM and CMDB Tools – Implementation Considerations – Part 4

This post is the part four (and concluding part) of a series where we discuss the planning and implementation of a SACM and CMDB solution. Previously in Part One we discussed the fundamental considerations that should go into deciding whether to implement a CMDB solution for your organization. Part Two and Part Three discussed some of the planning considerations that will impact the quality of the CMDB solution. In this post, I will try to wrap it all up with some additional suggestions on the implementation approach.

Coming up with a CMDB implementation approach will be highly organization dependent. If you need something to start the planning process, I would suggest examining the following high-level approach and refining it with more detailed steps or activities.

  1. Do the homework. Examine the planning factors we discussed in the previous posts. Have a clear idea of what the organization hopes to accomplish with the CMDB data. Refine the scope. Determine how much data is really enough by balancing the information need/availability with the resources and effort needed. Depending on the size of the effort and your organization’s requirements, you may or may not need to formalize this as a funded project.
  2. Line up the right people for the CMDB effort. Explicitly funded or not, implementing CMDB should be treated as a formal project with activities and resources clearly identified and planned. Get the right people involved at the various stages. Help the people acquire the necessary CMDB knowledge before diving into the work.
  3. Work on the requirements and translate the requirements into data model designs. Collaboration with other teams is the key to success here. Don’t do the design in isolation. Work with you Enterprise Architecture team or person to collaborate on the design. The data model needs to have the necessary details so a number of things such as processes and tools can be designed around them.
  4. Select the tools that can handle your design or come very close to it. You will use the tools to construct your CMDB and figure out what support processes your team will need to maintain the CMDB. The tools will also help you populate the data into the CMDB the very first time and facilitate the on-going additions and changes.
  5. Towards the end of the implementation effort, you will need to train your CMDB administrators and the users who will interact with the tools. Once the tools go live, you will need to start gathering usage statistics and measuring the effectiveness of the tools and processes. Periodically check for results and validate your assumptions about how the tools and processes should work or behave like. Look for risks to mitigate and opportunities to extend the usefulness of CMDB to support other processes and business activities.

In addition, I would recommend another book “Step-by-Step Guide to Building a CMDB” ISBN-13: 978-0977811939 for implementation and reference purposes. The book goes into many details of planning and implementing a CMDB solution. I have outlined the high-level steps below, and the book is something to consider.

Stage 1: Assemble the Project Team and Define the Project

  • Step 1: Assemble Project Team
  • Step 2: Obtain CMDB Knowledge
  • Step 3: Create and Agree on CMDB Goals and Mission Statement
  • Step 4: Review and Define Benefits
  • Step 5: Build a Business Case

Stage 2: Define Requirements and Create IT Service Model Blueprint

  • Step 6: Identify & Review Governance Requirements
  • Step 7: Review and Select Supporting Best Practices
  • Step 8: Identify Requirements to Address Potential Problems
  • Step 9: Identify Inventory & Asset Requirements
  • Step 10: Define Service Catalog Requirements
  • Step 11: Define CMDB Requirements to Support Other Processes
  • Step 12: Define Configuration Item Level & IT Service Model
  • Step 13: Define Configuration Item Relationships
  • Step 14: Define Configuration Item Attributes
  • Step 15: Design IT Service Model Blueprint

Stage 3: Select CMDB Solution and Tools

  • Step 16: Select CMDB Solution
  • Step 17: Plan the CMDB Population
  • Step 18: Select Tools to Automate CMDB Population
  • Step 19: Calculate Project ROI

Stage 4: Construct and Maintain Your CMDB

  • Step 20: Construct Your CMDB
  • Step 21: Create Configuration Item Lifecycle Management Processes
  • Step 22: Build Support Processes
  • Step 23: Populate Your CMDB
  • Step 24: Train the CMDB Team and Users

Stage 5: Driving Ongoing Value

  • Step 25: Implement Measures and Metrics
  • Step 26: Create a Continual Service Improvement Program

Links to other posts in the series

Fresh Links Sundae – March 11, 2012 Edition

Fresh Links Sundae encapsulates some pieces of information I have come across during the past week. They maybe ITSM related or not entirely. Often they are from the people whose work I admire, and I hope you will find something of value.

Robert Stroud commented on a recent released white paper on ITIL study from APMG and asked the question. Is ITIL still delivering the value? What the data tells us (The post also contains the link to download the study whitepaper) (CA on Service Management)

Jeff Wayman at ITSM Lens discussed and highlighted the fact that an ITSM initiative is very much a people endeavor, with the process and tools, while still important, playing more of a secondary role. 10 Reasons Why Your ITIL Implementation Will Fail (ITSM Lens)

Troy DuMoulin authored an excellent whitepaper on “Top 10 Considerations For Successful ITSM Programs” Check out Pink Elephant’s PinkLINK newsletter to download the whitepaper. (Pink Elephant)

Stephen Mann talked about mobile device management and how it relates to his message of “Support the people not the technology.” Enabling Customer Mobility: Why Current Mobile Device Management Thinking Is Flawed (Stephen Mann’s Blog)

Bob Sutton gave an example of when a good boss would intervene when something goes wrong. FUBAR, SNAFU, Fast Company, and Good Bosses (Bob Sutton)

George Colony, Forrester CEO, wrote about his observation from attending the World Economic Forum in Davos, Switzerland. 14 Things I Learned At Davos 2012 (The Counterintuitive CEO)

Finally, nine links in one post and full of experts’ opinions on what’s to come. I listed three just for fun, and you can find more in the post. Big Data Predictions and Opportunities for 2012 from the Experts (Evolven Blog)

Credit: Image Courtesy of Wikipedia