How the assessment works
-
Scope and planning: define the services, practices, timelines, and interview list.
-
Document review: examine policies, processes, templates, reports, metrics, records, and tool configuration.
-
Interviews and evidence gathering: speak with stakeholders and compare what’s documented with what’s happening in practice.
-
Scoring and analysis: convert findings into maturity results across the relevant ITSM areas.
-
Report and recommendations: deliver findings, quick wins, and an improvement path.
For a closer look at the model itself, see the ITIL maturity model levels, assessments, and how it works. And to understand why many organisations schedule assessments at the start of the year, see why your financial year should start with an ITIL maturity assessment.
The ITIL maturity model levels
The ITIL maturity model describes five stages of practice capability and organisational maturity. It’s used to show how reliably services are delivered, measured, and improved over time.
-
Level 1 (Initial or ad hoc): work is informal, uneven, and often dependent on individual effort.
-
Level 2 (Repeatable): a basic set of activities exists, and the practice can be followed more consistently.
-
Level 3 (Defined and documented): the process has shape, language, and structure, with integrated inputs from other practices.
-
Level 4 (Managed and measured): performance is monitored, reviewed, and managed within the wider service management system.
-
Level 5 (Optimised and continually improving): improvement becomes part of the operating habit, not a one-off project.
What you receive after the assessment
A good assessment leaves more than a score on the table. You’ll receive:
-
A maturity score across key ITSM capabilities.
-
A detailed assessment report with current findings.
-
Recommendations that point to practical improvement.
-
A roadmap for strengthening IT service management over time (see example report).
Whether your organisation is in Australia, New Zealand, or the UK, your Word-format report also includes:
-
Assessment method and scope
-
Graphical representations of findings
-
Per-practice conclusions
-
SVS component results (where applicable)
Optional add-on: Service Improvement Roadmap workshop. Please note, the report itself will provide findings and recommendations, but the roadmap is a separate workshop to create a detailed improvement plan.
Assessment options (choose what fits)
Option 1: Comprehensive Capability Assessment (SVS + practices)
-
Includes all SVS components: Guiding Principles, Governance, Service Value Chain, Practices, Continual Improvement, plus at least 7 practices (Continual Improvement must be included).
-
Best for: organisations wanting a full SVS maturity view and a strong baseline for transformation
-
Output: detailed report with per-practice findings, SVS component scoring, quick wins, and recommendations
-
Optional: independent validation and Maturity Level certificate (PeopleCert audit)
Option 2: Selected Practices Capability Assessment (practices only)
-
Assesses the capability of the practices you select (any number).
-
Best for: targeted uplift (e.g., service desk, change, incident, knowledge)
-
Output: per-practice capability scoring, findings, and recommendations
-
Example practice scope: Incident Management; Problem Management; Change Enablement; Knowledge Management; Asset Management; Service Catalogue Management; Service Desk Management; Service Request Management; Continual Improvement
Why does improving ITIL maturity matter?
When maturity rises, service management tends to feel more deliberate. It also supports better alignment between IT services and organisational goals.
In practice, that can mean:
-
More consistent service delivery.
-
Faster incident resolution.
-
Better governance and accountability.
-
Clearer alignment between IT and business priorities.
FAQ
What’s the difference between a capability assessment and a maturity assessment?
Capability assesses how well specific ITSM practices achieve their purpose. Maturity assesses the broader Service Value System (governance + management system) and produces the maturity rating used for improvement planning.
Do we need to assess all practices?
Who should be involved from our side?
What evidence do you need?
Policies, processes, templates, metrics/reports, tool configuration, records (incidents/changes/requests), meeting minutes, audit outputs, whatever is relevant and permissible. If documents can’t be shared, we can review them via screen share.
Is the assessment done onsite or remotely?
How long does it take?
- Documentation review: 5 days
- Interviews & evidence: 5 days
- Analysis & report writing: 10–12 days
- Final review & submission: 1 day
