Many institutions make use of key performance indicators or KPIs as part of their strategic planning activity. KPIs can be defined as:
'...financial and non-financial measures or metrics used to help an organization define and evaluate how successful it is, typically in terms of making progress towards its long-term organizational goals'
One of the key roles of KPIs is to give substance to the high level aspirations outlined in the organisation’s strategic documents and in doing so to make them both more tangible to those who must make progress towards them and those whose job it is to measure this progress.
As such it is important that the KPIs identified stem directly from these other strategic plans and statements and do not operate separately and in parallel to them. After all, there are innumerable aspects of the institution’s performance that could be measured but unless it is possible to draw a clear line between the KPI and the strategic objective to which it relates the value of gathering and analysing the data required to record progress is questionable.
Worse still, it may risk diluting the focus on those KPIs which are directly relevant to helping the institution achieve its goals.
Six “prior conditions” for implementing KPIs
- A proper governance context, including an appropriate separation of roles and a strategic planning process in which governors play a meaningful role
- Clear integration with other key processes, so that the KPIs discussion is seen as assisting, rather than adding another layer to, the existing governors’ agenda (Most obviously, links to strategic planning and risk management)
- Recognition that KPIs will never be “perfect” or “finished”. This is an 85% activity, where governors need a reasonable set of indicators as soon as possible, but these will continue to develop as strategy also develops
- Willingness to be selective. Institutions should not feel any obligation to make the KPIs comprehensive, or to choose everything off the “menu” in the Committee of University Chairs (CUC) guide. An alternative approach is to have KPIs which cover areas of particular concern to governors at any time. These will probably change over time, and the board may adopt a different mechanism for reviewing other areas of university performance
- A link to performance so something really happens (“what gets measured gets done” to quote one chair of a case study university)
- The support and buy-in of the chair and of at least some key governors
Taken from the Committee of University Chairs (CUC) report on the implementation of key performance indicators: case study experience June 2008.
Number and grouping
There is no ideal number of KPIs to aim for. Instead you should focus on defining as many as it takes to capture and record progress against the substantive goals articulated in your strategic plans.
In many respects it is probably better to measure a few key indicators well than to attempt to measure almost everything, which risks diverting resources unnecessarily and losing sight of the most important trends to be watching. Focusing on indicators which help answer a specific question, rather than just including those indicators you know you have the ability to measure should help in this regard.
Most find it useful to group their KPIs under subject headings reflecting the major areas of institutional engagement, in a university for example this may be: teaching and learning, research, knowledge transfer, financial etc. Some find it useful to distinguish between a small number of primary groups such as those just listed and a secondary set of supporting indicators relating to areas such as sustainability, internationalisation, facilities and estates etc.
Under each of these headings it is common place to find somewhere between one and five KPIs, but again the focus should be on completeness and relevance, rather than arbitrary targets.
Expressing your KPIs
KPIs can be viewed as specific, measurable, attainable, relevant and timely (SMART) targets and ensuring your KPIs adhere to these principles will ensure that you obtain the maximum benefit from them.
Each KPI is usually comprised of multiple elements including the performance indicator itself (ie, the area of activity to be monitored) alongside a statement of how performance is to be measured (eg, increase in full-time students, improved research assessment exercise (RAE) or Ofsted inspection results, improvement in National Student Survey or Framework for Excellence performance etc) plus a statement on what the target level of achievement is set at (eg, to be ranked within the top 10 research universities, to achieve a student satisfaction rating above 90% etc).
Most commentators are agreed that if KPIs are to have any real value they must be measurable – otherwise there is no means of establishing if and to what degree you are making progress towards achieving the targets you have set for yourself. It is the difference between a KPI which confirms the intention to ‘be in the top 10 institutions in the student satisfaction survey’ and one which simply aims to ‘increase the quality of the student experience’.
However, that does not mean to say that all KPIs must be quantitative. The capture and inclusion of qualitative information can also be extremely beneficial and help provide valuable information about progress and achievement in pursuit of strategic objectives. For example, the inclusion of awards or prizes won for achievement as part of the KPIs illustrating improving teaching quality.
Of course the measurement of progress is only possible if you have relevant and reliable baseline data in the required areas in the first place and the ability to generate accurate progress data on an ongoing basis. This reinforces the importance of the principles outlined in the environmental scanning and business intelligence stage.
It is also extremely useful to be able to benchmark your institution’s own performance against that of other institutions – especially those of whom you view as direct competitors. This enables you to review your performance in a more meaningful context and may well put a completely different slant on what the KPIs appear to be saying (for example a decline in student numbers which, in isolation, may appear entirely negative may be viewed somewhat differently if your rate of decline is far slower than that of all other institutions against whom you compare your performance).
Benchmarking your performance against that of others in ways which you chose can be a far more useful exercise than simply comparing your relative positions in national league tables by allowing you to focus on the enabling factors and trends relevant to specific agendas which may not otherwise be apparent in national surveys.
External benchmarking may well have a bearing on what KPIs you choose to track and how you choose to express them in order to make your data comparable with that generated by and available for other institutions.
In order to ensure that your KPIs are fit for purpose it is important that your proposed indicators are agreed by those people within the institution who have a direct interest in monitoring the data they generate.
Members of the senior management team and internal auditors should all have a view regarding what indicators are included, how they are expressed, what metrics will be used to identify success and how this will be measured. Likewise it is wise not to ignore the views of those members of staff who will be responsible for generating the data required to ensure that collecting it is feasible and sensible and that any practical issues are addressed.
One of the most important groups of stakeholders when it comes to the capture and especially the monitoring of KPIs are the members of your board of governors.
Accurate and clearly presented KPIs are now recognised as an essential element of the information which governors require to fulfil their appointed role in monitoring the strategic direction of the institution and assessing its performance against the objectives it has defined.
It is in recognition of this fact that the Committee of University Chairs has produced an invaluable guidance document on the implementation of KPIs based on the experience of its members and specific case studies. Although the report is HE specific in its remit there is still much generic good practice included here which all institutions would find useful in this area.
'At the heart of the process, governors need to be well-enough informed and independent enough to provide a distinctive view on institutional goals and performance. But they need to do this in a way which is also acknowledged to be useful to the executive management of the university, and which can actually lead to an impact on performance.'
Taken from the Committee of University Chairs (CUC) report on the implementation of key performance indicators: case study experience June 2008
Consult with your governors
"Engaging the governing body in the process of strategy development has proven useful; the impact of this will only be seen longitudinally in the next phase of the planning cycle. It is our recommendation to involve governance as early in the process as possible."
Rohan Slaughter, head of technology, Beaumont College