"Our performance indicators stem directly from […] strategic plans and statements and do not operate separately and in parallel."
Alexis Cornish, director of planning and deputy secretary, The University of Edinburgh
Formulating and maintaining key performance indicators for your institution based on the guidance outlined in the previous section should positively contribute to your strategic activity in several regards. If we take the list of outcomes suggested earlier that we should be trying to achieve through our strategic activity, the successful use of KPIs should directly contribute to achieving the following:
- A means for ensuring local (ie, faculty, departmental, team and/or individual) planning is consistent with these high-level strategic objectives and will contribute positively to achieving them
- A means for ensuring that overall progress towards the agreed objectives can be measured
- A means for ensuring that the right people within the institution are kept up to date with progress.
The strengths of KPIs largely revolve around measurement of progress and communication and embedding of strategic objectives throughout the institution.
Communication and embedding
"Consider the nature of your institution, the progress you have made down the ‘strategic vision’ path, and any possible technical issues, when selecting the consultation method(s) you will use."
Alexis Cornish, director of planning and deputy secretary, The University of Edinburgh
A focused and clearly articulated set of KPIs will help make the strategic objectives of the institution more tangible: it gives them form and makes them recognisable, relevant and understandable to staff. As such they can start to influence people’s behaviour and shape their actions in ways which contribute positively to achieving the stated aims.
Naturally it is important that the targets strike an appropriate balance between ambition and what is achievable but, assuming that balance is struck, it should become possible for staff to trace a clear connection between their own work and the successful realisation of targets that seem (and indeed are) real and achievable as opposed to remote and theoretical.
By ensuring that your KPIs are measurable and regularly comparing progress against baseline performance data (either your own or your competitors) it becomes possible for your institution to accurately chart its progress and to communicate this to all relevant stakeholders.
Key performance indicators lend themselves well to being summarised in various ways to illustrate progress, including graphically by means of a ‘traffic light system’ where the colours red, amber and green are used to indicate indicators which are either missing their target, showing some warning signs of missing their target or are proceeding according to plan.
Some institutions advocate the inclusion of a fourth level (for example: red, amber, green, gold or: red, amber, amber-green, green), the benefits being that this helps prevent the tendency for people to use the middle ‘amber’ setting as the default.
But variations on the traffic light system are only one means of visualising the data being produced. Some institutions are now making use of technology to help senior managers and others to interpret the data via means of KPI dashboards. These enable the user to combine KPI data with other relevant information and statistics about the institution’s performance and to ‘slice and dice’ this data as required to provide multiple views of the institution or to answer specific questions.
An overview of one approach to producing a KPI dashboard can be found at University of Exeter.
Alternatively (or in addition) some institutions use a balanced scorecard approach to tie their KPIs to their organisation’s strategy.
Such ‘at a glance’ summaries can then be made available to specific stakeholders who have an obligation to monitor the performance of the institution, such as governors, as well as the institution at large.
The value of monitoring KPIs to the work of the board of governors is evident from the survey results generated as part of the Committee of University Chairs Report (CUC) project. The implementation of key performance indicators suggested that 35% of governing bodies reviewed performance data at each meeting with 27% reviewing annually and 14% doing so twice a year.
Balanced scorecard - identifying the growth, business process, customer and financial perspectives
The balanced scorecard is a strategic management approach developed in the early 1990s by Dr Robert Kaplan of Harvard Business School and Dr David Norton.
Drs Kaplan and Norton describe the approach:
"The balanced scorecard retains traditional financial measures. But financial measures tell the story of past events, an adequate story for industrial age companies for which investments in long-term capabilities and customer relationships were not critical for success. These financial measures are inadequate, however, for guiding and evaluating the journey that information age companies must make to create future value through investment in customers, suppliers, employees, processes, technology, and innovation."
The balanced scorecard identifies four perspectives from which to view an organisation. These are:
- The learning and growth perspective
- The business process perspective
- The customer perspective
- The financial perspective
Further information on the balanced scorecard can be found at The Balanced Scorecard Institute.
Liverpool John Moores University, in collaboration with Oracle, is developing an executive dashboard - a strategic management tool based on the balanced scorecard approach. John Townsend, of the university reported in a paper to EUNIS 2005 that:
"The balanced scorecard approach and the associated Oracle technology provide a very powerful tool for managing change – and force you to ask hard questions with inevitable consequences: “Where are we going?…How will we know when we get there?… How will we get there?… Are we there yet? … How can we tell where we are now?… The data must be accurate and meaningful… Who gets the data?… Change management… means changes in the way we work… but this doesn’t happen of its own accord and there is a major communication and marketing effort to go with any such initiative…
This change is not a foregone conclusion… What about people?"
Lastly, it is possible to extend the KPI framework by cascading it down to as many levels and degree of detail as is desired – all within a consistent and coherent framework. Supporting sets of KPIs can be defined at a local level to measure performance and to help co-ordinate departmental activity with the strategic objectives of the institution as a whole.
For example, an institution may have identified a 25% reduction in CO2 emissions by 2010 as one of its environmental KPIs. This may then be further reflected and elaborated upon within individual departments as they articulate the changes they would need to make at a local level to enable the organisation as a whole to meet this target.
KPIs: key questions to address
In summary, institutions may find it useful to ask themselves the following questions in relation to the use of KPIs as part of their strategic planning processes:
- What questions are you hoping to answer through your KPIs?
- Do these questions link directly to the strategic objectives outlined in your strategic plan?
- Are you collecting the data required to answer these questions, or focusing simply on the data you know you can easily collect
- Are you collecting data unnecessarily?
- Have you consulted widely when selecting your KPIs?
- Have your KPIs been approved by senior management and the board of governors?
- How and how regularly are you going to monitor progress against your KPIs?
- Have you assigned appropriate owners for each KPI and are they aware of their responsibilities in this regard?
- Have you considered a mechanism for reviewing the KPIs themselves and changing them, removing them or adding new ones?
- Have you selected a means of displaying progress against the KPIs that suits the needs of all relevant stakeholders?
- Have you considered what other data streams (derived from both within and outside the institution) that you may wish to integrate with your KPI data?
- Are you collecting data in a format which may make it possible to integrate with other data streams?
- Are you comfortable with the idea of publicly flagging ‘red’ areas (ie, areas where you appear to be failing to meet agreed KPIs?)
- Have you considered what happens when areas are reported as being at ‘amber’ or ‘red’? For example, are there links between this reporting process and your institution’s issue and risk reporting procedures?