The benchmarking in European higher education project has underlined that there will be a different starting point for every higher education institution dependent inter alia on:
- Institutional profile
- Institutional capacity
- Organisational climate
- Focus on improvement
- Willingness to change
- Available resources
- Available data
- Degree of autonomy (of institution within sector or units within institution)
The HESA-commissioned report from PA Consulting ‘International benchmarking in UK higher education’ identifies the concept of maturity in relation to international benchmarking in higher education. Maturity is also identified as a key concept in our business intelligence guide as institutions assess their current position and measure progress.
In terms of the findings from the HESA benchmarking project, it is clear that:
- Each institution has varying levels of understanding of, and gives varying priority to, benchmarking (often linked to mission and strategy)
- Each institution has varying capacity and capability to undertake benchmarking (often a factor of size)
- Within institutions, specific areas may be more or less advanced in benchmarking
A maturity framework for benchmarking can be drawn out of the success factors and case studies shown within this infoKit, which provides a useful means of self-appraisal and also a route for enhancing competence and capability to benefit from benchmarking. This framework addresses leadership and governance, alignment with corporate strategy, resources, comparator groups, types of benchmarking, technology and source data.
|Leadership and governance||Benchmarking undertaken by specialists/analysts. Results viewed by individual enthusiasts at middle management level. No appreciation of value of benchmarking at more senior levels.||Individual Senior Managers are advocates of benchmarking and may promote use of technique within their departments on an ad hoc basis.||Senior management team members are advocates of benchmarking and review outcomes of benchmarking in setting strategic objectives. Benchmarking analyses part of suite of information routinely shared with head of institution and senior management team.|
|Alignment with corporate strategy||Results of benchmarking reviewed by individuals in particular departments with no explicit relationship with corporate or departmental strategy. Analyses undertaken on ad hoc basis to investigate particular issues.||Benchmarking analyses used as context for departmental strategy but not on systematic basis. No explicit link with overall corporate strategy.||All benchmarking activity fully aligned with corporate strategy across institution. Development of Corporate strategy informed where relevant by benchmarking analyses.|
|Resources||Lone individuals undertaking benchmarking when time and resources permit.||Resources made available to support benchmarking but on an infrequent and ad hoc basis.||Benchmarking considered intrinsic part of strategic planning with staffing and resources allocated appropriately.|
|Comparator groups (external benchmarking)||Mission groups or other historic groups used for all benchmarking analyses.||Different comparator groups used for different types of benchmarking, but composition not updated regularly using latest evidence.||Different comparator groups for different types of benchmarking selected through evidence. Groups updated at a frequency that aligns with the strategic planning cycle of the institution. Selection designed to include correct level of challenge and diversity of type. Aspirational as well as current comparator groups.|
|Types of benchmarking used||Simple comparisons based on metric benchmarking (internal or external). Provides more focused questions for further exploration.||More sophisticated comparisons using metric benchmarking with diagnostic approaches to provide depth to the analysis. Targets areas for further investigation but also starts to generate insights and context.||Sophisticated metric benchmarking, perhaps using techniques to normalise data by institutional characteristics. Diagnostic techniques used to focus and shape subsequent collaborative process benchmarking. Provides valuable and in-depth insights and gives practical depth to identification of good practice.|
|Technology||Any benchmarking comparisons undertaken largely through use of standard spreadsheet applications.||Use of specialist software applications providing more sophisticated benchmarking/dashboard functionality.||Use of enterprise-wide business intelligence system which is based on a comprehensive data warehouse application.|
|Source data||Source data compiled on one-off and ad hoc basis. Questionable quality and comparability of data. Localised sources of data held within departments that are not accessible to all staff and are not trusted across the institution.||Data feeds taken from good quality sources at regular intervals. Comparability ensured to high degree. Visibility and sharing of data between departments. Increasing trust in the data arising from developing consistency and transparency of data gathering processes.||Quality-assured internal and external data which is maintained as a central institutional resource. Integrated and coordinated approach to data gathering and update promoting timely and consistent data – “one version of the truth” that is trusted across the HEI. Adherence to data standards ensuring comparability and stability over time.|
It is important to understand that although level three in each of the above dimensions of benchmarking represents the highest level of maturity, this may not be attainable or even desirable for every institution.
Application of benchmarking must reflect the mission, characteristics and resources of each institution, and so decisions must be taken as to what the desired maturity level is within each dimension, together with the steps that are required to achieve it.