Please refer to the University’s Glossary of Terms for policies and procedures. Terms and definitions identified below are specific to these procedures and are critical to its effectiveness:
Benchmarking: a quality process used to evaluate performance by comparing institutional practices to identified good practices across the sector. Benchmarking can be undertaken collaboratively with another department or institution, as a desktop exercise using publicly available information about another organisation or against external standards.
1. Purpose of procedures
1.1 This procedure provides a method for Benchmarking, including desktop benchmarking and that conducted in partnership with another institution.
2.1 The following principles apply in the conduct of Benchmarking activities:
|Effective||Enables the benchmarking of university activities against external reference points or other higher education providers. Supports both the quality enhancement and quality assurance of programs, courses and university operations.|
|Efficient and sustainable||Provides a streamlined, efficient and sustainable process for Benchmarking that can be operationalised and used routinely by USC and partner institutions.|
|Transparent||Engages multiple perspectives and facilitates critical discussion between USC staff and with staff from other institutions.|
|Capacity building||Contributes to the professional development of participating staff and the formation of networks and contacts with staff from other institutions.|
3. The Benchmarking Process
3.1 Phase 1: Concept and Scope
(a) This phase is about deciding what type of benchmarking to use, how to undertake it and with whom. The first step is to identify and document:
- the purpose of the benchmarking activity, in terms of the intended benefits or improvements
- the proposed leader and benchmarking team members for USC
- whether the benchmarking will be undertaken collaboratively or as a desktop exercise
- the proposed partner(s) for collaborative benchmarking; or the relevant institution(s) or standards to be used for desktop benchmarking
- the proposed timescale
- to whom reports will be made regarding progress and outcomes
(b) In order to encourage collective buy-in for the project, ensure that all proposed USC benchmarking team members are consulted in discussions as part of Phase 1.
(c) Identify potential benchmarking partners by considering:
- those institutions which are most comparable to USC in terms of size, profile, and any other relevant factors
- those institutions which are perceived as having good practices or which achieve at a higher level in the area identified for benchmarking
(d) Obtain approval from the Head of School or relevant line manager to begin informal discussions with contacts from potential partner organisations.
(e) Establish a shared commitment and understanding of the benchmarking goals with proposed partners, ensuring that each partner will gain some benefit from the project.
(f) Make a formal approach to potential partners, either by email to senior staff or via the Peer Review Portal.
3.2 Phase 2: Plan and Design
(a) In this phase, the benchmarking project is planned in detail, with agreement reached regarding:
- the project team, with members from all partner organisations
- which organisation will lead the project
- the project scope (including what is not in scope)
- whether the Peer Review Portal will be used to facilitate the project
- use of a Memorandum of Understanding to be signed by all partner organisations
- several specific questions to investigate
- a schedule of action (including timelines and who will do what)
- how performance will be measured (performance indicators) and what data/information is required
- a reference group which will be consulted at each organisation
- a self-review meeting at each organisation to review the data
- a peer review workshop to discuss the findings with partner organisations
- the deliverables (e.g. a 2 page benchmarking report including recommendations, a 1 page action plan to implement those recommendations and a 2 page progress report delivered within 9 months of the initial report)
- a communication plan, with the above reports provided to team members, the portfolio manager and any relevant university committees
(b) Arrange for the project team to meet (in person or via videoconference) to develop an understanding of how each organisation operates, including the resources used, processes employed and variations between the organisations.
(c) Develop performance indicators in the form of a statement against which performance can be measured.
- Performance indicators can include inputs and outcomes, and indicators for what good performance or practice looks like.
- One performance indicator is developed for each issue or question identified.
- Ensure that agreement is reached between the partner organisations on what evidence needs to be collected against each indicator (e.g. policies, procedures, documented strategies, monitoring or review reports, survey findings and statistical data).
(a) Collect the relevant evidence, ensuring that people in the reference group are consulted appropriately.
(b) Ensure that all members of the USC project team meet to identify everything that contributes to the current performance outcomes (whether positive, negative or neutral in their impact).
(c) Make an assessment of achievement against each performance indicator. This could be undertaken using a four-point or five-point scale:
- A four-point scale, with achievement levels of: Basic, Satisfactory, Good, Excellent
- A five-point scale, with achievement levels of: Not at all, Somewhat, Adequately, Very well, Completely
3.4 Phase 4: Peer review
(a) Hold a peer review workshop with all of the partner organisations to compare processes and outcomes, with the aim of identifying:
- areas of good practice
- weaknesses and gaps in performance or process
- reasons for differences between the partners
- areas for sharing and collaboration
(b) Ensure that:
- each member of the project team has prior permission to use and share information from their institution
- other institutions’ information is not disclosed beyond the project team
(c) This may be the final stage of the collaborative benchmarking process, with individual institutions taking independent action to develop a report outlining the key findings to their own institutions.
(d) Alternatively, the project team may choose to rate and score each institution against the whole set of indicators and criteria, and to develop a report accordingly.
3.5 Phase 5: Communicate and implement improvements
(a) Develop a brief report which:
- outlines the findings of importance and relevance to USC
- identifies the priority areas for USC to address – including weaknesses, gaps and areas of strength to build upon
- includes recommendations for action
- does not include any information designated as confidential that was provided by any benchmarking partner
(b) Develop a brief action plan to address each recommendation in the report. The plan and actions must be capable of being implemented in a reasonable timeframe.
- Ensure that the plan identifies specific actions, who is responsible for each action and the target dates for completion of each action.
- The plan should be supported by the benchmarking team and all those who have specific responsibilities within the plan.
- Submit the plan to the relevant portfolio manager (e.g. HOS or Director) or university committee to gain approval.
(c) Wherever possible, include the action plan or components of the plan in operational plans for the various areas of the University that have responsibilities for specific actions.
3.6 Phase 6: Evaluate and review
(a) Meet to discuss and evaluate the effectiveness of the benchmarking exercise, including how the process could be improved.
An annual report on benchmarking activity will be provided by the Office of the Pro Vice-Chancellor (International and Quality) to the Academic Board.
5. Records management
5.1 Schools are responsible for creating detailed records of all benchmarking undertaken, including the method used, comparators employed, conclusions drawn, and the outcomes or improvements made.
5.2 Schools are responsible for ensuring all records are captured on the relevant program file in an approved records management system, in accordance with the University’s Information Management Framework – Governing Policy.