In November 2017, the UK government asked HEFCE (now UKRI / Research England) to introduce a “Knowledge Exchange Framework” [KEF] to measure effective collaboration and knowledge exchange with industry and business. It is designed to complement the already established REF[1] and TEF[2] and to evaluate the contribution universities make to the exploitation of knowledge.
After a consultation and pilots run with 21 universities in 2019, the KEF decisions report and metrics were published on 16 January 2020, followed by the Clustering and Narrative Templates report on 2 March 2020. In these reports, detailed information about the metrics and procedures can be found. The main thing to be aware of is that -unlike the REF and the TEF – it is NOT an excellence framework which measures quality. It is a purely quantitative ranking exercise that – certainly at first – will be purely of informative character.
The first iteration of the KEF will be launched in the current academic year (2019/2020) with Higher Education Institutions [HEIs] submitting their narratives between now and the end of May 2020 and results to be published in summer 2020. Similar to the TEF, the KEF will take a metrics-driven approach, though with a narrative component consisting of three brief statements in the areas of institutional context, local growth and regeneration and public and community engagement. The metrics are based on existing data sources that are available to UKRI and do not have to be submitted by the universities; the metrics will automatically be calculated but not automatically be published unless the institutions opted to participate (i.e. submitted a narrative). In the first year, participation is not compulsory, but it is highly likely that full participation will become a condition for Research England funding in future.
The key perspectives and metrics used will be
- Research Partnerships (Contribution to collaborative research and Co-authorship with non -academic partners)
- Working with business (HE-BCI[3] Contract research income with SME and non-SME business and HE-BCI Consultancy income with SME and non-SME business)
- Working with the public and third sector (HE-BCI Contract research income with the public and third sector and HE-BCI Consultancy income with the public and third sector)
- Skills, enterprise and entrepreneurship (HE-BCI CPD/CE income, HE-BCI CPD/CE learner days delivered and HE-BCI Graduate start-ups rate)
- Local growth and regeneration (Regeneration and development income from all sources and additional narrative)
- IP and Commercialisation (Estimated current turnover of all active firms, average external investment and Licensing and other IP income)
- Public and community engagement (Provisional score based on self-assessment developed with NCCPE[4] and additional narrative)
For most metrics, a three-year average will be used.
In order to make the data comparable, HEIs will be split into clusters with individual benchmarks for each cluster. The following clusters are being used in the initial year:
- Cluster E: Large universities with broad discipline portfolio across both STEM and non-STEM generating excellent research across all disciplines.
- Cluster J: Mid-sized universities with more of a teaching focus (although research is still in evidence) and academic activities across STEM and non-STEM Disciplines
- Cluster M: Smaller universities, often with a teaching focus
- Cluster V: Very large, very high research intensive and broad-discipline universities undertaking significant amounts of excellent research.
- Cluster X: Large, high research intensive and broad-discipline universities undertaking a significant amount of excellent research
- Arts specialists: Specialist institutions covering arts, music and drama
- STEM specialists: Specialist institutions covering science, technology, engineering and mathematics
For the publishing industry the metric “Co-authorship with non-academic partners” will be the only relevant one; however, it is interesting to see that this metric is the only one for which no data source has yet been found. And how should there be? No-one is collecting this data and there is certainly no such thing as a central place for comparison of this. Therefore, it remains to be seen whether this metric will survive or whether it will just become a part of the narrative and therefore based on anecdotal evidence.
Like any metrics-based system, there are different ways of looking at the data and variation of interpretation. The KEF will create a numeric ranking system of universities’ interaction with business and the public, but how meaningful this will be and what exactly it will tell us, is unclear.
[Written by Annika Bennett, Gold Leaf]
[1] Research Excellence Framework
[2] Teaching Excellence Framework
[3] Higher Education Business & Community Interaction (HE-BCI) survey