Generated by GPT-5-mini| Research Excellence Framework | |
|---|---|
| Name | Research Excellence Framework |
| Established | 2014 |
| Country | United Kingdom |
| Administered by | UK Research and Innovation |
Research Excellence Framework The Research Excellence Framework is a periodic national assessment that evaluates research quality in United Kingdom higher education institutions. It informs funding allocations from bodies such as UK Research and Innovation, influences institutional strategy at University of Oxford, University of Cambridge, University College London, and shapes reputational rankings used by Times Higher Education and QS World University Rankings. The exercise connects to policy instruments administered by Department for Business, Innovation and Skills and advisory reports from the Higher Education Funding Council for England and devolved funding councils such as Scottish Funding Council.
The framework assesses outputs, impact, and environment across units of assessment within institutions including Imperial College London, London School of Economics, University of Edinburgh, University of Manchester, and King's College London. It is overseen by panels drawn from subject bodies like Royal Society, British Academy, and professional organizations including Royal Society of Chemistry and Institute of Physics. Results are used by funders—chiefly Research England for England, with parallel roles for Research Councils UK predecessors—and inform performance-related research funding distributed to providers such as Queen Mary University of London and University of Warwick.
The framework evolved from earlier assessments: the predecessor exercises Research Assessment Exercise 1992, Research Assessment Exercise 1996, Research Assessment Exercise 2001, and Research Assessment Exercise 2008. Policy changes were shaped by inquiries like the Wakeham Review and the Browne Review, and by political decisions in the UK Parliament and ministries including Department for Education. Key milestones include pilots with panels from institutions such as University of Glasgow and reports commissioned by Higher Education Funding Council for England which led to the first full implementation cycle replacing the RAE.
Panels—chaired by senior academics from institutions such as University of Oxford and University of Cambridge—evaluate submissions using criteria that weight metrics across outputs, impact case studies, and research environment statements. Metrics interact with bibliometric data from Web of Science and Scopus and with grant records from European Research Council and national funders such as Medical Research Council and Arts and Humanities Research Council. Outputs are assessed across subject panels corresponding to learned societies like Royal Historical Society, Royal College of Surgeons, and Royal Geographical Society. The methodology includes peer review processes shaped by advice from bodies including Committee of University Chairs and external auditors from organizations such as KPMG in some advisory capacities.
Outcomes influence resource allocation to universities including University of Birmingham, University of Leeds, University of Bristol, Durham University, and Lancaster University. Institutional responses have led to hiring initiatives targeting fellows from organisations like Wellcome Trust and collaborations with partners such as NHS England, National Institute for Health and Care Research and international partners like Max Planck Society and Institut Pasteur. The framework has driven strategic planning at collegiate systems such as University of London and contributed to league table movements in publications tracked by Clarivate Analytics and student recruitment patterns influenced by agencies including UCAS.
Critiques have been raised by academics affiliated with University of Sussex, Goldsmiths, University of London, SOAS University of London, and trade unions such as University and College Union citing effects on workload and employment practices linked to casualization debates referenceable to cases at University of Leicester and University of Portsmouth. Concerns over metrics-driven assessment implicate providers of bibliometrics like Elsevier and debates involving regulatory frameworks such as those debated before House of Commons Select Committee inquiries. High-profile controversies include disputes over impact case study assessments involving researchers connected to Wellcome Trust Sanger Institute and disagreements about panel composition raised at meetings of the Russell Group.
Reform proposals have been advanced by advisory groups including panels convened by UK Research and Innovation, recommendations from think tanks like Higher Education Policy Institute and reviews spearheaded by figures associated with University of Oxford and University of Cambridge. Future directions discuss integration of research information systems such as PURE and Symplectic, increased use of responsibly framed metrics endorsed by organizations like Leiden University collaborators, and strengthened alignment with funding mechanisms of European Research Area partners. Debates continue in assemblies of representatives from institutions including Open University and consortia such as the Cathedrals Group about balancing peer review, bibliometrics, and recognition of interdisciplinary work across centres like Alan Turing Institute.