LLMpediaThe first transparent, open encyclopedia generated by LLMs

Metric Tide

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 66 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted66
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Metric Tide
NameMetric Tide
SubjectResearch assessment, bibliometrics, altmetrics
Published2015
AuthorsJames Wilsdon, Helen Adams, Martin Hawkins, Sheila H. H. Smith
PublisherHigher Education Funding Council for England, Digital Science, Wellcome Trust
LanguageEnglish

Metric Tide

Metric Tide was a major independent review of the role of quantitative indicators in research assessment and management, commissioned to evaluate the use of bibliometrics, altmetrics and related quantitative tools across United Kingdom research institutions such as University of Oxford, University of Cambridge, and University College London. Led by James Wilsdon, the review examined interactions among researchers, funders like the Wellcome Trust and UK Research and Innovation, publishers such as Elsevier and Springer Nature, and infrastructure providers including CrossRef and ORCID. The report informed debates involving bodies like the Higher Education Funding Council for England and the Research Excellence Framework.

Background and origins

The origins trace to discussions among stakeholders including Research Councils UK, Higher Education Funding Council for England, Wellcome Trust, and technology firms such as Digital Science following controversies around the Research Excellence Framework and use of citation indicators produced by providers like Clarivate Analytics and Scopus. High-profile incidents involving institutions such as University of Warwick and individuals associated with King's College London prompted examination of metric-driven incentives, with input from scholars linked to London School of Economics, University of Edinburgh, and Goldsmiths, University of London. The review convened advisory members from organizations including Jisc, Committee on Publication Ethics, and Royal Society to assess how indicators used by National Institutes of Health-style funders and philanthropic bodies affected behavior.

The Metric Tide review (2015)

Published in 2015 and chaired by James Wilsdon, the review combined qualitative case studies from institutions such as University of Manchester and University of Bristol with quantitative analysis drawing on data from Web of Science, Scopus, and repositories like arXiv. Contributors included academics affiliated with University of Sussex, University of Strathclyde, and University of Sheffield, and drew upon international comparisons with systems in United States, Australia, and Netherlands. The process featured consultation with publishers (Nature Publishing Group, Taylor & Francis), metrics vendors (Elsevier), and scholarly infrastructure projects including DataCite and CrossRef.

Key findings and recommendations

The review found that indicators such as those from Journal Citation Reports and metrics used by Google Scholar providers were being misapplied in hiring and promotion at universities like Imperial College London and University of Glasgow. It recommended development of principles similar to those later echoed by San Francisco Declaration on Research Assessment advocates and greater adoption of persistent identifiers like those from ORCID and DataCite. The report urged funders including Wellcome Trust and Research Councils UK to support responsible metrics, to improve transparency from vendors like Clarivate Analytics and to invest in training via organizations such as Jisc and Royal Society of Edinburgh.

Reception and impact on research assessment

Responses came from bodies including Higher Education Funding Council for England, Research England, Universities UK, and learned societies like the British Academy and Royal Society. University leadership at University of Birmingham and University of Leeds incorporated elements of the review into strategic planning, while publishers including Elsevier and Springer Nature engaged with follow-up consultations. International actors such as European Commission and funders like the National Science Foundation monitored implications for grant evaluation, and advocacy groups tied to San Francisco Declaration on Research Assessment and the European University Association promoted metric literacy.

Implementation and policy responses

Policymakers at Research England and funders including Wellcome Trust and UK Research and Innovation adopted guidance inspired by the review, encouraging use of qualitative narratives alongside indicators in frameworks like the Research Excellence Framework and institutional assessment at University of Southampton and University of Exeter. Commercial vendors such as Clarivate Analytics and Elsevier launched transparency initiatives, while infrastructure projects including ORCID and CrossRef expanded services to support attribution. Training programs were developed in partnership with Jisc, Royal Society and Higher Education Academy.

Criticisms and controversies

Critics from institutions such as University of Oxford and commentators in outlets connected with Times Higher Education argued the review did not go far enough to constrain commercial metrics firms like Clarivate Analytics and Elsevier. Some scholars associated with London School of Economics and University of Copenhagen contended that the recommendations favored existing infrastructures and failed to address power imbalances linked to publishers like Nature Research and databases such as Web of Science. Debates continued over the role of altmetrics promoted by companies including Altmetric and the potential for gaming highlighted by analysts from University of California, Berkeley and Harvard University.

Category:Research assessment