LLMpediaThe first transparent, open encyclopedia generated by LLMs

Clarivate Highly Cited Researchers

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 151 → Dedup 4 → NER 3 → Enqueued 3
1. Extracted151
2. After dedup4 (None)
3. After NER3 (None)
Rejected: 1 (not NE: 1)
4. Enqueued3 (None)
Clarivate Highly Cited Researchers
NameClarivate Highly Cited Researchers
Formation2014 (annual list)
HeadquartersPhiladelphia

Clarivate Highly Cited Researchers The Clarivate Highly Cited Researchers list is an annual compilation identifying scientists and social scientists whose publications rank in the top 1% by citations for field and year, derived from Clarivate Analytics' bibliometric databases. It is widely used alongside indices such as the Science Citation Index and the Web of Science to signal influence among researchers associated with institutions like Harvard University, Stanford University, and Chinese Academy of Sciences. The list intersects with awardees and affiliates of organizations including the Nobel Prize, Royal Society, and the National Academy of Sciences.

Overview

The list highlights individuals across fields connected to journals indexed in the Web of Science Core Collection, drawing attention to researchers comparable to laureates from the Nobel Prize in Physiology or Medicine, Nobel Prize in Chemistry, and recipients of honors such as the Breakthrough Prize and the Lasker Award. Many listed researchers hold appointments at institutions such as Massachusetts Institute of Technology, University of Oxford, University of Cambridge, California Institute of Technology, University of California, Berkeley, Johns Hopkins University, Yale University, Columbia University, Princeton University, Imperial College London, Peking University, Tsinghua University, University of Toronto, ETH Zurich, Max Planck Society, Seoul National University, University of Tokyo, Australian National University, McGill University, University of Melbourne, Karolinska Institutet, National University of Singapore, Indian Institute of Science, Federal University of Rio de Janeiro, University of São Paulo, University of Hong Kong, King's College London, University of Edinburgh, KU Leuven, Sorbonne University, University of Copenhagen, University of Amsterdam, University of Michigan, University of Chicago, Northwestern University, Purdue University, University of California, San Diego, University of Washington, University of Illinois Urbana–Champaign, University of Pennsylvania, Cornell University, Duke University, Brown University, University of California, Los Angeles, Rice University, University of Texas at Austin, University of British Columbia, Monash University, University of Zurich, University of Basel, Sechenov University, Moscow State University, Tel Aviv University, Weizmann Institute of Science, Technion – Israel Institute of Technology, King Abdullah University of Science and Technology, Saudi Aramco.

Selection Criteria and Methodology

Clarivate identifies researchers whose papers are among the most frequently cited using citation thresholds established from the Web of Science database and related indexes such as the Journal Citation Reports. The methodology parallels approaches used by entities like Scopus and organizations behind the Eigenfactor metric and the h-index concept popularized via studies from Google Scholar-associated research. The selection uses citation clusters mapped to subject categories akin to taxonomies used by the Institute for Scientific Information and cross-references affiliations similar to how the Times Higher Education and the QS World University Rankings handle institutional addresses. Data cleaning processes echo practices from the Open Researcher and Contributor ID (ORCID) initiative and name-disambiguation efforts related to the Digital Object Identifier system. The approach raises methodological comparisons with bibliometric analyses by the Organisation for Economic Co-operation and Development (OECD) and the European Research Council.

Annual lists, published since the mid-2010s, have shown trends such as growth in representation from institutions in China, India, Brazil, and South Korea, reflecting broader changes tracked by the United Nations Educational, Scientific and Cultural Organization (UNESCO) and reports by the World Bank. The presence of researchers affiliated with corporations mirrors patterns seen at Google Research, Microsoft Research, IBM Research, Baidu Research, Tencent AI Lab, Pfizer, Roche, Novartis, GlaxoSmithKline, and Samsung Research. Notable named scientists who have appeared on lists include figures associated with the Human Genome Project, contributors to the Intergovernmental Panel on Climate Change, leaders from the CERN collaborations, and investigators tied to the Large Hadron Collider and projects like LIGO. Shifts in subject-area representation echo developments highlighted by the Royal Society reports on artificial intelligence and biotechnology, and by policy analyses from the European Commission.

Impact and Significance

In academic hiring, promotion, and funding contexts, the list is invoked alongside credentials such as membership in the National Academy of Engineering, election to the American Academy of Arts and Sciences, or prizes like the MacArthur Fellowship. Institutions use listings in promotional material similar to how universities publicize endowment gifts from benefactors like the Gates Foundation or collaborations with Wellcome Trust. The list influences perceptions relevant to grant panels at agencies such as the National Institutes of Health, the European Research Council, the National Science Foundation, and the Japan Society for the Promotion of Science. It also factors into bibliometric studies comparing output from consortia including the Human Cell Atlas and networks such as the Allen Institute for Brain Science.

Controversies and Criticisms

Critiques of the list align with broader debates about citation metrics raised by signatories of the San Francisco Declaration on Research Assessment (DORA) and commentators at institutions like University College London, Cambridge University Press, and Elsevier. Concerns include gaming of author affiliations reminiscent of practices reported involving some researchers at institutions such as King Fahd University of Petroleum and Minerals and accusations of "honorary" affiliations paralleling controversies tied to corporate appointments at Alphabet Inc. subsidiaries. Critics compare list limitations to issues documented by analysts at The Lancet, Nature, Science (journal), and commentators in the New York Times and The Guardian. Debates reference policy responses from funders including the Wellcome Trust and the Gates Foundation.

Country and Institutional Rankings

Analyses of country and institutional counts derived from the list are often cited in national science assessments like those by China's Ministry of Science and Technology, India's Department of Science and Technology, UK Research and Innovation, and reports by the Organisation for Economic Co-operation and Development. Rankings frequently place United States universities at the top, followed by growing tallies for China, United Kingdom, Germany, Canada, Australia, Japan, South Korea, France, Switzerland, Netherlands, Sweden, Italy, Spain, Belgium, Israel, Singapore, Brazil, India, and Russia. Institutional competition recalls historical rivalries among Harvard University, Stanford University, Massachusetts Institute of Technology, University of Oxford, and University of Cambridge. The list has prompted policy discussion in national venues such as the Chinese Academy of Sciences Annual Conference, meetings of the European Research Council, and ministry briefings in capitals like Washington, D.C., Beijing, London, and New Delhi.

Category:Bibliometrics