LLMpediaThe first transparent, open encyclopedia generated by LLMs

InCites

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 103 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted103
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
InCites
NameInCites
DeveloperClarivate
TypeBibliometric analytics
Released2012

InCites is a bibliometric analytics platform designed to evaluate research output, citation impact, and institutional performance. It is used by universities, research funders, libraries, and government agencies to benchmark publications, analyze collaboration patterns, and inform strategic planning. The platform aggregates large-scale citation data, proprietary indices, and institutional metadata to support decision-making across research-intensive organizations.

Overview

InCites provides metrics-driven reporting for research entities such as Harvard University, University of Oxford, Massachusetts Institute of Technology, Stanford University, and University of Cambridge and for funders like the National Institutes of Health, the European Commission, and the Wellcome Trust. It integrates citation indices historically associated with Web of Science, allowing users to compare outputs from institutions such as University of California, Berkeley, Princeton University, Yale University, Columbia University, and California Institute of Technology. Administrators from Imperial College London, ETH Zurich, Peking University, Tsinghua University, and National University of Singapore often apply its dashboards to complement benchmarking against consortia like the Association of American Universities and regional rankings such as the Times Higher Education tables.

Data Sources and Coverage

The platform draws on bibliographic databases and citation indexes related to sources like Science Citation Index Expanded, Social Sciences Citation Index, and Arts & Humanities Citation Index, and aggregates metadata from publishers such as Elsevier, Springer Nature, Wiley-Blackwell, Taylor & Francis Group, and SAGE Publications. Coverage spans journals included in titles collated by organizations like Clarivate, cross-referenced with identifiers issued by CrossRef and author identifiers such as ORCID. It catalogues outputs tied to institutions including University of Tokyo, Seoul National University, University of Melbourne, McGill University, and University of Toronto and incorporates document types used by repositories such as PubMed Central, arXiv, SSRN, RePEc, and bioRxiv.

Features and Tools

Key tools include benchmarking dashboards used by leaders at Johns Hopkins University, University of Michigan, University of Washington, Cornell University, and Duke University; collaboration maps used by consortia like CERN and research networks such as Human Genome Project collaborators; and funding attribution modules relevant to agencies such as the National Science Foundation and the European Research Council. Analytics features mirror metrics familiar from lists like the Journal Citation Reports, offering normalizations comparable to indicators championed by organizations including the Leiden University research metrics groups and techniques referenced by the Royal Society. Visualization options enable map-based views employed by institutions such as Australian National University, University College London, McMaster University, University of Hong Kong, and University of São Paulo.

Users and Use Cases

Primary users are research administrators at institutions such as Brown University, Vanderbilt University, University of Illinois Urbana–Champaign, University of Pennsylvania, and University of Chicago; librarians at organizations like the British Library and the Library of Congress; and metrics analysts in ministries such as the Ministry of Education (China), Department for Education (UK), and U.S. Department of Education. Use cases include strategic planning by boards of trustees at institutions like Rutgers University and Ohio State University, grant portfolio analysis for agencies like the Gates Foundation, and national performance assessments similar to programs run by Research Councils UK and the Canadian Institutes of Health Research.

Criticisms and Limitations

Critiques often reference debates involving figures and bodies such as DORA, Hirsch, Eugene Garfield, and the San Francisco Declaration on Research Assessment when discussing reliance on citation-based metrics. Concerns voiced by scholars at universities such as Goldsmiths, University of London, University of Leiden, and University of Amsterdam include database coverage gaps highlighted in analyses by teams from Centre for Science and Technology Studies, the Max Planck Society, and the Royal Netherlands Academy of Arts and Sciences. Methodological limitations are compared to alternative datasets from Scopus and to bibliometric evaluations advocated by groups like COAR and SPARC; critics from think tanks such as the Pew Research Center and policy units at OECD have likewise underscored issues of transparency and field normalization.

History and Development

Launched in the early 2010s by a team associated with Thomson Reuters research analytics, the platform evolved under corporate transitions involving firms such as Clarivate Analytics and uses legacy assets linked to the work of pioneers like Eugene Garfield and services associated with Institute for Scientific Information. Development discussions involved partnerships with academic centers including Leiden University, CWTS, and policy stakeholders such as the European University Association and national research offices in countries like Germany, France, Japan, South Korea, and Brazil. Over time, feature enhancements paralleled shifts in scholarly communication exemplified by initiatives from COPE, repositories like Zenodo, and open science policies advanced by funders including Wellcome Trust and Bill & Melinda Gates Foundation.

Category:Bibliometrics