Generated by GPT-5-mini| SCImago Research Group | |
|---|---|
| Name | SCImago Research Group |
| Type | Research evaluation |
| Founded | 2007 |
| Headquarters | Spain |
SCImago Research Group is an independent research organization established in 2007 that produces bibliometric indicators and ranking tools used by universities, research institutes, and policy bodies. It is best known for creating the SCImago Journal & Country Rank, a platform relied upon by scholars, librarians, and funding agencies to assess publication influence. The group interacts with academic institutions, bibliographic databases, and standards bodies across international networks.
The group was founded in 2007 amid developments associated with Thomson Reuters, Elsevier, Clarivate, Google Scholar, Harvard University, and University of Granada initiatives that reshaped academic indexing and metrics. Early work referenced datasets and projects linked to Scopus, Web of Science, PubMed, CrossRef, and DOAJ, aligning with contemporaneous efforts by Centre for Science and Technology Studies, Leiden University, European Commission, and UNESCO to formalize research indicators. The group released the SCImago Journal & Country Rank platform, which built on methodologies discussed in forums involving OECD, World Bank, Wellcome Trust, European Research Council, and national ministries of science.
The group's leadership has included researchers and developers with affiliations to institutions such as University of Granada, Instituto de Salud Carlos III, Consejo Superior de Investigaciones Científicas, and collaborations with analysts from RAND Corporation, Brookings Institution, Max Planck Society, and CNRS. Advisory interactions have occurred with figures and bodies connected to Journals Citation Reports, Institute for Scientific Information, Association of University Presses, and consultancies used by Australian Research Council, Research Councils UK, National Science Foundation, and National Institutes of Health. Governance has combined academic directors, technical staff, and policy advisors with ties to European University Association and regional agencies.
The SCImago Journal & Country Rank is the group's flagship product and is used alongside rival products from Journal Citation Reports, Eigenfactor Project, Microsoft Academic, and Google Scholar Metrics. SJR provides indicators for journals and countries and is consulted by stakeholders at University of Oxford, Massachusetts Institute of Technology, Stanford University, University of Cambridge, Yale University, and national evaluation agencies including CNPq, CAPES, ANVUR, and CONACYT. The platform's rankings are compared in studies by researchers at Columbia University, University of California, Berkeley, University of Michigan, and Imperial College London.
SJR's methodology employs citation-weighting algorithms and normalization procedures that echo approaches discussed in literature from Eugene Garfield, Derek de Solla Price, Alfred P. Sloan Foundation, and the Leiden Manifesto proponents. Indicators include an SJR score, H-index variants, and subject-area classifications mapped against schemes used by Medical Subject Headings, Library of Congress, European Research Council panels, and standards promoted by CrossRef and ORCID. Methodological debates reference work from PLOS, Nature, Science (journal), and analytic teams at CWTS Leiden, INSEAD, and National Bureau of Economic Research.
The group bases its indicators on bibliographic data derived from large-scale indices such as Scopus, supplemented by metadata from CrossRef, PubMed, DOAJ, and institutional repositories at HAL (open archive), arXiv, and university presses including Oxford University Press and Cambridge University Press. Collaborative or comparative engagements have involved Elsevier data scientists, librarians from Library of Congress, and consortia like Knowledge Unlatched and OpenAIRE. External evaluations and methodological papers reference collaborations with organizations such as UNESCO, European Commission, World Health Organization, and national statistics offices.
SJR has been cited in bibliometric research published in venues like Scientometrics, Journal of Informetrics, Research Policy, and PLOS ONE and has been evaluated alongside metrics from Journal Citation Reports and Eigenfactor. Critiques have come from scholars associated with Leiden University, Humboldt-Universität zu Berlin, University of Chicago, and proponents of the San Francisco Declaration on Research Assessment who question reliance on citation-based metrics. Debates reference alternative frameworks promoted by DORA, COARA, Open Science Framework, and advocacy groups influencing policy at European Research Council and national funders.
SJR is used by universities, libraries, and research managers at institutions such as Harvard University, University of Toronto, National University of Singapore, University of Sydney, and agencies like National Science Foundation and European Commission for benchmarking, collection development, and assessment. It informs decisions in promotion committees, grant evaluations, and strategic planning alongside tools from Clarivate Analytics, Elsevier Research Intelligence, and open platforms like Dimensions. Applications are discussed in reports by OECD, World Bank, and national research assessment exercises including REF (United Kingdom) and comparable frameworks in Italy, Spain, and Brazil.