LLMpediaThe first transparent, open encyclopedia generated by LLMs

Altmetrics Manifesto

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Digital Science Hop 6
Expansion Funnel Raw 109 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted109
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Altmetrics Manifesto
TitleAltmetrics Manifesto
Date2010
AuthorsJason Priem, Duke University (affiliation), Paul Groth, Heather Piwowar, Mendeley, PLOS
SubjectResearch impact, scholarly communication, research metrics

Altmetrics Manifesto

The Altmetrics Manifesto is a 2010 proclamation advocating for new ways to measure research influence beyond traditional citation-based metrics. It emerged from cross-disciplinary debates involving Jason Priem, Heather Piwowar, Paul Groth, and groups associated with PLOS, Duke University, and Mendeley, seeking alternatives to metrics used by Clarivate, Thomson Reuters, and institutions that rely on the Journal Citation Reports and Journal Impact Factor. The Manifesto influenced discussions at venues such as Association for Computing Machinery, Public Library of Science, Society for Scholarly Publishing, OpenAIRE, and policy forums involving National Institutes of Health and Wellcome Trust.

Background and Origins

The Manifesto arose amid controversies over the Journal Impact Factor, debates at conferences like Altmetrics conferences and meetings hosted by SPARC, Internet Archive, and Mozilla Foundation, and responses to policy shifts at funders such as National Science Foundation and European Commission. Its authors drew on earlier work by scholars associated with Harvard University, Stanford University, University of Oxford, University of Cambridge, University of California, Berkeley, and research platforms including arXiv, SSRN, and ResearchGate. Inspiration also came from innovations at Google Scholar, social platforms such as Twitter, Facebook, and LinkedIn, and reference managers like CiteULike and Zotero.

Key Concepts and Definitions

Central concepts include capturing attention, engagement, and influence via alternative indicators such as social media mentions, online bookmarking, blog coverage, news coverage, and policy citations. The Manifesto proposed indicators analogous to those used by Thomson Reuters and Scopus but derived from sources like Mendeley, F1000, Reddit, YouTube, and mainstream outlets such as The New York Times, The Guardian, and Nature. It defined metrics to complement rather than replace measures used by Institute for Scientific Information, Elsevier, Springer Nature, and academic libraries at institutions like University of Michigan and Columbia University.

Development and Implementation

Implementation efforts involved collaborations among startups, publishers, and academic projects including Altmetric.com, Impactstory, PlumX, PLOS ALM, and aggregators that partnered with Crossref, DataCite, ORCID, and DOAJ. Pilot studies appeared in journals such as PLOS ONE, Nature, Science, PNAS, and platforms supported by organizations like Wellcome Trust and Bill & Melinda Gates Foundation. Workshops and working groups convened at locations including MIT, Stanford University, University College London, and University of Toronto to standardize definitions and APIs.

Reception and Criticism

Reception ranged from enthusiastic uptake among open science advocates at Open Science Framework and Creative Commons to skepticism from proponents of established metrics at Clarivate Analytics and Elsevier. Critics raised concerns echoed in analyses by researchers at Johns Hopkins University, Max Planck Society, European Research Council, and commentators in The Lancet and BMJ about gaming, lack of normalization, and disciplinary bias. Debates occurred at conferences like International Conference on Scientometrics and in policy statements from bodies such as Council of Australian University Librarians and national academies including Royal Society.

Impact on Scholarly Communication

The Manifesto catalyzed integration of altmetrics into institutional dashboards at universities like University of Oxford, University of California, Los Angeles, Yale University, and research assessment exercises influenced by Research Excellence Framework discussions. Publishers including Elsevier, Springer Nature, Wiley, and Taylor & Francis incorporated alternative indicators into article pages; funders such as NIH, European Research Council, and Wellcome Trust considered broader impact narratives. It informed policy dialogues at UNESCO, OECD, and national agencies including UK Research and Innovation.

Methodologies and Data Sources

Methodological approaches combine web mining, API harvesting, and bibliometrics drawing on datasets from Crossref Event Data, Twitter API, Facebook Graph API, Mendeley API, news aggregators such as LexisNexis and Factiva, and archives like Wayback Machine. Techniques reference standards and identifiers from DOI infrastructure, ORCID, and protocols used by CrossRef and DataCite. Validation studies were conducted by teams at University of Amsterdam, Leiden University, Indiana University, and University of Montreal comparing altmetric signals with citation indices from Scopus and Web of Science.

Future Directions and Challenges

Future work emphasizes interoperability with persistent identifier systems like ORCID and ROR, improved normalization across disciplines represented at ICMJE meetings, mitigation of gaming risks highlighted by auditors at KPMG and Deloitte, and ethical considerations raised by privacy advocates associated with Electronic Frontier Foundation and data protection authorities such as European Data Protection Board. Ongoing efforts involve standardization initiatives at NISO, governance conversations at World Economic Forum and persistent engagement from research communities at AAAS, ACM, and IEEE to ensure robustness, equity, and usefulness across institutions including UNESCO and national research councils.

Category:Scholarly communication