Generated by GPT-5-mini| Impactstory | |
|---|---|
| Name | Impactstory |
| Type | Nonprofit |
| Founded | 2011 |
| Founder | Heather Piwowar; Jason Priem |
| Headquarters | Portland, Oregon |
| Focus | Open access; scholarly communication; research metrics |
Impactstory
Impactstory is an organization that developed open-source tools for measuring the online attention and reuse of scholarly outputs. Founded by advocates of open access reform and alternative metrics, it aimed to provide researchers, institutions, and funders with transparent, researcher-centered indicators complementary to traditional citation counts. Its tools intersected with contemporary initiatives in scholarly communication promoted by actors such as SPARC and debates around the San Francisco Declaration on Research Assessment.
Impactstory was established in 2011 by advocates associated with projects in the open science and digital scholarship movements, including founders who previously contributed to platforms tied to the Public Library of Science and early altmetrics experiments. Early development drew on community conversations at meetings such as the Association of Research Libraries gatherings and conferences like Force11 and OpenCon. Initial pilots focused on integrating social-media signals evident in platforms like Twitter, Mendeley, and GitHub, alongside data from aggregators such as Crossref and indexing services like Web of Science and Scopus. Over the 2010s, the project evolved in response to critiques from stakeholders including representatives from Nature (journal), Science (journal), and national research funders such as the National Institutes of Health and the Wellcome Trust.
Impactstory's stated mission emphasized transparency, researcher control, and broadened recognition of diverse research outputs beyond traditional journal articles. Services included researcher profiles, article-level metrics dashboards, and badges indicating open availability, which connected to initiatives by organizations like DOAJ and Creative Commons. The project aimed to supplement evaluation practices promoted by groups such as the Declaration on Research Assessment signatories and to provide interoperable outputs usable by repositories like Zenodo and institutional systems at universities such as Harvard University and University of Oxford. It also collaborated with infrastructure providers, including ORCID and Crossref, to align identifiers and metadata.
Impactstory aggregated a range of indicators: social attention from platforms including Twitter, Facebook, and Reddit; reference manager readership from Mendeley; code reuse signals from GitHub; and citation data where available from services like Crossref and Google Scholar. Methodological choices prioritized transparent provenance, exposing sources and procedures used to compute metrics in ways resonant with recommendations from the Committee on Publication Ethics and standards discussed at OpenAIRE workshops. The project experimented with altmetric-normalization approaches similar to debates in literature tied to Eigenfactor and h-index critiques, seeking to contextualize counts by discipline and time. It also produced derivations such as openness badges referencing Creative Commons licensing and availability in repositories compliant with ROARMAP.
Responses to Impactstory varied across communities. Advocates in the open access and open science movements cited its role in popularizing article-level metrics and influencing discussions at venues like Peer Review Congress and ScienceOpen. Librarians at institutions such as the University of California system incorporated its ideas when advising on researcher profiles and promotion dossiers, drawing comparisons to proprietary services from vendors discussed at International Federation of Library Associations and Institutions events. Critics from traditional bibliometrics circles associated with Leiden University and analysts at organizations such as the Institute for Scientific Information questioned the robustness and susceptibility to gaming of social indicators, echoing debates around metrics reform exemplified by the San Francisco Declaration on Research Assessment. The project informed policy discussions at funders including the European Commission and national agencies like the National Science Foundation.
Funding for the organization came from a mix of philanthropic foundations and grantmakers active in scholarly communication reform, including supporters similar to Alfred P. Sloan Foundation, MacArthur Foundation, and grant programs coordinated by entities such as Ithaka S+R. Governance structures incorporated advisory input from scholars, librarians, and technologists who had affiliations with institutions like MIT, Stanford University, and University College London. The organization adhered to nonprofit governance practices common among entities registered in the United States and engaged with donor-driven initiatives alongside community steering processes advocated by groups like Mozilla Foundation.
Technically, the project built on open-source components and web APIs to harvest and normalize metadata, employing identifiers from DOI registries managed by Crossref and persistent identifiers from ORCID for author disambiguation. It integrated with repository software such as DSpace and EPrints and exposed data for reuse via APIs patterned after practices encouraged by DataCite and OpenAIRE. The architecture emphasized reproducibility and transparency, leveraging software-development workflows influenced by projects hosted on GitHub and continuous-integration paradigms discussed at conferences like PyCon.