LLMpediaThe first transparent, open encyclopedia generated by LLMs

Transparency and Openness Promotion Guidelines

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Open Science Framework Hop 4
Expansion Funnel Raw 93 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted93
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Transparency and Openness Promotion Guidelines
NameTransparency and Openness Promotion Guidelines
AbbreviationTOP Guidelines
Established2015
AuthorsCenter for Open Science
DisciplineResearch methodology
TypeReporting standards
CountryUnited States

Transparency and Openness Promotion Guidelines The Transparency and Openness Promotion Guidelines provide standardized standards for improving reproducibility and reporting in scholarly research. Originating from collaborative initiatives among institutions like the Center for Open Science, the guidelines intersect with practices promoted by National Institutes of Health, National Science Foundation, Wellcome Trust, European Commission, and publishers such as the American Psychological Association and Elsevier. They address data sharing, code availability, preregistration, and materials transparency across disciplines influenced by bodies such as Committee on Publication Ethics, Scholarly Publishing and Academic Resources Coalition, Association of American Universities, and Open Research Funders Group.

Overview

The guidelines define modular standards spanning citation, data, analytic code, materials, design transparency, and preregistration, drawing on frameworks used by National Academies of Sciences, Engineering, and Medicine, Cochrane Collaboration, CONSORT, PRISMA, and EQUATOR Network. They categorize compliance levels to enable adoption by journals like Nature (journal), Science (journal), The Lancet, PLOS, and BMJ while coordinating with funders such as Gates Foundation, Howard Hughes Medical Institute, and Wellcome Trust. The initiative has been promoted in forums including Society for Neuroscience, American Association for the Advancement of Science, International Congress on Peer Review and Scientific Publication, and via partnerships with repositories like Dryad, Zenodo, Figshare, and OSF (Open Science Framework). The guidelines interact with legal and policy contexts exemplified by General Data Protection Regulation, Health Insurance Portability and Accountability Act, Freedom of Information Act, and institutional review frameworks at universities such as Harvard University, University of Oxford, Stanford University, and Massachusetts Institute of Technology.

Development and History

Development began with pilot conversations among researchers, editors, and funders convened by the Center for Open Science, involving signatories and stakeholders including Royal Society, American Psychological Association, Society for Research in Child Development, Association for Psychological Science, and publishers like John Wiley & Sons and Springer Nature. Early workshops referenced methodological critiques by scholars associated with institutions such as University of Cambridge, Princeton University, University of California, Berkeley, and Yale University and responded to replication crises highlighted in high-profile cases like the debates around findings published in Science (journal), PNAS (Proceedings of the National Academy of Sciences), and discipline-specific controversies involving labs at Columbia University and University of Amsterdam. Formal release in 2015 was accompanied by endorsements from organizations including the American Statistical Association, Society for Industrial and Applied Mathematics, International Neuroinformatics Coordinating Facility, and funders like UK Research and Innovation and Australian Research Council.

Core Standards and Practices

The TOP framework defines modular standards across eight domains: citation, data, analytic methods (code), research materials, design and analysis transparency, preregistration of studies, preregistration of analysis plans, and replication. Implementation guidance references methodological standards used by CONSORT, STROBE, ARRIVE, PRISMA, and statistical practice bodies such as the American Statistical Association and aligns with repository practices at GitHub, Bitbucket, Dryad, and Zenodo. The levels of openness (ranging from disclosure to required sharing) are designed for integration with editorial policies at journals like Psychological Science, Journal of Neuroscience, American Journal of Public Health, and Nature Communications and with grant conditions imposed by National Institutes of Health, European Research Council, and philanthropic funders such as the Wellcome Trust and Gates Foundation.

Implementation and Certification

Adoption pathways include editorial policy changes, funder mandates, and institutional requirements; implementation has been tracked via registries and badges introduced by journals such as Psychological Science and platform integrations through Open Science Framework, Crossref, ORCID, and publisher workflows at Springer Nature and Elsevier. Certification and monitoring mechanisms involve badge systems, peer review checklists used by editorial boards at PLOS, BMJ, and Frontiers, and compliance audits similar to oversight practices at National Institutes of Health and Wellcome Trust. Training and infrastructure support are provided through workshops by organizations like Center for Open Science, Data Carpentry, Software Carpentry, and university programs at University of California, San Diego, University of Michigan, and ETH Zurich.

Impact on Research Reproducibility

Empirical studies assessing the guidelines’ effects reference replication projects such as the Reproducibility Project (psychology), large-scale replication efforts involving teams at Open Science Collaboration, and follow-up analyses by researchers at Meta-Research Innovation Center at Stanford and Centre for Evidence-Based Medicine. Outcomes reported include increased data availability in journals like PLOS ONE, improved code sharing via GitHub and Zenodo, and enhanced transparency in clinical trial reporting seen in The BMJ and Lancet Psychiatry. Meta-analyses conducted by groups at Stanford University, University College London, and Max Planck Society suggest variable effects across fields; some domains show measurable gains in reproducibility metrics while others face persistent barriers related to proprietary data, ethical restrictions, and resource constraints at institutions such as Johns Hopkins University and University of Toronto.

Criticisms and Limitations

Critiques arise from stakeholders including editors at Nature, methodologists at American Statistical Association, and ethicists connected to Georgetown University and Harvard Medical School who highlight concerns about feasibility, equity, and unintended consequences. Specific limitations cited include challenges with sensitive data under General Data Protection Regulation and Health Insurance Portability and Accountability Act, resource burdens on researchers at smaller institutions like State University of New York campuses, and potential misuse of shared materials documented in debates involving University of Cambridge and McGill University. Others, including commentators at Times Higher Education and advocacy groups like Scholarly Publishing and Academic Resources Coalition, note the risk of token compliance, disciplinary heterogeneity in norms exemplified by contrasts between High Energy Physics collaborations at CERN and fieldwork-based disciplines at University of Cape Town, and the need for sustained investments by funders such as National Science Foundation and philanthropic bodies including Gates Foundation.

Category:Research transparency