JMIR Publications recently published “Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality” in the Journal of Medical Internet Research (JMIR), which reported that improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature, but assessing measures of transparency tends to be very difficult if performed manually by reviewers.
Credit: Licensed by JMIR
JMIR Publications recently published “Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality” in the Journal of Medical Internet Research (JMIR), which reported that improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature, but assessing measures of transparency tends to be very difficult if performed manually by reviewers.
Video interview with the authors of this article: https://youtu.be/iWcNuCOKp7U
The overall aim of this study is to establish a scientific reporting quality metric that can be used across institutions and countries, as well as to highlight the need for high-quality reporting to ensure replicability within biomedicine, making use of manuscripts from the Reproducibility Project: Cancer Biology.
The authors address an enhancement of the previously introduced Rigor and Transparency Index (RTI), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, NIH, MDAR, ARRIVE).
Using work by the Reproducibility Project: Cancer Biology, the authors could determine that replication studies scored significantly higher than the original papers which, according to the project, all required additional information from authors to begin replication efforts.
Unfortunately, RTI measures for journals, institutions, and countries all currently score lower than the replication study average. If they take the RTI of these replication studies as a target for future manuscripts, more work will be needed to ensure the average manuscript contains sufficient information for replication attempts.
Dr. Anita Bandrowski from the University of California San Diego said, “Research reproducibility is necessary for scientific progress. However, over the last decade, numerous reports on research irreproducibility have shed light on a lingering problem, one that is proving to be both troublesome and costly.”
“Research reproducibility is necessary for scientific progress. However, over the last decade, numerous reports on research irreproducibility have shed light on a lingering problem, one that is proving to be both troublesome and costly.”
In an effort to encourage reproducibility, numerous scientific organizations and journals have adopted the Transparency and Openness Promotion guidelines, which focus on establishing best practices at the level of individual journals.
Along a similar vein, the publisher-driven Materials Design, Analysis, and Reporting framework is a multidisciplinary research framework designed to improve reporting transparency across life science research at the level of individual manuscripts.
This framework provides a consistent, minimum reporting checklist whose criteria were used, in part, to create the first RTI, a journal quality metric focusing on research methodologies and reporting transparency.
Specifically, the authors here introduce the latest version of the RTI, which represents the mean SciScore over a subset of papers, and demonstrate how it can be used to assess reporting transparency within research institutions.
While we cannot simply describe all papers scoring a “2” as not replicable and all papers scoring an “8” as replicable, as numerous fields and their subsequent best practices exist, we can state that higher scores are associated with more methodological detail and as such are likely easier to use to attempt a replication.
###
DOI – https://doi.org/10.2196/37324
Full-text – https://www.jmir.org/2022/6/e37324
Free Altmetric Report – https://jmir.altmetric.com/details/130343509
JMIR Publications is a leading, born-digital, open access publisher of 30+ academic journals and other innovative scientific communication products that focus on the intersection of health and technology. Its flagship journal, the Journal of Medical Internet Research, is the leading digital health journal globally in content breadth and visibility, and it is the largest journal in the medical informatics field.
To learn more about JMIR Publications, please visit https://www.JMIRPublications.com or connect with us via:
YouTube – https://www.youtube.com/c/JMIRPublications
Facebook – https://www.facebook.com/JMedInternetRes
Twitter – https://twitter.com/jmirpub
LinkedIn – https://www.linkedin.com/company/jmir-publications
Instagram – https://www.instagram.com/jmirpub/
Head Office – 130 Queens Quay East, Unit 1100 Toronto, ON, M5A 0P6 Canada
If you are interested in learning more about promotional opportunities please contact us at [email protected]
The content of this communication is licensed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, published by JMIR Publications, is properly cited. JMIR Publications is a registered trademark of JMIR Publications.
About SciScore
SciScore is a scientific content checker / validation tool that verifies common rigor criteria (NIH, MDAR, ARRIVE) and research resources (antibodies, cell lines, organisms). These guidelines can be checked by editorial, but the process is tedious and takes a lot of effort from a skilled professional, so checklists are enforced only in the best-resourced journals. SciScore uses text mining techniques to do the job in minutes, providing a report to the editors, reviewers or authors about criteria that have and have not been addressed. Furthermore, it provides a numerical score, which allows editors to assess the percentage of criteria met or not met at a glance.
Contact Researchers: Anita Bandrowski | [email protected]
Contact Media/Publishers: Martijn Roelandse | [email protected]
Journal
Journal of Medical Internet Research
DOI
10.2196/37324
Method of Research
Data/statistical analysis
Subject of Research
People
Article Title
Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality
Article Publication Date
27-Jun-2022