Journals to trial tool that automatically flags reproducibility and transparency issues in papers | News | Chemistry World

“A tool using natural language processing and machine learning algorithms is being rolled-out on journals to automatically flag reproducibility, transparency and authorship problems in scientific papers.

The tool, Ripeta, has existed since 2017 and has already been run on millions of journal papers following its release, but now the tool’s creators have enabled its latest versions to be run on papers before peer review. In August, Ripeta was integrated with the widekly used manuscript submission system Editorial Manager in a bid to identify shortcomings in papers before they are sent out to peer review at journals. At this stage the tool’s creators won’t disclose which journals are using Ripeta, citing commercial confidentiality.

Ripeta sifts through papers to identify ‘trust markers’ for papers such as whether they contain data and code availability statements, open access statements, as well as ethical approvals, author contributions, repository notices and funding declarations.

From October 2022, the technology behind Ripeta was also integrated in the scholarly database Dimensions, giving users access to metadata about trust markers – for a fee – in 33 million academic papers published since 2010….”

Research Square Launches Beta Testing of Ripeta’s Open Science Assessment Tool – ripeta

“The first 200 authors who opt in  can use this manuscript improvement tool at no cost 

Research Square has launched a beta trial of its new automated Open Science Assessment tool, which can help authors enhance the quality of their research and the robustness of their scientific reporting.

This opt-in tool, powered by Ripeta and currently in the beta testing phase, is available at no cost for authors who upload their preprints  to the Research Square platform….

Ripeta’s natural language processing technology targets several critical elements of a scientific manuscript, including purpose, data and code availability statements, funding statements, and more to gauge the level of responsible reporting in authors’ scientific papers and suggest improvements….”

ripeta – responsible science

“Ripeta is a credit review for scientific publications. Similar to a financial credit report, which reviews the fiscal health of a person, Ripeta assesses the responsible reporting of the scientific paper. The Ripeta suite identifies and extracts the key components of research reporting, thus drastically shortening and improving the publication process; furthermore, Ripeta’s ability to extract data makes these pieces of text easily discoverable for future use….

Researchers: Rapidly check your pre-print manuscripts to improve the transparency of reporting your research.

Publishers: Improve the reproducibility of the articles you publish with an automated tool that helps evidence-based science.

Funders: Evaluate your portfolio by checking your manuscripts for robust scientific reporting.”

Wellcome and Ripeta partner to assess dataset availability in funded research – Digital Science

“Ripeta and Wellcome are pleased to announce a collaborative effort to assess data and code availability in the manuscripts of funded research projects.

The project will analyze papers funded by Wellcome from the year prior to it establishing a dedicated Open Research team (2016) and from the most recent calendar year (2019). It supports Wellcome’s commitment to maximising the availability and re-use of results from its funded research.

Ripeta, a Digital Science portfolio company, aims to make better science easier by identifying and highlighting the important parts of research that should be transparently presented in a manuscript and other materials.

The collaboration will leverage Ripeta’s natural language processing (NLP) technology, which scans articles for reproducibility criteria. For both data availability and code availability, the NLP will produce a binary yes-no response for the presence of availability statements. Those with a “yes” response will then be categorized by the way that data or code are shared….”