Easyreporting simplifies the implementation of Reproducible Research layers in R software

Abstract:  During last years “irreproducibility” became a general problem in omics data analysis due to the use of sophisticated and poorly described computational procedures. For avoiding misleading results, it is necessary to inspect and reproduce the entire data analysis as a unified product. Reproducible Research (RR) provides general guidelines for public access to the analytic data and related analysis code combined with natural language documentation, allowing third-parties to reproduce the findings. We developed easyreporting, a novel R/Bioconductor package, to facilitate the implementation of an RR layer inside reports/tools. We describe the main functionalities and illustrate the organization of an analysis report using a typical case study concerning the analysis of RNA-seq data. Then, we show how to use easyreporting in other projects to trace R functions automatically. This latter feature helps developers to implement procedures that automatically keep track of the analysis steps. Easyreporting can be useful in supporting the reproducibility of any data analysis project and shows great advantages for the implementation of R packages and GUIs. It turns out to be very helpful in bioinformatics, where the complexity of the analyses makes it extremely difficult to trace all the steps and parameters used in the study.

 

Utrecht University Recognition and Rewards Vision

“By embracing Open Science as one of its five core principles1, Utrecht University aims to accelerate and improve science and scholarship and its societal impact. Open science calls for a full commitment to openness, based on a comprehensive vision regarding the relationship with society. This ongoing transition to Open Science requires us to reconsider the way in which we recognize and reward members of the academic community. It should value teamwork over individualism and calls for an open academic culture that promotes accountability, reproducibility, integrity and transparency, and where sharing (open access, FAIR data and software) and public engagement are normal daily practice. In this transition we closely align ourselves with the national VSNU program as well as developments on the international level….”

Dockstore: enhancing a community platform for sharing reproducible and accessible computational protocols | Nucleic Acids Research | Oxford Academic

Abstract:  Dockstore (https://dockstore.org/) is an open source platform for publishing, sharing, and finding bioinformatics tools and workflows. The platform has facilitated large-scale biomedical research collaborations by using cloud technologies to increase the Findability, Accessibility, Interoperability and Reusability (FAIR) of computational resources, thereby promoting the reproducibility of complex bioinformatics analyses. Dockstore supports a variety of source repositories, analysis frameworks, and language technologies to provide a seamless publishing platform for authors to create a centralized catalogue of scientific software. The ready-to-use packaging of hundreds of tools and workflows, combined with the implementation of interoperability standards, enables users to launch analyses across multiple environments. Dockstore is widely used, more than twenty-five high-profile organizations share analysis collections through the platform in a variety of workflow languages, including the Broad Institute’s GATK best practice and COVID-19 workflows (WDL), nf-core workflows (Nextflow), the Intergalactic Workflow Commission tools (Galaxy), and workflows from Seven Bridges (CWL) to highlight just a few. Here we describe the improvements made over the last four years, including the expansion of system integrations supporting authors, the addition of collaboration features and analysis platform integrations supporting users, and other enhancements that improve the overall scientific reproducibility of Dockstore content.

 

 

REPEAT (Reproducible Evidence: Practices to Enhance and Achieve Transparency)

“Replication is a cornerstone of the scientific method. Historically, public confidence in the validity of healthcare database research has been low. Drug regulators, patients, clinicians, and payers have been hesitant to trust evidence from databases due to high profile controversies with overturned and conflicting results. This has resulted in underuse of a potentially valuable source of real-world evidence.?…

Division of Phamacoepidemiology & Pharmacoeconomics [DoPE]

Brigham & Women’s Hospital and Harvard Medical School.”

Symposium: A critical analysis of the scientific reform movement

“As the science reform movement has gathered momentum to change research culture and behavior relating to openness, rigor, and reproducibility, so has the critical analysis of the reform efforts. This symposium includes five perspectives examining distinct aspects of the reform movement to illuminate and challenge underlying assumptions about the value and impact of changing practices, to identify potential unintended or counterproductive consequences, and to provide a meta perspective of metascience and open science. It’s meta, all the way up.

Each presenter will provide a 15-minute perspective followed by a concluding discussion among the panelists and a time to address audience questions. Visit cos.io/meta-meta to view session abstracts and speaker info.”

PsychOpen CAMA

“PsychOpen CAMA enables accessing meta-analytic datasets, reproducing meta-analyses and dynamically updating evidence from new primary studies collaboratively….

A CAMA (Community Augmented Meta Analysis) is an open repository for meta-analytic data, that provides meta-analytic analysis tools….

PsychOpen CAMA enables easy access and automated reproducibility of meta-analyses in psychology and related fields. This has several benefits for the research community:

Evidence can be kept updated by adding new studies published after the meta-analysis.
Researchers with special research questions can use subsets of the data or rerun meta-analyses using different moderators.
Flexible analyses with the datasets enable the application of new statistical procedures or different graphical displays.
The cumulated evidence in the CAMA can be used to get a quick overview of existing research gaps. This may give an idea of which study designs or moderators may be especially interesting for future studies to use limited resources for research in a way to enhance evidence.
Given existing meta-analytic evidence, the necessary sample size of future studies to detect an effect of a reasonable size can be estimated. Moreover, the effect of possible future studies on the results of the existing meta-analytic evidence can be simulated.
PsychOpen CAMA offers tutorials to better understand the reasoning behind meta-analyses and to learn the basic steps of conducting a meta-analysis to empower other researchers to contribute to our project for the benefit of the research community….”

 

 

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Why they shared: recovering early arguments for sharing social scientific data | Science in Context | Cambridge Core

Abstract:  Most social scientists today think of data sharing as an ethical imperative essential to making social science more transparent, verifiable, and replicable. But what moved the architects of some of the U.S.’s first university-based social scientific research institutions, the University of Michigan’s Institute for Social Research (ISR), and its spin-off, the Inter-university Consortium for Political and Social Research (ICPSR), to share their data? Relying primarily on archived records, unpublished personal papers, and oral histories, I show that Angus Campbell, Warren Miller, Philip Converse, and others understood sharing data not as an ethical imperative intrinsic to social science but as a useful means to the diverse ends of financial stability, scholarly and institutional autonomy, and epistemological reproduction. I conclude that data sharing must be evaluated not only on the basis of the scientific ideals its supporters affirm, but also on the professional objectives it serves.

 

An interview with protocols.io: CEO Lenny Teytelman on partnering with PLOS – The Official PLOS Blog

“Our ongoing partnership with protocols.io led to a new and exciting PLOS ONE article type, Lab Protocols, which offers a new avenue to share research in-line with the principles of Open Science. This two-part article type gives authors the best of both platforms: protocols.io hosts the step-by-step methodology details while PLOS ONE publishes a companion article that contextualizes the study and orchestrates peer review of the material. The end result is transparent and reproducible research that can help accelerate scientific discovery.

Read our interview with protocols.io CEO and co-founder Dr. Lenny Teytelman where he discusses what triggered the concept for the company, how researchers can benefit from this collaboration and how his team settled on such an adorable racoon logo….”

An interview with protocols.io: CEO Lenny Teytelman on partnering with PLOS – The Official PLOS Blog

“Our ongoing partnership with protocols.io led to a new and exciting PLOS ONE article type, Lab Protocols, which offers a new avenue to share research in-line with the principles of Open Science. This two-part article type gives authors the best of both platforms: protocols.io hosts the step-by-step methodology details while PLOS ONE publishes a companion article that contextualizes the study and orchestrates peer review of the material. The end result is transparent and reproducible research that can help accelerate scientific discovery.

Read our interview with protocols.io CEO and co-founder Dr. Lenny Teytelman where he discusses what triggered the concept for the company, how researchers can benefit from this collaboration and how his team settled on such an adorable racoon logo….”

Improving Social Science: Lessons from the Open Science Movement | PS: Political Science & Politics | Cambridge Core

“Recent years have been times of turmoil for psychological science. Depending on whom you ask, the field underwent a “replication crisis” (Shrout and Rodgers 2018) or a “credibility revolution” (Vazire 2018) that might even climax in “psychology’s renaissance” (Nelson, Simmons, and Simonsohn 2018). This article asks what social scientists can learn from this story. Our take-home message is that although differences in research practices make it difficult to prescribe cures across disciplines, much still can be learned from interdisciplinary exchange. We provide nine lessons but first summarize psychology’s experience and what sets it apart from neighboring disciplines….”

Notebook articles: towards a transformative publishing experience in nonlinear science

Abstract:  Open Science, Reproducible Research, Findable, Accessible, Interoperable and Reusable (FAIR) data principles are long term goals for scientific dissemination. However, the implementation of these principles calls for a reinspection of our means of dissemination. In our viewpoint, we discuss and advocate, in the context of nonlinear science, how a notebook article represents an essential step toward this objective by fully embracing cloud computing solutions. Notebook articles as scholar articles offer an alternative, efficient and more ethical way to disseminate research through their versatile environment. This format invites the readers to delve deeper into the reported research. Through the interactivity of the notebook articles, research results such as for instance equations and figures are reproducible even for non-expert readers. The codes and methods are available, in a transparent manner, to interested readers. The methods can be reused and adapted to answer additional questions in related topics. The codes run on cloud computing services, which provide easy access, even to low-income countries and research groups. The versatility of this environment provides the stakeholders – from the researchers to the publishers – with opportunities to disseminate the research results in innovative ways.

 

Assessment of transparency indicators across the biomedical literature: How open is open?

Abstract:  Recent concerns about the reproducibility of science have led to several calls for more open and transparent research practices and for the monitoring of potential improvements over time. However, with tens of thousands of new biomedical articles published per week, manually mapping and monitoring changes in transparency is unrealistic. We present an open-source, automated approach to identify 5 indicators of transparency (data sharing, code sharing, conflicts of interest disclosures, funding disclosures, and protocol registration) and apply it across the entire open access biomedical literature of 2.75 million articles on PubMed Central (PMC). Our results indicate remarkable improvements in some (e.g., conflict of interest [COI] disclosures and funding disclosures), but not other (e.g., protocol registration and code sharing) areas of transparency over time, and map transparency across fields of science, countries, journals, and publishers. This work has enabled the creation of a large, integrated, and openly available database to expedite further efforts to monitor, understand, and promote transparency and reproducibility in science.