“The purpose of the course is to provide an overview of current challenges in reproducibility and to provide tools and skills for students wishing to practice science openly….”
Category Archives: oa.reproducibility
Open Science @ Concordia – CRBLM
Conference Open Science @ Concordia
May 27, 2022 Open science—the movement to make scientific processes and outputs available and accessible to all—is here to stay and is profoundly changing the way we do research. The transition to the open-by-design and by-default model described in the Government of Canada’s Roadmap for Open Science is a call to action in areas as diverse as open access, open data, open notebooks, open evaluation, open educational resources, open innovation, open-source software, open governments, and citizen science. As a next-generation university, Concordia is positioned to become an open science leader both nationally and internationally.
The Open Science @Concordia conference will be a celebration of past efforts towards openness and the kickoff for Concordia’s transition to becoming a fully open institution. The day-long event will bring together advocates, enthusiasts, and other stakeholders in open science from across Concordia’s faculties and other regional institutions. Events include keynote talks featuring international speakers, interdisciplinary lightning-talk sessions, and lunch time roundtables. With plenty of free food and drink, and ample time for exchange and discussion, Open Science @Concordia will be the perfect opportunity to meet with the open community and help open science gain momentum. Attendees are welcome from any institution. Join us either virtually or in-person at the Loyola Jesuit Hall and Conference Centre on a day for the democratization of knowledge without barriers. Registration is free, but is required for in-person attendance.
Frontiers | Key Factors for Improving Rigor and Reproducibility: Guidelines, Peer Reviews, and Journal Technical Reviews | Cardiovascular Medicine
Abstract: To respond to the NIH’s policy for rigor and reproducibility in preclinical research, many journals have implemented guidelines and checklists to guide authors in improving the rigor and reproducibility of their research. Transparency in developing detailed prospective experimental designs and providing raw data are essential premises of rigor and reproducibility. Standard peer reviews and journal-specific technical and statistical reviews are critical factors for enhancing rigor and reproducibility. This brief review also shares some experience from Arteriosclerosis, Thrombosis, and Vascular Biology, an American Heart Association journal, that has implemented several mechanisms to enhance rigor and reproducibility for preclinical research….
Investigating the Effectiveness of the Open Data Badge Policy at Psychological Science Through Computational Reproducibility
Abstract: In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its stated aim at Psychological Science: ensuring reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all articles provided at least some data, 6/14 articles provided analysis code or scripts, only 1/14 articles was rated to be exactly reproducible, and 3/14 essentially reproducible with minor deviations. We recommend that Psychological Science require a check of reproducibility at the peer review stage before awarding badges, and that the Open Data badge be renamed “Open Data and Code” to avoid confusion and encourage researchers to adhere to this higher standard.
Guest Post: A Decade of Open Data in Research — Real Change or Slow Moving Compliance? – The Scholarly Kitchen
“There has been much made of the recent Nature news declaration of the NIH Data Policy (from January 2023) as ‘seismic’. In my opinion, it truly is. Many others will argue that the language is not strong enough. But for me, the fact that the largest public funder of biomedical research in the world is telling researchers to share their data demonstrates how fast the push for open academic data is accelerating.
While a lot of the focus is on incentive structures and the burden for researchers, the academic community should not lose focus on the potential ‘seismic’ benefits that open data can have for reproducibility and efficiency in research, as well as the ability to move further and faster when it comes to knowledge advancement….
Reflecting on the past decade of open research data, there are a few key developments that have helped speed up the momentum in the space, as well as a few ideas that haven’t come to fruition…yet.
The NIH is not the first funder to tell the researchers they fund that they should be making their data openly available to all. 52 funders listed on Sherpa Juliet require data archiving as a condition of funding, while a further 34 encourage it. A push from publishers has also acted as a major motivator for researchers to share their data. This goes as far back as PLOS requiring all article authors to make their data publicly available back in 2014. Now, nearly all major science journals have an open data policy of some kind. Some may say there is no better motivator for a researcher to share their data than if a publication is at stake.
In 2016, the ‘FAIR Guiding Principles for scientific data management and stewardship’ were published in Scientific Data, and a flurry of debate on the definition of Findable, Accessible, Interoperable, and Reusable data has continued ever since. This has been a net win for the space. Although every institution, publisher and funder may not be aiming for the exact same outcome, it is a move to better describe and ultimately make data outputs usable as a standalone output. The principles for Findable, Accessible, Interoperable and Reusable data emphasize that when thinking of research data, future consumers will not just be human researchers — we also need to feed the machines. This means that computers will need to interpret content with little or no human intervention. For this to be possible, the outputs need to be in machine readable formats and the metadata needs to be sufficient to describe exactly what the data are and how the data was generated.
This highlights the area (in my opinion) that can create the most change in the shortest amount of time: quality of metadata….
What senior academics can do to support reproducible and open research: a short, three-step guide | BMC Research Notes | Full Text
Abstract: Increasingly, policies are being introduced to reward and recognise open research practices, while the adoption of such practices into research routines is being facilitated by many grassroots initiatives. However, despite this widespread endorsement and support, as well as various efforts led by early career researchers, open research is yet to be widely adopted. For open research to become the norm, initiatives should engage academics from all career stages, particularly senior academics (namely senior lecturers, readers, professors) given their routine involvement in determining the quality of research. Senior academics, however, face unique challenges in implementing policy changes and supporting grassroots initiatives. Given that—like all researchers—senior academics are motivated by self-interest, this paper lays out three feasible steps that senior academics can take to improve the quality and productivity of their research, that also serve to engender open research. These steps include changing (a) hiring criteria, (b) how scholarly outputs are credited, and (c) how we fund and publish in line with open research principles. The guidance we provide is accompanied by material for further reading.
What senior academics can do to support reproducible and open research: a short, three-step guide
Abstract: Increasingly, policies are being introduced to reward and recognise open research practices, while the adoption of such practices into research routines is being facilitated by many grassroots initiatives. However, despite this widespread endorsement and support, as well as various efforts led by early career researchers, open research is yet to be widely adopted. For open research to become the norm, initiatives should engage academics from all career stages, particularly senior academics (namely senior lecturers, readers, professors) given their routine involvement in determining the quality of research. Senior academics, however, face unique challenges in implementing policy changes and supporting grassroots initiatives. Given that—like all researchers—senior academics are motivated by self-interest, this paper lays out three feasible steps that senior academics can take to improve the quality and productivity of their research, that also serve to engender open research. These steps include changing (a) hiring criteria, (b) how scholarly outputs are credited, and (c) how we fund and publish in line with open research principles. The guidance we provide is accompanied by material for further reading.
Open access methods and protocols promote open science in a pandemic – ScienceDirect
“How open-access methods and protocols publishing advanced the project’s goals
In considering a publication strategy, Milón was motivated by a common feeling of frustration: being fascinated by a new scientific publication and excited to try the new approach in his own lab but ultimately being disappointed to realize that the methods reporting wasn’t quite robust enough to faithfully recreate the experiment. Milón sees this as not only an inconvenience for himself but a broader challenge for research reproducibility. To help prevent challenges to other groups adopting their method, their results were therefore reviewed, polished, and packaged as three freely available scientific documents (Alcántara et al., 2021a; Alcántara et al., 2021b; Mendoza-Rojas, et al., 2021). The development of the method, including detailed reporting of the various optimizations and analytical comparisons that informed each component of the assay was described in Cell Reports Methods (Alcántara et al., 2021b). The methods paper provides the empirical justification for each step of the method and serves as both a general blueprint for future open-source diagnostic methods development and as a more specific template from which future modifications to any given step can be explored….”
Open access methods and protocols promote open science in a pandemic – ScienceDirect
“How open-access methods and protocols publishing advanced the project’s goals
In considering a publication strategy, Milón was motivated by a common feeling of frustration: being fascinated by a new scientific publication and excited to try the new approach in his own lab but ultimately being disappointed to realize that the methods reporting wasn’t quite robust enough to faithfully recreate the experiment. Milón sees this as not only an inconvenience for himself but a broader challenge for research reproducibility. To help prevent challenges to other groups adopting their method, their results were therefore reviewed, polished, and packaged as three freely available scientific documents (Alcántara et al., 2021a; Alcántara et al., 2021b; Mendoza-Rojas, et al., 2021). The development of the method, including detailed reporting of the various optimizations and analytical comparisons that informed each component of the assay was described in Cell Reports Methods (Alcántara et al., 2021b). The methods paper provides the empirical justification for each step of the method and serves as both a general blueprint for future open-source diagnostic methods development and as a more specific template from which future modifications to any given step can be explored….”
Show your work: Tools for open developmental science – ScienceDirect
Abstract: Since grade school, students of many subjects have learned to “show their work” in order to receive full credit for assignments. Many of the reasons for students to show their work extend to the conduct of scientific research. And yet multiple barriers make it challenging to share and show the products of scientific work beyond published findings. This chapter discusses some of these barriers and how web-based data repositories help overcome them. The focus is on Databrary.org, a data library specialized for storing and sharing video data with a restricted community of institutionally approved investigators. Databrary was designed by and for developmental researchers, and so its features and policies reflect many of the specific challenges faced by this community, especially those associated with sharing video and related identifiable data. The chapter argues that developmental science poses some of the most interesting, challenging, and important questions in all of science, and that by openly sharing much more of the products and processes of our work, developmental scientists can accelerate discovery while making our scholarship much more robust and reproducible.
Open Science and Multicultural Research: Some Data, Considerations, and Recommendations
Abstract: Objectives: There are two potentially useful but nonintersecting efforts to help ensure that psychological science produces valid and credible information and contributes to the understanding of diverse human experiences. Whereas North American ethnic minority psychology research/cultural diversity science (EM/D) emphasizes cultural competency to yield contextualized psychological understanding of understudied and underserved minority populations, current open science (OS) approaches emphasize material and data sharing, and statistical proficiency to maximize the replicability of mainstream findings. To illuminate the extent of and explore reasons for this bifurcation, and OS’s potential impact on EM/D, we conducted three studies. Methods and Results: In Study 1, we reviewed editorial/publishing policies and empirical articles appearing in four major EM/D journals on the incentives for and use of OS. Journals varied in OS-related policies; 32 of 823 empirical articles incorporated any OS practices. Study 2 was a national mixed-methods survey of EM/D scholars’ (N=141) and journal editors’ (N=16) views about and experiences with OS practices. Emerged themes included beliefs about the impact of OS on scientific quality, possible professional disadvantages for EM/D scholars, and concerns about the welfare of and ethical risks posed for communities of color. In Study 3, we explored community research participants’ beliefs about data sharing and credibility of science/scientists (N=1,104). Participants were receptive of data sharing and viewed psychological science favorably. Conclusions: We provide data-driven recommendations for researchers to assemble the best tools for approaching the knowledge-production process with transparency, humility, and cultural competency.
Waltman et al. (2022) How to improve scientific peer review: Four schools of thought | SocArXiv Papers
Waltman, L., Kaltenbrunner, W., Pinfield, S., & Woods, H. B. (2022, March 9). How to improve scientific peer review: Four schools of thought. https://doi.org/10.31235/osf.io/v8ghj
Abstract:Peer review plays an essential role as one of the cornerstones of the scholarly publishing system. There are many initiatives that aim to improve the way in which peer review is organized, resulting in a highly complex landscape of innovation in peer review. Different initiatives are based on different views on the most urgent challenges faced by the peer review system, leading to a diversity of perspectives on how the system can be improved. To provide a more systematic understanding of the landscape of innovation in peer review, we suggest that the landscape is shaped by four schools of thought: The Quality & Reproducibility school, the Democracy & Transparency school, the Equity & Inclusion school, and the Efficiency & Incentives school. Each school has a different view on the key problems of the peer review system and the innovations necessary to address these problems. The schools partly complement each other, but we argue that there are also important tensions between the schools. We hope that the four schools of thought offer a useful framework to facilitate conversations about the future development of the peer review system.
Guest post: OASPA & Make Data Count Workshop Report: What are publisher experiences of data citation and what can we do to help? – OASPA
: OASPA & Make Data Count Workshop Report: What are publisher experiences of data citation and what can we do to help?
Towards a culture of open scholarship: the role of pedagogical communities
The UK House of Commons Science and Technology Committee has called for evidence on the roles that different stakeholders play in reproducibility and research integrity. Of central priority are proposals for improving research integrity and quality, as well as guidance and support for researchers. In response to this, we argue that there is one important component of research integrity that is often absent from discussion: the pedagogical consequences of how we teach, mentor, and supervise students through open scholarship. We justify the need to integrate open scholarship principles into research training within higher education and argue that pedagogical communities play a key role in fostering an inclusive culture of open scholarship. We illustrate these benefits by presenting the Framework for Open and Reproducible Research Training (FORRT), an international grassroots community whose goal is to provide support, resources, visibility, and advocacy for the adoption of principled, open teaching and mentoring practices, whilst generating conversations about the ethics and social impact of higher-education pedagogy. Representing a diverse group of early-career researchers and students across specialisms, we advocate for greater recognition of and support for pedagogical communities, and encourage all research stakeholders to engage with these communities to enable long-term, sustainable change.
Streamlining statistical reproducibility: NHLBI ORCHID clinical trial results reproduction | JAMIA Open | Oxford Academic
Abstract: Reproducibility in medical research has been a long-standing issue. More recently, the COVID-19 pandemic has publicly underlined this fact as the retraction of several studies reached out to general media audiences. A significant number of these retractions occurred after in-depth scrutiny of the methodology and results by the scientific community. Consequently, these retractions have undermined confidence in the peer-review process, which is not considered sufficiently reliable to generate trust in the published results. This partly stems from opacity in published results, the practical implementation of the statistical analysis often remaining undisclosed. We present a workflow that uses a combination of informatics tools to foster statistical reproducibility: an open-source programming language, Jupyter Notebook, cloud-based data repository, and an application programming interface can streamline an analysis and help to kick-start new analyses. We illustrate this principle by (1) reproducing the results of the ORCHID clinical trial, which evaluated the efficacy of hydroxychloroquine in COVID-19 patients, and (2) expanding on the analyses conducted in the original trial by investigating the association of premedication with biological laboratory results. Such workflows will be encouraged for future publications from National Heart, Lung, and Blood Institute-funded studies.