Ouvrir la Science – Deuxième Plan national pour la science ouverte

From Google’s English:  “The National Open Science Plan announced in 2018 by the Minister of Higher Education, Research and Innovation, Frédérique Vidal, has enabled France to adopt a coherent and dynamic policy in the field of open science, coordinated by the Committee for Open Science, which brings together the ministry, research and higher education institutions and the scientific community. After three years of implementation, the progress made is notable. The rate of French scientific publications in open access rose from 41% to 56%. The National Open Science Fund was created, it launched two calls for projects in favor of open scientific publication and it supported structuring international initiatives. The National Research Agency and other funding agencies now require open access to publications and the drafting of data management plans for the projects they fund. The function of ministerial research data administrator has been created and a network is being deployed in the establishments. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published.

The steps already taken and the evolution of the international context invite us to extend, renew and strengthen our commitments by adopting a second National Plan for Open Science, the effects of which will be deployed until 2024. With this new plan, France is continuing the ambitious trajectory initiated by the law for a digital republic of 2016 and confirmed by the research programming law of 2020, which includes open science in the missions of researchers and teacher-researchers.

This second National Plan extends its scope to source codes resulting from research, it structures actions in favor of the opening or sharing of data through the creation of the Research Data Gouv platform, it multiplies the levers of transformation in order to generalize open science practices and it presents disciplinary and thematic variations. It is firmly in line with a European ambition and proposes, within the framework of the French Presidency of the European Union, to act to take effective account of open science practices in individual and collective research evaluations. It is about initiating a process of sustainable transformation in order to make open science a common and shared practice…”

Increasing transparency through open science badges

“Authors who adopt transparent practices for an article in Conservation Biology are now able to select from 3 open science badges: open data, open materials, and preregistration. Badges appear on published articles as visible recognition and highlight these efforts to the research community. There is an emerging body of literature regarding the influences of badges, for example, an increased number of articles with open data (Kidwell et al 2016) and increased rate of data sharing (Rowhani?Farid et al. 2018). However, in another study, Rowhani?Farid et al. (2020) found that badges did not “noticeably motivate” researchers to share data. Badges, as far as we know, are the only data?sharing incentive that has been tested empirically (Rowhani?Farid et al. 2017).

Rates of data and code sharing are typically low (Herold 2015; Roche et al 2015; Archmiller et al 2020; Culina et al 2020). Since 2016, we have asked authors of contributed papers, reviews, method papers, practice and policy papers, and research notes to tell us whether they “provided complete machine and human?readable data and computer code in Supporting Information or on a public archive.” Authors of 31% of these articles published in Conservation Biology said they shared their data or code, and all authors provide human?survey instruments in Supporting Information or via a citation or online link (i.e., shared materials)….”

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”

NISO’s Recommended Practice on Reproducibility Badging and Definitions Now Published | Industry Announcements and Events SSP-L

“The National Information Standards Organization (NISO) today announces the publication of its Recommended Practice, RP-31-2021, Reproducibility Badging and Definitions. Developed by the NISO Taxonomy, Definitions, and Recognition Badging Scheme Working Group, this new Recommended Practice provides a set of recognition standards that can be deployed across scholarly publishing outputs, to easily recognize and reward the sharing of data and methods….”

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Full article: To share or not to share – 10 years of European Journal of Psychotraumatology

Abstract:  The European Journal of Psychotraumatology, owned by the European Society for Traumatic Stress Studies (ESTSS), launched as one of the first full Open Access ‘specialist’ journals in its field. Has this Open Access model worked in how the Journal has performed? With the European Journal of Psychotraumatology celebrating its ten-year anniversary we look back at the past decade of sharing our research with the world and with how the journal sits with the broader movement beyond Open Access to Open Research and we present new policies we have adopted to move the field of psychotraumatology to the next level of Open Research. While we as researchers now make our publications more often freely available to all, how often do we share our protocols, our statistical analysis plans, or our data? We all gain from more transparency and reproducibility, and big steps are being made in this direction. The journal’s decennial performance as well as the exciting new Open Research developments are presented in this editorial. The journal is no longer in its infancy and eager to step into the next decade of Open Research.

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study | Royal Society Open Science

Abstract:  For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy’ (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.

 

FAIR metrics and certification, rewards and recognition, skills and training: FAIRsFAIR contribution to the EOSC Strategic Research and Innovation Agenda | FAIRsFAIR

“FAIRsFAIR is a key contributor to the ongoing development of global standards for FAIR data and repository certification and to the policies and practices that will turn the EOSC programme into a functioning infrastructure. The project strongly endorses all of the guiding principles already identified as relevant to implementing the EOSC vision, with a special emphasis on the importance of FAIR-by-design tools. The guiding principles are a multi-stakeholder approach; data as open as possible and as closed as necessary; implementation of a Web of FAIR data and related services for science; federation of existing research infrastructures; and the need for machine-run algorithms transparent to researchers)….”

Open science badges are coming. “A ‘badge’ is a symbol or indicator of… | by Bruce Caron | Aug, 2020 | Medium

“The notion of using open digital badges to acknowledge certain practices and learning achievements has been circulating in the open science endeavor for more than a decade. Over these years, this has become a perennial “near future” augmentation/implementation of how open science can recognize and reward practices and skills. Instead of using game-able metrics that rank individuals as though they were in a race, badges can promote active learning, current standards, professional development, and research quality assurance.

The transition from arbitrarily scarce reputation markers (impact metrics, prizes, awards) to universally available recognition markers also helps to level the ground on which careers can be built across the global republic of science. Every scientist who wants to take the time and effort to earn a badge for achieving some level of, say, research-data reusability, or graduate-student mentorship, can then show off this badge to the world. Every student/scientist who acquires a specific skill (R programming, software reusability, statistics, etc.) can add a new badge to their CV….”

Improving transparency and scientific rigor in academic publishing – Prager – 2019 – CANCER REPORTS – Wiley Online Library

“3.5.7 Registered reports and open practices badges

One possible way to incorporate all the information listed above and to combat the stigma against papers that report nonsignificant findings is through the implementation of Registered Reports or rewarding transparent research practices. Registered Reports are empirical articles designed to eliminate publication bias and incentivize best scientific practice. Registered Reports are a form of empirical article in which the methods and the proposed analyses are preregistered and reviewed prior to research being conducted. This format is designed to minimize bias, while also allowing complete flexibility to conduct exploratory (unregistered) analyses and report serendipitous findings. The cornerstone of the Registered Reports format is that the authors submit as a Stage 1 manuscript an introduction, complete and transparent methods, and the results of any pilot experiments (where applicable) that motivate the research proposal, written in the future tense. These proposals will include a description of the key research question and background literature, hypotheses, experimental design and procedures, analysis pipeline, a statistical power analysis, and full description of the planned comparisons. Submissions, which are reviewed by editors, peer reviewers and in some journals, statistical editors, meeting the rigorous and transparent requirements for conducting the research proposed are offered an in?principle acceptance, meaning that the journal guarantees publication if the authors conduct the experiment in accordance with their approved protocol. Many journals publish the Stage 1 report, which could be beneficial not only for citations, but for the authors’ progress reports and tenure packages. Following data collection, the authors prepare and resubmit a Stage 2 manuscript that includes the introduction and methods from the original submission plus their obtained results and discussion. The manuscript will undergo full review; referees will consider whether the data test the authors’ proposed hypotheses by satisfying the approved outcome?neutral conditions, will ensure the authors adhered precisely to the registered experimental procedures, and will review any unregistered post hoc analyses added by the authors to confirm they are justified, methodologically sound, and informative. At this stage, the authors must also share their data (see also Wiley’s Data Sharing and Citation Policy) and analysis scripts on a public and freely accessible archive such as Figshare and Dryad or at the Open Science Framework. Additional details, including template reviewer and author guidelines, can be found by clicking the link to the Open Science Framework from the Center for Open Science (see also94).

The authors who practice transparent and rigorous science should be recognized for this work. Funders can encourage and reward open practice in significant ways (see https://wellcome.ac.uk/what?we?do/our?work/open?research). One way journals can support this is to award badges to the authors in recognition of these open scientific practices. Badges certify that a particular practice was followed, but do not define good practice. As defined by the Open Science Framework, three badges can be earned. The Open Data badge is earned for making publicly available the digitally shareable data necessary to reproduce the reported results. These data must be accessible via an open?access repository, and must be permanent (e.g., a registration on the Open Science Framework, or an independent repository at www.re3data.org). The Open Materials badge is earned when the components of the research methodology needed to reproduce the reported procedure and analysis are made publicly available. The Preregistered badge is earned for having a preregistered design, whereas the Preregistered+Analysis Plan badge is earned for having both a preregistered research design and an analysis plan for the research; the authors must report results according to that plan. Additional information about the badges, including the necessary information to be awarded a badge, can be found by clicking this link to the Open Science Framework from the Center for Open Science….”