:From Bioethics to Data Sharing for Transparency in Nursing Research

“Our journal, Journal of Korean Academy of Nursing (JKAN), adopted data sharing policy in December 2020 (https://www.jkan.or.kr/index.php?body=dataSharing) [

3] which was applied from volume 50 issue 6 after extensive discussion. As editor-in-chief, I would like to inform our readers to enhance their understanding of the data sharing policy….”

Replicate Others as You Would Like to Be Replicated Yourself | PS: Political Science & Politics | Cambridge Core

“This article presents some principles on constructive ways to conduct replications. Following the style of the Transparency and Openness Promotion (TOP) guidelines for open science (Nosek et al. 2015), we summarize recommendations as a series of tiers from what is commonly done (Level I), what would be better (Level II), and what would be better still (Level III) (table 1). The aspiration of our constructive replication recommendations is to help fields move toward a research culture in which self-correction is welcomed, honest mistakes are normalized, and different interpretations of results are recognized as a routine outcome of the process. Changing culture is always difficult, of course, but conducting projects in line with ideals and encouraging ideals in others are available to researchers for contributing to an improved culture that is closer to reality….”

APA Joins as New Signatory to TOP Guidelines

“The American Psychological Association (APA), the nation’s largest organization representing psychologists and psychological researchers has become a signatory to the Transparency and Openness Promotion (TOP) Guidelines, an important step for helping to make research data and processes more open by default, according to the Center for Open Science (COS).

The TOP Guidelines are a community-driven effort to align research behaviors with scientific ideals. Transparency, open sharing, and reproducibility are core values of science, but not always part of daily practice. Journals, funders, and institutions can increase reproducibility and integrity of research by aligning their author or grantee guidelines with the TOP Guidelines.

The APA said it will officially begin implementing standardized disclosure requirements of data and underlying research materials (TOP Level 1). Furthermore, it encourages editors of core journals to move to Level 2 TOP (required transparency of data and research items when ethically possible). More information on the specific levels of adoption by each of the core journals will be coming in the first half of 2021….”

Open Science: Promises and Performance| Qualtrics Survey Solutions

“Although many scientists and organisations endorse this notion, progress has been slow. Some of my research explores the barriers that have impeded progress and makes recommendations to encourage future success. This  survey forms part of that work and addresses a variety of issues, including attitudes towards data storage and access, the role of journals in open science, and associated ethical issues. 

Those interested in scientific progress are invited to take part, and participation should take less than 10 minutes. Responses will be anonymous and participants can withdraw at any time.

The findings from the survey will be submitted to open access journal and made available as open access preprint. The raw data will be lodged with the Open Science Foundation …”

Transparency and open science principles… | Wellcome Open Research

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 28 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. This instrument rates the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.
Results: Across the 28 journals, we find low values on the TOP Factor (median [25th, 75th percentile] 2.5 [1, 3], min. 0, max. 9, out of a total possible score of 28) in sleep research and chronobiology journals.
Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support the recent developments in transparent and open science by implementing transparency and openness principles in their guidelines and making adherence to them mandatory.

Transparency and Openness Promotion (TOP) Guidelines

Abstract:  The Transparency and Openness Promotion (TOP) Committee met in November 2014 to address one important element of the incentive systems – journals’ procedures and policies for publication. The outcome of the effort is the TOP Guidelines. There are eight standards in the TOP guidelines; each move scientific communication toward greater openness. These standards are modular, facilitating adoption in whole or in part. However, they also complement each other, in that commitment to one standard may facilitate adoption of others. Moreover, the guidelines are sensitive to barriers to openness by articulating, for example, a process for exceptions to sharing because of ethical issues, intellectual property concerns, or availability of necessary resources.

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Introducing the STM 2020 Research Data Year

“STM has declared 2020 the ‘STM Research Data Year’ and is working with publishers and other partners to boost effective sharing of research data:

SHARE: Increase the number of journals with data policies and articles with Data Availability Statements (DAS)
LINK: Increase the number of journals that deposit the data links to the SCHOLIX framework
CITE: Increase the citations to datasets along the Joint Declaration of Data Citation Principles…”

Is it Finally the Year of Research Data? – The STM Association Thinks So – The Scholarly Kitchen

“At the recent Researcher to Reader conference in London, Mark Allin (@allinsnap) had the job of doing the conference round-up, which is the slot immediately before the closing keynote where the themes and take-homes of the conference are brought together. In his four summary themes, Allin inevitably drew out Open Access / Open Science. It’s almost impossible to have a publishing or library conference without it, however, in terms of significance, he put it at the bottom of the list, almost as an afterthought. His reasoning is that open science now feels like an inevitability. With a clear trend towards both open access and open data mandates among funders, institutions, and publishers, the question that each of us must ask ourselves isn’t whether it will or should happen, but how are we going to adapt as change continues….

Practices around open research data are gaining traction. In 2019’s The State of Open Data Report, 64% of respondents claimed that they made their data openly available in 2018. That’s a rise of 4% from the previous year. Comprehensive information on the prevalence of open data policies is hard to come by, but there is a general sense that publishers, funders, and institutions alike are all moving towards firstly having data policies and then steadily strengthening those policies over time. 

The JoRD project, based at Nottingham University in the UK was funded by Jisc and ran from December 2012 until its final blog post in 2014. In this article, Sturges et al., report that JoRD found the state of open data policies among journals to be patchy and inconsistent, with about half of all the journals they looked at having no policy at all, and with 75% of those that did exist being categorized as weak….

Unfortunately, the short timescale of the JoRD project limits its findings to a snapshot. However, there has since been piecemeal evidence of progress towards a more robust open research data landscape. The case studies presented in this article by Jones et al., — a different Jones, not me — describe how both Taylor and Francis, and Springer Nature have followed the path of steadily increasing the number of journals with data policies while strengthening those that exist….”

New Measure Rates Quality of Research Journals’ Policies to Promote Transparency and Reproducibility

“Today, the Center for Open Science launches TOP Factor, an alternative to journal impact factor (JIF) to evaluate qualities of journals. TOP Factor assesses journal policies for the degree to which they promote core scholarly norms of transparency and reproducibility. TOP Factor provides a first step toward evaluating journals based on their quality of process and implementation of scholarly values. This alternative to JIF may reduce the dysfunctional incentives for journals to publish exciting results whatever their credibility….

TOP Factor is based primarily on the Transparency and Openness Promotion (TOP) Guidelines, a framework of eight standards that summarize behaviors that can improve transparency and reproducibility of research such as transparency of data, materials, code, and research design, preregistration, and replication. Journals can adopt policies for each of the eight standards that have increasing levels of stringency. For example, for the data transparency standard, a score of 0 indicates that the journal policy fails to meet the standard, 1 indicates that the policy requires that authors disclose whether data are publicly accessible, 2 indicates that the policy requires authors to make data publicly accessible unless it qualifies for an exception (e.g., sensitive health data, proprietary data), and 3 indicates that the policy includes both a requirement and a verification process for the data’s correspondence with the findings reported in the paper. TOP Factor also includes indicators of whether journals offer Registered Reports, a publishing model that reduces publication bias of ignoring negative and null results, and badging to acknowledge open research practices to facilitate visibility of open behaviors….”

TOP (Transparency and Openness Promotion)

“Transparency, open sharing, and reproducibility are core values of science, but not always part of daily practice. Journals, funders, and societies can increase research reproducibility by adopting the TOP Guidelines….”

5 Scholarly Publishing Trends to Watch in 2020

“The vision for a predominantly open access (OA) publishing landscape has shifted from a possibility to a probability in the opinions of many. A 2017 Springer Nature survey of 200 professional staff working in research institutions around the world found that over 70% of respondents agreed scholarly content should be openly accessible and 91% of librarians agreed that “open access is the future of academic and scientific publishing.” …

As noted, there is growing consensus within academia that the majority of scholarly content will be available OA in the future — but how to reach that end is still a matter of debate. The announcement of Plan S in September 2018, an initiative by a consortium of national and international research funders to make research fully and immediately OA, sent shockwaves throughout academia. 2019 saw the release of the revised Plan S guidelines with some significant changes, including an extension of the Plan S deadline to January 2021, a clearer Green OA compliance pathway, and greater flexibility around non-derivative copyright licenses. What remains the same — and has been a matter of significant debate — is that Plan S will not acknowledge hybrid OA as a compliant publishing model.

In response to concerns raised by scholarly societies around the feasibility of transitioning to full and immediate OA publishing without compromising their operational funding, Wellcome and UKRI in partnership with ALPSP launched the “Society Publishers Accelerating Open Access and Plan S“ (SPA-OPS) project to identify viable OA publishing models and transition options for societies. The final SPA-OPS report was released in September of 2019, encompassing over 20 potential OA models and strategies as well as a “transformative agreement toolkit.” …”

Open data: growing pains | Research Information

“In its latest State of Open Data survey, Figshare revealed that a hefty 64 per cent of respondents made their data openly available in 2018.

The percentage, up four per cent from last year and seven per cent from 2016, indicates a healthy awareness of open data and for Daniel Hook, chief executive of Figshare’s parent company, Digital Science, it spells good news….

For example, the majority of respondents – 63 per cent – support national mandates for open data, an eight  per cent rise from 2017. And, at the same time, nearly half of the respondents – 46 per cent – reckon data citations motivate them to make data openly available. This figure is up seven per cent from last year….

Yet, amid the data-sharing success stories, myriad worries remain. Top of the pile is the potential for data misuse….

Inappropriate sharing of data is another key concern….

Results indicated that a mighty 58 per cent of respondents felt they do not receive sufficient credit for sharing data, while only nine per cent felt they do….

Coko recently won funding from the Sloan Foundation to build DataSeer, an online service that will use Natural Language Processing to identify datasets that are associated with a particular article. …”