From Google’s English: “The indicator is produced and launched annually by the Danish Agency for Education and Research, which is part of the Ministry of Education and Research. The indicator monitors the implementation of the Danish Open Access strategy 2018-2025 by collecting and analyzing publication data from the Danish universities.
OVERVIEW – National strategic goals and the realization of them at national and university level.
OA TYPES – Types of Open Access realization at national and local level.
DATA – Data for download as well as documentation at an overview and technical level.
GUIDANCE – Information to support the Danish universities’ implementation of Open Access, such as important dates and guidelines.
FAQ – Frequently Asked Questions….”
“For the sake of analysis, we compared what might happen if ALL authors chose one Plan S compliance route over another. In practice there will be a mix, and so the reality is likely to land somewhere between our two extremes. …
Compliance via fully OA journals
Plan S could lead to a slight lift in market value of just under 0.25% in the long term. Plan S articles add incremental revenues by boosting volumes in fully OA journals. Meanwhile with a mild drop in volumes from subscription journals, publishers are able to maintain their prices.
The UK’s UKRI is currently considering its position on OA. If the UKRI were to adopt Plan S principles, then it will make little difference to the market if the fully OA compliance route was followed.
Compliance via repositories
Plan S could lead to a slight fall in market value of just under 0.6% in the long term. This is driven by lost hybrid OA revenue, as authors opt for subscription journals instead.
If the UKRI were to adopt Plan S principles, then the long-term fall in market value would be just under 0.8%. This is another third or so compared with Plan S on its own. The UK’s current policies have driven significant hybrid uptake. If the value of these APCs is lost, it will have a noticeable effect….”
Compliance via fully OA journals
Plan S could lead to a fall in market value of around 2.8%. Subscription journals generate more revenues per article than their OA counterparts. Therefore, a reduction in subscription prices for a given volume of articles will be greater than the gains made from APCs. This adjustment will happen once. Then, as OA output is growing faster than the market as a whole, it will start to drive a very mild increase in market value.
If the UKRI were to adopt Plan S principles, then the long-term fall in market value would be just under 3.4%, or around 20% more than Plan S alone. The same dynamics apply as for Plan S alone….
“Advancing public access to research data is important to improving transparency and reproducibility of scientific results, increasing scientific rigor and public trust in science, and — most importantly — accelerating the pace of discovery and innovation through the open sharing of research results. Additionally, it is vital that institutions develop and implement policies now to ensure consistency of data management plans across their campuses to guarantee full compliance with federal research agency data sharing requirements. Beyond the establishment of policies, universities must invest in the infrastructure and support necessary to achieve the desired aspirations and aims of the policies. The open sharing of the results of scientific research is a value our two associations have long fought to protect and preserve. It is also a value we must continue to uphold at all levels within our universities. This will mean overcoming the various institutional and cultural impediments which have, at times, hampered the open sharing of research data….”
“Authors who adopt transparent practices for an article in Conservation Biology are now able to select from 3 open science badges: open data, open materials, and preregistration. Badges appear on published articles as visible recognition and highlight these efforts to the research community. There is an emerging body of literature regarding the influences of badges, for example, an increased number of articles with open data (Kidwell et al 2016) and increased rate of data sharing (Rowhani?Farid et al. 2018). However, in another study, Rowhani?Farid et al. (2020) found that badges did not “noticeably motivate” researchers to share data. Badges, as far as we know, are the only data?sharing incentive that has been tested empirically (Rowhani?Farid et al. 2017).
Rates of data and code sharing are typically low (Herold 2015; Roche et al 2015; Archmiller et al 2020; Culina et al 2020). Since 2016, we have asked authors of contributed papers, reviews, method papers, practice and policy papers, and research notes to tell us whether they “provided complete machine and human?readable data and computer code in Supporting Information or on a public archive.” Authors of 31% of these articles published in Conservation Biology said they shared their data or code, and all authors provide human?survey instruments in Supporting Information or via a citation or online link (i.e., shared materials)….”
“Researchers who receive federal help consistently fail to report their results to the public. The government should hold them accountable….
Researchers using federal funds to conduct cancer trials — experiments involving drugs or medical devices that rely on volunteer subjects — were sometimes taking more than a year to report their results to the N.I.H., as required. “If you don’t report, the law says you shouldn’t get any funding,” he said, citing an investigation I had published in Stat with my colleague Talia Bronshtein. “Doc, I’m going to find out if it’s true, and if it’s true, I’m going to cut funding. That’s a promise.”
It was true then. It’s true now. More than 150 trials completed since 2017 by the N.I.H’s National Cancer Institute, which leads the $1.8 billion Moonshot effort, should have reported results by now. About two-thirds reported after their deadlines or not at all, according to a University of Oxford website that tracks clinical trials regulated by the Food and Drug Administration and National Institutes of Health. Some trial results are nearly two years overdue. Over all, government-sponsored scientists have complied less than half the time for trial results due since 2018. (A spokeswoman for the N.I.H. said, “We are willing to do all measures to ensure compliance with ClinicalTrials.gov results reporting.”)…
In 2016, Dr. Francis Collins, the director of the National Institutes of Health, announced that the agency would begin penalizing researchers for failing to comply with its reporting requirements. “We are serious about this,” he said at the time. Yet in the years since, neither the F.D.A. nor N.I.H. has enforced the law. …”
“From January 2021, there are some changes for ACS authors funded by certain members of?cOAlition S. You may be required to make sure that you publish your work immediately open access under a CC-BY license. ACS offers a wide range of options enabling our authors to comply with these requirements through?publication in a fully open access journal or a gold open access option in all our hybrid journals. In addition, your institution may have signed an ACS Read + Publish Agreement that provides funding for open access publishing. See below for more information regarding these changes.”
The United States has mobilized the full force of its clinical research enterprise to address the Covid-19 pandemic, allocating billions of dollars to support timely research. As of January 2021, for example, the National Institutes of Health (NIH) had issued nearly a thousand awards cumulatively worth roughly $2 billion to support Covid-19 projects ranging from the development of medical products (including diagnostics and vaccines) to evaluations of population-specific risk factors and outcomes.1 Such initiatives, which have yielded new technologies and important evidence, illustrate the value of robust scientific infrastructure.
Abstract: PLOS has long supported Open Science. One of the ways in which we do so is via our stringent data availability policy established in 2014. Despite this policy, and more data sharing policies being introduced by other organizations, best practices for data sharing are adopted by a minority of researchers in their publications. Problems with effective research data sharing persist and these problems have been quantified by previous research as a lack of time, resources, incentives, and/or skills to share data.
In this study we built on this research by investigating the importance of tasks associated with data sharing, and researchers’ satisfaction with their ability to complete these tasks. By investigating these factors we aimed to better understand opportunities for new or improved solutions for sharing data. In May-June 2020 we surveyed researchers from Europe and North America to rate tasks associated with data sharing on (i) their importance and (ii) their satisfaction with their ability to complete them. We received 728 completed and 667 partial responses. We calculated mean importance and satisfaction scores to highlight potential opportunities for new solutions to and compare different cohorts. Tasks relating to research impact, funder compliance, and credit had the highest importance scores. 52% of respondents reuse research data but the average satisfaction score for obtaining data for reuse was relatively low. Tasks associated with sharing data were rated somewhat important and respondents were reasonably well satisfied in their ability to accomplish them. Notably, this included tasks associated with best data sharing practice, such as use of data repositories. However, the most common method for sharing data was in fact via supplemental files with articles, which is not considered to be best practice. We presume that researchers are unlikely to seek new solutions to a problem or task that they are satisfied in their ability to accomplish, even if many do not attempt this task. This implies there are few opportunities for new solutions or tools to meet these researcher needs. Publishers can likely meet these needs for data sharing by working to seamlessly integrate existing solutions that reduce the effort or behaviour change involved in some tasks, and focusing on advocacy and education around the benefits of sharing data. There may however be opportunities – unmet researcher needs – in relation to better supporting data reuse, which could be met in part by strengthening data sharing policies of journals and publishers, and improving the discoverability of data associated with published articles.
This presentation was given by Johan Rooryck during the Open Access Talk on 29 October 2020. Johan Rooryck, Professor at Leiden University and Executive Director of cOAlition S, briefly outlines the rationale for the principles of Plan S. Beyond that, he discusses its implementation for all grants awarded by cOAlition S funders from 1 January 2021, including the Horizon Europe framework. In his talk, Johan Rooryck covers the following questions: Which conditions do you need to fulfil to publish in a journal of your choice under Plan S? What can the newly developed Journal Checker Tool do for you? How does the recent Rights Retention Strategy help you to keep the rights to your Author Accepted Manuscript? In addition, Johan Rooryck mentions a number of other projects initiated by cOAlition S, such as the Price Transparency Framework to ensure that prices for publishing services become more transparent and fair or the commission of a study to identify concrete funding mechanisms to support and strengthen diamond journals and their platforms. The lecture “Plan S and funding – What is going to change?” was held as part of the Open Access Talk online series of the BMBF-funded project open-access.network.
“Question What are the rates of declared and actual sharing of clinical trial data after the medical journals’ implementation of the International Committee of Medical Journal Editors data sharing statement requirement?
Findings In this cross-sectional study of 487 clinical trials published in JAMA, Lancet, and New England Journal of Medicine, 334 articles (68.6%) declared data sharing. Only 2 (0.6%) individual-participant data sets were actually deidentified and publicly available on a journal website, and among the 89 articles declaring that individual-participant data would be stored in secure repositories, data from only 17 articles were found in the respective repositories as of April 10, 2020.
Meaning These findings suggest that there is a wide gap between declared and actual sharing of clinical trial data.”
Abstract: Data management plans (DMPs) have increasingly been encouraged as a key component of institutional and funding body policy. Although DMPs necessarily place administrative burden on researchers, proponents claim that DMPs have myriad benefits, including enhanced research data quality, increased rates of data sharing, and institutional planning and compliance benefits.
In this article, we explore the international history of DMPs and describe institutional and funding body DMP policy. We find that economic and societal benefits from presumed increased rates of data sharing was the original driver of mandating DMPs by funding bodies. Today, 86% of UK Research Councils and 63% of US funding bodies require submission of a DMP with funding applications. Given that no major Australian funding bodies require DMP submission, it is of note that 37% of Australian universities have taken the initiative to internally mandate DMPs. Institutions both within Australia and internationally frequently promote the professional benefits of DMP use, and endorse DMPs as ‘best practice’. We analyse one such typical DMP implementation at a major Australian institution, finding that DMPs have low levels of apparent translational value. Indeed, an extensive literature review suggests there is very limited published systematic evidence that DMP use has any tangible benefit for researchers, institutions or funding bodies.
We are therefore led to question why DMPs have become the go-to tool for research data professionals and advocates of good data practice. By delineating multiple use-cases and highlighting the need for DMPs to be fit for intended purpose, we question the view that a good DMP is necessarily that which encompasses the entire data lifecycle of a project. Finally, we summarise recent developments in the DMP landscape, and note a positive shift towards evidence-based research management through more researcher-centric, educative, and integrated DMP services.
“cOAlition S is pleased to announce that 160 journals published by Elsevier are now registered as Plan S aligned Transformative Journals. These titles, including many by Cell Press, have committed to transitioning to fully open access respecting the transformative journal criteria as described in the cOAlition S Guidance on the implementation of Plan S.
In their public statement, Elsevier reiterates their commitment “to ensure that any author who wants to publish open access in any one of our journals, across all disciplines of research, can do so while also meeting their funder’s requirements”.
You can consult this list of Elsevier’s Transformative Journals. The complete list of cOAlition S approved Transformative Journals is available here….”
In Australia the first challenge is to overcome the apathy about open access issues. The term “open access” has been too easy to ignore. Many consider it a low priority compared to achievements in research, obtaining grant funding, or university rankings glory.
“As Plan S officially gets underway, are you wondering what steps you may still need to take to prepare? Journal publishers that wish to comply with the initiative to make all research funded by cOAlition S members on or after the 1st of January 2021 fully and immediately open access have three routes to choose from: …”
From Google’s English: “The objective of this website is to periodically analyze the degree of compliance with the CSIC’s institutional open access mandate that came into effect on April 1, 2019. [CSIC = Consejo Superior de Investigaciones Científicas.]
This institutional mandate is part of the so-called “green route mandates” since it chooses the DIGITAL.CSIC repository as a channel for opening the research results of its research community.
The mandate affects a wide range of types of research results. On the one hand, the CSIC provides that the bibliographic references (metadata) of all peer-reviewed publications (articles, book chapters, books, conference communications) be made public and permanently in DIGITAL.CSIC from the moment of their publication. editorial acceptance and that their full texts are freely available on DIGITAL.CSIC as soon as publishers allow.
On the other hand, it provides that the bibliographic references (metadata) of the datasets associated with journal articles be made public permanently in DIGITAL.CSIC from the moment of the editorial acceptance of the associated articles and that such datasets are in open access in DIGITAL.CSIC as long as there are no legitimate reasons for confidentiality, intellectual property and / or security.
We inaugurate this website with the publication of the results of a first monitoring carried out by the Technical Office of DIGITAL.CSIC throughout the summer of 2020.
We hope that this website will be a useful and transparent instrument to monitor the degree of compliance with the institutional mandate at the CSIC institute level and as a basis for analytical studies of various kinds….”