SPARC Europe to facilitate high-level European OS policymaker group CoNOSC – SPARC Europe

“SPARC Europe is honoured to support the Council for National Open Science Coordination (CoNOSC) in their efforts to advance national European Open Science policies. The CoNOSC mission is to help countries create, update and cooordinate their national open science policies by sharing valuable insights from the network.

SPARC Europe will work with the CoNOSC group at least until the end of 2022. We will investigate the needs of today’s national Open Science (OS) policymakers and organise strategic OS policy meetings with high-level national OS Co-ordinators and ministry representatives to determine priorities for the coming year. We will facilitate discussion around important OS policy topics and showcase policy developments and outcomes to help resolve common challenges and stimulate synergies. We will also help build and expand the CoNOSC network….”

CoNOSC – Council for National Open Science Coordination

“On 21 October 2019, in Helsinki, France, the Netherlands and Finland invited representatives of the ERAC countries to discuss the creation of a network of open science coordination. The program of the day is included. Twenty-one countries were present, as well as the European Union. Participants agreed that it was necessary to create such a network to enable the coordination of national efforts in the field of open science.

The objectives and organizational principles of this network, which we have named ‘Council for National Open Science Coordination’ (CoNOSC), are specified in the attached Memorandum of Understanding. Here is the summary:

 

 

– CoNOSC helps to fill  in the gaps in national open science coordination.

– CoNOSC will provide a valuable insights through the dialogue with other international partners.

– CoNOSC membership is in principle open to all countries within the European Research Area….”

Guest Post – Transforming the Transformative Agreement – The Scholarly Kitchen

“At Cambridge University Press, we’ve been engaged in a major expansion of our TAs with US institutions. Agreements with 130 institutions came into effect this year with a diverse mix of organizations, including state university systems, liberal arts colleges, and major research universities. These agreements follow the “Read and Publish” model (R&P) we kicked off in the US with the University of California system; repurposing institutions’ existing subscription spend to open up access to important scholarly content and to extend the reach of their researchers’ work. The success this year in the US now gives us real scale — we have over 100 TAs covering 1000 institutes in 30 countries — and a critical mass of customer, author, and stakeholder feedback has given us a much better sense of what we will need to prioritize moving forward.

Yet even as we’ve actively sought to build momentum for change through R&P arrangements, we know that the evolution of TAs is essential to a long-term transition. While there are still many challenges we must solve for collectively, we are focusing our external engagement on four main areas.

Funder mandates should not be the only drivers of change….

Increased scale must come with better use of resources….

Equity and diversity must be supported in new ways….

Open is a means, not an end….”

Open Science and Data Policy Developments: Virtual SciDataCon 2021 Strand – CODATA, The Committee on Data for Science and Technology

“Virtual SciDataCon 2021 is organised around a number of thematic strands.  This is the third of a series of announcements presenting these strands to the global data community. Please note that registration is free, but participants must register for each session they wish to attend.

The  COVID-19 pandemic has demonstrated some of the benefits of Open Science practices, while highlighting persistent shortcomings in current science system. The deepening climate crisis underlines the need for targeted data gathering and action oriented research. In the policy sphere, 2021 started with the adoption of the ‘Recommendation of the OECD Council concerning Access to Research Data from Public Funding’.  November should see the adoption of a Recommendation on Open Science by the UNESCO General Conference: a major achievement which it is hoped will have a mobilising effect on Members States world-wide. The UNESCO Recommendation defines shared values and principles for Open Science, and identifies concrete measures on Open Science, with proposals to bring citizens closer to science and commitments to facilitate the production and dissemination of scientific knowledge around the world.

On Tuesday 19 October, SciDataCon will host a strand of session exploring these and other important Open Science and data policy developments.  Two sessions relate to the implementation of the OECD Recommendation. The third will include an update on the UNESCO Recommendation and other developments….”

Second French Plan for Open Science: Generalising Open Science in France 2021-2024

This Second French Plan extends the scope to include source code from research, structures actions promoting data sharing and openness through the creation of the Recherche Data Gouv platform, it increases the number of transformative levers available to generalise the practice of open science and is divided up into different disciplines and themes. It is firmly attached to a European-wide vision and, in the context of the French presidency of the European Union, proposes to act in favour of open science being effectively taken into account in both individual and collective assessments for research. This involves initiating a process of sustainable transformation in order to ensure that open science becomes a common and shared practice, encouraged by the whole international ecosystem of higher education, research and innovation.

Data sharing policies: share well and you shall be rewarded | Synthetic Biology | Oxford Academic

Abstract:  Sharing research data is an integral part of the scientific publishing process. By sharing data, authors enable their readers to use their results in a way that the textual description of the results does not allow by itself. In order to achieve this objective, data should be shared in a way that makes it as easy as possible for readers to import them in computer software where they can be viewed, manipulated and analyzed. Many authors and reviewers seem to misunderstand the purpose of the data sharing policies developed by journals. Rather than being an administrative burden that authors should comply with to get published, the objective of these policies is to help authors maximize the impact of their work by allowing other members of the scientific community to build upon it. Authors and reviewers need to understand the purpose of data sharing policies to assist editors and publishers in their efforts to ensure that every article published complies with them.

Prospective Clinical Trial Registration: A Prerequisite for Publishing Your Results | Radiology

“The ICMJE requires that clinical trial results be published in the same clinical trial depository where the trial is registered. These results are in the form of a short (?500 words) abstract or table (6,7). Full disclosure of the existing results publication in a clinical trial registry should be explicitly stated when the manuscript is submitted for publication. The Food and Drug Administration (FDA) has indicated it will enforce trial results reporting related to ClinicalTrials.gov (8). The FDA is authorized to seek civil monetary penalties from responsible parties, including additional civil monetary penalties. In the United States, the sponsor of an applicable clinical trial is considered the responsible party, unless or until the sponsor designates a qualified principal investigator as the responsible party. The FDA issued its first Notice of Noncompliance in April 2021 for failure to report results in ClinicalTrials.gov based on a lack of reporting the safety and effectiveness results for the drug dalantercept in combination with axitinib in patients with advanced renal cell carcinoma (8).

Finally, as of July 1, 2018, manuscripts submitted to ICMJE journals that report the results of clinical trials must contain a data sharing statement. Clinical trials that begin enrolling participants on or after January 1, 2019, must include a data sharing plan in the trial registration. (for further information, see www.icmje.org/recommendations/browse/publishing-and-editorial-issues/clinical-trial-registration.html). Since most clinical trials take 2 or more years for results to be reported, the Radiology editorial board had expected such mandatory data sharing plans to be reported in the current year. However, because of the COVID-19 pandemic, many clinical trials were halted. Thus, journal publication requirements to include data sharing statements are more likely to impact authors beginning in 2023. Data sharing statements required for Radiological Society of North America (RSNA) journals may be found at https://pubs.rsna.org/page/policies#clinical.

In conclusion, prospective clinical trial registration is a mechanism allowing us to ensure transparency in clinical research conduct, honest and complete reporting of the clinical trial results, and minimization of selective result publications. Since its inception in 2004, this requirement has evolved into a policy that is practiced by major medical journals worldwide, is mandatory for publication of trial results, and, in some circumstances, is enforced by the FDA. Further, ICMJE journals, including RSNA journals, are expecting manuscripts that report trial results to include statements on data sharing. As each clinical trial design is unique, we encourage authors to refer to the full description of the current ICMJE policy at icmje.org for additional information pertaining to their specific circumstances.”

Free for all, or free-for-all? A content analysis of Australian university open access policies | bioRxiv

Abstract:  Recent research demonstrates that Australia lags in providing open access to research outputs. In Australia, while the two major research funding bodies require open access of outputs from projects they fund, these bodies only fund a small proportion of research conducted. The major source of research and experimental development funding in Australian higher education is general university, or institutional, funding, and such funds are not subject to national funder open access policies. Thus, institutional policies and other institutional supports for open access are important in understanding Australia’s OA position. The purpose of this paper is, therefore, to understand the characteristics of Australian institutional open access policies and to explore the extent they represent a coherent and unified approach to delivering and promoting open access in Australia. Open access policies were located using a systematic web search approach and then their contents were analysed. Only half of Australian universities were found to have an open access policy. There was a wide variation in language used, expressed intent of the policy and expectations of researchers. Few policies mention monitoring or compliance and only three mention consequences for non-compliance. While it is understandable that institutions develop their own policies, when language is used which does not reflect national and international understandings, when requirements are not clear and with consequences, policies are unlikely to contribute to understanding of open access, to uptake of the policy, or to ease of transferring understanding and practices between institutions. A more unified approach to open access is recommended.

 

PsyArXiv Preprints | When open data closes the door: Problematising a one size fits all approach to open data in journal submission guidelines

Abstract:  Opening data promises to improve research rigour and democratise knowledge production. But it also poses practical, theoretical, and ethical risks for qualitative research. Despite discussion about open data in qualitative social psychology predating the replication crisis, the nuances of this discussion have not been translated into current journal policies. Through a content analysis of 261 journals in the domain of social psychology, we establish the state of current journal policies for open data. We critically discuss how these expectations may not be adequate for establishing qualitative rigour, can introduce ethical challenges, and may place those who wish to use qualitative approaches at a disadvantage in peer review and publication processes. We assert that open data requirements should include clearer guidelines that reflect the nuance of data sharing in qualitative research, and move away from a universal ‘one-size-fits-all’ approach to data sharing.

 

Annual report: a recap of the San Francisco Declaration on Research Assessment (DORA) activities in 2020 | DORA

“Over the past year, it has become increasingly clear that research assessment reform is a systems challenge that requires collective action. Point interventions simply do not solve these types of complex challenges that involve multiple stakeholders. Because of this, we dedicated our efforts in 2020 on building a community of practice and finding new ways to support organizations seeking to improve the decision-making that impacts research careers.

Current events also influenced our approach this year and evolved our thinking about research assessment reform. The Covid-19 pandemic led to the abrupt global disruption of academic research, along with many other industries. For academics with limited access to research laboratories and other on-campus resources, work stalled. Without appropriate action, this disruption will have a profound effect on the advancement and promotion of the academic workforce, and it will likely disproportionately affect women and underrepresented and minoritized researchers. So in April DORA called on institutions to redefine their expectations and clearly communicate how evaluation procedures will be modified. In May, DORA organized a webinar with Rescuing Biomedical Research to better understand specific faculty concerns as a result of the pandemic….

In the Fall of 2020, DORA initiated a new community project with Schmidt to develop a means for institutions to gauge their ability to support academic assessment interventions and set them up for success. Our goal for the project was to support the development of new practices by helping institutions analyze the outcomes of their efforts. More than 70 individuals in 26 countries and 6 continents responded to our informal survey in August, and about 35 people joined us for 3 working sessions in September. From these activities, we heard it was important to look beyond individual interventions to improve assessment, because the success of these interventions depends on institutional conditions and capabilities. We were also reminded that institutional capabilities impact interventions, so it is important not only to gauge success but also to support interventions. These and other insights led us to create SPACE to Evolve Academic Assessment: a rubric for analyzing institutional conditions and progress indicators. The first draft of the rubric was developed in the last quarter of 2020. The final version was released in 2021 after an initial pilot phase with seven members of the academic community, including a college dean, policy advisor, research administrator, faculty member, and graduate student….

Another addition to the website was a repository of case studies documenting key elements of institutional change to improve academic career assessment, such as motivations, processes, timelines, new policies, and the types of people involved. The repository, Reimagining academic assessment: stories of innovation and change, was produced in partnership with the European University Association and SPARC Europe. At the time of launch, the repository included 10 structured case studies coming from 7 universities and 3 national consortia. Nine of the 10 cases are from Europe and one is from China. The case studies have shown us the importance of coalition-building to gain bottom-up support for change. We also learned that limited awareness and capacity for incentivizing and rewarding a broader range of academic activities were challenges that all the cases had to overcome. By sharing information about the creation of new policies and practices, we hope the case studies will serve as a source of inspiration for institutions seeking to review or improve academic career assessment….

Policy progress for research assessment reform continued to gain momentum in 2020. A new national policy on research assessment in China announced in February prohibits cash rewards for research papers and indicates that institutions can no longer exclusively hire or promote researchers based on their number of publications or citations. In June, Wellcome published guidance for research organizations on how to implement responsible and fair approaches for research assessment that are grounded i

Open Access Week events will feature the university’s new open access policy | VTx | Virginia Tech

“Beginning Oct. 25, Open Access Week will offer talks, panel discussions, and workshops to help Virginia Tech authors understand the importance of sharing their scholarship and how to do so. A key topic of discussion throughout the week is the university’s new open access policy.

Kevin McGuire, a professor in Forest Resources and Environmental Conservation and an Open Access Policy Working Group member, will discuss the new policy….

Also on that Monday from 6-8 p.m., experts from Virginia Tech and Harvard University will discuss the new open access policy. This feature presentation will dive into how the open access policy works, how it was developed, and why it’s important. Peter Suber from Harvard University’s Office of Scholarly Communication and members of the open access working group will answer questions from attendees regarding the open access policy. The presentation will be followed by introductions to open data and open educational resources….”

OAreport: Put OA policies into practice in minutes, not months.

“We discover papers and data using open scholarly metadata, targeted text and data mining, and an institution’s internal data sources….

We transparently analyse those papers against all the terms of the institution’s current policy, or custom criteria, to provide detailed statistics and key insights….

We help libraries and funders unlock individual papers as they’re published by making outreach a one-click process, and help build evidence for systemic changes….”

Australian funder backflips on controversial preprint ban

“Australia’s major research funding body has backtracked on a rule that banned the mention of preprints in grant applications, under pressure from researchers who decried the ruling as “astonishing” and “outdated”.

The policy adjustment by the Australian Research Council (ARC) comes nearly four weeks after an anonymous researcher behind the ARC Tracker account on Twitter revealed that dozens of applications for early-career funding schemes had been rejected for citing preprints. More than 30 applications, worth Aus$22 million (US$16 million), were ruled ineligible.

Several rejected applicants, who can’t apply again because fellowship-application attempts are limited, told Nature last month that the decision had effectively ended their careers….”