Abstract: There has been much debate around the role of metrics in scholarly communication, with particular focus on the misapplication of journal metrics, such as the impact factor in the assessment of research and researchers. Various initiatives have advocated for a change in this culture, including the Declaration on Research Assessment (DORA), which invites stakeholders throughout the scholarly communication ecosystem to sign up and show their support for practices designed to address the misuse of metrics. This case study provides an overview of the process undertaken by a large academic publisher (Taylor & Francis Group) in signing up to DORA and implementing some of its key practices in the hope that it will provide some guidance to others considering becoming a signatory. Our experience suggests that research, consultation and flexibility are crucial components of the process. Additionally, approaching signing with a project mindset versus a ‘sign and forget’ mentality can help organizations to understand the practical implications of signing, to anticipate and mitigate potential obstacles and to support cultural change.
“PLOS has released a preprint and supporting data on research conducted to understand the needs and habits of researchers in relation to code sharing and reuse as well as to gather feedback on prototype code notebooks and help determine strategies that publishers could use to increase code sharing.
Our previous research led us to implement a mandatory code sharing policy at PLOS Computational Biology in March 2021 to increase the amount of code shared alongside published articles. As well as exploring policy to support code sharing, we have also been collaborating with NeuroLibre, an initiative of the Canadian Open Neuroscience Platform, to learn more about the potential role of technological solutions for enhancing code sharing. Neurolibre is one of a growing number of interactive or executable technologies for sharing and publishing research, some of which have become integrated with publishers’ workflows….”
The UK House of Commons Science and Technology Committee has called for evidence on the roles that different stakeholders play in reproducibility and research integrity. Of central priority are proposals for improving research integrity and quality, as well as guidance and support for researchers. In response to this, we argue that there is one important component of research integrity that is often absent from discussion: the pedagogical consequences of how we teach, mentor, and supervise students through open scholarship. We justify the need to integrate open scholarship principles into research training within higher education and argue that pedagogical communities play a key role in fostering an inclusive culture of open scholarship. We illustrate these benefits by presenting the Framework for Open and Reproducible Research Training (FORRT), an international grassroots community whose goal is to provide support, resources, visibility, and advocacy for the adoption of principled, open teaching and mentoring practices, whilst generating conversations about the ethics and social impact of higher-education pedagogy. Representing a diverse group of early-career researchers and students across specialisms, we advocate for greater recognition of and support for pedagogical communities, and encourage all research stakeholders to engage with these communities to enable long-term, sustainable change.
“In January 2023, the US National Institutes of Health (NIH) will begin requiring most of the 300,000 researchers and 2,500 institutions it funds annually to include a data-management plan in their grant applications — and to eventually make their data publicly available.
Researchers who spoke to Nature largely applaud the open-science principles underlying the policy — and the global example it sets. But some have concerns about the logistical challenges that researchers and their institutions will face in complying with it. Namely, they worry that the policy might exacerbate existing inequities in the science-funding landscape and could be a burden for early-career scientists, who do the lion’s share of data collection and are already stretched thin….
Such a seismic shift in practice has left some researchers worried about the amount of work that the mandate will require when it becomes effective….
Others worry that data-management activities will further sap funds from under-resourced labs. Although the policy outlines certain fees that researchers can add to their proposed budgets to offset the costs of compliance with the mandate, it doesn’t specify what criteria the NIH will use to grant these requests….
Despite its potential pitfalls, Ross thinks that the policy will have a ripple effect that will persuade smaller funding agencies and industry to adopt similar changes. “This policy establishes what people expect from clinical research,” he says. “It’s essentially saying the culture of research needs to change.” ”
“While preprint feedback is beneficial for the authors, reviewers, readers and other stakeholders, public commenting on preprints has so far remained relatively low. Cultural barriers likely influence participation in public preprint feedback. Authors fear that competitors will leave unfair criticism, or that even fair criticism will bias journal editors and evaluators: while nearly every paper will be thoroughly criticized during journal peer review, the rarity of this feedback being out in the open might lead some to believe that the paper receiving it is especially problematic. Potential reviewers, especially those who rely on more senior colleagues for career advancement, are concerned about retribution for public criticism, or simply harming their reputation by leaving uninformed feedback.
In order to overcome these concerns, we convened a Working Group to discuss how to alleviate the social friction associated with public feedback by developing a set of behavioral norms to guide constructive participation in preprint review. The Working Group brought together relevant stakeholders (researchers, editors, preprint review platform representatives, funders) to discuss the challenges around participation in preprint review and explore what cultural norms could enable and foster further participation in public commentary and feedback. …”
“The open science movement has been gathering force in STEM disciplines for many years, and some of its procedural elements have been adopted also by quantitative social scientists. However, little work has yet been done on exploring how more ambitious open science principles might be deployed across both the qualitative and quantitative social science disciplines. Patrick Dunleavy sets out some initial ideas to foster a cultural shift towards open social science, explored in a current CIVICA project.”
Abstract: There has been strong interest in preprint commenting and review activities in recent years. Public preprint feedback can bring benefits to authors, readers and others in scholarly communication, however, the level of public commenting on preprints is still low. This is likely due to cultural barriers, such as fear by authors that criticisms on their paper will bias readers, editors and evaluators, and concerns by commenters that posting a public critique on a preprint by a more senior colleague may lead to retribution. In order to help address these cultural barriers and foster positive and constructive participation in public preprint feedback, we have developed a set of 14 principles for creating, responding to, and interpreting preprint feedback. The principles are clustered around four broad themes: Focused, Appropriate, Specific, Transparent (FAST). We describe each of the FAST principles and designate which actors (authors, reviewers and the community) each of the principles applies to. We discuss the possible implementation of the FAST principles by different stakeholders in science communication, and explore what opportunities and challenges lie ahead in the path towards a thriving preprint feedback ecosystem.
“There are many benefits to public feedback on preprints: comments that can help authors improve their work, broader opportunities for early career researchers to participate in review, and additional context for readers. However, we have not yet seen wide engagement in the public review of preprints. This is likely due to cultural barriers: there is a lack of incentives for researchers to participate in preprint review, but there can be risks associated with posting pointed critiques, however constructive, on the paper by another researcher whose favour you may require for a future job or grant.
If we are to foster a positive and thriving environment for public preprint feedback, we need to collectively agree on the norms and behaviors we expect when creating, responding to, and interpreting preprint feedback. This was the remit of the ASAPbio preprint review cultural norms Working Group, which has been developing a set of principles for preprint feedback over the last six months. Following the initial draft shared last July, the Working Group has iterated on the principles based on feedback and has discussed their potential use by different stakeholders in science communication.
We are pleased to now share the FAST principles for preprint feedback. This is a set of 14 principles clustered around four broad themes: Focused, Appropriate, Specific, and Transparent (FAST). Each principle includes a designation for the actors it applies to: authors, reviewers and the community….”
“The open science movement has been gathering force in STEM disciplines for many years, and some of its procedural elements have been adopted also by quantitative social scientists. However, little work has yet been done on exploring how more ambitious open science principles might be deployed across both the qualitative and quantitative social science disciplines. Patrick Dunleavy sets out some initial ideas to foster a cultural shift towards open social science, explored in a current CIVICA project….”
In this article, we provide a toolbox of recommendations and resources for those aspiring to promote the uptake of open scientific practices. Open Science encompasses a range of behaviours that aim to improve the transparency of scientific research. This paper is divided into seven sections, each devoted to different groups or institutions in the research ecosystem: colleagues, students, departments and faculties, universities, academic libraries, journals, and funders. We describe the behavioural influences and incentives for each of these stakeholders as well as changes they can make to foster Open Science. Our primary goal, however, is to suggest actions that researchers can take to promote these behaviours, inspired by simple principles of behaviour change: make it easy, social, and attractive. In isolation, a small shift in one person’s behaviour may appear to make little difference, but when combined, many shifts can radically alter shared norms and culture. We offer this toolbox to assist individuals and institutions in cultivating a more open research culture.
“EMBL has released a new Open Science Policy as part of its ongoing commitment to drive trust, transparency, and more inclusive research across the life sciences….
The Open Science Policy will expand on existing practice, and contribute to positive culture change across EMBL and more widely. To ensure this, the policy covers research assessment and fair attribution of credit. The policy also puts in place guidelines for EMBL staff regarding open and timely access to research results via publications, data, and software….
The 2021 High Level Workshop on the European Research Area dealt with the topic of research culture and how to keep the research sector attractive for current and future generations of researchers.
The research system faces several challenges when it comes to attractiveness, including precarious career paths, a narrow and ineffective reward and incentive system, and a continued lack of diversity throughout the research environment.
The workshop focused on the influence of culture on how research is conceived, conducted, communicated, and assessed, with the goal to understand the views of different actors in the ERA and to pursue alignment on fundamental aspects of research culture. This, in turn, can help to identify actions that can make the ERA a more attractive place for researchers.
“Are you an experienced and ambitious Research and Innovation Development professional looking to further your career in one of the UK’s leading research-intensive universities? Are you an advocate for a collaborative, inclusive, and supportive research culture? Are you a confident leader, capable of building productive working relationships with academic colleagues, funders and senior stakeholders? …”
“Lizzie Gadd makes the case for open research being required not rewarded.
There’s no glory associated with running due diligence on your research partners and following GDPR legislation won’t give you an advantage in a promotion case. These are basic professional expectations placed on every self-respecting researcher. And whilst there are no prizes for those who adhere to them, there are serious consequences for those that don’t. Surely this is what we want for open research? Not that it should be treated as an above-and-beyond option for the savvy few, but that it should be a bread-and-butter expectation on everyone.
Now I appreciate there is probably an interim period where institutions want to raise awareness of open research practices (as I said before, they need to be enabled before they can be incentivised). And during this period, running some ‘Open Research Culture Awards’ or offering ‘Open research hero badges’ to web pages might have their place. But we can’t dwell here for long. We need to move quite rapidly to this being a basic expectation on researchers. We have to define what open research expectations are relevant to each discipline. Add these expectations to our Codes of Good Research Practice. Train researchers in their obligations. Monitor (at discipline/HEI level) engagement with these expectations. And hold research leads accountable for the practices of their research groups.”
“Research systems now generate knowledge faster than ever before, and fundamental research problems and societal challenges are becoming increasingly complex, putting greater demand on the skills and competencies of the research community as well as the infrastructure and organisations that support them. In parallel to these increasing demands, research communities and systems are faced with challenges that threaten both sectoral attractiveness and sustainability, including precarious career paths, a narrow and ineffective rewards and incentives system, delays in achieving openness, and a continued lack of diversity throughout the research environment….
To move forward on Science Europe’s priority to ‘contribute to the evolution of research culture’, the following commitments are made. Science Europe: … Commits to creating an open values framework that underpins this vision for research culture, and will promote alignment on actions required to embed those values. This value framework will be consistently refined through consultation and collaboration with expert groups as well as representatives from all relevant research stakeholder groups, and will be a reference for policy and practice development.”