“Gold Open Access under a Creative Commons licence is arguably a major way in which you can increase the reach of your work because most clinicians or patients cannot access paywalled content. If you submit to a subscription publisher, you will have the option of paying the article processing charge (APC) so that your work becomes Gold Open Access. This is expensive for most individuals and your funder or institution may pay the APC on your behalf – but you must ask. You can also ask the journal if other forms of Open Access are appropriate or possible (Table?1). Next, you can approach your department, institution or university to see what promotion it can offer. Sometimes, the journal may have declined to issue a press release, but others still might believe your work to be newsworthy. Authors could even approach newspapers, radio stations, broadcasters and journalists independently. Finally, you may wish to approach blog and podcast producers, conference organisers and social media influencers. The more methods used to communicate key messages from your work, the higher the reach of your paper….”
Peer-reviewed scientific publications and congress abstracts are typically written by scientists for specialist audiences; however, patients and other non-specialists are understandably interested in the potential implications of research and what they may mean for them. Plain language summaries (PLS)—summaries of scientific articles in easy-to-read language—are emerging as a valuable addition to traditional scientific publications. Co-creation of PLS with the intended audience is key to ensuring a successful outcome, but practical guidance on how to achieve this has been lacking.
Building on the Patient Engagement (PE) Quality Guidance previously developed by Patient Focused Medicines Development (PFMD), a multi-stakeholder working group (WG) of individuals with patient engagement experience and/or expertise in PLS was established to develop further activity-specific guidance. PLS guidance was developed through a stepwise approach that included several rounds of co-creation, public consultation (two rounds), internal review and a final external review. The iterative development process incorporated input from a wide variety of stakeholders (patient representatives, industry members, publishers, researchers, medical communications agencies, and public officials involved in research bodies). Feedback from each step was consolidated by the WG and used for refining the draft guidance. The final draft was then validated through external consultation.
The WG comprised 14 stakeholders with relevant experience in PE and/or PLS. The WG developed a set of 15 ethical principles for PLS development. These include the necessity for objective reporting and the absence of any promotional intent, the need for balanced presentation, the importance of audience focus, the need to apply health literacy principles, and the importance of using inclusive and respectful language. The first public consultation yielded 29 responses comprising 478 comments or edits in the shared draft guidance. The second public consultation was an online survey of 14 questions which had 32 respondents. The final ‘How-To’ Guide reflects feedback received and provides a rational, stepwise breakdown of the development of PLS.
The resulting ‘How-To’ Guide is a standalone, practical, ready-to-use tool to support multi-stakeholder co-creation of PLS.
“The Content Strategy Committee (CSC) arrived at the Assessment Guidelines for Open Access Publishers through a consultation with the CRKN membership, the results of which were presented to and supported by members at the 2021 Conference. These are meant to be guiding principles that the CSC may use in assessing whether proposals from vendors and publishers meet with the CRKN membership’s stated goals and objectives with respect to supporting open access. We have purposefully not assigned any weighting to the criteria as the CSC is better positioned to have as much latitude as possible in assessing the offers it receives from providers. Therefore, there is not an expectation that each provider will meet every criterion….”
“Three Paths to Open Access is a handout that can be shared with researchers to provide an overview of three common options for making their work open access. The content can be edited to better reflect your institution’s open access support services. For a more in-depth exploration of this topic, see our YouTube video, Three Routes to Open Access: https://www.youtube.com/watch?v=hkSLywLnS9c …”
Open-source science builds on open and free resources that include data, metadata, software, and workflows. Informed decisions on whether and how to (re)use digital datasets are dependent on an understanding about the quality of the underpinning data and relevant information. However, quality information, being difficult to curate and often context specific, is currently not readily available for sharing within and across disciplines. To help address this challenge and promote the creation and (re)use of freely and openly shared information about the quality of individual datasets, members of several groups around the world have undertaken an effort to develop international community guidelines with practical recommendations for the Earth science community, collaborating with international domain experts. The guidelines were inspired by the guiding principles of being findable, accessible, interoperable, and reusable (FAIR). Use of the FAIR dataset quality information guidelines is intended to help stakeholders, such as scientific data centers, digital data repositories, and producers, publishers, stewards and managers of data, to: i) capture, describe, and represent quality information of their datasets in a manner that is consistent with the FAIR Guiding Principles; ii) allow for the maximum discovery, trust, sharing, and reuse of their datasets; and iii) enable international access to and integration of dataset quality information. This article describes the processes that developed the guidelines that are aligned with the FAIR principles, presents a generic quality assessment workflow, describes the guidelines for preparing and disseminating dataset quality information, and outlines a path forward to improve their disciplinary diversity.
“Below are some of the fundamental guidelines of transformative agreements, as defined by the ESAC Initiative community; a listing of the specific requirements that have been adopted by national consortia and other organizations can be found here https://esac-initiative.org/guidelines/. …”
“From optimising supply chains and supporting innovation, to addressing sector challenges and delivering public services, we have seen that sharing data can generate benefits for companies, the economy, society and the environment.
However, a common concern for organisations looking to share data is in providing assurance to senior leaders that sharing a particular set of data will not generate negative impacts on reputation; compromise legal compliance or negatively affect their place in the market; or cause harm to society, the economy or the environment.
With this in mind, we’ve created this guide to help organisations identify, assess and manage risks related to sharing data that they hold.
This guide seeks to provide early steps – prior to seeking legal counsel (if that is required) – to consider real and perceived risks in sharing data to identify suitable mitigating actions. We include typical risk categories, key questions to consider and suggestions on how to minimise harm….”
Abstract: Are you interested in the field of scholarly communications or have you recently been hired at your institution as the director of scholarly initiatives? The concepts presented in Sustaining and Enhancing the Scholarly Communications Department: A Comprehensive Guide by Kris S. Helge, Ahmet Meti Tmava, and Amanda R. Zerangue provide guidance for the scholarly communications librarian, especially those new to the profession.
Wissenschaftrat (2022): Empfehlungen zur Transformation des wissenschaftlichen Publizierens zu Open Access; Köln. DOI: https://doi.org/10.57674/fyrc-vb61
“In May 2015, the Center for Open Science invited Epidemiology to support the Transparency and Openness Promotion (TOP) Guidelines.1 After consulting our editors and former Editors-in-Chief, I declined this invitation and published an editorial to explain the rationale.2 Nonetheless, the Center for Open Science has assigned a TOP score to the journal and disseminated the score via Clarivate, which also disseminates the Journal Impact Factor. Given that Epidemiology has been scored despite opting not to support the TOP Guidelines, and that our score has been publicized by the Center for Open Science, we here restate and expand our concerns with the TOP Guidelines and emphasize that the guidelines are at odds with Epidemiology’s mission and principles. We declined the invitation to support the TOP Guidelines for three main reasons. First, Epidemiology prefers that authors, reviewers, and editors focus on the quality of the research and the clarity of its presentation over adherence to one-size guidelines. For this reason, among others, the editors of Epidemiology have consistently declined opportunities to endorse or implement endeavors such as the TOP Guidelines.3–5 Second, the TOP Guidelines did not include a concrete plan for program evaluation or revision. Well-meaning guidelines with similar goals sometimes have the opposite of their intended effect.6 Our community would never accept a public health or medical intervention that had little evidence to support its effectiveness (more on that below) and no plan for longitudinal evaluation. We hold publication guidelines to the same standard. Third, we declined the invitation to support the TOP Guidelines because they rest on the untenable premise that each research article’s results are right or wrong, as eventually determined by whether its results are reproducible or not. Too often, and including in the study of reproducibility that was foundational in the promulgation of the TOP Guidelines,7 reproducibility is evaluated by whether results are concordant in terms of statistical significance. This faulty approach has been used frequently, even though the idea that two results—one statistically significant and the other not—are necessarily different from one another is a well-known fallacy.8,9 ”
“Whether starting to develop your own open access strategy or assessing a publisher “read and publish” offer for the first time, adapting to the changes underway in the scholarly publishing landscape can be daunting. Luckily, clear signposts have emerged and, thanks to the excellent resources shared by the community, there is no need to re-invent the wheel.
The ESAC Reference Guide is the narrative manifestation of a mapping exercise conducted in Spring 2021 by members the international ESAC community that have accumulated deep, first-hand knowledge and expertise in the negotiation and implementation of transformative agreements with scholarly publishers. Threading together and contextualizing the many local guidelines, recommendations, toolkits, templates and data openly available, the reference guide serves as an authoritative and essential orientation for librarians and consortium staff just beginning to approach or looking to update their transformative agreement strategies based on the latest benchmarks.
The ESAC Reference Guide develops through the phases of preparing, negotiating and implementing an agreement, but libraries and library consortia each have their own unique starting points, and the steps they take in adopting transformative agreements will have local flavors….”
Abstract: In this short practice paper, we introduce the public version of the Qualitative Data Repository’s (QDR) Curation Handbook. The Handbook documents and structures curation practices at QDR. We describe the background and genesis of the Handbook and highlight some of its key content.
Abstract: To make evidence-based policy, the research ecosystem must produce trustworthy evidence. In the US, federal evidence clearinghouses evaluate research using published standards designed to identify “evidence-based” interventions. Because their evaluations can affect billions of dollars in funding, we examined 10 federal evidence clearinghouses. We found their standards focus on study design features such as randomization, but they overlook issues related to transparency, openness, and reproducibility that also affect whether research tends to produce true results. We identified intervention reports used by these clearinghouses, and we developed a method to assess journals that published those evaluations. Based on the Transparency and Openness Promotion (TOP) Guidelines, we created new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices). Of the 340 journals that published influential research, we found that some endorse the TOP Guidelines, but standards for transparency and openness are not applied systematically and consistently. Examining the quality of our tools, we also found varying levels of interrater reliability across different TOP standards. Using our results, we delivered a normative feedback intervention. We informed editors how their journals compare with TOP and with their peer journals. We also used the Theoretical Domains Framework to develop a survey concerning obstacles and facilitators to implementing TOP; 88 editors responded and reported that they are capable of implementing TOP but lack motivation to do so. The results of this program of research highlight ways to support and to assess transparency and openness throughout the evidence ecosystem.
“A Researcher’s Guide to Working With Public Access Books…”
Not even an abstract is OA.