The Scholarly Publishing Roundtable was formed in 2009 at the request of a US Congressional Committee to develop recommendations for public access policy.
Published in January 2010, the Roundtable’s recommendations had a significant impact on the guidelines for federal funding agencies issued in 2013.
The Roundtable was unique in bringing together individuals holding divergent views about open access policy.
The success of the Roundtable may provide important lessons for policymakers in addressing open access issues….”
“As a journal-level metric, the IF is unable to assess the value of any given article or author. To make this inference, one would need to read the article and assess its claims, scientific rigor, methodological soundness, and broader implications. What’s more, the IF (which represents the average number of citations across a finite set of eligible articles) is vulnerable to the skewness in citation rates among articles (Nature, 2005) and to the manipulation, negotiation, and gaming of its calculation among stakeholders (Ioannidis & Thombs, 2019). At a more fundamental level the IF does not capture journal functioning such as improvements to (or worsening of) internal evaluative processes (e.g., effectiveness of peer review, changes to submission instructions and policies, use and adherence to reporting guidelines, etc.; Dunleavy, 2022). These and other issues are explored in more depth by Seglen (1997)….
In light of these limitations, social work should de-emphasize the IF and instead embrace a new set of evaluative tools. The San Francisco Declaration on Research Assessment (American Society for Cell Biology, 2013)—and more recently the Leiden Manifesto (Hicks et al., 2015)—typify such efforts. They encourage stakeholders (i.e., academic institutions, journals, funders, researchers) to consider using a multitude of qualitative and quantitative alternative metrics (i.e., “altmetrics”; Priem et al., 2012; see also https://metrics-toolkit.org/metrics/) when judging scholarly output—whether it be a journal article, a grant proposal, or even a hiring or tenure packet. …”
“Some final thoughts: (1) Overall usage was a stronger influence on the change in value than the small changes in the proportion of hybrid OA article usage. (2) Despite the range of research activity levels across our institutions, there wasn’t much difference in the proportion of the open versus controlled usage across the site-licensed institutions for either publisher. (3) COVID likely affected these trends, but precisely how was unclear. Did lockdown increase the usage or limit it? Did it affect our two publishers differently? We have no ‘non-COVID’ control unfortunately. (4) If the impact of transformative agreements on the rate of hybrid OA article output influenced these trends, the impact was quite small. Still, with more libraries negotiating transformative agreements, growth in the proportion of OA articles should accelerate. As long as usage in publisher packages continues to grow, cost per controlled use will increase more quickly than cost per use. This new cost per controlled use metric should help libraries track the return on investment from their journal package subscription payments as a growing proportion of underlying articles are free to read.”
“Emerald Publishing has partnered with Knowledge Unlatched (KU) to create and promote an Open Access e-book collection in business management and economics.
The exclusive deal starts from 2023 and is the first Open Access partnership of its kind for the publisher within its e-books portfolio.
All book titles in the “Emerald Publishing – Responsible Management and the SDGs” package will also focus on responding to and achieving the United Nations’ Sustainable Development Goals (SDGs), with a particular focus on SDGs on decent work and economic growth; industry, innovation and infrastructure; and responsible consumption and production.
Titles will cover themes such as diversity, inclusion and gender and racial equity in the workplace, sustainable tourism and ending forced labour, and how businesses of all sizes are working towards SDGs. …”
“Ahead of the June Competitiveness Council, where the ministers will be invited to adopt conclusions on research assessment and implementation of Open Science policies, The Guild urges the member states to ensure that Open Access serves science, not publishers.
While research excellence requires free flow of knowledge, some Open Access strategies and models increase the financial burden on research institutions. Article Processing Charges (APCs), used by some of the Open Access journals, exacerbate the unsustainable situation of journal spending in university libraries and create unequal access to knowledge. Greater transparency on the publication costs for Open Access journals, and fair and transparent contractual arrangements with publishers are crucial for monitoring the proper use of public research funding.
It is important to develop alternative and sustainable non-APC Open Access models. The Guild calls for the member states to support the development and uptake of Diamond Open Access journals and platforms which consist often of community-driven, and academic-led and owned publishing initiatives. Unlike other Open Access models, Diamond Open Access journals and platforms do not charge any fees from the authors or readers. Thus, they can further empower researchers to disseminate their research results, ensuring bibliodiversity and vital academic publishing….”
“Each day we will present TOPS upcoming plans for the Year of Open Science and highlight emerging trends, success stories, and lessons learned with open science experts. Public participation is encouraged.”
“There have been many calls in recent years to make data sets FAIR (Findable, Accessible, Interoperable and Reusable), and to ensure that open data abide by the 5-star deployment scheme suggested by World Wide Web inventor Tim Berners-Lee, which aims to make them findable, free and structured. Many researchers are now committed to depositing data in free and open repositories with appropriate metadata.
Chaos around units undermines these efforts. Already, many scientists invest more time in wrangling data than doing research. When data are not interoperable or machine readable, researchers’ individual informatics approaches are thwarted. The benefits of data sharing shrink.
Unless we take steps to ensure that measurement units are routinely documented for easy, unambiguous exchange of data, information will be unusable or, worse, be misinterpreted. All global challenges, from pandemics to climate change, require high-quality data across multidisciplinary, international sources. Mistakes and lost opportunities will cost humanity much more than hundreds of millions of dollars for a single crashed spacecraft….”
“Making it easy for researchers to publish their articles open access is not just a question of eliminating—or significantly lowering—the financial obstacles of APCs. Library and publisher processes, workflows, and communication streams are still deeply rooted in the old logic of accessing and producing content behind subscription paywalls.
In order for “open” to become the default in scholarly communication, these subscription-based systems will have to be reengineered so that open access is the norm and not the exception. Transformative agreements provide a framework for both libraries and publishers to initiate this process of transformation and effectively bring open access to researchers wherever they choose to publish.
To illustrate how, the next session of the ESAC Community of Practice will explore how libraries, library consortia and publishers are using transformative agreements to adapt their systems and prepare their organizations for open access in research communication on a large scale.
Focusing on the first point of contact with authors at the start of the scholarly publishing cycle, our guest speakers will share how they are reorienting their communication and engagement strategies with authors, adjusting mechanisms behind article submission process, and more….”
“This system has gone through several revisions. Below are links to the current version, previous versions, and related files.
In brief, journal publishers earn points in this scoring system by engaging in practices that demonstrate partnership with libraries, educators, and researchers. Library Partnership (LP) certification is calculated using a method similar to the U.S. Green Building Council’s LEED (Leadership in Energy and Environmental Design) certification for architectural and building projects. In LEED certification, architectural projects “earn points for various green building strategies across several categories. Based on the number of points achieved, a project earns one of four LEED rating levels: Certified, Silver, Gold or Platinum” (https://www.usgbc.org/). Where LEED certification assesses a building project’s practices in “credit categories” such as water efficiency or indoor air quality, LP certification assesses a publisher’s practices in four categories: Access, Rights, Community, and Discoverability.
A publisher’s partnership score reflects an overall achievement of credits. This score places them in one of four levels or tiers, Tier 1 (highest partnership practices) through Tier 4 (lowest partnership practices)….”
“The MIT Ad Hoc Task Force on Open Access to MIT’s Research, chaired by Class of 1922 Professor of Electrical Engineering and Computer Science Hal Abelson and Director of Libraries Chris Bourg, will lead an Institute-wide discussion of ways in which current MIT open access policies and practices might be updated or revised to further the Institute’s mission of disseminating the fruits of its research and scholarship as widely as possible.”
“In March 2009, MIT faculty passed one of the country’s first open access policies; the policy covers their scholarly articles by default.
As of April 2017, all MIT authors, including students, postdocs, and staff, can “opt-in” to an open access license. See below for information on how to deposit a paper, get download statistics on your papers, or opt out of the policy. Authors covered by the MIT faculty open access policy do not need to sign this license.
MIT faculty OA policy
Text of the 2009 faculty open access policy, as well as definitions of terms that appear in the policy.
MIT authors’ opt-in OA license
Information and FAQs on MIT’s opt-in open access license. Sign the license.
FAQ on MIT’s faculty OA policy
Opt-out of MIT’s OA policies
Automated form to waive the faculty OA policy or authors’ opt-in license for a specific paper. Email email@example.com for more information.
Reader comments on OA articles
This beta site shows what readers around the globe are saying about MIT’s OA policy.
Open access publishing support
Find support for open access publishing, including the OA fund. …”
“To advance open educational practices (OEP), MERLOT and SkillsCommons have designed this OEP portal to enable easy discovery and sharing of free, open, and exemplary collections of ePortfolios along with tools, templates, and guidelines that showcase open practices in higher education….”
The transition to an open science system affects the entire research process. The reward systems also need to be adjusted in order to support and mirror the open research landscape, but what will this work look like, and what will change? We met Gustav Nilsonne, chair of the European working group dealing with the issue and a participant in the SUHF working group on merit reviews.
Advocates for open access argue that people need scientific information, although they lack evidence for this. Using Google’s recently developed deep learning natural language processing model, which offers unrivalled comprehension of subtle differences in meaning, 1.6 million people downloading National Academies reports were classified, not just into broad categories such as researchers and teachers but also precisely delineated small groups such as hospital chaplains, veterans, and science fiction authors. The results reveal adults motivated to seek out the most credible sources, engage with challenging material, use it to improve the services they provide, and learn more about the world they live in. The picture contrasts starkly with the dominant narrative of a misinformed and manipulated public targeted by social media.
In seeking to understand how to protect the public information sphere from corruption, researchers understandably focus on dysfunction. However, parts of the public information ecosystem function very well, and understanding this as well will help in protecting and developing existing strengths. Here, we address this gap, focusing on public engagement with high-quality science-based information, consensus reports of the National Academies of Science, Engineering, and Medicine (NASEM). Attending to public use is important to justify public investment in producing and making freely available high-quality, scientifically based reports. We deploy Bidirectional Encoder Representations from Transformers (BERT), a high-performing, supervised machine learning model, to classify 1.6 million comments left by US downloaders of National Academies reports responding to a prompt asking how they intended to use the report. The results provide detailed, nationwide evidence of how the public uses open access scientifically based information. We find half of reported use to be academic—research, teaching, or studying. The other half reveals adults across the country seeking the highest-quality information to improve how they do their job, to help family members, to satisfy their curiosity, and to learn. Our results establish the existence of demand for high-quality information by the public and that such knowledge is widely deployed to improve provision of services. Knowing the importance of such information, policy makers can be encouraged to protect it.
Abstract: Questionable research practices (QRPs) among researchers have been a source of concern in many fields of study. QRPs are often used to enhance the probability of achieving statistical significance which affects the likelihood of a paper being published. Using a sample of researchers from 10 top research-productive management programs, we compared hypotheses tested in dissertations to those tested in journal articles derived from those dissertations to draw inferences concerning the extent of engagement in QRPs. Results indicated that QRPs related to changes in sample size and covariates were associated with unsupported dissertation hypotheses becoming supported in journal articles. Researchers also tended to exclude unsupported dissertation hypotheses from journal articles. Likewise, results suggested that many article hypotheses may have been created after the results were known (i.e., HARKed). Articles from prestigious journals contained a higher percentage of potentially HARKed hypotheses than those from less well-regarded journals. Finally, articles published in prestigious journals were associated with more QRP usage than less prestigious journals. QRPs increase in the percentage of supported hypotheses and result in effect sizes that likely overestimate population parameters. As such, results reported in articles published in our most prestigious journals may be less credible than previously believed.