Why price transparency in research publishing is a positive step | Hindawi

“In 2019, Hindawi took part in the price transparency framework pilot run by Information Power on behalf of cOAlition S. Three years later and the coalition’s new Journal Comparison Service (JCS) is up and running. Hindawi is proud to be one of the publishers that has contributed data to this service. Taking part has helped us focus on the rigour of our own reporting system and has enabled us to give researchers greater choice when choosing a journal by giving more visibility to our services in our new and publicly available journal reports.

Only a few publishers took part in the pilot and the framework remains untested. It’s not yet clear how useful the JCS will be to the institutions who might want to access the service and use the data, or how the JCS will increase transparency about costs as well as pricing across the publishing industry more generally. In part, this is because it’s seen by some to provide an overly simplistic view of publishing. Compartmentalising publishing services into seven or eight different categories  (see page 20 of the JCS guidance for publishers) inevitably constrains the many different and often overlapping services that publishers provide. In addition, limiting the price breakdown of these services into the percentage that each contributes to a journal’s APC also means that the real costs aren’t visible. There are also pragmatic reasons that make it very difficult for some publishers to collect data consistently, especially for those with large portfolios that operate on multiple platforms or have journal-specific workflows. Finally, fully open-access publishers who don’t have an APC business model can’t take part, even if they want to be more transparent. However, we believe the upsides are large. Hindawi has more than 200 journals in our portfolio and the following outlines a few of the ways we, and we hope those who contribute to and access our journals, are benefiting. Our focus is on the ‘Information Power’ framework for the JCS and on the ‘Journal Quality’ information specifically (columns P-Z in the template spreadsheet). This information relates to data on the journal workflow, especially peer review (such as timings and the no of reviewers involved). We know that there is a long way to go to make all publishing services transparent, but we are learning from our participation in the JCS and will continue to explore ways to improve transparency….”

Elsevier absent from journal cost comparison | Times Higher Education (THE)

“Of the 2,070 titles whose information will become accessible under the JCS, although not directly to researchers, 1,000 belong to the US academic publishing giant Wiley, while another 219 journals owned by Hindawi, which was bought by Wiley last year, also appear on the list.

Several other fully open access publishers will also participate on the comparison site including Plos, the Open Library of Humanities, and F1000, while learned society presses and university publishers, including the Royal Society, Rockefeller University Press, and the International Union of Crystallography, are also part of the scheme.

Other notable participants include the prestigious life sciences publisher eLife, EMBO Press and the rapidly growing open access publisher, Frontiers.

However, the two of the world’s largest scholarly publishers – Elsevier and Springer Nature, whose most prestigious titles charge about £8,000 for APCs – are not part of the scheme….

Under the Plan S agreement, scholarly journals are obliged to become ‘transformative journals’ and gradually increase the proportion of non-paywalled content over a number of years. Those titles that do not make their papers free at the point of publication will drop out of the Plan S scheme, meaning authors cannot use funds provided by any of the 17 funding agencies and six foundations now signed up to Plan S. There are, however, no immediate consequences for a publisher who decides not to share their price and service data through the JCS.  …”

MDPI Journals: 2015 -2021 | Dan Brockington

“In this blog I report on growth of MDPI journals and papers from 2015-2021. It updates previous blogs on the same topic (the most recent is here) that looked at growth up to 2020….

By every measure MDPI’s growth continues to be remarkable. The rate of revenue increase has slowed in the last two years, to just over 50%, but even that remains extraordinary.  Note that the proportion of submissions that are published has increased, from around 44% two years ago to over 55% currently (Table 1; Figure 1)….

The growth in publications is partly sustained by lower rejection rates. The journals with the lowest rejection rates used to count for only a minority of publications and fees (Tables 2-4). Now figures for 2021 show that journals with low rejection rates are producing a higher proportion of MDPI publications….

MDPI itself has been aware of the dangers of being too inclusive. In its 2015 annual report it noted that the overall rejection rate had increased since last year (from 52 to 54%). This achievement was listed in one of the key performance indicators as a sign of progress….

Because acceptance and rejection data are no longer available on the MDPI website, we will not know what is happening to rejection rates. We cannot know, at the level of each journal, how inclusive they are, or are becoming. This points to a wider need for all publishing houses to be more transparent with the data of their journals to allow researchers to make informed choices about their journals. MDPI’s transparency had been welcome. It is now, unfortunately, following the standards set by the other publishing houses….”

Transparent peer review for all | Nature Communications

“Starting in 2016, we have offered authors the option to publish the comments received from the reviewers and their responses alongside the paper. As we believe that transparency strengthens the quality of peer review, we are now moving to publish the exchanges between authors and reviewers for all research articles submitted from November 2022 onward. Referees will still have the option to remain completely anonymous, to sign their reports, and/or to choose to be acknowledged by name as part of our reviewer recognition scheme….”

OA = Funders and Lobbyists | Oct 10, 2022

“Do OA and open science represent a set of aligned interests being pushed by the rich and powerful — politicians, funders, lobbyists, and larger commercial operators — to allow for techno-utopian political posturing while they double-dip on their already-plentiful societal advantages and increase the odds that their current advantages grow?

However you answer this very leading question, it’s increasingly clear that policies are not being implemented transparently and openly, but rather via a hidden web of relationships, deals, and coordination — from Plan S to OSTP.

More and more information is pointing to a gradual, purposeful, and internecine takeover of publishing, not to make it more author-centric, but to make it more funder-centric. The relationships among funders, governments, and oligarchs are often blurry, with lobbyists an indicator that some kind of alignment is in the works.

A recent paper in Science and Public Policy about inadequate transparency in the EU’s approach to creating its influential open science policy discusses the role of lobbyists in the paradigm shift from “science 2.0” to “open science” as policies were formulated in Brussels and elsewhere. This was a meaningful shift. Both phrases are vague, but the first is more commonly understood as connoting a digital future based on existing norms. The latter injects a new set of untested norms, with the authors worrying that: ‘. . . successful projects of openness tend to be exploited on the one hand by powerful commercial actors and, on the other hand, by non-serious or even criminal actors, sometimes working in a grey area.’

Given the trail of influence SPARC and ORFG have left in the US through the NLM and the OSTP — in addition to the National Academies of Science, Engineering, and Medicine (NASEM) — and their efforts to obscure relationships, roles, and ties to the registered lobbying firm (New Venture Fund [NVF]) that is their fiscal sponsor, some statements in the paper hit familiar notes when it comes to lobbyists on this side of the Atlantic:…

Given the power dynamics — with subscription-based journals creating strong filters at the headwaters of various scientific communities, often leading to funded projects being unpublished or published in lesser journals than their funders imagined — it’s little wonder funders changed lanes, entering publishing in order to gain further influence, lower barriers, and put their interests at the headwaters. “Publishers being co-opted by funders” now seems to be the unspoken intent of OA and open science.”

https://web.archive.org/web/20221010111724/https://www.the-geyser.com/oa-and-its-lobbyists/

Anti-transparency within the EU shift to open science | Science and Public Policy | Oxford Academic

Abstract:  In 2014, the European Commission initiated a process to strengthen science 2.0 as a core research policy concept. However, this turned into a substantial ideational shift. The concept of science 2.0 was dropped. Instead, open science became established as one of the three pillars of the €94 billion research framework programme Horizon Europe. This article scrutinises the official narrative regarding the shift of concepts, identifying transparency issues, specifically misrepresentation of concepts and data, and the redaction of key material. This can be characterised as problems of input legitimacy. A public consultation did take place, but numerous transparency issues can be found. From science 2.0 to open science, the ideational shift was portrayed as simply a matter of exchanging two synonymous concepts. However, science 2.0 is a descriptive concept referring to science being transformed by digitalisation. In contrast, open science involves normative assumptions about how science should work and be governed.

 

Principles of Transparency and Best Practice in Scholarly Publishing – OASPA

“The Committee on Publication Ethics (COPE), the Directory of Open Access Journals (DOAJ), the Open Access Scholarly Publishers Association (OASPA), and the World Association of Medical Editors (WAME) are scholarly organisations that have seen an increase in the number, and broad range in the quality, of membership applications. Our organisations have collaborated to identify principles of transparency and best practice for scholarly publications and to clarify that these principles form the basis of the criteria by which suitability for membership is assessed by COPE, DOAJ and OASPA, and part of the criteria on which membership applications are evaluated by WAME. Each organisation also has their own, additional criteria which are used when evaluating applications. The organisations will not share lists of or journals that failed to demonstrate that they met the criteria for transparency and best practice.

This is the third version of a work in progress (published January 2018); the first version was made available by OASPA in December 2013 and a second version in June 2015. We encourage its wide dissemination and continue to welcome feedback on the general principles and the specific criteria. Background on the organisations is below….”

Revised principles of transparency and best practice released | OASPA

A revised version of the Principles of Transparency and Best Practice in Scholarly Publishing has been released by four key scholarly publishing organizations today. These guiding principles are intended as a foundation for best practice in scholarly publishing to help existing and new journals reach the best possible standards. 

The fourth edition of the Principles represents a collective effort between the four organizations to align the principles with today’s scholarly publishing landscape. The last update was in 2018, and the scholarly publishing landscape has changed. Guidance is provided on the information that should be made available on websites, peer review, access, author fees and publication ethics. The principles also cover ownership and management, copyright and licensing, and editorial policies. They stress the need for inclusivity in scholarly publishing and emphasize that editorial decisions should be based on merit and not affected by factors such as the origins of the manuscript and the nationality, political beliefs or religion of the author.

 

Reporting and transparent research practices in sports medicine and orthopaedic clinical trials: a meta-research study | BMJ Open

Abstract:  Objectives Transparent reporting of clinical trials is essential to assess the risk of bias and translate research findings into clinical practice. While existing studies have shown that deficiencies are common, detailed empirical and field-specific data are scarce. Therefore, this study aimed to examine current clinical trial reporting and transparent research practices in sports medicine and orthopaedics.

Setting Exploratory meta-research study on reporting quality and transparent research practices in orthopaedics and sports medicine clinical trials.

Participants The sample included clinical trials published in the top 25% of sports medicine and orthopaedics journals over 9 months.

Primary and secondary outcome measures Two independent reviewers assessed pre-registration, open data and criteria related to scientific rigour, like randomisation, blinding, and sample size calculations, as well as the study sample, and data analysis.

Results The sample included 163 clinical trials from 27 journals. While the majority of trials mentioned rigour criteria, essential details were often missing. Sixty per cent (95% confidence interval (CI) 53% to 68%) of trials reported sample size calculations, but only 32% (95% CI 25% to 39%) justified the expected effect size. Few trials indicated the blinding status of all main stakeholders (4%; 95% CI 1% to 7%). Only 18% (95% CI 12% to 24%) included information on randomisation type, method and concealed allocation. Most trials reported participants’ sex/gender (95%; 95% CI 92% to 98%) and information on inclusion and exclusion criteria (78%; 95% CI 72% to 84%). Only 20% (95% CI 14% to 26%) of trials were pre-registered. No trials deposited data in open repositories.

Conclusions These results will aid the sports medicine and orthopaedics community in developing tailored interventions to improve reporting. While authors typically mention blinding, randomisation and other factors, essential details are often missing. Greater acceptance of open science practices, like pre-registration and open data, is needed. As these practices have been widely encouraged, we discuss systemic interventions that may improve clinical trial reporting.

Journal transparency – the new Journal Comparison Service from PlanS | Maverick Publishing Specialists

“At a recent STM Association webinar, Robert Kiley, Head of Open Research at the Wellcome Trust, presented an informative overview of the new Journal Comparison Service from PlanS. He stated that the goal of this new tool is to meet the needs of the research community who “have called for greater transparency regarding the services publishers provide and the fees they charge. Many publishers are willing to be responsive to this need, but until now there was no standardised or secure way for publishers to share this information with their customers.” Publishers of scholarly journals are invited to upload data on their journals – one data set for each journal. The cOAlition S Publisher’s Guide  points out that the data is all information that publishers already have in some form, and it will need to be uploaded every year for the previous year.

There are two versions of data that can be supplied and I took a look at the version developed by Information Power (see https://www.coalition-s.org/journal-comparison-service-resources-publishers/ for the details and an FAQ). There are 34 fields, including basic journal identifiers plus additional information in three broad categories: prices (APC data; subscription prices plus discount policies); editorial data (acceptance rates, peer review times, Counter 5 data); and costs (price and service information)….

As a previous publisher of a portfolio of journals, I know that allocating these kinds of costs back to a specific journal is at best a guesstimate and very unlikely to be accurate and comparable.

The webinar included a contribution from Rod Cookson, CEO of International Water Association (IWA) Publishing.  Rod has been an advocate for transparency and helped to create the tool kit for publishers who want to negotiate transformative agreements (https://www.alpsp.org/OA-agreements). Rod reported that it had taken 6 people 2-3 months to gather the data to complete the 34 fields in the comparison tool.  IWA Publishing publishes 14 journals….”

 

OSF Preprints | Research funders should be more transparent: a plea for open applications

Abstract:  Transparency is increasingly becoming the new norm and modus operandi of the global research enterprise. In this mini-review we summarise ongoing initiatives to increase transparency in science and funding in particular. Based on this, we plea for a next step in funders’ compliance with the principles of Open Science, suggesting the adoption of open applications. Our proposed model includes a plea for the publication of all submitted grant applications; open sharing of review reports, argumentations for funding decisions, and project evaluation reports; and the disclosure of reviewers’ and decision committee members’ identities.

 

Rigor and Transparency Index: Large Scale Analysis of Scientific Reporting Quality

“JMIR Publications recently published “Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality” in the Journal of Medical Internet Research (JMIR), which reported that improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature, but assessing measures of transparency tends to be very difficult if performed manually by reviewers.

 

The overall aim of this study is to establish a scientific reporting quality metric that can be used across institutions and countries, as well as to highlight the need for high-quality reporting to ensure replicability within biomedicine, making use of manuscripts from the Reproducibility Project: Cancer Biology.

 

The authors address an enhancement of the previously introduced Rigor and Transparency Index (RTI), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, NIH, MDAR, ARRIVE).

 

Using work by the Reproducibility Project: Cancer Biology, the authors could determine that replication studies scored significantly higher than the original papers which, according to the project, all required additional information from authors to begin replication efforts.

 

Unfortunately, RTI measures for journals, institutions, and countries all currently score lower than the replication study average. If they take the RTI of these replication studies as a target for future manuscripts, more work will be needed to ensure the average manuscript contains sufficient information for replication attempts….”

Advances in transparency and reproducibility in the social sciences – ScienceDirect

Abstract:  Worries about a “credibility crisis” besieging science have ignited interest in research transparency and reproducibility as ways of restoring trust in published research. For quantitative social science, advances in transparency and reproducibility can be seen as a set of developments whose trajectory predates the recent alarm. We discuss several of these developments, including preregistration, data-sharing, formal infrastructure in the form of resources and policies, open access to research, and specificity regarding research contributions. We also discuss the spillovers of this predominantly quantitative effort towards transparency for qualitative research. We conclude by emphasizing the importance of mutual accountability for effective science, the essential role of openness for this accountability, and the importance of scholarly inclusiveness in figuring out the best ways for openness to be accomplished in practice.

 

 

 

Taking an open science approach to publishing | Hindawi

“We are delighted to launch Hindawi’s journal reports today. These reports, developed with the help of DataSalon, showcase a range of journal metrics about the different publishing services we provide for our journals. By exposing more detailed data on our workflows – from submission through peer review to publication and beyond – we are giving researchers, partners, and funders a clearer view of what’s under the ‘journal hood’. We are also raising greater awareness of less talked-about services, such as how we are helping to make the publication process more equitable and published articles more accessible and discoverable.

This is the first phase of our journal reports and detailed metrics are available by following the “see full report” link from the journal’s main page. In this first phase, our reports give greater insight into acceptance rates and decision times, but also the median time in peer review and the median number of reviews per article. Alongside traditional metrics, such as citations and article views, the reports also display maps of the geographic distribution of authors, editors, and reviewers.

The final section demonstrates how we make articles more accessible and discoverable. It takes advantage of data from Crossref’s participation reports, which we extracted from Crossref’s open API. The section includes the percentage of articles in the journal that are open access (i.e. 100%), and the proportion of corresponding authors with an ORCID ID. It also shows the extent to which abstracts and citations are open. Hindawi supports the initiative for open citations (I4OC) and we are also a founding organisation for the initiative for open abstracts (I4OA). Because our metadata is machine readable and openly available, it makes the articles we publish more discoverable than publishers who don’t make this information openly available. The infrastructure for Open Access is also a key building block of Open Science….”