Reporting and transparent research practices in sports medicine and orthopaedic clinical trials: a meta-research study | BMJ Open

Abstract:  Objectives Transparent reporting of clinical trials is essential to assess the risk of bias and translate research findings into clinical practice. While existing studies have shown that deficiencies are common, detailed empirical and field-specific data are scarce. Therefore, this study aimed to examine current clinical trial reporting and transparent research practices in sports medicine and orthopaedics.

Setting Exploratory meta-research study on reporting quality and transparent research practices in orthopaedics and sports medicine clinical trials.

Participants The sample included clinical trials published in the top 25% of sports medicine and orthopaedics journals over 9 months.

Primary and secondary outcome measures Two independent reviewers assessed pre-registration, open data and criteria related to scientific rigour, like randomisation, blinding, and sample size calculations, as well as the study sample, and data analysis.

Results The sample included 163 clinical trials from 27 journals. While the majority of trials mentioned rigour criteria, essential details were often missing. Sixty per cent (95% confidence interval (CI) 53% to 68%) of trials reported sample size calculations, but only 32% (95% CI 25% to 39%) justified the expected effect size. Few trials indicated the blinding status of all main stakeholders (4%; 95% CI 1% to 7%). Only 18% (95% CI 12% to 24%) included information on randomisation type, method and concealed allocation. Most trials reported participants’ sex/gender (95%; 95% CI 92% to 98%) and information on inclusion and exclusion criteria (78%; 95% CI 72% to 84%). Only 20% (95% CI 14% to 26%) of trials were pre-registered. No trials deposited data in open repositories.

Conclusions These results will aid the sports medicine and orthopaedics community in developing tailored interventions to improve reporting. While authors typically mention blinding, randomisation and other factors, essential details are often missing. Greater acceptance of open science practices, like pre-registration and open data, is needed. As these practices have been widely encouraged, we discuss systemic interventions that may improve clinical trial reporting.

Journal transparency – the new Journal Comparison Service from PlanS | Maverick Publishing Specialists

“At a recent STM Association webinar, Robert Kiley, Head of Open Research at the Wellcome Trust, presented an informative overview of the new Journal Comparison Service from PlanS. He stated that the goal of this new tool is to meet the needs of the research community who “have called for greater transparency regarding the services publishers provide and the fees they charge. Many publishers are willing to be responsive to this need, but until now there was no standardised or secure way for publishers to share this information with their customers.” Publishers of scholarly journals are invited to upload data on their journals – one data set for each journal. The cOAlition S Publisher’s Guide  points out that the data is all information that publishers already have in some form, and it will need to be uploaded every year for the previous year.

There are two versions of data that can be supplied and I took a look at the version developed by Information Power (see https://www.coalition-s.org/journal-comparison-service-resources-publishers/ for the details and an FAQ). There are 34 fields, including basic journal identifiers plus additional information in three broad categories: prices (APC data; subscription prices plus discount policies); editorial data (acceptance rates, peer review times, Counter 5 data); and costs (price and service information)….

As a previous publisher of a portfolio of journals, I know that allocating these kinds of costs back to a specific journal is at best a guesstimate and very unlikely to be accurate and comparable.

The webinar included a contribution from Rod Cookson, CEO of International Water Association (IWA) Publishing.  Rod has been an advocate for transparency and helped to create the tool kit for publishers who want to negotiate transformative agreements (https://www.alpsp.org/OA-agreements). Rod reported that it had taken 6 people 2-3 months to gather the data to complete the 34 fields in the comparison tool.  IWA Publishing publishes 14 journals….”

 

OSF Preprints | Research funders should be more transparent: a plea for open applications

Abstract:  Transparency is increasingly becoming the new norm and modus operandi of the global research enterprise. In this mini-review we summarise ongoing initiatives to increase transparency in science and funding in particular. Based on this, we plea for a next step in funders’ compliance with the principles of Open Science, suggesting the adoption of open applications. Our proposed model includes a plea for the publication of all submitted grant applications; open sharing of review reports, argumentations for funding decisions, and project evaluation reports; and the disclosure of reviewers’ and decision committee members’ identities.

 

Rigor and Transparency Index: Large Scale Analysis of Scientific Reporting Quality

“JMIR Publications recently published “Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality” in the Journal of Medical Internet Research (JMIR), which reported that improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature, but assessing measures of transparency tends to be very difficult if performed manually by reviewers.

 

The overall aim of this study is to establish a scientific reporting quality metric that can be used across institutions and countries, as well as to highlight the need for high-quality reporting to ensure replicability within biomedicine, making use of manuscripts from the Reproducibility Project: Cancer Biology.

 

The authors address an enhancement of the previously introduced Rigor and Transparency Index (RTI), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, NIH, MDAR, ARRIVE).

 

Using work by the Reproducibility Project: Cancer Biology, the authors could determine that replication studies scored significantly higher than the original papers which, according to the project, all required additional information from authors to begin replication efforts.

 

Unfortunately, RTI measures for journals, institutions, and countries all currently score lower than the replication study average. If they take the RTI of these replication studies as a target for future manuscripts, more work will be needed to ensure the average manuscript contains sufficient information for replication attempts….”

Advances in transparency and reproducibility in the social sciences – ScienceDirect

Abstract:  Worries about a “credibility crisis” besieging science have ignited interest in research transparency and reproducibility as ways of restoring trust in published research. For quantitative social science, advances in transparency and reproducibility can be seen as a set of developments whose trajectory predates the recent alarm. We discuss several of these developments, including preregistration, data-sharing, formal infrastructure in the form of resources and policies, open access to research, and specificity regarding research contributions. We also discuss the spillovers of this predominantly quantitative effort towards transparency for qualitative research. We conclude by emphasizing the importance of mutual accountability for effective science, the essential role of openness for this accountability, and the importance of scholarly inclusiveness in figuring out the best ways for openness to be accomplished in practice.

 

 

 

Taking an open science approach to publishing | Hindawi

“We are delighted to launch Hindawi’s journal reports today. These reports, developed with the help of DataSalon, showcase a range of journal metrics about the different publishing services we provide for our journals. By exposing more detailed data on our workflows – from submission through peer review to publication and beyond – we are giving researchers, partners, and funders a clearer view of what’s under the ‘journal hood’. We are also raising greater awareness of less talked-about services, such as how we are helping to make the publication process more equitable and published articles more accessible and discoverable.

This is the first phase of our journal reports and detailed metrics are available by following the “see full report” link from the journal’s main page. In this first phase, our reports give greater insight into acceptance rates and decision times, but also the median time in peer review and the median number of reviews per article. Alongside traditional metrics, such as citations and article views, the reports also display maps of the geographic distribution of authors, editors, and reviewers.

The final section demonstrates how we make articles more accessible and discoverable. It takes advantage of data from Crossref’s participation reports, which we extracted from Crossref’s open API. The section includes the percentage of articles in the journal that are open access (i.e. 100%), and the proportion of corresponding authors with an ORCID ID. It also shows the extent to which abstracts and citations are open. Hindawi supports the initiative for open citations (I4OC) and we are also a founding organisation for the initiative for open abstracts (I4OA). Because our metadata is machine readable and openly available, it makes the articles we publish more discoverable than publishers who don’t make this information openly available. The infrastructure for Open Access is also a key building block of Open Science….”

An open letter on open access: call for greater clarity and transparency of open access terms and conditions – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning

“On 1 March 2022, cOAlition S wrote to publishers as part of the Plan S initiative to drive open access publishing. Acknowledging recent progress made by publishers to increase open access to scientific publications, cOAlition S urged publishers to take further steps by making details of their open access policies and contracts more obvious for authors. The letter signed by Professor Johan Rooryck, Executive Director of cOAlition S, calls on journals to make the following information plainly available for authors at the point of submission:

the copyright licence that authors would need to sign before their manuscript’s publication
all costs associated with publishing the manuscript
whether the journal will re-direct the manuscript to another journal based on reasons other than editorial rejection.

While details on these policies can often be found on a journal’s own web site or via the publisher’s web site, cOAlition S states that it would be helpful for authors if this information were displayed:

prominently on the journal’s web site
as a part of the ‘Information for Authors’ section
at the start of the journal’s submission process….”

Financial transparency at EMBO Press – Features – EMBO

“The bottom line remains the same as two years ago: covering our basic publication costs would require raising APCs to just short of 9,000 euros per research article. Thus, a financially sustainable transition to a Gold OA model at all four EMBO Press journals would represent a challenge for many authors not supported by dedicated publication funds, effectively excluding them based on financial, and not scientific, criteria.   

The scientific community and its funders must decide if – or literally how much – they value high-quality selective journals, open-access, open science, and journalistic content. Through transparency, EMBO and EMBO Press want to contribute to grounding this debate in the financial realities of scientific publishing.”

We need a clean break from commercial publishers | Times Higher Education (THE)

“There are three reasons why academia’s relationships with for-profit publishers must finally be severed.

First, the peer review system is broken. In the old days, the most accomplished experts usually agreed to evaluate papers. Now, editors report sending 15 or more requests to find two warm bodies to offer an opinion.

Second, academics often can’t afford those high open access fees – especially faculty outside the sciences, the wealthier institutions and the developed world. This makes it more likely that journals will fill their pages with papers by authors who have money, as opposed to authors who have good ideas. Pay to play is simply the wrong model for academia.

Third, publishers have resisted repeated attempts to make their contracts with universities more transparent. A 2014 analysis showed that the University of Michigan, Ann Arbor paid Elsevier $2.16 million (£1.77 million) for the exact same package of journals sold to the University of Wisconsin, Madison for $1.22 million. Yale, with about 12,500 students, paid Springer $711,564 for the same package that the University of Texas, Austin, with more than 50,000 students, purchased for $481,932….”

4 ways to increase peer review transparency to foster greater trust in the process

Putting research questions and methods before findings…

Employing more open peer review practices…

Developing shared peer-review standards and taxonomies…

Facilitating the sharing of review reports across journals….”

EDP Sciences – Subscribe-to-Open 2022 Transparency Report for maths journals provides new metrics

“We are pleased to share the Subscribe-to-Open (S2O) 2022 Transparency Report. The annual report details costs and prices related to the EDP Sciences-SMAI Subscribe-to-Open program for the applied mathematics journals they co-publish.

As staunch advocates of open science, both EDP Sciences and the Société de Mathématiques Industrielles et Appliquées (SMAI) support the principle of transparency of costs and prices. The 2022 Transparency Report updates the range of metrics published in the 2021 transparency report such as evolution of subscription prices, renewal targets, publication costs, and other key measures. It also includes additional metrics such as publication statistics and subscription price per article. More detailed information is available to interested libraries on request….”

Open research: Enhancing transparency in peer review – Langley?Evans – 2022 – Journal of Human Nutrition and Dietetics – Wiley Online Library

“Unfortunately, the ideas that underpin open science meet most resistance within universities at the level of individual researchers. This is because cultural shifts in non-commercial environments take some time to accomplish and academia is notorious for its lack of change agility and inertia….

Some journals have now adopted a model of open review in which the authors and the reviewers are made known to each other from the start. This is proposed to encourage a civil debate about the work and improve its quality, as well as to enhance reviewer performance. However, there is a risk that a relatively junior reviewer may feel too intimidated to openly criticise the work of a senior researcher in the field (and who they may want to work with in future) and there are concerns that reviewers may not wish to review on those terms, making life difficult for editors to secure the necessary level of scrutiny for papers. Transparent peer review removes some of this concern. With this approach, anonymity can be preserved during the review process but, after the paper is accepted, the reviews and author responses are published along with the paper, for open scrutiny. The identity of the reviewer can remain concealed during the review process but, in a fully transparent review, their identity would be made public after paper acceptance….

The Journal of Human Nutrition and Dietetics has operated with double-blind peer review for many years. Recently, the journal has joined the Wiley Transparent Peer Review pilot scheme. This brings together the publisher with Publons and ScholarOne (part of Clarivate Web of Science) and enables the entire peer review process associated with a paper to be published alongside the accepted paper. Our papers now have an Open Research section, which provides a link to the digital object identifier and allow readers to see the peer review content. The peer review and author responses are in themselves citable materials. Our transparent peer review is a voluntary process for both authors and reviewers. Authors can opt to keep the peer review comments unpublished and reviewers can remain anonymous but still have their comments published….

Despite our push for openness through the transparent peer review scheme, there seems to be a reluctance to participate….

I would like to finish this editorial with an exhortation to take part in the revolution. Let us make research in the area of nutrition and dietetics more open! The advantages are clear. Open science is more interesting science, more collaborative science and kinder science. Transparent peer review is not something to be feared and should instead prompt constructive dialogues between authors, editors and peer reviewers. If there are some dinosaurs out there who still want to use peer review as a platform for bullying their junior colleagues, they will be in for a shock as the growth of more healthy research environments and communities leaves them behind. Transparent peer review is certainly not a panacea, but it is a great step forward to put right some of the historical problems that lie in the peer review system.”

Measuring Research Transparency

“Measuring the transparency and credibility of research is fundamental to our mission. By having measures of transparency and credibility we can learn about the current state of research practice, we can evaluate the impact of our interventions, we can track progress on culture change, and we can investigate whether adopting transparency behaviors is associated with increasing credibility of findings….

Many groups have conducted research projects that manually code a sample of papers from a field to assess current practices. These are useful but highly effortful. If machines can be trained to do the work, we will get much more data, more consistently, and much faster. There are at least three groups that have made meaningful progress creating scalable solutions: Ripeta, SciScore, and DataSeer. These groups are trying to make it possible, accurate, and easy to assess many papers for whether the authors shared data, used reporting standards, identified their conflicts of interest, and other transparency relevant actions….”

Challenges of scholarly communication: bibliometric transparency and impact

Abstract:  Citation metrics have value because they aim to make scientific assessment a level playing field, but urgent transparency-based adjustments are necessary to ensure that measurements yield the most accurate picture of impact and excellence. One problematic area is the handling of self-citations, which are either excluded or inappropriately accounted for when using bibliometric indicators for research evaluation. In this talk, in favour of openly tracking self-citations, I report on a study of self-referencing behaviour among various academic disciplines as captured by the curated bibliometric database Web of Science. Specifically, I examine the behaviour of thousands of authors grouped into 15 subject areas like Biology, Chemistry, Science and Technology, Engineering, and Physics. In this talk, I focus on the methodological set-up of the study and discuss data science related problems like author name disambiguation and bibliometric indicator modelling. This talk bases on the following publication: Kacem, A., Flatt, J. W., & Mayr, P. (2020). Tracking self-citations in academic publishing. Scientometrics, 123(2), 1157–1165. https://doi.org/10.1007/s11192-020-03413-9

 

From library budget to information budget: fostering transparency in the transformation towards open access

The discussion on the transformation of scholarly journals to open access (OA) increasingly concerns financial aspects. Considering the variety of funding strategies for article processing charge (APCs), the array of cost types for scientific information and the need for data monitoring to promote cost transparency, an integrated view of the financial dimension of the OA transition is needed. This commentary describes the need for implementing an information budget that looks beyond just the library budget and comprehensively targets all financial flows from universities and other research performing organizations to publishers. An information budget promotes an integrated perspective on the distributed costs at a given institution. This centralized approach of assessing financial flows can be used to strengthen the position of research institutions when negotiating with publishers.