Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.
Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.
89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.
Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.
“Although open-access publication has its upsides, for purposes of this essay, I am going to lump publishing in open-access journals in with posting to preprint servers as potentially problematic. My reason for doing so is that both make it harder for clinicians to separate helpful research from distracting, unhelpful, and in the case of preprint servers, unvetted material. In previous editorials, I’ve highlighted some redeeming qualities of open-access publication [17, 18]; I also note that open access is a publication option here at CORR®. But from where I sit today, it’s becoming clear to me that the distortion of publication incentives that are inherent to fully open-access journals does not serve readers (or their patients) very well….”
Pandemic events often trigger a surge of clinical trial activity aimed at rapidly evaluating therapeutic or preventative interventions. Ensuring rapid public access to the complete and unbiased trial record is particularly critical for pandemic research given the urgent associated public health needs. The World Health Organization (WHO) established standards requiring posting of results to a registry within 12 months of trial completion and publication in a peer reviewed journal within 24 months of completion, though compliance with these requirements among pandemic trials is unknown.
This cross-sectional analysis characterizes availability of results in trial registries and publications among registered trials performed during the 2009 H1N1 influenza, 2014 Ebola, and 2016 Zika pandemics. We searched trial registries to identify clinical trials testing interventions related to these pandemics, and determined the time elapsed between trial completion and availability of results in the registry. We also performed a comprehensive search of MEDLINE via PubMed, Google Scholar, and EMBASE to identify corresponding peer reviewed publications. The primary outcome was the compliance with either of the WHO’s established standards for sharing clinical trial results. Secondary outcomes included compliance with both standards, and assessing the time elapsed between trial completion and public availability of results.
Three hundred thirty-three trials met eligibility criteria, including 261 H1N1 influenza trials, 60 Ebola trials, and 12 Zika trials. Of these, 139 (42%) either had results available in the trial registry within 12 months of study completion or had results available in a peer-reviewed publication within 24 months. Five trials (2%) met both standards. No results were available in either a registry or publication for 59 trials (18%). Among trials with registered results, a median of 42 months (IQR 16–76 months) elapsed between trial completion and results posting. For published trials, the median elapsed time between completion and publication was 21 months (IQR 9–34 months). Results were available within 24 months of study completion in either the trial registry or a peer reviewed publication for 166 trials (50%).
Very few trials performed during prior pandemic events met established standards for the timely public dissemination of trial results.
Abstract: Discussions of open-access publishing tend to center the scientific disciplines, and this trend has continued during the Covid-19 pandemic. But while the pandemic has certainly shed new light on the importance of openly accessible medical research, its effects—from economic impacts to attitudinal shifts—have been felt and speculated about across disciplines. This paper presents an investigation into present and future impacts of the pandemic on open-access publishing in the humanities, which have historically been slower to adopt open-access models than other disciplines. A survey distributed to scholarly publishing professionals, academic librarians, and others working in open-access humanities publishing sought to determine what changes these professionals had observed in their field since the start of the pandemic, as well as what impacts they projected for the long term. While the lasting effects of this still-evolving global health and economic crisis remain uncertain, the survey results indicate that open-access humanities professionals have already observed changes in areas including market demand, institutional interest, and funding, while many of them predict that the pandemic will have a long-term impact on the field. These findings contribute to an ongoing conversation about the place of the humanities in the openaccess publishing landscape and the need for sustainable institutional investment.
“Open science is a broad goal that includes making data, data analysis, scientific processes and published results easier to access, understand and reproduce. It’s an appealing concept but, in practice, open science is difficult and, often, the costs seem to exceed the benefits. Recognizing both the shortfalls and the promise of open science, Stanford University’s Center for Open and REproducible Science (CORES) – which is part of Stanford Data Science – hopes to make the practice of open science easier, more accessible and more rewarding.
Since its launch in September 2020, CORES has been hard at work on the center’s first major efforts. These include developing a guide for open science practices at Stanford – called the “Open by Design” handbook – and producing workshops and a lecture series to help people learn about and contribute to open science across the university….”
“Flowcite – a German-based service providing an all-in-one platform for academic research, writing, editing, and publishing –partners up with Brooklyn-based scite.ai to offer quick source evaluation for its users to ensure quality, improve the relevance of results, and thus save time on research….”
“[Q] Which brings us to Wikipedia. Many of us consult it, slightly wary of its bias, depth, and accuracy. But, as you’ll be sharing in your speech at Intellisys, the content actually ends up being surprisingly reliable. How does that happen?
[A] The answer to “should you believe Wikipedia?” isn’t simple. In my book I argue that the content of a popular Wikipedia page is actually the most reliable form of information ever created. Think about it—a peer-reviewed journal article is reviewed by three experts (who may or may not actually check every detail), and then is set in stone. The contents of a popular Wikipedia page might be reviewed by thousands of people. If something changes, it is updated. Those people have varying levels of expertise, but if they support their work with reliable citations, the results are solid. On the other hand, a less popular Wikipedia page might not be reliable at all….”
“Archival Resource Keys (ARKs) serve as persistent identifiers, or stable, trusted references for information objects. Among other things, they aim to be web addresses (URLs) that don’t return 404 Page Not Found errors. The ARK Alliance is an open global community supporting the ARK infrastructure on behalf of research and scholarship.
End users, especially researchers, rely on ARKs for long term access to the global scientific and cultural record. Since 2001 some 8.2 billion ARKs have been created by over 780 organizations — libraries, data centers, archives, museums, publishers, government agencies, and vendors.
ARKs are open, mainstream, non-paywalled, decentralized persistent identifiers that you can start creating in under 48 hours. They identify anything digital, physical, or abstract….”
“Another long-term trend that researchers are watching out for is the push for scientists to share their research data more openly. This was mandated by the biomedical funding charity, Wellcome, for research that it funded on COVID-19, although there have been instances of people circumventing the rules by making data available ‘upon request’.
In theory, the push for open data might lessen international collaboration if it is no longer necessary to establish personal relationships to access data. Sugimoto says this could happen, but also wonders whether open data might help to link researchers from across the world by making their work more visible. “It could actually, in some ways, enhance and increase international collaboration rather than diminish it,” she says….”
“The Digital Library of Georgia has made its 2 millionth digitized and full-text- searchable historic newspaper page available freely online. The title page of the May 27, 1976 issue of the Augusta News-Review will become the 2
millionth page digitized by the Digital Library of Georgia. The newspaper, published by Mallory Millender from 1971 to 1985, identified itself as a “community paper with a predominantly Black readership” that presented the issues of the Central Savannah River Area (CSRA) from a “Black perspective.” The digitization of the title was made possible by Georgia Public Library Service. …”