Social media attention and citations of published outputs from re-use of clinical trial data: a matched comparison with articles published in the same journals | BMC Medical Research Methodology | Full Text

Abstract:  Background

Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.

Methods

Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.

Results

89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.

Conclusions

Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.

All the Research That’s Fit to Print: Open Access and the News Media

Abstract:  The goal of the open access (OA) movement is to help everyone access the scholarly research, not just those who can afford to. However, most studies looking at whether OA has met this goal have focused on whether other scholars are making use of OA research. Few have considered how the broader public, including the news media, uses OA research. This study sought to answer whether the news media mentions OA articles more or less than paywalled articles by looking at articles published from 2010 through 2018 in journals across all four quartiles of the Journal Impact Factor using data obtained through Altmetric.com and the Web of Science. Gold, green and hybrid OA articles all had a positive correlation with the number of news mentions received. News mentions for OA articles did see a dip in 2018, although they remained higher than those for paywalled articles.

 

 

Evaluating implementation of the Transparency and Openness Promotion (TOP) guidelines: the TRUST process for rating journal policies, procedures, and practices | Research Integrity and Peer Review | Full Text

Abstract:  Background

The Transparency and Openness Promotion (TOP) Guidelines describe modular standards that journals can adopt to promote open science. The TOP Factor is a metric to describe the extent to which journals have adopted the TOP Guidelines in their policies. Systematic methods and rating instruments are needed to calculate the TOP Factor. Moreover, implementation of these open science policies depends on journal procedures and practices, for which TOP provides no standards or rating instruments.

Methods

We describe a process for assessing journal policies, procedures, and practices according to the TOP Guidelines. We developed this process as part of the Transparency of Research Underpinning Social Intervention Tiers (TRUST) Initiative to advance open science in the social intervention research ecosystem. We also provide new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices) according to standards in the TOP Guidelines. In addition, we describe how to determine the TOP Factor score for a journal, calculate reliability of journal ratings, and assess coherence among a journal’s policies, procedures, and practices. As a demonstration of this process, we describe a protocol for studying approximately 345 influential journals that have published research used to inform evidence-based policy.

Discussion

The TRUST Process includes systematic methods and rating instruments for assessing and facilitating implementation of the TOP Guidelines by journals across disciplines. Our study of journals publishing influential social intervention research will provide a comprehensive account of whether these journals have policies, procedures, and practices that are consistent with standards for open science and thereby facilitate the publication of trustworthy findings to inform evidence-based policy. Through this demonstration, we expect to identify ways to refine the TOP Guidelines and the TOP Factor. Refinements could include: improving templates for adoption in journal instructions to authors, manuscript submission systems, and published articles; revising explanatory guidance intended to enhance the use, understanding, and dissemination of the TOP Guidelines; and clarifying the distinctions among different levels of implementation.

Open Science rankings: yes, no, or not this way? A debate on developing and implementing transparency metrics. – JOTE | Journal of Trial and Error

“The Journal of Trial and Error is proud to present an exciting and timely event: a three-way debate on the topic of Open Science metrics, specifically, transparency metrics. Should we develop these metrics? What purposes do they fulfil? How should Open Science practices be encouraged? Are (transparency) rankings the best solution? These questions and more will be addressed in a dynamic and interactive debate with three researchers of different backgrounds: Etienne LeBel (Independent Meta-Scientist and founder of ERC-funded project ‘Curate Science’), Sarah de Rijcke (Professor of Science and Evaluation Studies and director of the Centre for Science and Technology Studies at Leiden University), and Juliëtte Schaafsma (Professor of Cultural Psychology at Tilburg University and fierce critic of rankings and audits). This is an event organized by the Journal of Trial and Error, and supported by the Open Science Community Tilburg, the Centre for Science and Technology Studies (CWTS, Leiden University), and the Open Science Community Utrecht.”

 

Comparison of subscription access and open access obstetrics and gynecology journals in the SCImago database | Özay | Ginekologia Polska

Abstract:  Objectives: The aim of this study is to compare the annual SJR and to evaluate the other parameters that show the scientific effect of journals in terms of open access (OA) or subscription access (SA) in the field of obstetrics and gynecology according to the SCImago database. Material and methods: This study was conducted between September-December 2019 at Near East University. The SCImago Journal & Country Rank database was used to collect information about the journals. We evaluated and compared the changes in the one-year SJR (SCImago Journal Rank) and journal impact factor (JIF) of OA and SA journals. Results: Data from 183 scientific journals in the field of obstetrics and gynecology from the period between 1999 and 2018 were evaluated, where 140 of these journals were SA and 43 were OA. The average SJR of OA journals in 1999 was 0.17, while it was 0.38 for SA journals. In 2018, these values were 0.31 and 0.78 for OA and SA journals, respectively. In the comparison of JIF, the average of the OA journals in 1999 was 0.09, while it was 0.66 for SA journals. In 2018, these values were 0.80 and 1.93 for OA and SA journals, respectively. Conclusions: Access to information has become easier due to technological developments and this will continue to affect the access policies of journals. Despite the disadvantages of predator journals, the rise of OA journals in terms of number and quality is likely to continue. Key words: open access journal; impact factor; subscription access journal; SCImago; obstetrics; gynecology.

Open Scholarship Support Guide.pdf(Shared)- Adobe Document Cloud

“Steps to Support Open Scholarship

Open scholarship entails a culture shift in how research is conducted in universities. It requires action on the part of university administration, working in concert with faculty, sponsors and disciplinary communities.  Universities should consider steps in three areas:

•  Policies:  Language and guidance should be reviewed for alignment with open scholarship, in particular: (1) academic hiring, review, tenure and promotion (valuing diverse types of research products; metrics that  incentivize the open dissemination of articles, data, and other research outputs; and valuing collaborative research); (2) intellectual property (ownership, licensing and distribution of data, software, materials and publications); (3) research data protection (for data to be stored and shared through repositories); (4) attribution (recognizing full range of contributions);  and (5) privacy (insuring that privacy obligations are met). 

•  Services and Training:  Researchers need support to assure that data and other research objects are managed according to FAIR Principles: findable, accessible, interoperable and reusable.  While the specific solution must be tailored to the discipline and research, common standards, including Digital Object Identifiers (DOIs), must be followed.

•  Infrastructure:  Archival storage is required for data, materials, specimens and publications to permit reuse.  Searchable portals are needed to register research products where they can be located and accessed. Universities can recognize efficiencies by utilizing external resources (including existing disciplinary repositories) and by developing shared resources that span the institution when external resources do not exist.

Presidents and provosts are encouraged to work with their academic senates to create an open scholarship initiative that promotes institution-wide actions supporting open scholarship practices, while remaining sufficiently flexible to accommodate disciplinary differences and norms….”

Is rapid scientific publication also high quality? Bibliometric analysis of highly disseminated COVID?19 research papers – Khatter – – Learned Publishing – Wiley Online Library

Abstract:  The impact of COVID-19 has underlined the need for reliable information to guide clinical practice and policy. This urgency has to be balanced against disruption to journal handling capacity and the continued need to ensure scientific rigour. We examined the reporting quality of highly disseminated COVID-19 research papers using a bibliometric analysis examining reporting quality and risk of bias (RoB) amongst 250 top scoring Altmetric Attention Score (AAS) COVID-19 research papers between January and April 2020. Method-specific RoB tools were used to assess quality. After exclusions, 84 studies from 44 journals were included. Forty-three (51%) were case series/studies, and only one was an randomized controlled trial. Most authors were from institutions based in China (n =?44, 52%). The median AAS and impact factor was 2015 (interquartile range [IQR] 1,105–4,051.5) and 12.8 (IQR 5–44.2) respectively. Nine studies (11%) utilized a formal reporting framework, 62 (74%) included a funding statement, and 41 (49%) were at high RoB. This review of the most widely disseminated COVID-19 studies highlights a preponderance of low-quality case series with few research papers adhering to good standards of reporting. It emphasizes the need for cautious interpretation of research and the increasingly vital responsibility that journals have in ensuring high-quality publications.

 

EVALUATION OF OPEN-ACCESS JOURNALS IN OBSTETRICS AND GYNECOLOGY – Journal of Obstetrics and Gynaecology Canada

Abstract:  A retrospective observational study was conducted to evaluate open-access journals in obstetrics and gynaecology, published between 2011 and 2019. Journals were classified based on their registration in open-access journal directories. Of 176 journals, 47 were not registered. Journals registered in the Directory of Open Access Journals (DOAJ) demonstrated good overall quality, and their journal metrics were significantly higher than those of non-registered journals or journals registered in other directories. The lack of editor names and indexing information on a journal’s website are the most distinctive features of non-registered journals. Non-registration in an open-access journal directory indicates a lack of transparency and may ultimately indicate that a journal is predatory.

 

 

Open Science: read our statement – News – CIVIS – A European Civic University

“CIVIS universities promote the development of new research indicators to complement the conventional indicators for research quality and impact, so as to do justice to open science practices and, going beyond pure bibliometric indicators, to promote also non-bibliometric research products. In particular, the metrics should extend the conventional bibliometric indicators in order to cover new forms of research outputs, such as research data and research software….

Incentives and Rewards for researchers to engage in Open Science activities 

Research career evaluation systems should fully acknowledge open science activities. CIVIS members encourage the inclusion of Open Science practices in their assessment mechanisms for rewards, promotion, and/or tenure, along with the Open Science Career Assessment Matrix….”

Recognition and rewards – Open Science – Universiteit Utrecht

“Open science means action. And the way we offer recognition and reward to academics and university staff is key in bringing about the transition that Utrecht University aims for. Over the course of the past year the working group on Recognition and Rewards, part of the Open Science Programme, has reflected and thoroughly debated a novel approach to ensuring that we offer room for everyone’s talent, resulting in a new vision (pdf)….

In the current system, researchers and their research are judged by journal impact factors, publisher brands and H-indices, and not by actual quality, real use, real impact and openness characteristics….

Under those circumstances, at best open science practices are seen as posing an additional burden without rewards. At worst, they are seen as actively damaging chances of future funding and promotion & tenure. Early career researchers are perhaps the most dependent on traditional evaluation culture for career progression, a culture held in place by established researchers, as well as by institutional, national and international policies, including funder mandates….”