International observatory targets predatory publishers | Times Higher Education (THE)

“A coalition of scientists, funders, publishing societies and librarians believes that the formation of an international observatory to study predatory journals will lead to improved advice on how to tackle them.

The initiative aims to fill the void left by the closure three years ago of Jeffrey Beall’s blacklist of predatory publishers. Since then, many others have set up their own blacklists and checklists, but there is “a lack of unity across the community about what predatory journals are”, said Agnes Grudniewicz, assistant professor at the Telfer School of Management at the University of Ottawa.

The coalition’s biggest achievement so far is to create a consensus definition of predatory journals. It defines predatory journals and publishers as “entities that prioritise self-interest at the expense of scholarship” and “are characterised by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggregate and indiscriminate solicitation practices”….

Creating an international observatory – potentially funded by research funders, charities, publishers and research institutions – was a less contentious solution than relying on blacklists or “whitelists” of approved providers, said Dr Grudniewicz. Research led by Michaela Strinzel, from the Swiss National Science Foundation, found that 34 journals listed as predatory by Professor Beall appeared on an approved list of titles run by the Directory of Open Access Journals (DOAJ), while 31 DOAJ titles were deemed predatory by subscription service Cabells….”

Predatory journals: no definition, no defence

“The consensus definition reached was: “Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices.” …”

Identifying publications in questionable journals in the context of performance-based research funding

Abstract:  In this article we discuss the five yearly screenings for publications in questionable journals which have been carried out in the context of the performance-based research funding model in Flanders, Belgium. The Flemish funding model expanded from 2010 onwards, with a comprehensive bibliographic database for research output in the social sciences and humanities. Along with an overview of the procedures followed during the screenings for articles in questionable journals submitted for inclusion in this database, we present a bibliographic analysis of the publications identified. First, we show how the yearly number of publications in questionable journals has evolved over the period 2003–2016. Second, we present a disciplinary classification of the identified journals. In the third part of the results section, three authorship characteristics are discussed: multi-authorship, the seniority–or experience level–of authors in general and of the first author in particular, and the relation of the disciplinary scope of the journal (cognitive classification) with the departmental affiliation of the authors (organizational classification). Our results regarding yearly rates of publications in questionable journals indicate that awareness of the risks of questionable journals does not lead to a turn away from open access in general. The number of publications in open access journals rises every year, while the number of publications in questionable journals decreases from 2012 onwards. We find further that both early career and more senior researchers publish in questionable journals. We show that the average proportion of senior authors contributing to publications in questionable journals is somewhat higher than that for publications in open access journals. In addition, this paper yields insight into the extent to which publications in questionable journals pose a threat to the public and political legitimacy of a performance-based research funding system of a western European region. We include concrete suggestions for those tasked with maintaining bibliographic databases and screening for publications in questionable journals.

 

Professors Receive NSF Grant to Develop Training for Recognizing Predatory Publishing | Texas Tech Today | TTU

“With more open-access journals making research articles free for people to view, some journals are charging authors publication fees to help cover costs. While some journals that do this are still peer-reviewed and credible, others are not and will publish lower quality work strictly for profit. The difference can be hard to tell, even to the most seasoned author….”

Plaudit · Open endorsements from the academic community

“Plaudit links researchers, identified by their ORCID, to research they endorse, identified by its DOI….

Because endorsements are publisher-independent and provided by known and trusted members of the academic community, they provide credibility for valuable research….

Plaudit is built on open infrastructure. We use permanent identifiers from ORCID and DOI, and endorsements are fed into CrossRef Event Data.

We’re open source, community-driven, and not for profit….”

Where Can I Publish? Part 2: Is there a definitive list? – Delta Think

“We set out to examine whether there is a definitive, curated list of journals that researchers can use when deciding on their publication venue. While some offer very good coverage, the short answer appears to be that no one index offers a definitive list.

Across all journals, there seems to be overlap of significant proportions of the mainstream indexes. However, fully OA journals present a more varied landscape. You need to combine multiple lists to round up a comprehensive list of curated fully OA journals.

Our analysis has combined over 100,000 ISSNs across over 65,000 titles and, we think it represents one of the most comprehensive round ups of the coverage of curated lists available….”

How Americans view research and findings| Pew Research Center

“The Pew Research Center survey asked about several factors that could potentially increase – or decrease – trust in research findings and recommendations. The two steps that inspire the most confidence among members of the public are open access to data and an independent review.

A majority of U.S. adults (57%) say they trust scientific research findings more if the researchers make their data publicly available. Another 34% say that makes no difference, and just 8% say they are less apt to trust research findings if the data is released publicly….

People with higher levels of science knowledge are especially likely to say that open access to data and an independent review boost their confidence in research findings. For example, 69% of those with high science knowledge say that having data publicly available makes them trust research findings, versus 40% of those with low science knowledge….”

 

Peter Suber: The largest obstacles to open access are unfamiliarity and misunderstanding of open access itself

I’ve already complained about the slowness of progress. So I can’t pretend to be patient. Nevertheless, we need patience to avoid mistaking slow progress for lack of progress, and I’m sorry to see some friends and allies make this mistake. We need impatience to accelerate progress, and patience to put slow progress in perspective. The rate of OA growth is fast relative to the obstacles, and slow relative to the opportunities.”

Peter Suber: The largest obstacles to open access are unfamiliarity and misunderstanding of open access itself

I’ve already complained about the slowness of progress. So I can’t pretend to be patient. Nevertheless, we need patience to avoid mistaking slow progress for lack of progress, and I’m sorry to see some friends and allies make this mistake. We need impatience to accelerate progress, and patience to put slow progress in perspective. The rate of OA growth is fast relative to the obstacles, and slow relative to the opportunities.”

Are open access journals peer reviewed? – Quora

As of today, the Directory of Open Access Journals (DOAJ) lists 13,229 peer-reviewed open-access (OA) journals.

DOAJ deliberately limits its coverage to the peer-reviewed variety, and evaluates each listed journal individually.

At the same time, some scam or “predatory” OA journals claim to perform peer review but do not. They give OA a bad name, and get wide publicity, creating the false impression that all or most OA journals are scams.

Analogy: Some police are corrupt, and cases of (actual or suspected) police corruption get wide publicity. But that doesn’t mean that all or most police are corrupt….”

Blacklisting or Whitelisting? Deterring Faculty in Developing Countries from Publishing in Substandard Journals

Abstract:  A thriving black-market economy of scam scholarly publishing, typically referred to as ‘predatory publishing,’ threatens the quality of scientific literature globally. The scammers publish research with minimal or no peer review and are motivated by article processing charges and not the advancement of scholarship. Authors involved in this scam are either duped or willingly taking advantage of the low rejection rates and quick publication process. Geographic analysis of the origin of predatory journal articles indicates that they predominantly come from developing countries. Consequently, most universities in developing countries operate blacklists of deceptive journals to deter faculty from submitting to predatory publishers. The present article discusses blacklisting and, conversely, whitelisting of legitimate journals as options of deterrence. Specifically, the article provides a critical evaluation of the two approaches by explaining how they work and comparing their pros and cons to inform a decision about which is the better deterrent.

Payouts push professors towards predatory journals

If South Africa truly wants to encourage good research, it must stop paying academics by the paper…

Why are South Africans relying so much on journals that do little or nothing to ensure quality? In an effort to boost academic productivity, the country’s education department launched a subsidy scheme in 2005. It now awards roughly US$7,000 for each research paper published in an accredited journal. Depending on the institution, up to half of this amount is paid directly to faculty members. At least one South African got roughly $40,000 for research papers published in 2016 — about 60% of a full professor’s annual salary. There is no guarantee (or expectation) that a researcher will use this money for research purposes. Most simply see it as a financial reward over and above their salaries….

In my experience, publication subsidies promote several other counterproductive practices. Some researchers salami-slice their research to spread it across more papers. Others target low-quality journals that are deemed less demanding….”

 

Open access medical journals: Benefits and challenges – ScienceDirect

Abstract:  The world of medical science literature is ever increasingly accessible via the Internet. Open access online medical journals, in particular, offer access to a wide variety of useful information at no cost. In addition, they provide avenues for publishing that are available to health care providers of all levels of training and practice. Whereas costs are less with the publishing of online open access journals, fewer resources for funding and technical support also exist. A recent rise in predatory journals, which solicit authors but charge high fees per paper published and provide low oversight, pose other challenges to ensuring the credibility of accessible scientific literature. Recognizing the value and efforts of legitimate open access online medical journals can help the reader navigate the over 11,000 open access journals that are available to date.

Prevalence of publishing in predatory journals

Abstract:  Objectives: In 2017 the journal Nature published challenges to the assumption that research intensive U.S. institutions are immune to the hazards of predatory publishing. Sample articles from hundreds of potentially predatory journals were analyzed: the NIH was the most frequent funder and Harvard was among the most frequent institutions. Our study was designed to identify the publication prevalence at our institution. 

Methods: Predatory publishers were defined using an archived version of Beall’s list, a now defunct website that was widely recognized as the only comprehensive black list for potential predators. The archive was collected January 15, 2017 and reflects updates made 1-2 weeks prior. To identify our NIH publications, records were collected from PubMed Central using an institution search and limiting to 2011-2016 to reflect a five-year period covered by Beall’s last update. PMC was selected under the assumption that direct journal inclusion in PubMed/MedLine serves as a proxy for quality. Journal and ISSN data were referenced against Ulrich’s Periodical Directory to determine publishers. Data were then compared against the Beall’s listing of potentially predatory publishers and standalone journals. The publication costs for the predatory journals were used to determine the total amount of NIH funding used to pay for publications in predatory journals. 
 
Results: The review of the University’s Publications submitted to PubMed Central from 2011 to 2016 revealed 15090 publications. Of those 15090 articles 218 publications (1.4%) were from publishers that fell in Beall’s list of predatory publishers. A review of publication fees for the publishers that University faculty published in revealed that approximately $300,000 dollars of Federal grant money was spent over the 5 year period publishing in predatory publications. 
 
Conclusions: Previously, it was thought that publishing predatory journals was primarily a problem in developing countries. However, like the 2017 Nature study, we found that researchers publishing at Emory are publishing in journals that are considered predatory. While the rate of publication in predatory journals is low (1.4%) it did cost approximately $300,000 of Federal tax payer money, which amounts to approximately 70% of the funds of one year of the average NIH R01 grant.