“On the face of it, the bill is in line with what a lot of researchers argue for: open access not just for journal papers but for data too. The big idea is that this will make science more transparent and replicable, and decrease the friction for one lab to evaluate the work of another. (Psychology and a number of other fields have been dealing with an ongoing “crisis” in which they’re finding past research doesn’t replicate. Open access is a way to rectify it.)”
“The issuing of press releases about academic research that is not openly available impedes fact-checking and public debate, it has been warned.
MPs on the UK’s science and technology committee said that they took a “dim view” of the issuing of press releases without allowing access to the full peer-reviewed reports, having heard evidence that publishers were using embargoes as “news management” tools in such cases….In its evidence, Imperial College London says that some of the drawbacks of the embargo system “could be addressed if press releases and the journal papers on which they are based were required to be publicly available and linked from online news reports as part of the embargo contract”.
Felicity Mellor, senior lecturer in science communication at Imperial, told Times Higher Education that journals should make research papers available to journalists, “regardless of whether they’re open access”….”
“We envision building an evolving network of modular, interoperable, flexible and reusable open source projects that facilitate rapid, transparent and reproducible research and research communication for the public good. Rather than remaining independent and siloed, these projects will share resources and learn from each other, creating an open science infrastructure.”
“Alternative metrics should be used by the European Commission alongside expert judgement and other measures of research quality, according to a new report.
The report cautions against relying too heavily on new ways of measuring research when developing the open science agenda in Europe….The group, led by James Wilsdon, professor of research policy at the University of Sheffield, came to its conclusions by reviewing the literature and evidence submitted to it about how new metrics could help to advance the work on opening up European science….”
“A new Europe-wide code of research conduct has ordered academics and journals to treat negative experimental results as being equally worthy of publication as positive ones….The new European Code of Conduct for Research Integrity frames the bias against negative results as an issue of research conduct, stipulating that “authors and publishers [must] consider negative results to be as valid as positive findings for publication and dissemination”….It has been drawn up by All European Academies (Allea), a network of academic organisations including the British Academy, Germany’s Leopoldina and the French Académie des Sciences….The new code also puts more emphasis on research organisations themselves to prevent and detect misconduct; for example, universities should reward “open and reproducible practices” when it comes to hiring and promoting researchers, it says….”
In conjunction with PLOS ONE’s 10th anniversary celebration, the journal is launching a Datasets Collection to highlight articles with datasets that are noteworthy because of their impact and usefulness. The collection was assembled by PLOS
Killer whales. The name alone is enough to strike fear in even the steeliest of hearts. Also known as orcas, these apex predators are sometimes referred to as “wolves of the sea” and are found
“Widespread acceptance of open access has progressed more slowly than many advocates had hoped. One such advocate, Dr. Peter Suber, explains the barriers and misconceptions, and offers some strategic and practical advice….”
Stuart Lawson successfully used a Freedom of Information request to obtain the University of Liverpool’s contract with Elsevier.
“Since 2010, Cornell’s sustainability planning initiative has aimed to reduce arXiv’s financial burden and dependence on a single institution, instead creating a broad-based, community-supported resource. arXiv’s funding and governance for the current operation (Classic arXiv) is based on a membership program engaging libraries and research laboratories worldwide that represent the repository’s heaviest institutional users. As of February 2017, we have 206 members representing 25 countries. arXiv’s sustainability plan is founded on and presents a business model for generating revenues and a set of governance, editorial, and financial principles. Cornell University Library (CUL), the Simons Foundation, and a global collective of institutional members support arXiv financially. The financial model for 2013–2017 entails three sources of revenues:
CUL provides a cash subsidy of $75,000 per year in support of arXiv’s operational costs. In addition, CUL makes an in-kind contribution of all indirect costs, which currently represents 37% of total operating expenses.
The Simons Foundation contributes $100,000 per year ($50,000 prior to 2016) in recognition of CUL’s stewardship of arXiv. In addition, the Foundation matches $300,000 per year of the funds generated through arXiv membership fees.
Each member institution pledges a five-year funding commitment to support arXiv. Based on institutional usage ranking, the annual fees are set in four tiers from $1,500 to $3,000.
In 2016, Cornell raised approximately $515,000 through membership fees from 201 institutions and the total revenue (including CUL, Simons Foundation direct contributions, and online fundraising) is around $1,015,000. We remain grateful for the support from the Simons Foundation that encouraged long-term community support by lowering arXiv membership fees and making participation affordable to a broad range of institutions. This model aims to ensure that the ultimate responsibility for sustaining arXiv remains with the research communities and institutions that benefit from the service most directly.”
“The rise of preprints, now endorsed by the NIH, has created a new pressure valve for rapid publication outside of journals. With this emerging venue for interim publication coming into place, do journals need to be so quick to publish?
Perhaps, instead, the strategic differentiator for journals isn’t unpredictable schedules, rapid publication, and error-prone publishing of scientific reports. With preprint servers supporting rapid, preliminary publication in an environment that is actually more supportive of amendments/corrections, speed, and unpredictability, perhaps journals should rethink shouldering the load of and courting the risks of rapid publication. More importantly, there are indications that coordinating with your audience, taking more time to fact-check and edit, and returning to a higher level of quality may be the smart move.
Journals don’t have to perform every publishing trick anymore. Maybe it’s time to return to doing what they do best — vetting information carefully, validating claims as best they can, and ensuring novelty, quality, relevance, and importance around what they choose to publish.”
“Outlining a core set of best practices that can be applied across the sciences, Chambers demonstrates how all these sins can be corrected by embracing open science, an emerging philosophy that seeks to make research and its outcomes as transparent as possible….”
“People wanting to read a new and wide-ranging collection of critiques about the PACE Trial are no longer banging their heads against an academic journal’s paywall. That’s because – thanks to another publishing coup by the ME Association – all the commentaries in The Journal of Health Psychology are being made open access.
Following an exchange of correspondence with journal editor Dr David Marks, it has been agreed by the publisher that ALL of the commentaries on the PACE trial will be made Open Access at no charge to the authors.
Below is a list of the related articles published so far.
Investigator bias and the PACE trial (Steven Lubet)
The PACE trial missteps on pacing and patient selection (Leonard Jason)
‘PACE-Gate’: When clinical trial evidence meets open data access (Keith J Geraghty)
PACE team response shows a disregard for the principles of science (Jonathan Edwards)
Several more commentaries have yet to be loaded up on the journal’s website, including one from our medical adviser Dr Charles Shepherd. We don’t have a date when they will all be in place.”
“One of Europe’s biggest science spenders could soon branch out into publishing. The European Commission, which spends more than €10 billion annually on research, may follow two other big league funders, the Wellcome Trust and the Bill & Melinda Gates Foundation, and set up a “publishing platform” for the scientists it funds, in an attempt to accelerate the transition to open-access publishing in Europe….”
“Pisanski and three colleagues concocted the fake application—supported by a cover letter, a CV boasting phoney degrees, and a list of non-existent book chapters — and sent it to 360 peer-reviewed social science publications.
In the peer-review process, journals ask outside experts to assess the methodology and importance of submissions before accepting then.
The journals were drawn equally from three directories: one listing reputable titles available through subscriptions, with a second devoted to ‘open access’ publications.
The third was a blacklist — compiled by University of Colorado librarian Jeffrey Beall — of known or suspected ‘predatory journals’ that make money by extracting fees from authors.
The number of these highly dubious publications has exploded in recent years, number at least 10,000.
Indeed, 40 of the 48 journals that took the bait and offered a position to the fictitious Anna O. figured on Beall’s list, which has since been taken offline.
The other eight were from the open-access registry. No one made any attempt to contact the university listed on the fake CV, and few probed her obviously spotty experience.
One journal suggested ‘Ms Fraud’ organise a conference after which presenters would be charged for a special issue.
‘Predatory publishing is becoming an organised industry’, said Pisanski, who decided not to name-and-shame the journals caught out by the sting.
Their rise ‘threatens the quality of scholarship’, she added.
Even after the researchers contacted all the journals to inform them that Anna O. Szust did not really exist, her name continued to appear on the editorial board of 11 — including one to which she had not even applied.
None of the journals from the most select directory fell in the trap, and a few sent back tartly worded answers.”