Why do libraries sign contracts forbidding mining? I ask under FOI and request them to stop

I intend to submit the following Freedom Of Information request to the 26 leading UK universities (“Russell Group”). The excellent http://whatdotheyknow.com makes this very easy as it gives the addresses and actually sends the request.  The Universities have to answer within 20 working days (most manage it in 19.9 days so don’t hold your breath).

I ask whether any University has any policy on supporting researchers to carry out content-mining (Text and data Mining, TDM). Most universities seem to accede to any conditions laid down by publishers. This is strengthened by the total lack of any reaction to Elsevier’s recent “click through” licence. It’s easy to get the impression that universities don’t care. Maybe this request will show they have been secretly fighting for us – who knows?

I’d be very grateful for comments ASAP. I will try to summarise answers and would certainly appreciate help here.

========================= Dear University ====================

Background and terminology:

This request relates to content mining (aka Text And Data Mining (TDM), or data analytics) of scholarly articles provided by publishers under a subscription model. Mining is the use of machines (software) to systematically traverse(crawl, spider) subscribed content, index it and extract parts of the content, especially facts. This process (abstracting) has been carried out by scholars (“researchers”) for many decades without controversy; what is new is the use of machines to add speed and quality.

Most subscribers (universties, libraries) sign contracts provided by the publishers. Many of these contain clauses specifically restricting or forbidding mining (“restrictive contracts”). Recently the UK government (through the Intellectual Property Office and professor Hargreaves) recommended reform of Copyright to allow mining; a statutory instrument is expected in 2014-04. Many subscription publishers (e.g. Elsevier) have challenged this (e.g. in Licences 4 Europe discussions) and intend to offer bespoke licences to individual researchers (“click-through licences”).

In many universties contracts are negotiated by the University Library (“library”) who agree the terms and conditions (T&C) of the contract. At the request of the publishers some or all of the contract is kept secret.

Oversight of library activities in universities usually involves “library committee” with a significant number of academics or other non-library members.

Questions (please give documentary evidence such as library committee minutes or correspondence with publishers):

* How many subscription publishers have requested the university to sign a restrictive contract (if over 20 write “> 20″)?
* When was the first year that the University signed such a contract?
* How often has the university challenged a restrictive contract?
* How many challenges have resulted in removal of ALL restrictions on mining?
* Has the university ever raised restrictions on mining with a library committee or other committee?
* How many researchers have approached the university to request mining? How many were rejected?

* How often has the university negotiated with a publisher for a specific research project? Has the publisher imposed any conditions on the type or extent of the research? Has the publisher imposed conditions on how the research can be published?

* How often has an researcher carried out mining and caused an unfavourable response from a publisher (such as removal of service or a legal letter)?

* How often has the university advised a researcher that they should desist from mining? Have any researchers been disciplined for mining or had subscription access removed?
* Does the university have a policy on researchers signing “click through licences”?
* Does the university have a policy for facilitating researchers to carry out mining after the UK statutory instrument is confirmed?
* Does the university intend to refuse to sign restrictive contracts after the statutory instrument comes into force?

 

Your immediate comments will be very valuable asa I shall start sending these out very soon.

Beall’s criticism of MDPI lacks evidence and is irresponsible

I have just seen Jeffrey Beall’s “analysis” of MDPI http://scholarlyoa.com/2014/02/18/chinese-publishner-mdpi-added-to-list-of-questionable-publishers/#more-3072 and wish to respond immediately.

I will not respond to all Beall’s criticisms.

Beall has set up a site where he lists questionable (aka predatory) Open Access publishers who have poor or non-existent quality controls or have questionable organisations. This is potentially a useful service, though it is inappropriate that it should be done by a single person, especially one lacking discipline knowledge.

I have no personal involvement with MDPI. I remember when they started as a company which actually took physical chemical samples and stored them so that people could check later (the acronym MDPI can also stand for Molecular Diversity Preservation International). The compounds were linked to a journal, “Molecules” with full text. It has been going for 17 years. At one stage I wrote to them and asked them to change the licence from CC-NC to CC-BY and they immediately did.

I have never had any reason to doubt the validity of Molecules. I am now using it as an Open Access source of material to data-mine. We are doing the same with “Materials” and “Metabolites”.

Beall’s criticism that these are “one-word” titles is ridiculous and incompetent. They are accurate titles.

I have read (as a human) hundreds of articles in these publications. If I were to review a paper in any of them I would assume it was a reasonably competent, relatively boring, moderately useful contribution to science. The backbone of knowledge. I would expect to find errors, as I would in any paper. I reported one in my last post. This wasn’t fraud, it was a product of the awful state of ALL scholarly publishing where paper processes breed errors.

It is right that there should be a list of irresponsible journals and publishers. It should be run by an Open organisation, not Beall. Maybe OASPA? Maybe SPARC? I don’t know. It is wrong that a single person can destroy a publisher’s reputation.

It is also right that we should highlight the equally awful (if not worse) practices of closed access publishers. Why is there no organisation campaigning for reader rights? It seems to fall to me, an individual.

All publishers have junk articles and fraudulent articles. We don’t know the scale. (It’s a pity that they publishers so little to enable technical solutions to this). By default I would say that a paper in Molecules is no more or less likely to be questionable than one in a closed access journal from Elsevier or ACS.

The main problem is that the Open Access community has failed to get its act together. And that the closed access community prevents anyone getting an act together.

Machines are better referees than humans but we’ll be sued if we use them

Andy Howlett and Mark Williamson in our group have been developing fantastic software.

It can read the whole scientific literature and analyse it in minute detail. One of the things we are starting with is chemistry. ChemVisitor (part of AMI2) can read chemical structure diagrams and chemical names and work out what they mean.

It takes less than a second. That’s pretty impressive, and we’ll be reporting this at the ACS meeting next month. Here’s the first picture we chose.

Our software can read the whole chemical literature every day and work out all the compounds. And I can do it on my laptop.

badcompound

Hey – hang on – you’re violating copyright! And copyright is more important than science, isn’t it? Well, actually I am not violating it here, because this is from a CC-BY paper (I omit the attribution for a reason you’ll see). But yes, if it was from a Tetrahedron (Elsevier) article or J. American Chemical Society I would have to get permission. I’d probably have to pay. I wouldn’t be allowed to do X, Y or Z… It would take days without any likelihood of success.

And all I am doing is science. Note that chemical structure diagrams are NOT creative works. They are data. They are the only effective way of communicating what the compound is. But Elsevier and ACS and Nature and Science and … will all challenge me with lawyers if I take diagrams from non-CC-BY articles (e.g from Nature).

Now Andy has just mailed to say that this diagram is wrong. One of the compounds is incorrectly drawn. He’s contacted the author who has agreed. The error matters. These are compounds that many of you may eat. If the compound has the wrong name or formula then the science is badly flawed. And that can mean people die.

So try it for yourself. Which compound is wrong? (*I* don’t know yet) How would you find out? Maybe you would go to Chemical Abstracts (ACS). Last time I looked it cost 6USD to look up a compound. That’s 50 dollars, just to check whether the literature is right. And you would be forbidden from publishing what you found there (ACS sent the lawyers to Wikipedia for publishing CAS registry numbers). What about Elsevier’s Reaxys? Almost certainly as bad.

But isn’t there an Open collection of molecules? Pubchem in the NIH? Yes, and ACS lobbied on Capitol Hill to have it shut down as it was “socialised science instead of the private sector”. They nearly won. (Henry Rzepa and I ran a campaign to highlight the issue). So yes, we can use Pubchem and we have and that’s how Andy’s software discovered the mistake.

This was the first diagram we analysed. Does that mean that every paper in the literature contains mistakes?

Almost certainly yes.

But they have been peer-reviewed.

Yes – and we wrote software (OSCAR) 10 years ago that could do the machine reviewing. And it showed mistakes in virtually every paper.

So we plan to do this for every new paper. It’s technically possible. But if we do it what will happen?

If I sign the Elsevier content-mining click-through (I won’t) then I agree not to disadvantage Elsevier’s products. And pointing out publicly that they are full of errors might just do that. And if I don’t?…

Elsevier will cut off the University of Cambridge and the University will then contact me and tell me I have broken the sacred conditions that they have signed. Because no University ever challenges conditions that publishers set. The only thing that matters is price. So all universities have agreed with the publishers that readers cannot carry out text and data mining. They didn’t ask me – they just signed my rights away. If I continue I’ll probably face disciplinary action.

And the scientific literature will continue to be stuffed full of errors. And people will continue to die because of them.

Does anyone care? I don’t think so as no-one (ZERO) from a University has commented on my analysis of Elsevier’s restrictive TDM licence. They’ll just go ahead and sign it. Because it’s the easiest thing to do.

MicrobiologyOpen Issue 3:1 Published Online

MicrobiologyOpenMicrobiologyOpen has published its latest online issue. 13 new articles are fully open access: free to read, download and share.

MicrobiologyOpen is a broad scope, peer reviewed journal delivering rapid decisions and fast publication of microbial science.  The journal gives priority to reports of quality research, pure or applied, that further our understanding of microbial interactions and microbial processes.

Editor-in-Chief, Pierre Cornelis has highlighted the papers below as of particular interest:

purple_lock_openUnsuspected pyocyanin effect in yeast under anaerobiosis
Rana Barakat, Isabelle Goubet, Stephen Manon, Thierry Berges and Eric Rosenfeld

Summary: Toxicity of pyocyanin (PYO) was investigated under aerobiosis and anaerobiosis in several wild-type and mutant strains of the yeast Saccharomyces cerevisiae and also in Candida albicans. PYO is toxic for actively respiring cells but its toxicity was found to be important and even higher under anaerobiosis. This indicates that PYO effect can be mediated by other phenomenon than oxidative stress and respiratory disturbance.

purple_lock_open

Serum influences the expression of Pseudomonas aeruginosa quorum-sensing genes and QS-controlled virulence genes during early and late stages of growth
Cassandra Kruczek, Uzma Qaisar, Jane A. Colmer-Hamood and Abdul N. Hamood

Summary: In this study, we demonstrated that serum reduces the expression of different QS genes at early stages of growth but increases their expression at late stages of growth of P. aeruginosa. A similar phenomenon was observed regarding the production of autoinducers and the expression of QS-controlled virulence genes. Serum also differentially regulated the expression several positive and negative regulators of the QS systems. While the mechanism by which serum affects QS at early stage of growth is not yet known, our results suggest that serum accomplishes its effect at late stages of growth through the virulence factor regulator vfr

purple_lock_openVisualization of VirE2 protein translocation by the Agrobacterium type IV secretion system into host cells
Philippe A. Sakalis, G. Paul H. van Heusden and Paul J. J. Hooykaas

Summary: Here we report the direct visualization of VirE2 protein translocation from Agrobacterium into host cells. To this end we cocultivated Agrobacterium strains expressing VirE2 tagged with one part of a fluorescent protein with host cells expressing the complementary part. Fluorescent filaments became visible in recipient cells 20-25 hours after the start of the cocultivation indicative of VirE2 protein translocation.

Submit you paper here>   Sign up for eToC Alerts here>

MicrobiologyOpen Issue 3:1 Published Online

MicrobiologyOpenMicrobiologyOpen has published its latest online issue. 13 new articles are fully open access: free to read, download and share.

MicrobiologyOpen is a broad scope, peer reviewed journal delivering rapid decisions and fast publication of microbial science.  The journal gives priority to reports of quality research, pure or applied, that further our understanding of microbial interactions and microbial processes.

Editor-in-Chief, Pierre Cornelis has highlighted the papers below as of particular interest:

purple_lock_openUnsuspected pyocyanin effect in yeast under anaerobiosis
Rana Barakat, Isabelle Goubet, Stephen Manon, Thierry Berges and Eric Rosenfeld

Summary: Toxicity of pyocyanin (PYO) was investigated under aerobiosis and anaerobiosis in several wild-type and mutant strains of the yeast Saccharomyces cerevisiae and also in Candida albicans. PYO is toxic for actively respiring cells but its toxicity was found to be important and even higher under anaerobiosis. This indicates that PYO effect can be mediated by other phenomenon than oxidative stress and respiratory disturbance.

purple_lock_open

Serum influences the expression of Pseudomonas aeruginosa quorum-sensing genes and QS-controlled virulence genes during early and late stages of growth
Cassandra Kruczek, Uzma Qaisar, Jane A. Colmer-Hamood and Abdul N. Hamood

Summary: In this study, we demonstrated that serum reduces the expression of different QS genes at early stages of growth but increases their expression at late stages of growth of P. aeruginosa. A similar phenomenon was observed regarding the production of autoinducers and the expression of QS-controlled virulence genes. Serum also differentially regulated the expression several positive and negative regulators of the QS systems. While the mechanism by which serum affects QS at early stage of growth is not yet known, our results suggest that serum accomplishes its effect at late stages of growth through the virulence factor regulator vfr

purple_lock_openVisualization of VirE2 protein translocation by the Agrobacterium type IV secretion system into host cells
Philippe A. Sakalis, G. Paul H. van Heusden and Paul J. J. Hooykaas

Summary: Here we report the direct visualization of VirE2 protein translocation from Agrobacterium into host cells. To this end we cocultivated Agrobacterium strains expressing VirE2 tagged with one part of a fluorescent protein with host cells expressing the complementary part. Fluorescent filaments became visible in recipient cells 20-25 hours after the start of the cocultivation indicative of VirE2 protein translocation.

Submit you paper here>   Sign up for eToC Alerts here>

Ask EveryONE: Corrections

My paper was recently published in PLOS ONE, but I’ve noticed an error. Can it be corrected?

PLOS ONE corrects major errors found in published articles via the addition of a Formal Correction to the paper. Formal Corrections are reserved for errors that significantly affect the understanding or utility of the paper.  In addition to being published on the PLOS ONE website, corrections are also indexed in PubMed Central and PubMed.

When a paper has been corrected, a correction notice will appear in a gray box at the top of the article page.  A CrossMark logo now appears on every PLOS article page and in the downloadable PDF; clicking the logo on a corrected article’s page will bring up a status box showing that the paper has been corrected.

To see the full correction, click the “View correction” link in the gray box.  This will direct you to a page with the full correction details, including any updated figures, tables, or supporting information, along with a PDF version of the correction notice available for download.  An example of a correction notice on the original article is shown below.

corrections image 1

Example of a Formal Correction notice (click to enlarge)

If you notice an error in your published paper, you should contact our corrections team at corrections@plos.org.  Please include the title and DOI of your paper; a description of the problem; and any corrected figures, tables, or supporting information files. PLOS staff will decide whether a Formal Correction is appropriate and will work with you to publish a correction as quickly as possible.

If there is an error in one of your figures, tables, or supporting information files, the corrected items will be included in the Formal Correction. An example of a Formal Correction is shown below.

corrections image2Example of a Formal Correction (click to enlarge)

The post Ask EveryONE: Corrections appeared first on EveryONE.

The value of the hacker community: reacting to natural disasters

I used to live on the edge of the Somerset levels and as boy cycle throughout them…
I am including in full a post from an OKFN list [after these paragraphs] , inviting people to hack today (Sunday) in Shoreditch London and virtually to help mitigate the effects of the worst UK floods in living memory. Read it. The message is simple:
  • YOU can make a difference.

People often think that they can’t hack – that you have to speak Perl and Unix and node.js and…

That’s wrong. Hacking is about communities making a difference. We all understand communities, so we can all be hackers. The mayor of Palo Alto ran a city-wide “hack the city”

  • EVERYONE is welcome at a hack day.

You don’t even have to have a computer. Just an ability to communicate.

I can’t be there (I am in AU).  AU also gets floods (and bush fires). Last time I was here the Melbourne Age newspaper held a hack day in its offices – one of the topics we hacked was bush fires. There were lots of non-geeks there.

What you will find in Shoreditch today is a random selection of people – could be 5, could be 500. The thing in common is that they want to help. They know that no single person has the answer. That they don’t, at present, even know where to start.

That’s where you could well be able to help. Perhaps you are in local government or the voluntary sector? Or, maybe you’ve actually been in a flood or have detailed experience from someone who has. That’s a great starting point to find out what people actually want rather that what we think they want.  That’s why I was so impressed with the NHS Hackdays – people identified useful tasks that were achievable and then achieved them. Maybe you’re a teacher, or maybe you are still at school. Yes, school children can change the world.

And, of course, we are unlikely to solve everything this Sunday… Much of the success will be taking good starting points and building the communities and protocols that will make them sustainable. When the 2010 earthquake hit Haiti the Openstreetmap community – hundreds of thousands – leapt into action to use satellite photos to recreate pre-earthquake roads and buildings. Read http://hot.openstreetmap.org/projects/haiti-2.

Maybe you’re a keen photographer and went on holiday in Somerset. Perhaps your photos could be useful – I don’t know. Or maybe you know about low-cost boats. Or fly drones as a hobby. Who knows? A feature of hacks is that we pool ideas at the start and see which catch people’s imagination and which are feasible. It doesn’t matter if *your* ideas doesn’t work out, simply that good ideas get developed. Glory is communal not personal.

Perhaps you can find information on key resources that might be available but unused.

I may be able to log in from AU. What can I do?

  • give moral support.
  • spread the word
  • cross fertilise
If only one person reads this post and does something that’s massively worthwhile. Now the details
====================================
From: Joshua March <josh@conversocial.com>
Subject: Your country needs you: #FloodHack this Sunday!
Date: 14 February 2014 20:54:24 GMT

Hi guys,

The government called a meeting today with a number of major UK tech companies to discuss what the tech and developer community could do to help with the flood crisis engulfing the UK.
As part of this, the Environment Agency agreed to open up real-time data on flood levels/status, mapped across the UK, so that developers can utilize the data for free (at least for the next three months).
We’re organizing a hackathon THIS SUNDAY in Shoreditch, London, to build apps on top of the data to try and help people keep up to date with the issues in their area (or areas they’re traveling to), and get the data they need on how they can get help, how they can volunteer etc.
There are more details below, and on the hackpad page: https://hackpad.com/UK-Flood-Help-February-2014-QFpKPE5Wy6s
This is obviously super short notice, but an amazing opp to build something that could actually help thousands of people. Google have agreed to host it, and will be sending developers, as will Facebook, Microsoft and many other start-ups in the area (including my own).
Please spread the word, and come down if you can make it!
Calling all developers!
We have been hit by the worst flooding and weather the UK has seen in our lifetimes. Getting the right information to people about the problems affecting particular areas, and the right places to turn to help (or for information on how THEY can help volunteer) is crucial. The government has near real-time data on flooding levels and alerts, mapped out across the entire country, which they want to put to the best possible use. Following a meeting called today at Number 10 with leading technology companies, the Environment Agency, the Government Digital Service, the Open Data Institute and the Cabinet Office are working to open up this data to the public for the next three months, allowing developers to build innovative applications that can help those affected by the flooding.
This Sunday at 10am, join developers from Google, Facebook, Twitter, Conversocial, Datasift, Mother, Taskhub and more for a hackathon, hosted by Tech CityUK at Google Campus in Shoreditch, where the Open Data Institute will share the flood level data with developers and be on hand to help throughout the day. The Cabinet Office will be choosing the most useful applications demoed on the day to be promoted to flood victims across the country.
Please register for the hackathon here.
Your country needs you!

 

 

More Skulduggery from SPP’s Scholarly Scullery

In the Society for Scholarly Publishing’s Scholarly Kitchen, Rick Anderson complains of “errors and misinformation? in the ROARMAP registry of OA mandates and calls for publishers to provide this service instead.

ROARMAP is a registry for institutional and funder OA policies:

X-Other (Non-Mandates) (86)
Proposed Institutional Mandates (6)
Proposed Sub-Institutional Mandate (4)
Proposed Multi-Institutional Mandates (5)
Proposed Funder Mandates (12)
Institutional Mandates (202)
Sub-Institutional Mandates (43)
Multi-Institutional Mandates (9)
Funder Mandates (87)
Thesis Mandates (109)

The distinction between a mandate and a non-mandate is fuzzy, because mandates vary in strength.

For a classification of the ROARMAP policies in terms of WHERE and WHEN to deposit, and whether the deposit is REQUIRED or REQUESTED, see El CSIC?s (la Universitat de Barcelona, la Universitat de València & la Universitat Oberta de Catalunya) MELIBEA.

For analyses of mandate strength and effectiveness, see:

Gargouri, Y., Lariviere, V., Gingras, Y., Brody, T., Carr, L., & Harnad, S. (2012). Testing the finch hypothesis on green OA mandate ineffectiveness. arXiv preprint arXiv:1210.8174.

Gargouri, Yassine, Larivière, Vincent & Harnad, Stevan (2013) Ten-year Analysis of University of Minho Green OA Self-Archiving Mandate (in E Rodrigues, A Swan & AA Baptista, Eds. Uma Década de Acesso Aberto e na UMinho no Mundo).

Further analyses are underway. For those interested in analyzing the growth of OA mandate types and how much OA they generate, ROARMAP and MELIBEA, which index OA policies, can be used in conjunction with ROAR and BASE, which index repository contents.

“Leave providing the OA to us…”

If there is any party whose interests it serves to debate the necessary and sufficient conditions for calling an institutional or funder OA policy an OA ?mandate,? it?s not institutions, funders or OA advocates, whose only concern is with making sure that their policies (whatever they are called) are successful in that they generate as close to 100% OA as possible, as soon as possible.

The boundary between a mandate and a non-mandate is most definitely fuzzy. A REQUEST is certainly not a mandate, nor is it effective, as the history of the NIH policy has shown. (The 2004 NIH policy was unsuccessful until REQUEST was upgraded to REQUIRE in 2007.)

But (as our analyses show), even requirements come in degrees of strength. There can be a requirement with or without the monitoring of compliance, with or without consequences for non-compliance, and with consequences of varying degrees. Also, all of these can come with or without the possibility of exceptions, waivers or opt-outs, which can be granted under conditions varying in their exactingness and specificity.

All these combinations actually occur, and, as I said, they are being analyzed in relation to their success in generating OA. It is in the interests of institutions, funders and OA itself to ascertain which mandates are optimal for generating as much OA as possible, as soon as possible.

I am not sure whose interests it serves to ponder the semantics of the word ?mandate? or to portray as sources of ?errors and misinformation? the databases that are indexing in good faith the actual OA policies being adopted by institutions and funders.

(It is charges of “error and misinformation” that sound a bit more like propaganda to me, especially if they come from parties whose interests are decidedly not in generating as much OA as possible, as soon as possible.)

But whatever those other interests may be, I rather doubt that they are the ones to be entrusted with indexing the actual OA policies being adopted by institutions and funders — any more than they are to be entrusted with providing the OA.

§ § § §

I think it is not only appropriate but essential that services like the University of Southampton’s ROAR, ROARMAP, the Universities of Barcelona, Valencia and Catalunya’s MELIBEA and University of Bielefeld’s BASE are hosted and provided by scholarly institutions rather than by publishers. I also think the reasons for this are obvious.

Buzz Me Baby: Unusual Courtship Songs for Valentine’s Day

 We heart the Ostrinia nubilalis

When most people think of Valentine’s Day, images of love, candy, and flowers pop to mind.  However, this Valentine’s Day, we thought we’d share two animals with you that use scales, wings, and other things to create songs that attract that special someone.

Moth Melodies

Male moths use a combination of pheromones and ultrasound—sound with frequencies above the range of human hearing—to woo females. To better understand moth sounds during courtship, researchers in this PLOS ONE study recorded and examined the ultrasounds emitted by three types of grass moths. They found that two of the three moth species had sex-specific wing and thoracic scales that played a role in ultrasound production, and that using these scales increased mating success. This audio clip is the recorded ultrasound of Ostrinia nubilalis (pictured above), aka the European corn borer, slowed down 10 times so that human ears can hear it.

CotesiaWasp Chorus

Cotesia Wasp

Rapid wing fanning is the attraction tool of choice for male wasps when courting females. According to this PLOS ONE study, parasitic wasp wing fanning has been studied before, but the mechanism for how the sound is generated has not.  The researchers characterized the wasp songs and found that they contain a two-part signal with sequences of buzzes and boing sounds. While scientists could characterize  the male courtship songs, how they produce the sound remains a mystery. This audio clip starts with wing fanning, which produces a buzz sound, and is followed by a series of boing sounds.

 


Whether you choose to scale, buzz, or boing to impress your mate with beautiful music, we wish you a Happy Valentine’s Day from PLOS ONE!

 

Citations: 

Takanashi T, Nakano R, Surlykke A, Tatsuta H, Tabata J, et al. (2010) Variation in Courtship Ultrasounds of Three Ostrinia Moths with Different Sex Pheromones. PLoS ONE 5(10): e13144. doi:10.1371/journal.pone.0013144

Bredlau JP, Mohajer YJ, Cameron TM, Kester KM, Fine ML (2013) Characterization and Generation of Male Courtship Song in Cotesia congregata(Hymenoptera: Braconidae). PLoS ONE 8(4): e62051. doi:10.1371/journal.pone.0062051

Image Credits:

Photo a Ostrinia nubilalis by dhobern. Heart added by us.

Dorsal view of one pair of wings of a male Cotesia congregata. Figure 8. doi:10.1371/journal.pone.0062051.g008

The post Buzz Me Baby: Unusual Courtship Songs for Valentine’s Day appeared first on EveryONE.

Have scientists finally got angry enough to rebel against publishers?

Richard Smith (http://blahah.net/about.html ) has posted a very brave piece about how to create a revolution to change the process of scholarly publishing. http://blahah.net/2014/02/11/knowledge-sets-us-free/.

Before I start, I know Richard and when unembargoed will enthusiastically blog his ideas about a marketplace for software and scientists. I am also personally delighted that in the short time the OKF (I should say Keren Limor, of course) have been running Open Science discussions at the Panton Arms in Cambridge we have had massively important meetings. I missed Richard’s – but think it will be scene-changing – and I missed this one.

I’ve copied Richard’s post in full and comment here…

First. YES!

Finally the moral unacceptability of TA-STM publishing has hit the modern world. The good news is that the technology is now so powerful that if we want to change it we can. It won’t be pretty and it won’t be predictable but it’s possible.

I have blogged before on the role of civil disobedience. Breaking to formal law to promote a higher moral good. It’s been a critical force in many countries over millennia. It’s always risky and people may suffer. The important things are:

* is there a compelling moral case?

* is there a likelihood of making change happen.

The second is optional. A moral case is good enough, but it can be very lonely. But if you can change the hearts and minds of enough people, then change can be rapid.

So Second, I am with you. I want to disrupt the system. I’m currently doing it in a parallel and complementary way. It might be judged illegal and I am prepared to take that risk.It’s undoubtedly moral. Until you gave the lead I didn’t have any authority – it’s not for my generation to tell your what to do, but to support it when it does it.

The key things are critical mass, simple coherent aims, and irresistible technology.

 


Written by: Richard SmithLast updated: 2014-02-11 17:15:00 -0800

Last night at Open Research CambridgeJelena Aleksic gave a great talk about Open Access. In her closing comments, she floated the idea of an iTunes for scientific papers. Imagine being able to get any scientific paper for 79p. That’s a reasonable price to cover costs of creating, archiving and distributing knowledge (given the research is already funded). Most people can afford it.

Current prices – $32 for one-off access to a Nature paper – are disgusting. Scientists created that knowledge, probably with public funding, then a team of other scientists peer reviewed it without getting paid, and Nature wants to make $32 from imprisoning it on their website? Fuck you, Nature.

Music shows us how to set knowledge free

If we can possibly bring about a situation where knowledge comes at cost price (~79p), or better – free, at the point of consumption, it’s our moral imperative to do so. To do that we have to destroy traditional publishing. No small task. Be we can take lessons from how the music industry was transformed.

Ubiquitous music piracy broke the strangehold traditional record companies had over listening to music. In the late 1990s CDs were £10-20. If you wanted to hear a particular song you had to buy a load of other songs at the same time and wait for them to be delivered to your house on a plastic disk. Then software made it trivially easy to pirate music, and in the last ten years record companies have been forced to change their business models to match the needs of their consumers. Now we can buy any song for pennies, and if someone can’t afford it, it’s easy to get it free.

When it becomes trivial to pirate scientific papers whilst being very difficult to trace the source of piracy, and at the same time it becomes very easy to search and acquire pirated papers, the tyranny of publishers will be over.

A vision of the future

Let’s imagine what that utopian world would look like by examining a few scenarios

1. A student/researcher has a library of hundreds or thousands of PDFs and associated metadata in their reference manager. They wish that knowledge was free.

They fire up the Liberator software and hit a button. Their reference manager database is anonymised and liberated into an online, distributed repository.

2. A student/researcher is browsing a journal’s website.

They are running the Liberator browser plugin that grabs every paper linked from every page they visit and anonymously sends it to the open respository network.

3. Anyone wants to read a paper.

They go to one of dozens of websites that let you search the distributed network of papers that have been liberated. They can download the paper and the data, and link out to open peer reviews from a variety of sites to enable them to judge the quality of the research for themselves.

4. A student/researcher wants to liberate their entire subject.

The Liberator connects to the distributed network and gets the list of papers already liberated – the “free list”. When the user connects to the internet at their library, the Liberator compares the free list to what’s available for access via the library. It starts anonymously crawling the publishers’ sites and liberating papers that aren’t free yet.

5. A citizen wants to contribute to freeing all knowledge

They visit a University library that has free public internet access, and deposit a tiny box in a discreet place. The box contains a raspberry pi with a USB wi-fi plug. The raspberry pi is running the Liberator, and starts crawling and setting papers free.

Perhaps they don’t live near a University library. They visit anywhere with public internet and deposit their raspberry pi. It connects to a decentralised database of hacked or donated student library logins and begins crawling, liberating.

Perhaps there’s no public wi-fi near them. They run the Liberator in TOR mode, and it anonymously crawls from the safety of their home using the login database to gain access.

6. Someone (most likely, some consortium of publishers) attacks the network. They use court orders to take search sites offline and have servers shut down.

Within seconds the network has recovered – mirror sites are pre-arranged to launch when others go down. All the data is held distributed around the world and cannot be destroyed without destroying the world’s computers.

The future is now

Sounds rosy, huh? Nobody gets hurt, and the whole of human knowledge becomes free. Publishers can’t stop it: their customers are the Universities. They can’t cut them off without cutting off their own income stream, and the Universities have already paid for all these papers. Millions of people running the Liberator are righteous leeches. They bleed the publishers to death. This allows us to rebuild knowledge archival and distribution for the modern era using open processes.

The cool thing is, this is all technically possible to achieve using tools that already exist, or that could be rapidly developed. I propose the following set of software to make this happen:

  1. The Liberator: A web crawler that scrapes publishers websites and submits papers along with all their supplementary files and metadata to the Liberator network. The scraping uses Zotero’s community-maintained translators which already cover all the major publishers and many minor ones. It doesn’t duplicate effort – it checks whether a paper is already free before liberating it. It can securely update over-the-air to add workarounds when publishers start trying to block the crawler. It can also find databases from all commonly used reference managers and anonymise and liberate their contents.
  2. A torrent tracker with features that allow effective search and display of scientific papers. It produces RSS feeds that contain random subsets of the new papers in the network, so that new papers are evenly spread out around all seeders without anyone having to host all papers. These are minor modifications to existing open source torrent trackers, like Gazelle.
  3. Browser plugins that run Liberator whenever an academic publisher’s website is visited. This is a trivial extension to the Zotero connector plugins.

If you want to help make this happen, go here and start talking (anonymously, if you like).

 

Reply to Richard van Noorden

[Note I have switched laptops and this has caused delay – also I cannot yet do formatting].

Earlier this week I strongly criticised Nature News and Richard van Noorden (http://blogs.ch.cam.ac.uk/pmr/2014/02/10/natures-recent-news-article-on-text-and-data-mining-was-an-unacceptable-marketing-exercise-i-ask-them-to-renounce-licensing/) for a post about Elsevier’s click-through licences. My concern was that the article was – [I agree not intentionally, and withdrew the “marketing”] – supportive of nature’s business interests. Richard has replied I’ll set the scene first and then reply to specific points.

I look to Nature News as a reliable source of scientific news and comment (unlike , say, the UK’s D**ly M**l). I suspect that many readers, including me, glance at the headlines and the first paragraph and then move on. So I read:

Elsevier opens its papers to text-mining

Researchers welcome easier access for harvesting content, but some spurn tight controls.

and the first paragraph…

Academics: prepare your computers for text-mining. Publishing giant Elsevier says that it has now made it easy for scientists to extract facts and data computationally from its more than 11 million online research papers. Other publishers are likely to follow suit this year, lowering barriers to the computer-based research technique. But some scientists object that even as publishers roll out improved technical infrastructure and allow greater access, they are exerting tight legal controls over the way text-mining is done.

I suspect that most readers would see this as a statement of a fait accompli. It’s going to happen the way the publishers say. Yes, a few people are carping; but the world is moving ahead

Nature has a vested interest in seeing this happen. For whatever reasons it supports the STM publishers in their intention to offer licences for content mining. Note that this is not the result of a negotiation – it is a unilateral move by the publishers. And it’s totally opposed by all major academic bodies and library organisations as I detailed.

This is not the only case where a publisher’s interests have coincided with a favourable story.

* Science Magazine did a “study” “showing” that Open Access peer-review was flawed.

Who’s Afraid of Peer Review?

A spoof paper concocted by Science reveals little or no scrutiny at many open-access journals. [PMR: Note Science appears to have a significant business interest in keeping the Toll-Access status quo]

* Taylor and Francis “surveyed” 71 K readers and reported that they preferred CC-NC licences over CC-BY. [PMR Note: T+F have an apparent business advantage in restricting APC licences to NC]

* and here NPG have an interest in licensing TDM rather than accepting copyright extensions.

My concerns with the piece were that it gave a completely unbalanced view. Richard notes, and I agree, that elsewhere he has reviewed the case for copyright reform, but it was not in the current piece. A casual reader would not go searching for history, but assume that the licence issue was relatively uncontroversial.

Nature News wields great power. It is therefore critical that where it has vested interests they are made clear.

The same story could have been reported very differently (e.g. by Alok Jha or George Monbiot of the Guardian). An organisation critical of TA-STM publishers might have written:

“Elsevier ignore coming copyright reform and create de facto approach to licensing”

“In an attempt to forestall coming legislation which would make content mixable by all scientists, Elsevier has rushed through a licence scheme to persuade scientists that they can content-mine their journals. Other publishers seem likely to follow. But our experts showed that the licence was designed to protect the publisher’s business interests rather than assist the researcher – who might unwittingly end up in court.”

Same story – different emphasis. It was critical that NN stayed objective and I don’t think it did.

Detailed comments:

Dear Peter,

I believe my article was fair, giving representation to pro and anti- sides in this debate.  Agreed there were two sides, but not that one was highly favourable to Nature.

Let’s dig into the detail: you suggest that the article was ‘biased reporting’ which ‘purports to be news’ and was ‘effectively an attempt … to promote publisher licenses as a benefit to science’. My article does not intend to make a case for or against publisher licenses. It is, quite simply, reporting: explaining what has happened, and how scientists reacted to Elsevier’s new policy (which was, of course, news).

I should rephrase. I do not question your motivation. FWIW I was also listened for an hour to John Bohannon and believed he was sincere. But the overall impression is a news story which is supportive of Science’s (and here NPG’s) interested

Far from a bias for publishers’ licenses, the article clearly states the objections that you raise against the license approach. The introduction says that ‘some scientists object that even as publishers roll out improved technical infrastructure and allow greater access, they are exerting tight legal controls over the way text-mining is done’. The final three paragraphs explain precisely the complaints that some researchers have with the way publishers are setting license-controls on text-mining activity, leaving the reader with Ross Mounce’s criticisms.

“Some scientists” is far too weak. It should be replaced with “major national scientific societies, major funders, and international library organisation ware all absolutely opposed to licences”

On the other hand, for all you might disagree with them, it is a fact that other scientists I spoke to – including Max Hauessler, who has been very critical of Elsevier in the past – were pleased about the API and the click-through license. They told me that this would open up TDM opportunities, albeit under restrictive conditions (conditions that the article explains). I had, as you know, contacted you for your reaction too. You pointed me to your first blog (written before your more detailed analysis, which wasn’t available at the time), and I judged that Ross Mounce had already provided a voice for that view in the article.

I have explained that some scientists would welcome this – that does not mean it’s acceptable. Have any of the scientists been asked “are you happy to answer in court if you impinge on Elsevier’s business interests?” “Is your library happy with the licence or may you be disciplined?”. This is no more reliable than T+F’s 71K readers.

It is particularly bewildering that you accuse Nature of “failing to report any of the Licenses4Europe discussion”, and ask for “a balanced account of the Licenses4Europe story”.

For as far as I am aware, Nature is the *only* mainstream media venue to have reported the Licenses4Europe issues. In March 2013, I covered the clash between scientists and publishers over licenses, and in June 2013, further reported on the divisions rife in the European Commission TDM discussions. What’s more, two years ago I wrote the first media coverage of Max Haeussler’s struggles to get permission from Elsevier to text-mine for biological sequences.

I didn’t consider that the Licenses4Europe discussion needed to be explained again in this article: for I had already explained the argument that ‘the right to read is the right to mine’, and noted that the European Commission was examining the issue. Of course, all the relevant previous coverage is linked to at the end of the story.

The problem is that the casual reader will not know the history and will regard the links as superfluous detail.

Where does this discussion of bias and reporting balance leave us? Your critique helps me think carefully about how I’m reporting my stories for our readers. And your campaigning is bringing the issue to wider attention; I’ll be as interested as you are to see how NPG responds to your call for the company to ‘publicly renounce the use of licenses to control TDM’. Your examination of Elsevier’s detailed legal terms is also very useful. So, broadly, I welcome your letter.

Thank you. And I welcome your critique here. If this gets a different response from NPG over TDM licences (and nature is well placed to do so). It will have been worth while.

Except this: you have conflated your antipathy to NPG’s (and other subscription publishers’) TDM policies, with the incorrect accusations that the reporting in Nature was an attempt to promote publisher licenses, and was somehow ‘marketing … under the guise of news’. I’m pleased that you have already retracted your implication that I was involved in a marketing exercise. I hope that in future you’ll keep separate your critiques of my reporting, from your critiques of NPG policies.

I have retracted this assertion that it was deliberate. There is however a danger that any large institution becomes corporatist and institutionalist and I think publishers have to be particularly careful.

Richard.

I have high regard for almost everyone I have met in NPG – Philip Campbell, Timo Hannay, the New Technology Group and now Digital Science, and the Blogs and ScienceOnline teams. I would not say the same about other TA-STM publishers. But I think Nature – as an organisation – has to be very aware of its roots in the community.

 

It’s Not Easy Being Green: Assessing the Challenges of Urban Community Gardening

urbangardenSF

From vertical gardens to succulent gardens to community veggie gardens like the San Francisco garden pictured above, city dwellers all around us have started embracing their (hopefully) green thumbs.  For urbanites in particular, community gardening provides us with much needed “outside time” with likeminded individuals, with the added gift of hyper-local produce available throughout the growing season. These benefits have led to increases in residential and community garden participation in major cities across the US.

While many people are jumping on the garden-fresh bandwagon to reap the obvious, verdant benefits, it is important to consider the potential side effects that come alongside urban farming. Urban soil is not only closer to possible sources of pollution, like traffic and industrial areas, but could also contain residual chemicals from past land use. Residential land previously occupied by industrial buildings has been found to contain dangerous levels of toxins like lead, which can poison residents and contaminate food grown on-site. But it doesn’t take a former factory to contaminate your backyard. Soil can absorb and hold toxins left over from something as small as a previous homeowners dumping of cleaning water down the drain or off the back porch.

Researchers from Baltimore published an article in PLOS ONE earlier this month assessing Baltimore community gardeners’ knowledge of soil contamination risks and explored what steps can be taken to mitigate the dangers of urban pollution in urban gardens.

The authors, hailing from Johns Hopkins, University of Maryland, and the Community Greening Resource Network, conducted interviews with Baltimore’s community garden members, and found that unfortunately, the gardeners generally seem to have low levels of concern about potential contaminants in their soil. Those working in established community gardens were least concerned as they often assumed that any issues with soil contamination had been addressed in the early days of the garden’s use.

Participants listed lead as the most concerning pollutant—likely due to city interventions concerning lead poisoning—with 66% of surveyed gardeners mentioning it as something that would concern them if found in their soil. The study results also indicate that gardeners are more worried about the presence of pesticides and other added chemicals than most other residual chemicals in the soil. Soil quality and fertility even took greater precedence for some gardeners than the presence of contaminants.

By interviewing Baltimore officials knowledgeable about community gardening practices and soil contamination issues, the researchers determined key steps in assuring the safety of gardening sites. Above all, officials suggested the creation of a central source of information related to soil contamination concerns. Similar projects relating to regulation and urban agriculture are already underway in places like Los Angeles, though these resources aim to help residents navigate the maze of confusing legislation related to urban agriculture, and focus less on providing information on how to evaluate the safety of specific plots of land.

The authors suggest other important ways to determine the safety of a garden site, including learning about the site’s past uses and testing the soil for lingering chemicals, both of which might not seem necessary to those untrained in urban planning or chemical analysis. They also recommend that officials in urban areas provide services that will encourage use of these tools and help gardeners find and interpret the results of soil testing or historical research.

In the meantime, the authors suggest limiting exposure to potentially contaminated land. For instance, we should minimize contact with dirt from garden sites by washing our hands and taking off shoes before entering any indoor spaces. Many interviewed gardeners have tried to mitigate this problem by using raised beds, which they believe eliminates concern about contaminants in homegrown vegetables. However, researchers find this method ineffective, and it should not be seen as a fix-all. Raised beds do not prevent contamination from soil around the beds, which can still be ingested or tracked into the home, and surrounding pollutants have been known to blow into beds or seep into the soil from treated wood used to build the structures.

Urban community gardening is a trend that is here to stay, and we have it to thank for fresher local produce, greener surroundings, a greater sense of community, and for the physical, and sometimes therapeutic, activity it provides. The potential dangers associated with gardening in urban areas probably do not outweigh the benefits, as long as gardeners remain diligent and become better informed. Though their study focused on a limited group, this paper’s findings draw attention to the fact that they’re not. So, next time you’re digging into a grassy patch in your backyard with visions of veggies or working in your local community garden, take a minute to think about what you know about your area, discuss past developments with longtime residents, and above all, clean up afterward.

More information on soil testing and good gardening practices can be found on this site from the EPA.

Citation: Kim BF, Poulsen MN, Margulies JD, Dix KL, Palmer AM, et al. (2014) Urban Community Gardeners’ Knowledge and Perceptions of Soil Contaminant Risks. PLoS ONE 9(2): e87913. doi:10.1371/journal.pone.0087913

Image: Tenderloin People’s Garden by SPUR

The post It’s Not Easy Being Green: Assessing the Challenges of Urban Community Gardening appeared first on EveryONE.

Two Shark Studies Reveal the Old and Slow

Sharks live in the vast, deep, and dark ocean, and studying these large fish in this environment can be difficult. We may have sharks ‘tweeting’ their location, but we still know relatively little about them. Sharks have been on the planet for over 400 million years and today, there are over 400 species of sharks, but how long do they live, and how do they move? Two recent studies published in in PLOS ONE have addressed some of these basic questions for two very different species of sharks:  great whites and megamouths.

The authors of the first study looked at the lifespan of the great white shark. Normally, a shark’s age is estimated by counting growth bands in their vertebrae (image 1), not unlike counting rings inside a tree trunk. But unfortunately, these bands can be difficult to Great white vertdifferentiate in great whites, so the researchers dated the radiocarbon that they found in them. You might wonder where this carbon-14 (14C) came from, but believe it or not, radiocarbon was deposited in their vertebrae when thermonuclear bombs were detonated in the northwestern Atlantic Ocean during the ‘50s and ’60s. These bands therefore provide age information. Based on the ages of the sharks in the study, the researchers suggest that great whites may live much longer than previously thought. Some male great whites may even live to be over 70 years old, and this may qualify them as one of the longest-living shark species. While these new estimates are impressive, they may also help scientists understand how threats to these long-living sharks may impact the shark population.

A second shark study analyzed the structure of a megamouth shark’s pectoral fin (image 2) to understand and predict their motion through the water. Discovered megamouth finin 1976, the megamouth is one of the rarest sharks in the world, and little is known about how they move through the water. We do know that the megamouth lives deep in the ocean and is a filter feeder, moving at very slow speeds to filter out a meal with its large mouth. But swimming slowly in the water is difficult in a similar way flying slowly in an airplane is difficult. Sharks need speed to control lift and movement.

To better understand the megamouth’s slow movement, the researchers measured the cartilage, skin histology, and skeletal structure of the pectoral fins of one female and one male megamouth shark, caught accidentally and preserved for research. The researchers found that the megamouth’s skin was highly elastic, and its cartilage was made of more ‘segments’ than any other known shark, which may provide added flexibility compared to other species. megamouth jointThe authors also suggest that the joint structure (image 3) of the pectoral fin may allow forward and backward rotation, motions that are largely restricted in most sharks.  The authors suggest that this flexibility and mobility of the pectoral fin may be specialized for controlling body posture and depth at slow swimming speeds. This is in contrast to the fins of fast-swimming sharks that are generally stiff and immobile.

In addition to the difficulties in exploring deep, dark seas, small sample sizes present challenges for many shark studies, including those described here. But whether studying the infamous great white shark or one of the rare megamouths, both contribute to a growing body of knowledge of these elusive fish.

Citations: Hamady LL, Natanson LJ, Skomal GB, Thorrold SR (2014) Vertebral Bomb Radiocarbon Suggests Extreme Longevity in White Sharks. PLoS ONE 9(1): e84006. doi:10.1371/journal.pone.0084006

Tomita T, Tanaka S, Sato K, Nakaya K (2014) Pectoral Fin of the Megamouth Shark: Skeletal and Muscular Systems, Skin Histology, and Functional Morphology. PLoS ONE 9(1): e86205. doi:10.1371/journal.pone.0086205

Images1: doi:10.1371/journal.pone.0084006.g001

Image 2: doi:10.1371/journal.pone.0086205.g003

Image 3: doi:10.1371/journal.pone.0086205.g004

The post Two Shark Studies Reveal the Old and Slow appeared first on EveryONE.

American Geophysical Union and Wiley Launch New Open Access Journal, Earth and Space Science

The American Geophysical Union (AGU) and John Wiley & Sons, Inc., today announced the creation of a new all open access peer-reviewed journal, Earth and Space Science.

Marking AGU’s second new open access journal in the last 12 months, Earth Space and Science is the only journal that reflects the expansive range of science represented by AGU’s 62,000 members, including all of the Earth, planetary, and space sciences, and related fields in environmental science, geoengineering, space engineering, and biogeochemistry.
 
>> Read the full press release here <<

Earth and Space Science joins a prestigious portfolio of research publications that are governed by AGU’s rigorous peer review process. This includes the highly ranked Geophysical Research Letters and Journal of Geophysical Research – Atmospheres, and Earth’s Future—an innovative open access publication that features trans-disciplinary research, editorials, and essays emphasizing the Earth as an interactive, evolving system under the influence of the human enterprise—which was successfully launched in late 2013.

The journal will publish articles under the Creative Commons Attribution License enabling authors to be fully compliant with open access requirements of funding organizations where applicable. The publication fee will be competitive with those of other broad open access journals.

A search is now underway for Earth and Space Science’s inaugural editor in chief, who will lead a team of preeminent academic editors who are closely connected to their communities. 

Additional information on Earth and Space Science is available at http://earthspacescience.agu.org.