Registration is now open for the Berlin 10 Open Access Conference!

Registration is now open for the Berlin 10 Open Access Conference, to be held at the Wallenberg Research Centre, Stellenbosch Institute for Advanced Study (STIAS), Stellenbosch, South Africa  from 7-8 November 2012. Registration is also open for Pre-conference Workshops to be presented on 6 November 2012.  The theme of the Conference is Networked scholarship in a networked world: participation in Open Access. The programme will be made available on the Berlin 10 Open Access Conference web at www.berlin10.org in due course. 

To register for the Conference and Pre-conference Workshops, please visit: http://www.lib.sun.ac.za/b10/register.html 

Please note that space is limited. Early Bird registration closes on the 15th of September 2012, and General Registration on the 24th of October 2012.

****************

The Berlin 10 Open Access Conference is being organised by Stellenbosch University, in collaboration with the Max Planck Society. Other partners include the Association of African Universities (AAU), the Academy of Science of South Africa (ASSAf) and UNESCO.

****************

The Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities, issued in 2003 by international research, scientific, and cultural institutions, promotes the Internet as a medium for disseminating global knowledge. It has been signed by the leaders of over 300 research institutions, libraries, archives, museums, funding agencies, and governments from around the world. Signatories include the Max Planck Society (co-initiator and custodian of the declaration), CERN, the Chinese Academy of Sciences, Academia Europea, Harvard University, and the International Federation of Library Associations.

The Berlin Open Access Conference Series supports the continued adoption and realisation of the principles of the declaration and has been hosted in Germany, Switzerland, England, Italy, France, China, and more recently in the USA (North America). Berlin 10 will mark the first such meeting to take place on the African continent. The program will feature concrete steps taken by a variety of stakeholders to support Open Access and invite participants to consider added actions that might be taken – including encouraging signatures to the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities. Please visit http://www.berlin10.org/call-to-action.html for more information.

*****************

For registration enquiries, please contact: 

Ellen Claasen ellen.claasen@mrc.ac.za

 

All other enquiries can be directed to:

Ina Smith ismith@sun.ac.za

 

More information on the registration process: http://www.berlin10.org/reg.html

Open access versus subscription journals: a comparison of scientific impact

Abstract

Background

In the past few years there has been an ongoing debate as to whether the proliferation of open access (OA) publishing would damage the peer review system and put the quality of scientific journal publishing at risk. Our aim was to inform this debate by comparing the scientific impact of OA journals with subscription journals, controlling for journal age, the country of the publisher, discipline and (for OA publishers) their business model.

Methods

The 2-year impact factors (the average number of citations to the articles in a journal) were used as a proxy for scientific impact. The Directory of Open Access Journals (DOAJ) was used to identify OA journals as well as their business model. Journal age and discipline were obtained from the Ulrich’s periodicals directory. Comparisons were performed on the journal level as well as on the article level where the results were weighted by the number of articles published in a journal. A total of 610 OA journals were compared with 7,609 subscription journals using Web of Science citation data while an overlapping set of 1,327 OA journals were compared with 11,124 subscription journals using Scopus data.

Results

Overall, average citation rates, both unweighted and weighted for the number of articles per journal, were about 30% higher for subscription journals. However, after controlling for discipline (medicine and health versus other), age of the journal (three time periods) and the location of the publisher (four largest publishing countries versus other countries) the differences largely disappeared in most subcategories except for journals that had been launched prior to 1996. OA journals that fund publishing with article processing charges (APCs) are on average cited more than other OA journals. In medicine and health, OA journals founded in the last 10 years are receiving about as many citations as subscription journals launched during the same period.

Conclusions

Our results indicate that OA journals indexed in Web of Science and/or Scopus are approaching the same scientific impact and quality as subscription journals, particularly in biomedicine and for journals funded by article processing charges.



Authors: Bo-Christer Björk1* and David Solomon2

1 Hanken School of Economics, Helsinki, Finland
2 College of Human Medicine, Michigan State University, East Lansing, MI, USA

For all author emails, please log on.

BMC Medicine 2012, 10:73 doi:10.1186/1741-7015-10-73

The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1741-7015/10/73

Background

Emergence and growth of open access

Over the last 20 years the publishing of scientific peer-reviewed journal articles has gone through a revolution triggered by the technical possibilities offered by the internet. Firstly, electronic publishing has become the dominant distribution channel for scholarly journals. Secondly, the low cost of setting up new electronic journals has enabled both scholars and publishers to experiment with new business models, where anybody with internet access can read the articles (‘open access’ or OA) and the required resources to operate journals are collected by other means than charging readers. Similarly, increased availability can be achieved by scientists uploading the prepublication versions of their articles published in subscription journals to OA web repositories such as PubMed Central. The majority of publishers now allow some form of archiving in their copyright agreements with authors, sometimes requiring an embargo period. Major research funders such as the National Institutes of Health (NIH) and the Wellcome Trust have started requiring OA publishing from their grantees either in open access journals (gold OA) or repositories (green OA). A recent study showed that 20.4% of articles published in 2008 were freely available on the web, in 8.5% of the cases directly in journals and in 11.9% in the form of archived copies in some type of repository [1].
In the latter half of the 1990s when journals created by individual scientists were dominating OA publishing, these journals were not considered by most academics a serious alternative to subscription publishing. There were doubts about both the sustainability of the journals and the quality of the peer review. These journals were usually not indexed in the Web of Science, and initially they lacked the prestige that academics need from publishing. Quite often their topics were related to the internet and its possibilities, as exemplified by the Journal of Medical Internet Research, which in 15 years has managed to become a leading journal in its field.
A second wave of OA journals consisted of established subscription journals, mainly owned by societies. These publishers decided to make the electronic version of their journal(s) freely accessible. Such journals are particularly important in certain regions of the world for example, Latin America and Japan, where portals such as Scielo and J-stage host hundreds of journals at no cost to the publishers. One of the earliest journals to make its electronic version OA was BMJ, which since 1998 has made its research articles freely available.
The third wave of OA journals was started by two new publishers, BioMedCentral and Public Library of Science (PLoS). They pioneered the use of article processing charges (APCs) as the central means of financing professional publishing of OA journals. Since 2000 the importance of the APC business model for funding OA publishing has grown rapidly. BioMedCentral was purchased in 2008 by Springer and over the last couple of years almost all leading subscription publishers have started full open access journals funded by APCs. The leading scientific OA journals using the APC model tend to charge between US$2,000 and US$3,000 for publishing but overall the average APC was US$900 in 2010 across all journals charging APCs listed in the Directory of Open Access Journals [2]. In many fields the payment of such charges is a substantial barrier to submissions. In a broad survey of authors who had published in scholarly journals, 39% of respondents who hadn’t published in OA journals mentioned problems in funding article-processing fees as a reason [3].
Subscription publishers have also tried an OA option called hybrid journals where authors can pay fees (typically in the range of US$3,000) to have the electronic versions of their articles OA as part of what is otherwise a subscription journal. The uptake for hybrid journals in general has been very limited at about 1% to 2% for the major publishers [4].

Does OA threaten to undermine scientific peer review?

The starting point for this study are the claims made, often by publishers and publishers’ organizations, that the proliferation of OA would set in motion changes in the publishing system which would seriously undermine the current peer review system and hence the quality of scientific publishing. Suber has written an excellent overview of this discussion [5]. Lobbying using this argument has in particular been directed against government mandates for OA such as implemented by the NIH for their grantees. It is claimed that the resulting increase in posting of manuscript copies to OA repositories would lead to wide-scale cancellation of subscriptions putting traditional publishers, both commercial and society in jeopardy and in the long run result in an erosion of scientific quality control. This scenario is based on the assumption that the OA publishers would take over an increasing part of the publishing industry and would not provide the same level of rigorous peer review as traditional subscription publishers, which would result in a decline in the quality of scholarly publishing. The NIH have documented that their mandate has not in fact caused any harm to publishers [6].
The critique has in particular been focused on OA publishers that charge authors APCs. Superficially such publishers would seem to be inclined to accept substandard articles since their income is linearly dependent on the number of papers they publish. There have in fact been reports of some APC-funded OA publishers with extremely low quality standards [7]. Reports of such cases in the professional press such as the recent article ‘Open access attracts swindlers and idealists’ [8] in the Finnish Medical Journal, a journal read by the majority of practicing physicians in Finland, can by the choice of title alone contribute to a negative image of OA publishing. The founding of the Open Access Scholarly Publishers Association, which in particular strives to establish quality standards for OA journals, was in part a reaction by reputable OA publishers to the appearance of such publishers on the market.
One of the questions in the above-mentioned survey of scholarly authors [3], dealt with the ‘myths’ about open access, including the quality issue. On a Likert scale researchers in general tended to disagree with the statements ‘Open access undermines the system of peer review’ and ‘Open access publishing leads to an increase in the publication of poor quality research’ (results reported in Figure 4; [3]). It thus seems that a majority of scholars or at least those who completed this very widely disseminated survey did not share this negative perception of the quality of OA publishing.

Aim of this study

Scientific quality is a difficult concept to quantify. In general terms very rigorous peer review procedures should raise the quality of journals by screening out low quality articles and improving manuscripts via the reviewers’ comments. In this respect one could assume that the novel peer review procedures used by certain OA journals such as PLoS ONE should lower the quality. However, such journals essentially leave it to the readers to affirm the quality through metrics such as the number of citations per article. In practice the only proxy for the quality that is generally accepted and widely available across journals are citation statistics. In the choice of title for this article we have hence consciously avoided the term scientific ‘quality’ and chose to use ‘impact’ instead, which is closely related to citations such as in the impact factor used in Journal Citation Reports.
It has now been 20 years since the emergence of the first OA journals and 10 years since the launch of the first major OA journals funded by APCs. The number of peer-reviewed articles published in OA journals was already around 190,000 in 2009 and growing at the rate of 30% per annum [9]. Roughly half of the articles are published in journals charging APCs [2]. Enough time has also passed so that the qualitatively better OA journals and in particular journals that have been OA from their inception are now being indexed by major citation indexes such as the Web of Science and Scopus. In the last few years academic search engines such as Google Scholar have also emerged, but the data generated by these automated searches is too unstructured to be used for a study of the citation counts of large numbers of articles or full journals. In contrast both the Journal Citation Reports (JCR), and SCOPUS via the data available on the SCImago portal provide aggregated data in the form of impact factors, which can be used for comparing OA and subscription journals.
This provides empiric data enabling us to ask meaningful questions such as: ‘How frequently are articles published in OA journals cited compared to articles in non-OA journals?’. Although the citation level cannot directly be equated to scientific quality, it is widely accepted as a proxy for quality in the academic world, and is the only practical way of getting comprehensive quantitative data concerning the impact of journals and the articles they contain. The aim of this study was thus to compare OA and subscription journals in terms of the average number of citations received both at the journal and article level.

Earlier studies

Over the past 10 years there have been numerous studies reporting that scientific articles that are freely available on the internet are cited more frequently than articles only available to subscribers (for overviews see Swan [10] and Wagner [11]). Most of these studies have been conducted by comparing articles in subscription journals where some authors have made their articles freely available in archives. Gargouri et al. [12] found a clear citation advantage of the same size both for articles where the author’s institution mandated OA, and for articles archived voluntary. They also found that the citation advantage was proportionally larger for highly cited articles. Some authors claim that when eliminating factors such as author’s selecting their better work for OA dissemination, the advantage, at least concerning citations in Web of Science journals is low or even non-existent. Evans and Reimar using extensive Web of Science data report an overall global effect of 8% more citations, but with a clearly higher level of around 20% for developing countries [13]. Davis, in a randomized trial experiment involving 36 mainly US-based journals, found no citation effect but a positive effect on downloads [14]. His study was however limited to high-impact journals with wide subscription bases.
Assuming that there is some level of citation advantage, this would mean that the articles published in full OA journals would receive an additional citation advantage beyond their intrinsic quality from their availability. In practice it would, however, be very difficult to separate out the effects of these two underlying factors. A share of the articles in subscription journals (approximately 15%) also benefit from the increased citations due to the existence of freely available archival copies as noted for instance by Gargouri et al. [12]. If there was a consensus of the citation advantage for being freely available, it would be possible to correct for this effect. Since the estimates of this factor vary so much across studies, we are hesitant to attempt such a correction.
However, we don’t necessarily need to explicitly take this factor into account when assessing the quality level of the global OA journal corpus. If articles in them on average get as many citations as articles in subscription journals, then their overall scientific impact (as measured by getting cited) is also equal. OA is just one of several factors influencing the citation levels of particular journals, others being the prestige of the journals, the interest of the topics of the articles, the quality of the layout for easy reading, timeliness of publication and so on.
Journals that were launched as OA from relatively new publishers such as PLoS or BMC have disadvantages in other respects. They lack the established reputation of publishers that have been in business for decades. The reputation of these journals is also hindered by a large, though shrinking, number of researchers who believe that electronic-only OA journals are somehow inferior to their more established subscription counterparts. In this study we will therefore make no attempt to look separately at the citation effect of OA, due to the complexity of the issue and the lack of a reliable estimate of the effect.
There are a few previous studies that have tried to determine the overall quality of OA journal publishing as compared to traditional subscription publishing. McVeigh studied the characteristics of the 239 OA journals included in the 2003 Journal Citation Reports [15]. Her report contains very illustrative figures showing the positions of these journals in the ranking distribution within their respective scientific disciplines. Overall, OA journals were represented more heavily among the lower-ranking journals, but there were also 14 OA journals in the top 10% in their disciplines. She also mentions that 22,095 articles were published in these OA journals in 2003. In considering the results from this early study it is important to bear in mind the highly skewed regional and age distributions of the journals in question. Only 43% of the OA journals were published in North America or Western Europe, and the vast majority of the journals were old established journals that had recently decided to make their electronic content openly available.
Giglia [16] set out to duplicate the McVeigh study, to the extent possible. Giglia was now able to rely solely on the DOAJ index for info about which journals were OA and identified 385 titles to study, using JCR from 2008 as the starting point. Giglia studied the distribution of titles in different percentiles of rank in their discipline using the same breakdown as McVeigh. All in all the results were not much different from the earlier study. Giglia found that 38% of the 355 OA journals in Science Citation Index and 54% of the 30 OA journals in Social Science Citation Index were in the top half ranks in JCR.
Miguel et al. [17] focused on studying how well represented gold and green OA journals were in citation indexes. They were able to combine DOAJ data with data from the SCOPUS citation database, which covers more journals than JCR, and could also use the average citation counts from the SCImago database. The results highlighted how OA journals have achieved a share of around 15% of all SCOPUS indexed journals for Asia and Africa and a remarkable 73% for Latin America. Of particular interest for this study was that some of the figures in the article showed the average number of citations per document in a 2-year window (calculated over journals) for particular journal categories. Thus the overall average number of citations was around 0.8 for OA journals, 1.6 for subscription journals allowing green posting and 0.8 for subscription journals not allowing green posting. They found highly differentiated average citation levels for nine different broad disciplines. They also found very clear differences in the citation levels between regions, with North American and European OA journals performing at a much higher level than journals from other parts of the world. Both in the disciplinary and regional breakdowns the non-OA journals followed the same patters, so that the relative performance of OA journals to non-OA journals was relatively stable.

Methods

The data for this study were obtained from four databases. These included Ulrichsweb, Journal Citation Reports 2010 (JCR), SCImago Journal & Country Rank (SCImago), and the Directory of Open Access Journals (DOAJ). SCImago and DOAJ are openly available and provide their data in an easily downloaded format. Both our institutions have subscriptions to the electronic versions of Ulrichsweb and JCR, and it was possible to use our institutional access to these databases to obtain the information needed.
Ulrichsweb is a database of detailed information on more than 300,000 periodicals of all types. The JCR is the 2010 version of a database concerning the articles published and the citations received by the peer-reviewed journals indexed in the Web of Science citation index, a database of selected high quality scholarly journals maintained by Thomson Reuters. This study largely focuses on the average number of citations received by a journal over the most recent 2-year period, commonly called an impact factor. SCImago provides open access to similar metrics for citations concerning journals included in the Scopus Citation Database maintained by Elsevier. Scopus is similar to Web of Science but provides data on a larger number of journals. The DOAJ is a database of open access journals that provides basic information about the journals as well as immediate unrestricted access to full text articles for some of these journals. Of these services, Web of Science whose citation index is provided through the JCR has the strictest inclusion criteria, followed by Scopus. DOAJ accepts all journals that fulfill certain criteria concerning the open accessibility and the peer review, whereas Ulrichsweb is open for any journal to self-report their data.
A limitation of this method is that journals not indexed in Web of Science or Scopus cannot be included, since there is no way to obtain citation data in a systematic way. Google scholar could be used to study citations in that index to individual journals but the process is extremely labor intensive and cannot be performed for large numbers of journals.
Studies have shown a high degree of correlation between the citation metrics of JCR and Scopus, although their absolute values differ. For instance Pislyakov [18] studied the citedness of 20 leading economics journals using data from both JCR and Scopus and found that the correlation between the Impact factors of these two indexes was 0.93 (Pearson). Sicilia et al. [19] also found a strong correlation between the two measures for computer science journals. Hence either one provides a good measure for the level of citations.
We used this mix of sources because we needed a number of data items for our analysis that could not be obtained from just one database. Ulrichsweb was used to obtain the start year for each journal as well as the up to five discipline categories in which it was classified. It was also used to identify the country of origin of the publisher. Being listed in the DOAJ was used as an indicator of whether a journal was open access and to determine if a journal charged APCs. The JCR was used to obtain the 2-year impact factor for each journal as well as the number of articles published in it in the most recent year available in the report, 2010. SCImago was used to obtain the 2-year citation count divided by number of articles published for Scopus indexed journals (in essence similar to the JCR impact factor) and the number of articles published in 2011.
To create a merged data set for analysis we started with the Ulrichsweb database, first narrowing the database to only journals that were: abstracted or indexed, currently active, academic/scholarly, refereed, and formatted as online and/or in print.
We selected all journals within those limits that were listed in the following discipline categories (based on the discipline coding used by Ulrichsweb): arts and literature; biological science; business and economics; chemistry; earth, space and environmental sciences; education; mathematics; medicine and health; physics; social sciences; technology and engineering. While there were other disciplines categorized in Ulrichsweb, these in our view captured the major scholarly disciplines. Many journals were listed under multiple disciplines. We recorded each discipline listed for each journal. The maximum for any journal was five. The data were retrieved in January 2012.
We then merged data from the other three databases to the journals identified in Ulrichsweb using either the International Standard Serial Number (ISSN) or the Electronic International Standard Serial Number (EISSN) as the identifier. There were 23,660 journals identified in Ulrichsweb meeting the criteria within the 11 disciplines of which 12,451 (52.6%) were in the SCImago database as of January 2012, 8,256 (35.0%) were in the JCR 2010 and 2,530 (10.7%) were in the DOAJ as retrieved from their web site in August 2011.
Citation metrics of OA and subscription journals were analyzed in two different ways. Firstly they were analyzed with journals as the unit of analysis, which was at the level the data were retrieved from the four databases. We also estimated the citation metrics of the articles published. This was performed by weighting the journal level citation metrics by the number of articles published in each journal per year using article counts provided by the JCR and SCImago databases. This lends more or less weight to each journal based on the number of articles that were published within the journal. We feel this adds a new and important dimension to the analysis as compared to earlier studies.
In the data collection and analysis process we found some problems with the SCImago data. The site allows downloading the basic article numbers and citation data for all journals as one Microsoft Excel file with the most current year’s data. The data on impact factors and number of articles was for 2011 but it seems that the article and citation counts are not complete for the full year, so that both the article numbers and impact factors are too low. This could easily be checked for individual journals and it turned out that the impact factors for 2010 as well as preceding years were in most cases almost double compared to the 2011 figures. A comparison with the journal level analysis in Miguel et al. [17] also pointed in the same direction. Unfortunately it was not possible to extract the older data for the over 12,000 journals in the study so we were limited to using the 2011 data, which was incomplete.
We nevertheless feel that the analysis using SCOPUS data provides a useful triangulation with the JCR analysis. Provided that the insufficient counting for 2011 is systematic across all journals, with no differentiation between OA and subscription journals, the citation levels for OA vs. subscription relative to each other should remain the same, although the absolute levels are lower. In comparing the numbers with the JCR based the proportions between OA and subscription citation rates were approximately the same in both sets supporting the conclusions we later illustrate mainly with the JCR results.

Results

The results were calculated using 2-year average citations (impact factors) from the JCR and Scopus (via SCImago) by journal and weighted by the number of article in each journal as described above. OA and subscription journals were compared by the time period when they were launched (pre-1996, 1996 to 2001, and 2002 to 2011), by country published grouped into the four largest publishing countries (USA, UK, The Netherlands, and Germany) versus other countries, scientific discipline (medicine and health versus other) and business model (OA funded by APC, OA not funded by APC, and subscription).
Table 1 provides a comparison of the impact factors for OA and subscription journals based on journals in the JCR and Scopus databases. OA journals had impact factors that were approximately 76% and 67% as high as subscription journals in JCR and Scopus respectively when analyzed by journal and 73% and 62% when weighted for articles published. Due to our concerns about the Scopus data from the SCImago Journal and Country site outlined above in the Methods Section only JCR figures are presented and discussed below.
Table 1. The 2-year citation averages for open access versus subscription journals, calculated using Web of Science or Scopus data
Figure 1 shows the average JCR impact factor for OA and subscription journals weighted by the number of articles as a function of the time period the journal was launched and location of the publisher. The left side of the figure includes the journals from the four countries where most of the major society and commercial publishers are located. The publishers in these four countries account for approximately 70% of the journals in our sample. The right side of the figure includes journals publishing in the rest of the world.
thumbnailFigure 1. Citation averages as a function of the journal start year for two regions. The figures are based on Web of Science and weighted by journal article volumes.
There are large differences in the impact factors between the two regions with the ‘big four’ on average having journals with significantly higher impact factors. Somewhat surprisingly in this region more recently launched journals tended to have higher impact scores than the older more established journals. This was true for both subscription and OA journals. In addition the difference in impact between OA and subscription journals narrows with time.
The pattern for journals from the rest of the world is quite different. While the overall number of journals published is much lower, the number of OA journals is actually quite high in the pre-1996 group where OA journals have a clearly lower impact. This group largely consists of old established print journals, which at some staged have opened up their electronic versions. In the middle time period, OA journals were outperforming subscription journals and in the youngest group they were on a par with subscription journals.

Effects of the discipline of the journals

Several studies have shown that gold open access journals have had a larger uptake in the biomedical fields [1,15], where authors usually have less problems in financing APCs and where many research funders also require some form of OA for the results. Figure 2 shows the average JCR impact factor of OA and subscription journals weighted by the number of articles as a function of the discipline. The journals were split into two groups. The first included journals with the Ulrichsweb discipline category ‘Medicine and Health’. All the other disciplines were combined into the second group.
thumbnailFigure 2. Citation averages as a function of the journal start year for Medicine and Health versus all other disciplines. The figures are based on Web of Science and weighted by journal article volumes.
In medicine and health, the large difference in impact between OA and subscription journals seen in older journals essentially disappears among the journals launched after 2001. This probably reflects the emergence of high quality professional OA publishers such as PLoS and BioMedCentral that rely on APCs for funding. For the other disciplines, OA articles had considerably lower impact scores in journals before 1996 and journals launched after 2001 but the average impact of OA articles in journals launched between 1996 and 2001 was essentially equal to the average impact of articles in subscription journals launched in the same period. In reviewing the raw data, the high average impact of the OA articles during this period was due to a handful of relatively high impact and high volume OA journals published by BioMedCentral, which had been classified as biological rather than medical journals.

Effects of the revenue model of OA journals

In Figure 3 (subscription journals), OA journals funded by APCs and OA journals that do not charge APCs are compared as a function of journal age. As noted above, the early OA journals were funded through volunteer effort and small subsidies from largely universities. Beginning with BioMedCentral and PLoS in about 2001 a growing number of professional publisher have begun publishing OA journals funding their operations by charging publication fees.
thumbnailFigure 3. Citation averages for open access journals using article processing charges (APCs) versus those that are free to publish in for authors, compared to impact factors for subscription journals.
The impact of OA journals that are not funded by APCs are more or less the same irrespective of journal age at about 1.25. The oldest age category consists mainly of print journals that have made their electronic versions freely available. APC funded OA journals’ average impact increased markedly in the period 1996 to 2001 and to a lesser extent in 2002 to 2011 nearly reaching the same level as subscription journals at about 3.2. The 89 APC funded journals launched before 1996 we expect largely include subscription journals that converted to the APC model of OA publishing. A number of the journals are published by Hindawi, which did in fact transition from a subscription publisher to an OA publisher funded by APCs [20]. The other journals are published by a variety of publishers, universities, societies and other organizations from around the world.

Discussion

The distribution of OA journals over time periods and regions differs markedly from the corresponding distribution of subscription journals. OA journals are much more numerous in categories that have low overall impact factors which may explain some of the difference in average impact between OA and subscription journals. Almost half (302) of all OA journals found in JCR are journals started before 1996 and published in the ‘other countries’ region. While over 75% of the subscription journals found in the JCR were also launched before 1996, nearly 70% of subscription journals are from publishers in the four major publishing counties. As can be seen in Figure 1, across all age categories and for both OA and subscription journals, those published outside the four major publishing countries have substantially lower impact factors. While correlation is not necessary causation, the location of the publisher appears to account for much of the difference in average impact between OA and subscription journals.
The vast majority of journals founded before 1996 that are listed in the JCR started as paper-based subscription journals. Those listed as OA must at some stage have made their electronic versions open access. Many of these are journals published by scientific societies and universities but at least in one case (Hindawi) a publisher converted their whole portfolio from subscription to OA.
Both in the leading publishing countries and in the rest of the world, older established journals that have made their electronic versions openly available have lower impact scores than their subscription counterparts. This is understandable since the large commercial publishers and the leading society publishers have usually refrained from opening up the e-content, BMJ being a notable exception. But for the newer journals, particularly in medicine and health, our results show that OA journals are performing at about the same level as subscription journals, in fact getting more citations in some subcategories.
For almost 15 years the quality of OA journals has been debated and questioned. In the early days of electronic journals, when hardly any startup OA journals were operated by reputable professional publishers, it was easy to understand the reluctance of scientists to submit their best manuscripts to OA journals and for research funders and university promotion and tenure committees to accept publishing in OA journals as on par with publishing in traditional subscription based journals. After the launch of professionally run high quality biomedical OA journals beginning in about 2000, the situation has changed. Today the funding mechanism of a journal is irrelevant in considering its quality. There are large numbers of both subscription and OA journals that are high quality and widely cited.
The development and increasing acceptance of the APC funding model for OA scholarly journals has spawned a group of publishers with questionable peer review practices that seem focused on making short-term profits by having low or non-existent quality standards. Unfortunately this has created some bad publicity for OA publishing. As this study demonstrates, this does not change the broad picture. Gold OA publishing has increased at a rate of 30% per year over the past decade [9] and in the last couple of years many major subscription publishers have started adding pure OA journals to their portfolios.
We believe our study of the quality of the OA journals indexed in either Web of Science or Scopus is the most comprehensive to date. The results indicate that the level of citations for older subscription based OA journals, which have made the electronic version openly available, is clearly lower than for the corresponding subscription journals. At the same time newly founded full OA journals compete on almost equal terms with subscription journals founded in the same period. OA articles published medicine and health by publishers in the four largest publishing countries; attract equal numbers of citations compared to subscription journals in these fields. Based on the evidence from earlier studies it is likely that a part of the citations to the OA articles are due to the increased readership following from the open availability, but there is no way we can isolate the effect of this factor in our calculations nor would this factor alone account for the increasing respect researchers are showing for these journals through their citations.
The focus of the criticism of OA journals has been directed against journals funding their operations with APCs, claiming that this revenue model leads to journals lowering their review standards in order to maximize their profits. While there is clearly a substrata of journals reflecting this phenomena, there are also a growing number of high quality APC funded journals from reputable publishers that are on par with their subscription counterparts.

Conclusions

In summary, gold OA publishing is rapidly increasing its share of the overall volume of peer-reviewed journal publishing, and there is no reason for authors not to choose to publish in OA journals just because of the ‘OA’ label, as long as they carefully check the quality standards of the journal they consider.

Competing interests

There are no competing financial interests. Both authors have founded OA journals in the 1990s and are emeritus editors-in-chiefs. B-CB is a current and DS a former board member of the Open Access Scholarly Publishers Association.

Authors’ contributions

B-CB initiated the study and has written most of the background sections of the articles. DS collected the data from the different sources and made the calculations. Both authors participated equally in the analysis of the results and the drawing of conclusions.

Authors’ information

B-CB is professor of Information Systems Science at the Hanken School of Economics, Helsinki, Finland. DS is Professor of Medicine at the College of Human Medicine, Michigan State University, USA.

References

  1. Björk B-C, Welling P, Laakso M, Majlender P, Hedlund T, Guðnason G: Open access to the scientific journal literature: situation 2009.
    PLoS ONE 2010, 5:e11273. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL
  2. Solomon DJ, Björk B-C: A study of Open Access Journals using article processing charges.
    J Am Soc Info Sci Technol, in press. OpenURL
  3. Dallmeier-Tiessen S, Darby R, Goerner B, Hyppoelae J, Igo-Kemenes P, Kahn D, Lambert S, Lengenfelder A, Leonard C, Mele S, Nowicka M, Polydoratou P, Ross D, Ruiz-Perez S, Schimmer R, Swaisland M, van der Stelt W: Highlights from the SOAP project survey. [http://arxiv.org/abs/1101.5260v2] webcite
    What scientists think about open access publishing OpenURL
  4. Dallmeier-Thiessen S, Goerner B, Darby R, Hyppoelae J, Igo-Kemenes P, Kahn D, Lambert S, Lengenfelder A, Leonard C, Mele S, Polydoratou P, Ross D, Ruiz-Perez S, Schimmer R, Swaisland M, van der Stelt W: Open access publishing – models and attributes. [http://edoc.mpg.de/478647] webcite
    SOAP project report, Max Planck Society digital library; 2010.
  5. Suber P: Will open access undermine peer review? [http://www.earlham.edu/~peters/fos/newsletter/09-02-07.htm] webcite
    the SPARC Open Access Newsletter, issue 113; 2009.
  6. NIH: NIH Public Access Policy. [http://publicaccess.nih.gov/public_access_policy_implications_2012.pdf] webcite
  7. Gilbert N: Editor will quit over hoax paper: computer-generated manuscript accepted for publication in open-access journal. [http://www.nature.com/news/2009/090615/full/news.2009.571.html] webcite
    Nature News 2009. OpenURL
  8. Järvi U: Open Access Attracts swindlers and idealists [in Finnish].
    Finn Med J 2012, 67:666-667. OpenURL
  9. Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C, Hedlund T: The development of open access journal publishing from 1993 to 2009.
    PLoS ONE 2011, 6:e20961. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL
  10. Swan A: The Open Access citation advantage: studies and results to date. [http:/ / openaccess.eprints.org/ index.php?/ archives/ 716-Alma-Swan-Review-of-Studies-on- Open-Access-Impact-Advantage.html] webcite
    Technical Report, School of Electronics & Computer Science, University of Southampton; 2010.
  11. Wagner A: Open access citation advantage: an annotated bibliography. [http://www.istl.org/10-winter/article2.html] webcite
    Iss Sci Technol Librarian 2010., 60: OpenURL
  12. Gargouri Y, Hajjem C, Larivière V, Gingras Y, Carr L, Gingras Y, Brody T, Harnad S: Self-selected or mandated, open access increases citation impact for higher quality research.
    PLoS ONE 2010, 5:e13636. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL
  13. Evans J, Reimer J: Open access and global participation in science.
    Science 2009, 323:1025. PubMed Abstract | Publisher Full Text OpenURL
  14. Davis P: Open access, readership, citations: a randomized controlled trial of scientific journal publishing.
    FASEB J 2011, 25:2129-2134. PubMed Abstract | Publisher Full Text OpenURL
  15. McVeigh M: Open Access Journals in the ISI Citation Databases: Analysis of Impact Factors and Citation Patterns. [http://science.thomsonreuters.com/m/pdfs/openaccesscitations2.pdf] webcite
    citation study from Thomson Scientific; 2004.
  16. Giglia E: The impact factor of open access journals: data and trends. [http://elpub.scix.net/cgi-bin/works/Show?102_elpub2010] webcite
    In Proceedings of the 14th International Conference on Electronic Publishing (ELPUB 2010) 16-18 June 2010, Helsinki, Finland Edited by Turid Hedlund T, Tonta Y. Hanken School of Economics; 2010, 17-39. OpenURL
  17. Chinchilla-Rodriguez Z, de Moya-Anegoin F: Open Access and Scopus: a new approach to scientific visibility from the standpoint of access.
    J Am Soc Info Sci Technol 2011, 62:1130-1145. Publisher Full Text OpenURL
  18. Pislyakov V: Comparing two “thermometers”: impact factors of 20 leading economic journals according to Journal Citation Reports and Scopus.
    Scientometrics 2009, 79:541-550. Publisher Full Text OpenURL
  19. Sicilia M-A, Sánchez-Alonso S, García-Barriocanal E: Comparing impact factors from two different citation databases: the case of Computer Science.
    J Informetrics 2011, 5:698-704. Publisher Full Text OpenURL
  20. Peters P: Going all the way: how Hindawi became an open access publisher.
    Learn Pub 2007, 20:191-195. Publisher Full Text OpenURL

Open access versus subscription journals: a comparison of scientific impact

Abstract

Background

In the past few years there has been an ongoing debate as to whether the proliferation of open access (OA) publishing would damage the peer review system and put the quality of scientific journal publishing at risk. Our aim was to inform this debate by comparing the scientific impact of OA journals with subscription journals, controlling for journal age, the country of the publisher, discipline and (for OA publishers) their business model.

Methods

The 2-year impact factors (the average number of citations to the articles in a journal) were used as a proxy for scientific impact. The Directory of Open Access Journals (DOAJ) was used to identify OA journals as well as their business model. Journal age and discipline were obtained from the Ulrich’s periodicals directory. Comparisons were performed on the journal level as well as on the article level where the results were weighted by the number of articles published in a journal. A total of 610 OA journals were compared with 7,609 subscription journals using Web of Science citation data while an overlapping set of 1,327 OA journals were compared with 11,124 subscription journals using Scopus data.

Results

Overall, average citation rates, both unweighted and weighted for the number of articles per journal, were about 30% higher for subscription journals. However, after controlling for discipline (medicine and health versus other), age of the journal (three time periods) and the location of the publisher (four largest publishing countries versus other countries) the differences largely disappeared in most subcategories except for journals that had been launched prior to 1996. OA journals that fund publishing with article processing charges (APCs) are on average cited more than other OA journals. In medicine and health, OA journals founded in the last 10 years are receiving about as many citations as subscription journals launched during the same period.

Conclusions

Our results indicate that OA journals indexed in Web of Science and/or Scopus are approaching the same scientific impact and quality as subscription journals, particularly in biomedicine and for journals funded by article processing charges.



Authors: Bo-Christer Björk1* and David Solomon2

1 Hanken School of Economics, Helsinki, Finland
2 College of Human Medicine, Michigan State University, East Lansing, MI, USA

For all author emails, please log on.

BMC Medicine 2012, 10:73 doi:10.1186/1741-7015-10-73

The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1741-7015/10/73

Background

Emergence and growth of open access

Over the last 20 years the publishing of scientific peer-reviewed journal articles has gone through a revolution triggered by the technical possibilities offered by the internet. Firstly, electronic publishing has become the dominant distribution channel for scholarly journals. Secondly, the low cost of setting up new electronic journals has enabled both scholars and publishers to experiment with new business models, where anybody with internet access can read the articles (‘open access’ or OA) and the required resources to operate journals are collected by other means than charging readers. Similarly, increased availability can be achieved by scientists uploading the prepublication versions of their articles published in subscription journals to OA web repositories such as PubMed Central. The majority of publishers now allow some form of archiving in their copyright agreements with authors, sometimes requiring an embargo period. Major research funders such as the National Institutes of Health (NIH) and the Wellcome Trust have started requiring OA publishing from their grantees either in open access journals (gold OA) or repositories (green OA). A recent study showed that 20.4% of articles published in 2008 were freely available on the web, in 8.5% of the cases directly in journals and in 11.9% in the form of archived copies in some type of repository [1].
In the latter half of the 1990s when journals created by individual scientists were dominating OA publishing, these journals were not considered by most academics a serious alternative to subscription publishing. There were doubts about both the sustainability of the journals and the quality of the peer review. These journals were usually not indexed in the Web of Science, and initially they lacked the prestige that academics need from publishing. Quite often their topics were related to the internet and its possibilities, as exemplified by the Journal of Medical Internet Research, which in 15 years has managed to become a leading journal in its field.
A second wave of OA journals consisted of established subscription journals, mainly owned by societies. These publishers decided to make the electronic version of their journal(s) freely accessible. Such journals are particularly important in certain regions of the world for example, Latin America and Japan, where portals such as Scielo and J-stage host hundreds of journals at no cost to the publishers. One of the earliest journals to make its electronic version OA was BMJ, which since 1998 has made its research articles freely available.
The third wave of OA journals was started by two new publishers, BioMedCentral and Public Library of Science (PLoS). They pioneered the use of article processing charges (APCs) as the central means of financing professional publishing of OA journals. Since 2000 the importance of the APC business model for funding OA publishing has grown rapidly. BioMedCentral was purchased in 2008 by Springer and over the last couple of years almost all leading subscription publishers have started full open access journals funded by APCs. The leading scientific OA journals using the APC model tend to charge between US$2,000 and US$3,000 for publishing but overall the average APC was US$900 in 2010 across all journals charging APCs listed in the Directory of Open Access Journals [2]. In many fields the payment of such charges is a substantial barrier to submissions. In a broad survey of authors who had published in scholarly journals, 39% of respondents who hadn’t published in OA journals mentioned problems in funding article-processing fees as a reason [3].
Subscription publishers have also tried an OA option called hybrid journals where authors can pay fees (typically in the range of US$3,000) to have the electronic versions of their articles OA as part of what is otherwise a subscription journal. The uptake for hybrid journals in general has been very limited at about 1% to 2% for the major publishers [4].

Does OA threaten to undermine scientific peer review?

The starting point for this study are the claims made, often by publishers and publishers’ organizations, that the proliferation of OA would set in motion changes in the publishing system which would seriously undermine the current peer review system and hence the quality of scientific publishing. Suber has written an excellent overview of this discussion [5]. Lobbying using this argument has in particular been directed against government mandates for OA such as implemented by the NIH for their grantees. It is claimed that the resulting increase in posting of manuscript copies to OA repositories would lead to wide-scale cancellation of subscriptions putting traditional publishers, both commercial and society in jeopardy and in the long run result in an erosion of scientific quality control. This scenario is based on the assumption that the OA publishers would take over an increasing part of the publishing industry and would not provide the same level of rigorous peer review as traditional subscription publishers, which would result in a decline in the quality of scholarly publishing. The NIH have documented that their mandate has not in fact caused any harm to publishers [6].
The critique has in particular been focused on OA publishers that charge authors APCs. Superficially such publishers would seem to be inclined to accept substandard articles since their income is linearly dependent on the number of papers they publish. There have in fact been reports of some APC-funded OA publishers with extremely low quality standards [7]. Reports of such cases in the professional press such as the recent article ‘Open access attracts swindlers and idealists’ [8] in the Finnish Medical Journal, a journal read by the majority of practicing physicians in Finland, can by the choice of title alone contribute to a negative image of OA publishing. The founding of the Open Access Scholarly Publishers Association, which in particular strives to establish quality standards for OA journals, was in part a reaction by reputable OA publishers to the appearance of such publishers on the market.
One of the questions in the above-mentioned survey of scholarly authors [3], dealt with the ‘myths’ about open access, including the quality issue. On a Likert scale researchers in general tended to disagree with the statements ‘Open access undermines the system of peer review’ and ‘Open access publishing leads to an increase in the publication of poor quality research’ (results reported in Figure 4; [3]). It thus seems that a majority of scholars or at least those who completed this very widely disseminated survey did not share this negative perception of the quality of OA publishing.

Aim of this study

Scientific quality is a difficult concept to quantify. In general terms very rigorous peer review procedures should raise the quality of journals by screening out low quality articles and improving manuscripts via the reviewers’ comments. In this respect one could assume that the novel peer review procedures used by certain OA journals such as PLoS ONE should lower the quality. However, such journals essentially leave it to the readers to affirm the quality through metrics such as the number of citations per article. In practice the only proxy for the quality that is generally accepted and widely available across journals are citation statistics. In the choice of title for this article we have hence consciously avoided the term scientific ‘quality’ and chose to use ‘impact’ instead, which is closely related to citations such as in the impact factor used in Journal Citation Reports.
It has now been 20 years since the emergence of the first OA journals and 10 years since the launch of the first major OA journals funded by APCs. The number of peer-reviewed articles published in OA journals was already around 190,000 in 2009 and growing at the rate of 30% per annum [9]. Roughly half of the articles are published in journals charging APCs [2]. Enough time has also passed so that the qualitatively better OA journals and in particular journals that have been OA from their inception are now being indexed by major citation indexes such as the Web of Science and Scopus. In the last few years academic search engines such as Google Scholar have also emerged, but the data generated by these automated searches is too unstructured to be used for a study of the citation counts of large numbers of articles or full journals. In contrast both the Journal Citation Reports (JCR), and SCOPUS via the data available on the SCImago portal provide aggregated data in the form of impact factors, which can be used for comparing OA and subscription journals.
This provides empiric data enabling us to ask meaningful questions such as: ‘How frequently are articles published in OA journals cited compared to articles in non-OA journals?’. Although the citation level cannot directly be equated to scientific quality, it is widely accepted as a proxy for quality in the academic world, and is the only practical way of getting comprehensive quantitative data concerning the impact of journals and the articles they contain. The aim of this study was thus to compare OA and subscription journals in terms of the average number of citations received both at the journal and article level.

Earlier studies

Over the past 10 years there have been numerous studies reporting that scientific articles that are freely available on the internet are cited more frequently than articles only available to subscribers (for overviews see Swan [10] and Wagner [11]). Most of these studies have been conducted by comparing articles in subscription journals where some authors have made their articles freely available in archives. Gargouri et al. [12] found a clear citation advantage of the same size both for articles where the author’s institution mandated OA, and for articles archived voluntary. They also found that the citation advantage was proportionally larger for highly cited articles. Some authors claim that when eliminating factors such as author’s selecting their better work for OA dissemination, the advantage, at least concerning citations in Web of Science journals is low or even non-existent. Evans and Reimar using extensive Web of Science data report an overall global effect of 8% more citations, but with a clearly higher level of around 20% for developing countries [13]. Davis, in a randomized trial experiment involving 36 mainly US-based journals, found no citation effect but a positive effect on downloads [14]. His study was however limited to high-impact journals with wide subscription bases.
Assuming that there is some level of citation advantage, this would mean that the articles published in full OA journals would receive an additional citation advantage beyond their intrinsic quality from their availability. In practice it would, however, be very difficult to separate out the effects of these two underlying factors. A share of the articles in subscription journals (approximately 15%) also benefit from the increased citations due to the existence of freely available archival copies as noted for instance by Gargouri et al. [12]. If there was a consensus of the citation advantage for being freely available, it would be possible to correct for this effect. Since the estimates of this factor vary so much across studies, we are hesitant to attempt such a correction.
However, we don’t necessarily need to explicitly take this factor into account when assessing the quality level of the global OA journal corpus. If articles in them on average get as many citations as articles in subscription journals, then their overall scientific impact (as measured by getting cited) is also equal. OA is just one of several factors influencing the citation levels of particular journals, others being the prestige of the journals, the interest of the topics of the articles, the quality of the layout for easy reading, timeliness of publication and so on.
Journals that were launched as OA from relatively new publishers such as PLoS or BMC have disadvantages in other respects. They lack the established reputation of publishers that have been in business for decades. The reputation of these journals is also hindered by a large, though shrinking, number of researchers who believe that electronic-only OA journals are somehow inferior to their more established subscription counterparts. In this study we will therefore make no attempt to look separately at the citation effect of OA, due to the complexity of the issue and the lack of a reliable estimate of the effect.
There are a few previous studies that have tried to determine the overall quality of OA journal publishing as compared to traditional subscription publishing. McVeigh studied the characteristics of the 239 OA journals included in the 2003 Journal Citation Reports [15]. Her report contains very illustrative figures showing the positions of these journals in the ranking distribution within their respective scientific disciplines. Overall, OA journals were represented more heavily among the lower-ranking journals, but there were also 14 OA journals in the top 10% in their disciplines. She also mentions that 22,095 articles were published in these OA journals in 2003. In considering the results from this early study it is important to bear in mind the highly skewed regional and age distributions of the journals in question. Only 43% of the OA journals were published in North America or Western Europe, and the vast majority of the journals were old established journals that had recently decided to make their electronic content openly available.
Giglia [16] set out to duplicate the McVeigh study, to the extent possible. Giglia was now able to rely solely on the DOAJ index for info about which journals were OA and identified 385 titles to study, using JCR from 2008 as the starting point. Giglia studied the distribution of titles in different percentiles of rank in their discipline using the same breakdown as McVeigh. All in all the results were not much different from the earlier study. Giglia found that 38% of the 355 OA journals in Science Citation Index and 54% of the 30 OA journals in Social Science Citation Index were in the top half ranks in JCR.
Miguel et al. [17] focused on studying how well represented gold and green OA journals were in citation indexes. They were able to combine DOAJ data with data from the SCOPUS citation database, which covers more journals than JCR, and could also use the average citation counts from the SCImago database. The results highlighted how OA journals have achieved a share of around 15% of all SCOPUS indexed journals for Asia and Africa and a remarkable 73% for Latin America. Of particular interest for this study was that some of the figures in the article showed the average number of citations per document in a 2-year window (calculated over journals) for particular journal categories. Thus the overall average number of citations was around 0.8 for OA journals, 1.6 for subscription journals allowing green posting and 0.8 for subscription journals not allowing green posting. They found highly differentiated average citation levels for nine different broad disciplines. They also found very clear differences in the citation levels between regions, with North American and European OA journals performing at a much higher level than journals from other parts of the world. Both in the disciplinary and regional breakdowns the non-OA journals followed the same patters, so that the relative performance of OA journals to non-OA journals was relatively stable.

Methods

The data for this study were obtained from four databases. These included Ulrichsweb, Journal Citation Reports 2010 (JCR), SCImago Journal & Country Rank (SCImago), and the Directory of Open Access Journals (DOAJ). SCImago and DOAJ are openly available and provide their data in an easily downloaded format. Both our institutions have subscriptions to the electronic versions of Ulrichsweb and JCR, and it was possible to use our institutional access to these databases to obtain the information needed.
Ulrichsweb is a database of detailed information on more than 300,000 periodicals of all types. The JCR is the 2010 version of a database concerning the articles published and the citations received by the peer-reviewed journals indexed in the Web of Science citation index, a database of selected high quality scholarly journals maintained by Thomson Reuters. This study largely focuses on the average number of citations received by a journal over the most recent 2-year period, commonly called an impact factor. SCImago provides open access to similar metrics for citations concerning journals included in the Scopus Citation Database maintained by Elsevier. Scopus is similar to Web of Science but provides data on a larger number of journals. The DOAJ is a database of open access journals that provides basic information about the journals as well as immediate unrestricted access to full text articles for some of these journals. Of these services, Web of Science whose citation index is provided through the JCR has the strictest inclusion criteria, followed by Scopus. DOAJ accepts all journals that fulfill certain criteria concerning the open accessibility and the peer review, whereas Ulrichsweb is open for any journal to self-report their data.
A limitation of this method is that journals not indexed in Web of Science or Scopus cannot be included, since there is no way to obtain citation data in a systematic way. Google scholar could be used to study citations in that index to individual journals but the process is extremely labor intensive and cannot be performed for large numbers of journals.
Studies have shown a high degree of correlation between the citation metrics of JCR and Scopus, although their absolute values differ. For instance Pislyakov [18] studied the citedness of 20 leading economics journals using data from both JCR and Scopus and found that the correlation between the Impact factors of these two indexes was 0.93 (Pearson). Sicilia et al. [19] also found a strong correlation between the two measures for computer science journals. Hence either one provides a good measure for the level of citations.
We used this mix of sources because we needed a number of data items for our analysis that could not be obtained from just one database. Ulrichsweb was used to obtain the start year for each journal as well as the up to five discipline categories in which it was classified. It was also used to identify the country of origin of the publisher. Being listed in the DOAJ was used as an indicator of whether a journal was open access and to determine if a journal charged APCs. The JCR was used to obtain the 2-year impact factor for each journal as well as the number of articles published in it in the most recent year available in the report, 2010. SCImago was used to obtain the 2-year citation count divided by number of articles published for Scopus indexed journals (in essence similar to the JCR impact factor) and the number of articles published in 2011.
To create a merged data set for analysis we started with the Ulrichsweb database, first narrowing the database to only journals that were: abstracted or indexed, currently active, academic/scholarly, refereed, and formatted as online and/or in print.
We selected all journals within those limits that were listed in the following discipline categories (based on the discipline coding used by Ulrichsweb): arts and literature; biological science; business and economics; chemistry; earth, space and environmental sciences; education; mathematics; medicine and health; physics; social sciences; technology and engineering. While there were other disciplines categorized in Ulrichsweb, these in our view captured the major scholarly disciplines. Many journals were listed under multiple disciplines. We recorded each discipline listed for each journal. The maximum for any journal was five. The data were retrieved in January 2012.
We then merged data from the other three databases to the journals identified in Ulrichsweb using either the International Standard Serial Number (ISSN) or the Electronic International Standard Serial Number (EISSN) as the identifier. There were 23,660 journals identified in Ulrichsweb meeting the criteria within the 11 disciplines of which 12,451 (52.6%) were in the SCImago database as of January 2012, 8,256 (35.0%) were in the JCR 2010 and 2,530 (10.7%) were in the DOAJ as retrieved from their web site in August 2011.
Citation metrics of OA and subscription journals were analyzed in two different ways. Firstly they were analyzed with journals as the unit of analysis, which was at the level the data were retrieved from the four databases. We also estimated the citation metrics of the articles published. This was performed by weighting the journal level citation metrics by the number of articles published in each journal per year using article counts provided by the JCR and SCImago databases. This lends more or less weight to each journal based on the number of articles that were published within the journal. We feel this adds a new and important dimension to the analysis as compared to earlier studies.
In the data collection and analysis process we found some problems with the SCImago data. The site allows downloading the basic article numbers and citation data for all journals as one Microsoft Excel file with the most current year’s data. The data on impact factors and number of articles was for 2011 but it seems that the article and citation counts are not complete for the full year, so that both the article numbers and impact factors are too low. This could easily be checked for individual journals and it turned out that the impact factors for 2010 as well as preceding years were in most cases almost double compared to the 2011 figures. A comparison with the journal level analysis in Miguel et al. [17] also pointed in the same direction. Unfortunately it was not possible to extract the older data for the over 12,000 journals in the study so we were limited to using the 2011 data, which was incomplete.
We nevertheless feel that the analysis using SCOPUS data provides a useful triangulation with the JCR analysis. Provided that the insufficient counting for 2011 is systematic across all journals, with no differentiation between OA and subscription journals, the citation levels for OA vs. subscription relative to each other should remain the same, although the absolute levels are lower. In comparing the numbers with the JCR based the proportions between OA and subscription citation rates were approximately the same in both sets supporting the conclusions we later illustrate mainly with the JCR results.

Results

The results were calculated using 2-year average citations (impact factors) from the JCR and Scopus (via SCImago) by journal and weighted by the number of article in each journal as described above. OA and subscription journals were compared by the time period when they were launched (pre-1996, 1996 to 2001, and 2002 to 2011), by country published grouped into the four largest publishing countries (USA, UK, The Netherlands, and Germany) versus other countries, scientific discipline (medicine and health versus other) and business model (OA funded by APC, OA not funded by APC, and subscription).
Table 1 provides a comparison of the impact factors for OA and subscription journals based on journals in the JCR and Scopus databases. OA journals had impact factors that were approximately 76% and 67% as high as subscription journals in JCR and Scopus respectively when analyzed by journal and 73% and 62% when weighted for articles published. Due to our concerns about the Scopus data from the SCImago Journal and Country site outlined above in the Methods Section only JCR figures are presented and discussed below.
Table 1. The 2-year citation averages for open access versus subscription journals, calculated using Web of Science or Scopus data
Figure 1 shows the average JCR impact factor for OA and subscription journals weighted by the number of articles as a function of the time period the journal was launched and location of the publisher. The left side of the figure includes the journals from the four countries where most of the major society and commercial publishers are located. The publishers in these four countries account for approximately 70% of the journals in our sample. The right side of the figure includes journals publishing in the rest of the world.
thumbnailFigure 1. Citation averages as a function of the journal start year for two regions. The figures are based on Web of Science and weighted by journal article volumes.
There are large differences in the impact factors between the two regions with the ‘big four’ on average having journals with significantly higher impact factors. Somewhat surprisingly in this region more recently launched journals tended to have higher impact scores than the older more established journals. This was true for both subscription and OA journals. In addition the difference in impact between OA and subscription journals narrows with time.
The pattern for journals from the rest of the world is quite different. While the overall number of journals published is much lower, the number of OA journals is actually quite high in the pre-1996 group where OA journals have a clearly lower impact. This group largely consists of old established print journals, which at some staged have opened up their electronic versions. In the middle time period, OA journals were outperforming subscription journals and in the youngest group they were on a par with subscription journals.

Effects of the discipline of the journals

Several studies have shown that gold open access journals have had a larger uptake in the biomedical fields [1,15], where authors usually have less problems in financing APCs and where many research funders also require some form of OA for the results. Figure 2 shows the average JCR impact factor of OA and subscription journals weighted by the number of articles as a function of the discipline. The journals were split into two groups. The first included journals with the Ulrichsweb discipline category ‘Medicine and Health’. All the other disciplines were combined into the second group.
thumbnailFigure 2. Citation averages as a function of the journal start year for Medicine and Health versus all other disciplines. The figures are based on Web of Science and weighted by journal article volumes.
In medicine and health, the large difference in impact between OA and subscription journals seen in older journals essentially disappears among the journals launched after 2001. This probably reflects the emergence of high quality professional OA publishers such as PLoS and BioMedCentral that rely on APCs for funding. For the other disciplines, OA articles had considerably lower impact scores in journals before 1996 and journals launched after 2001 but the average impact of OA articles in journals launched between 1996 and 2001 was essentially equal to the average impact of articles in subscription journals launched in the same period. In reviewing the raw data, the high average impact of the OA articles during this period was due to a handful of relatively high impact and high volume OA journals published by BioMedCentral, which had been classified as biological rather than medical journals.

Effects of the revenue model of OA journals

In Figure 3 (subscription journals), OA journals funded by APCs and OA journals that do not charge APCs are compared as a function of journal age. As noted above, the early OA journals were funded through volunteer effort and small subsidies from largely universities. Beginning with BioMedCentral and PLoS in about 2001 a growing number of professional publisher have begun publishing OA journals funding their operations by charging publication fees.
thumbnailFigure 3. Citation averages for open access journals using article processing charges (APCs) versus those that are free to publish in for authors, compared to impact factors for subscription journals.
The impact of OA journals that are not funded by APCs are more or less the same irrespective of journal age at about 1.25. The oldest age category consists mainly of print journals that have made their electronic versions freely available. APC funded OA journals’ average impact increased markedly in the period 1996 to 2001 and to a lesser extent in 2002 to 2011 nearly reaching the same level as subscription journals at about 3.2. The 89 APC funded journals launched before 1996 we expect largely include subscription journals that converted to the APC model of OA publishing. A number of the journals are published by Hindawi, which did in fact transition from a subscription publisher to an OA publisher funded by APCs [20]. The other journals are published by a variety of publishers, universities, societies and other organizations from around the world.

Discussion

The distribution of OA journals over time periods and regions differs markedly from the corresponding distribution of subscription journals. OA journals are much more numerous in categories that have low overall impact factors which may explain some of the difference in average impact between OA and subscription journals. Almost half (302) of all OA journals found in JCR are journals started before 1996 and published in the ‘other countries’ region. While over 75% of the subscription journals found in the JCR were also launched before 1996, nearly 70% of subscription journals are from publishers in the four major publishing counties. As can be seen in Figure 1, across all age categories and for both OA and subscription journals, those published outside the four major publishing countries have substantially lower impact factors. While correlation is not necessary causation, the location of the publisher appears to account for much of the difference in average impact between OA and subscription journals.
The vast majority of journals founded before 1996 that are listed in the JCR started as paper-based subscription journals. Those listed as OA must at some stage have made their electronic versions open access. Many of these are journals published by scientific societies and universities but at least in one case (Hindawi) a publisher converted their whole portfolio from subscription to OA.
Both in the leading publishing countries and in the rest of the world, older established journals that have made their electronic versions openly available have lower impact scores than their subscription counterparts. This is understandable since the large commercial publishers and the leading society publishers have usually refrained from opening up the e-content, BMJ being a notable exception. But for the newer journals, particularly in medicine and health, our results show that OA journals are performing at about the same level as subscription journals, in fact getting more citations in some subcategories.
For almost 15 years the quality of OA journals has been debated and questioned. In the early days of electronic journals, when hardly any startup OA journals were operated by reputable professional publishers, it was easy to understand the reluctance of scientists to submit their best manuscripts to OA journals and for research funders and university promotion and tenure committees to accept publishing in OA journals as on par with publishing in traditional subscription based journals. After the launch of professionally run high quality biomedical OA journals beginning in about 2000, the situation has changed. Today the funding mechanism of a journal is irrelevant in considering its quality. There are large numbers of both subscription and OA journals that are high quality and widely cited.
The development and increasing acceptance of the APC funding model for OA scholarly journals has spawned a group of publishers with questionable peer review practices that seem focused on making short-term profits by having low or non-existent quality standards. Unfortunately this has created some bad publicity for OA publishing. As this study demonstrates, this does not change the broad picture. Gold OA publishing has increased at a rate of 30% per year over the past decade [9] and in the last couple of years many major subscription publishers have started adding pure OA journals to their portfolios.
We believe our study of the quality of the OA journals indexed in either Web of Science or Scopus is the most comprehensive to date. The results indicate that the level of citations for older subscription based OA journals, which have made the electronic version openly available, is clearly lower than for the corresponding subscription journals. At the same time newly founded full OA journals compete on almost equal terms with subscription journals founded in the same period. OA articles published medicine and health by publishers in the four largest publishing countries; attract equal numbers of citations compared to subscription journals in these fields. Based on the evidence from earlier studies it is likely that a part of the citations to the OA articles are due to the increased readership following from the open availability, but there is no way we can isolate the effect of this factor in our calculations nor would this factor alone account for the increasing respect researchers are showing for these journals through their citations.
The focus of the criticism of OA journals has been directed against journals funding their operations with APCs, claiming that this revenue model leads to journals lowering their review standards in order to maximize their profits. While there is clearly a substrata of journals reflecting this phenomena, there are also a growing number of high quality APC funded journals from reputable publishers that are on par with their subscription counterparts.

Conclusions

In summary, gold OA publishing is rapidly increasing its share of the overall volume of peer-reviewed journal publishing, and there is no reason for authors not to choose to publish in OA journals just because of the ‘OA’ label, as long as they carefully check the quality standards of the journal they consider.

Competing interests

There are no competing financial interests. Both authors have founded OA journals in the 1990s and are emeritus editors-in-chiefs. B-CB is a current and DS a former board member of the Open Access Scholarly Publishers Association.

Authors’ contributions

B-CB initiated the study and has written most of the background sections of the articles. DS collected the data from the different sources and made the calculations. Both authors participated equally in the analysis of the results and the drawing of conclusions.

Authors’ information

B-CB is professor of Information Systems Science at the Hanken School of Economics, Helsinki, Finland. DS is Professor of Medicine at the College of Human Medicine, Michigan State University, USA.

References

  1. Björk B-C, Welling P, Laakso M, Majlender P, Hedlund T, Guðnason G: Open access to the scientific journal literature: situation 2009.
    PLoS ONE 2010, 5:e11273. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL
  2. Solomon DJ, Björk B-C: A study of Open Access Journals using article processing charges.
    J Am Soc Info Sci Technol, in press. OpenURL
  3. Dallmeier-Tiessen S, Darby R, Goerner B, Hyppoelae J, Igo-Kemenes P, Kahn D, Lambert S, Lengenfelder A, Leonard C, Mele S, Nowicka M, Polydoratou P, Ross D, Ruiz-Perez S, Schimmer R, Swaisland M, van der Stelt W: Highlights from the SOAP project survey. [http://arxiv.org/abs/1101.5260v2] webcite
    What scientists think about open access publishing OpenURL
  4. Dallmeier-Thiessen S, Goerner B, Darby R, Hyppoelae J, Igo-Kemenes P, Kahn D, Lambert S, Lengenfelder A, Leonard C, Mele S, Polydoratou P, Ross D, Ruiz-Perez S, Schimmer R, Swaisland M, van der Stelt W: Open access publishing – models and attributes. [http://edoc.mpg.de/478647] webcite
    SOAP project report, Max Planck Society digital library; 2010.
  5. Suber P: Will open access undermine peer review? [http://www.earlham.edu/~peters/fos/newsletter/09-02-07.htm] webcite
    the SPARC Open Access Newsletter, issue 113; 2009.
  6. NIH: NIH Public Access Policy. [http://publicaccess.nih.gov/public_access_policy_implications_2012.pdf] webcite
  7. Gilbert N: Editor will quit over hoax paper: computer-generated manuscript accepted for publication in openaccess journal. [http://www.nature.com/news/2009/090615/full/news.2009.571.html] webcite
    Nature News 2009. OpenURL
  8. Järvi U: Open Access Attracts swindlers and idealists [in Finnish].
    Finn Med J 2012, 67:666-667. OpenURL
  9. Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C, Hedlund T: The development of open access journal publishing from 1993 to 2009.
    PLoS ONE 2011, 6:e20961. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL
  10. Swan A: The Open Access citation advantage: studies and results to date. [http:/ / openaccess.eprints.org/ index.php?/ archives/ 716-Alma-Swan-Review-of-Studies-on- Open-Access-Impact-Advantage.html] webcite
    Technical Report, School of Electronics & Computer Science, University of Southampton; 2010.
  11. Wagner A: Open access citation advantage: an annotated bibliography. [http://www.istl.org/10-winter/article2.html] webcite
    Iss Sci Technol Librarian 2010., 60: OpenURL
  12. Gargouri Y, Hajjem C, Larivière V, Gingras Y, Carr L, Gingras Y, Brody T, Harnad S: Self-selected or mandated, open access increases citation impact for higher quality research.
    PLoS ONE 2010, 5:e13636. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL
  13. Evans J, Reimer J: Open access and global participation in science.
    Science 2009, 323:1025. PubMed Abstract | Publisher Full Text OpenURL
  14. Davis P: Open access, readership, citations: a randomized controlled trial of scientific journal publishing.
    FASEB J 2011, 25:2129-2134. PubMed Abstract | Publisher Full Text OpenURL
  15. McVeigh M: Open Access Journals in the ISI Citation Databases: Analysis of Impact Factors and Citation Patterns. [http://science.thomsonreuters.com/m/pdfs/openaccesscitations2.pdf] webcite
    citation study from Thomson Scientific; 2004.
  16. Giglia E: The impact factor of open access journals: data and trends. [http://elpub.scix.net/cgi-bin/works/Show?102_elpub2010] webcite
    In Proceedings of the 14th International Conference on Electronic Publishing (ELPUB 2010) 16-18 June 2010, Helsinki, Finland Edited by Turid Hedlund T, Tonta Y. Hanken School of Economics; 2010, 17-39. OpenURL
  17. Chinchilla-Rodriguez Z, de Moya-Anegoin F: Open Access and Scopus: a new approach to scientific visibility from the standpoint of access.
    J Am Soc Info Sci Technol 2011, 62:1130-1145. Publisher Full Text OpenURL
  18. Pislyakov V: Comparing two “thermometers”: impact factors of 20 leading economic journals according to Journal Citation Reports and Scopus.
    Scientometrics 2009, 79:541-550. Publisher Full Text OpenURL
  19. Sicilia M-A, Sánchez-Alonso S, García-Barriocanal E: Comparing impact factors from two different citation databases: the case of Computer Science.
    J Informetrics 2011, 5:698-704. Publisher Full Text OpenURL
  20. Peters P: Going all the way: how Hindawi became an open access publisher.
    Learn Pub 2007, 20:191-195. Publisher Full Text OpenURL

The OA Interviews: Jeffrey Beall, University of Colorado Denver

In 2004 the scholarly publisher Elsevier made a written submission to the UK House of Commons Science & Technology Committee. Elsevier asserted that the traditional model used to publish research papers — where readers, and institutions like libraries, pay the costs of producing scholarly journals through subscriptions — “ensures high quality, independent peer review and prevents commercial interests from influencing decisions to publish.”
Jeffrey Beall
Elsevier added that moving to the Open Access (OA) publishing model — where authors, or their sponsoring institutions, paid to publish research papers by means of an article-processing charge (APC) — would remove “this critical control measure” from scholarly publishing.
The problem with adopting the gold OA model, explained Elsevier, is that publishers’ revenues would then be driven entirely by the number of articles published. As such, OA publishers would be “under continual pressure to increase output, potentially at the expense of quality.”
This is no longer a viewpoint that Elsevier promulgates. Speaking to me earlier this year, for instance, Elsevier’s Director of Universal Access Alicia Wise said, “Today open access journals do generally contain high-quality peer reviewed content, but in 2004 this was unfortunately not always the case.”
She added, “Good work in this area by the Open Access Scholarly Publishers Association (OASPA) has helped to establish quality standards for open access publications. For several years now Elsevier has taken a positive test-and-learn approach to open access and believes that open access publishing can be both of a high quality and sustainable.”
Prescient
While many OA publishers today are undeniably as committed to the production of high-quality papers as subscription publishers ever were, Elsevier’s 2004 warning was nevertheless prescient.
No one knows this better than Jeffrey Beall, a metadata librarian at the University of Colorado Denver. Beall maintains a list of what he calls “predatory publishers”. That is, publishers who, as Beall puts it, “unprofessionally exploit the gold openaccess model for their own profit.” Amongst other things, this can mean that papers are subjected to little or no peer review before they are published.
Currently, Beall’s blog list of “predatory publishers” lists over 100 separate companies, and 38 independent journals. And the list is growing by 3 to 4 new publishers each week.
Beall’s opening salvo against predatory publishers came in 2009, when he published a review of the OA publisher Bentham Open for The Charleston Advisor. Since then, he has written further articles on the topic (e.g. here), and has been featured twice in The Chronicle of Higher Education (here and here).
His work on predatory publishers has caused Beall to become seriously concerned about the risks attached to gold OA. And he is surprised at how little attention these risks get from the research community. As he puts it, “I am dismayed that most discussions of gold openaccess fail to include the quality problems I have documented. Too many OA commenters look only at the theory and ignore the practice. We must ‘maintain the integrity of the academic record’, and I am doubtful that gold openaccess is the best long-term way to accomplish that.”
When presented with evidence of predatory publishing, OA advocates often respond by saying that most OA journals do not actually charge a processing fee. 

But as commercial subscription publishers increasingly enter the OA market it would be naïve to think that the number of journals that charge APCs will not grow exponentially in the coming years.

Whether this will lead to an overall increase in quality remains to be seen. It must be hoped that as more and more traditional journals embrace OA, so quality levels will rise, and predatory publishers will begin to be squeezed out. 

However, if Beall’s growing list is anything to go by, the omens are not currently very good. Moreover, if it turns out that there is indeed an inherent flaw in the gold OA model — as Elsevier once claimed — then the research community would appear to have a long-term problem.

 

The interview begins …

RP: You are a metadata librarian: what does your job involve?
JB: As a faculty librarian, my work is divided up into three components: librarianship, research, and service. My librarianship work involves creating and maintaining library metadata in my library’s discovery systems, including the online catalogue, the discovery layer, and the institutional repository, and related duties.
My research component is thirty per cent of my job, and I am devoting it to my research in scholarly communication. The service component chiefly involves committee work.
RP: How and when did you become interested in predatory open access publishing?
JB: I became interested in predatory publishers in 2008 when I began to receive spam email solicitations from new, online, third-world publishers.
RP: What is the purpose of the list of predatory OA publishers you keep, and how many publishers does it currently include?
JB: The lists are part of my blog. I write the blog to help myself develop my ideas and to share what I am learning about scholarly openaccess publishing. The lists are a means of sharing information about publishers I have judged as questionable or predatory.
There are actually two lists, one of independent journals that do not publish under the aegis of a publisher, and one of publishers. There are 38 independent journals and 111 publishers currently on the list.
RP: When you say independent journals do you mean journals published by researchers themselves?
JB: No, I mean journals that exist independently on the Internet that are not part of a publisher’s fleet of journals. An example is the Global Journal of Medicine and Public Health.

Criteria

RP: Do you have any sense of how fast the phenomenon of predatory publishing is growing?
JB: Yes, the attention my blog has received has inspired academics and others to forward me spam emails they have received and to pass on information they have about new, questionable publishers. In the last couple months, I have been adding 3-4 per week. A new predatory publisher appears almost weekly in India, the location of most of my recent listings.
RP: Is predatory publishing in your view a phenomenon that originates primarily in the developing world?
JB: Yes, and in this I include publishers in the U.S., Canada, Australia, and the U.K. that are run by people from developing countries. They typically set up shop in developed countries and then market their services (vanity scholarly publishing) to the unwary worldwide, especially to those in their home countries.
RP: How do you define a predatory publisher?
JB: Predatory publishers are those that unprofessionally exploit the gold openaccess model for their own profit.
RP: Presumably this implies publishers that charge a fee to publish scholarly papers (Not all gold OA journals do charge a fee)?
JB: By definition, gold openaccess publishers levy an article processing charge (APC).
RP: How do you select publishers to include in your list? What criteria do you use?
JB: As I mentioned, most of the additions to the list result from tips from scientists and other scholars. I have composed and use a criteria document, currently in draft form, that I am preparing for publication on my blog.
Most importantly, I use established criteria, specifically those published by the Committee on Publication Ethics (COPE), the Open Access Scholarly Publishers Association (OASPA), and the International Association of Scientific, Technical & Medical Publishers (STM). There is one statement in COPE’s code of conduct that nicely encapsulates all the criteria into one: “Maintain the integrity of the academic record”.

OASPA

RP: Can you say what specific things you look for when assessing a potentially predatory publisher: for instance, do you look for evidence of spamming, poor or no peer review, the absence of information on ownership and/or location of the publisher, lack of an editor-in-chief, or editorial board, or what? What are the tell-tale signs of a predatory publisher?
JB: Yes, broadly I look for deception and lack of transparency. These two characteristics can manifest themselves in many ways, including those you list.  One thing (among many) that I look for is publishers that refer to themselves as a “center”,  “institute”, “network”, etc. For example, the Institute of Advanced Scientific Researchis not really an institute; it’s a predatory publisher. This is deception. If you look at their contact address on Google Maps, it’s an apartment.
RP: Is there such a thing as a subscription-based predatory publisher?
JB: No, not according to my definition of predatory publisher.
RP: You mentioned OASPA. OASPA has been accused of doing too little to stem the tide of questionable OA publishers. Would you agree? Could it be doing more? If so, what? On the other hand, might OASPA be the wrong organisation to attempt to control these activities? What is and should be OASPA’s role (if any) vis-à-vis predatory publishing?
JB: Only one or two of the publishers on my list are OASPA members. Therefore, there’s little the organization can do to control the predatory publishers. In fact, most of the publishers on my list lack affiliation with any professional association, and they fail to follow many established publishing standards. It’s not really my role to tell OASPA what it should be doing.
RP: One of OASPA’s founding members, Hindawi, was at one time on your watchlist, but subsequently you removed it. However, your current list of predatory publishers still includes the International Scholarly Research Network (ISRN). ISRN is one of Hindawi’s brands. What do we learn from this?
JB: If you’re a publisher, don’t call yourself a network when you’re not a network.
RP: When and why do you remove a publisher from your list?
JB: I have removed publishers from my list for two reasons. First, if the publisher’s website disappears, I remove it from the list. This has happened only once or twice, and I removed them from the list without saving the names. Second, I remove a publisher from the list when I receive convincing comments from colleagues disagreeing with my having added it to the list.

Legal threats

RP: Have you ever removed a publisher from your list as a result of receiving a legal threat? Have you ever received any legal threats in connection with your list?
JB: My answer to the first question is no. Regarding the second question, yes, I have received two legal threats.
RP: I am struck that at least one of the publishers that you have removed from your list — Dove Press — was formerly a member of OASPA. Dove has been the subject of some controversy, and is no longer a member of OASPA. Why did you remove Dove from your list of questionable publishers?
JB: I removed it based on comments that JQJohnson left on my old blog. He is Director, Scholarly Communications and Instructional Support, at the University of Oregon and someone whose opinion I respect. I took his comments as a form of “peer review” and decided to accept his suggestion to remove Dove Press from the list.
RP: People have said to me that you tend to “shoot from the hip” when listing publishers as predatory, sometimes making your decision on too little information. Would you agree? Have you ever regretted putting a publisher on your list?
JB: In most cases, the decision to place a given publisher on my list is an easy one because the publisher is so clearly corrupt and predatory. Thus, a decisive and resolute action is appropriate, and no, I don’t agree, for I believe I make the decisions with sufficient information.
I now regret having the watchlist on my earlier blog. The feedback I received indicated that the watchlist painted a negative picture of the publishers on that list given the context in which the list appeared (juxtaposed with a list of predatory publishers). I acted on the feedback and now no longer have a public watchlist, though I do maintain one privately.

Conflict of interest?

RP: Others have suggested that you might have a conflict of interest, pointing out, for instance, that you are on the editorial board of a subscription journal. Should such claims be taken seriously? Why? Why not?
JB: Two people have said that. One is Scott Albers, an attorney from Great Falls, Montana and author of  the article, “The Golden Mean, The Arab Spring And a 10-Step Analysis of American Economic History“, a paper published in the Middle East Studies Online Journal. He asked me for advice as he was submitting the same article to a second publisher. I told him the publisher was essentially a vanity press, and he became offended and then contrived the conflict of interest story. The second is Ken Masters, the editor of Internet Scientific Publications’ The Internet Journal of Medical Education. Masters is an assistant professor at Oman’s Sultan Qaboos University, and he took it personally when I put Internet Scientific Publications, a publisher run out of a spare bedroom in Sugar Land, Texas, on my list.
The truth is there is no conflict of interest. I have no financial stake in Taylor & Francis, the publisher of the journal on whose editorial board I serve. In point of fact, my service on the editorial board has enabled me to learn a lot about the scholarly publishing process and scholarly publishing in general. Masters has been trying to bait people on email lists, including LIBLICENSE, with the conflict of interest story, but he has been ignored.
RP: The implication in the above claim, I assume, is that you are anti-OA. How would you describe your position vis-à-vis OA: advocate, sceptic, opponent?
JB: I am not “anti” anything. I am in favourof the best model for scholarly communication, whatever it turns out to be. If that is gold OA, then so be it.
I review science books for Library Journal. Occasionally, I’ll give a book a negative review. That doesn’t mean I’m anti-science. My list is essentially a collective review of gold openaccess publishers. It’s a re-invention of what librarians call “readers advisory”.
RP: Whatever your position vis-à-vis OA, do you think the author-pays publishing model is inherently flawed so far as scholarly publishing is concerned?
JB: It’s too early to tell, so I don’t have a final opinion on this yet. On the one hand, the evidence I see every day argues that the model is indeed flawed. On the other hand, we need to ask, Which is the best model for the future of scholarly communication? It’s too early to eliminate a potentially successful and sustainable model.  

Abused the system

RP: I assume most researchers publish in the journals of predatory publishers without realising that they are dealing with a predatory publisher — and clearly a list like yours can play a useful role in helping them avoid doing so. On the other hand, I have had researchers say to me that they have knowingly paid to appear in a predatory publisher’s journal, explaining that they did so because they were having difficulties being published in a more reputable journal, or simply needed to get a paper published quickly for tenure or promotion purposes. I do not know how common the practice is, but does it not suggest that the research community is conspiring in the growth of predatory publishers, and, therefore, that the phenomenon is likely only to grow?
JB: I don’t think there’s a conspiracy, but I do think that some individuals have unprofessionally abused the system for their own benefit. But that’s why we have tenure and promotion committees. It is the committees’ job to vet the research of their tenure candidates. Tenure and promotion committees must now bring greater scrutiny to candidates’ published works than they did in the past, given the presence and abuse of scholarly vanity presses and the disappearance of the validation function that traditional publishers have so effectively provided.
RP: In the UK recently the Finch Report recommended that all publicly funded research should be made freely available on an OA basis, and by means of gold OA. This, it said, would require UK universities to pay an additional £50-60 million a year in order to disseminate the research they produce. If other countries follow suit, and if the author-pays model does indeed turn out to be inherently flawed, we can presumably expect the research community to find itself in trouble at some point can we not?
JB: Yes, and I am dismayed that most discussions of gold openaccess fail to include the quality problems I have documented. Too many OA commenters look only at the theory and ignore the practice. We must “maintain the integrity of the academic record”, and I am doubtful that gold openaccess is the best long-term way to accomplish that.
RP: What future plans do you have for your work on predatory publishers? Will you be adding new features to your blog, for instance?
JB: One weakness of my list is that it is binary: a publisher is either on the list or it isn’t. I would like to classify the publishers more granularly in terms of their quality, an upgrade that would differentiate among the borderline ones and the really bad ones. I am also in the middle of a research project about library catalogues and inclusion of predatory journals and hope to carry out additional research on openaccess publishing.

Suber, Neylon & Harnad on Finch, RCUK & Hybrid Gold OA

Excerpts from ongoing discussion:

Glyn Moody:

OA advocate Stevan Harnad withdraws support for RCUK policy – if true, this looks disastrous for UK :Open and Shut?: OA advocate Stevan Harnad withdraws support for RCUK policy

Cameron Neylon:

Disagree strongly with Stevan here. His main objection is that this will annoy researchers but to be honest the Wellcome has been taking this line for some years with no signs of revolt. Yes the question of pricing is core but what the RCUK policy does is push those purchasing decisions exactly where they should be, at the institutional/researcher level.

Its not a bug, its a feature.I disagree a lot of Stevan on strategy, which is fine, but from a tactical perspective I don’t think what he’s doing is at all helpful. He’s basically alienating all the people we need to work with to get the implementation right. And because he has such a loud voice it is assumed that he speaks for a larger group than he does.

Thomas Pfeiffer:

+Cameron Neylon From reading the interview it seems to me that +Stevan Harnad’s main objection is not that it will annoy researchers but that it creates a loophole for publishers to force authors to pay atronomical prices for Hybrid Gold OA instead of using Green OA. This does sound rather serious to me.

However I agree with you that instead of attacking the RCUK so harshly in public, he should instead just have talked to them directly, point out the problem he discovered and present his solution to them. He seems to be convinced that the RCUK opened the loophole by mistake and not on purpose, so they should be open to his recommendation.

Cameron Neylon:

It’s not a mistake its quite deliberate. RCUK position as I understand it is that they want to ensure there is a market – if authors don’t like the price that journals are charging they should go elsewhere. I would prefer a green option in these cases myself but they’re prepared to take the flack. What they can’t do is set prices…as a QUANGO this would be illegal – what they can do is set up a system where there is price sensitivity and that’s what they’ve done.

Thomas Pfeiffer:

But isn’t that was Finch is aiming for as well?

Cameron Neylon:

Finch doesn’t really aim for anything – it suggests what the priorities are, but it’s main weakness in my view was precisely in not providing a mechanism that constrains prices. Several routes to this: one is ensure a green option is allowed and viable (one sentence to this effect in Finch would have changed the whole tone). The second is to force researchers to be price sensitive – which seems to be the RCUK route. A third is for the funder to take on the price negotiations – this is the Wellcome approach.

An awful lot depends on how publishers respond. On previous evidence they will cave in, set prices as high as they think they can get away with, but not so high no-one will pay…which will get us close to a net neutral position, but with a functioning market that will then bring prices down.

Thomas Pfeiffer:

It seems that this is really the central question: How important will price be for authors? Will they favor a less well-known journal with similar quality but lower price, or will they stick with the prestigious journals, no matter the price?

Moving the decision to the authors is definitely a good start. I agree that with subscription-based journals the fact that authors don’t have to care about the subscription price and can always go for the most prestigious journal is a huge problem.

Now we’ll see how fast authors will adapt their decision process.
I surely will, since for me price will be a lot more important than Impact Factor – I aim for other ways of getting my work known anyway.

Thomas Pfeiffer

I also wonder what +Peter Suber has to say about this?

Peter Suber:

Hi +Thomas Pfeiffer: In general I’m with Stevan on this. The RCUK policy and the Finch recommendations fail to take good advantage of green OA. Like Stevan, I initially overestimated the role of green in the RCUK policy, but in conversation with the RCUK have come to a better understanding. In various blog posts since the two documents were released, I’ve criticized the under-reliance on green. I’m doing so again, more formally, in a forthcoming editorial in a major journal. I’m also writing up my views at greater length for the September issue of my newsletter (SPARC Open Access Newsletter).

For more background, I’ve argued for years that green and gold are complementary; I have a whole chapter on this in my new book . So we want both. But there are better and worse ways to combine them. Basically the RCUK and Finch Group give green a secondary or minimal role, and fail to take advantage of its ability to assure a fast and inexpensive transition to OA.

Thomas Pfeiffer:
 

Thank you for stating your opinion here, +Peter Suber. I know that you have been promoting Green OA and I’ve read about your opinion on the Finch report and your initial very positive reaction to the RCUK policy. Seems like I missed your posts about your opinion on RCUK after a re-examining it, so it was interesting to know what you think about it by now.

Cameron Neylon:
 

It’s probably worth saying that I broadly agree with +Peter Suber ‘s position (and even to an extent Stevan’s) but I disagree with Stevan’s tactics. I don’t think that the RCUK position is so bad – but its a question of degree. It also has to be understood in the context of the philosophical background to the policies. Stevan has generally argued from a public good perspective – more research available for researchers to read is a public good – rather than a technological or industrial policy perspective.

RCUK and Finch are coming from a much more innovation and industry focussed perspective. Their central motivation is to ensure that research is maximally available for exploitation. They don’t want to rapidly get to a public access environment and then have to fight through to a CC-BY OA environment – they want CC-BY as the immediate goal and see this as the fastest way to get there. So the goals are somewhat different – which leads to a difference in tactics – but we can also disagree over whether the tactics are optimal given the goals.

FWIW I agree with Eve, the publishers know exactly how weak their position is and are unlikely to resort to extensive gouging. In turn we can use the differences in policy between the US, Europe, and the UK as a pincer to tackle both sets of issues (access and rights) simultaneously.

Thomas Pfeiffer:

I concur. Until now I hadn’t realized that the differences between preferring Gold or Green OA depended on the philosophical stance, but the way Cameron explains it, it absolutely makes sense. However I can’t really say which position seems more valid to me, they both have good reasons speaking for them.

I can only say that most of my colleagues prefer Green OA – for the obvious reason that they want their research to be widely available but still want to publish in prestigious journals without paying high prices for it. 

Cameron Neylon:

Yeh, the trouble I have with the whole “its free!” argument is that of course, it isn’t. That seems to be getting missed in the discussion somewhere. We are paying for this – and we should be able to do this by at worse zero-sum with some transitional costs. Frustrating that people still believe the current system is “free”.

Stevan Harnad:

reply to +Cameron Neyon

1. PRIORITIES

Yes, free online access to refereed research (Gratis OA) + various re-use rights (Libre OA) is better than Gratis OA alone.

But free online access is incomparably more urgent and important for research and researchers than Libre OA today.

Universal Gratis OA worldwide is also fully reachable today, free of extra cost, via effective Green OA mandates from funders and institutions.

RCUK’s is an ineffective mandate as currently formulated, insisting on paying extra for Libre Gold OA out of scarce research funds instead of providing cost-free Gratis Green OA. 

In its present form, the RCUK mandate will be resented and resisted by UK researchers and is unscalable to the rest of the world.

I hope its drafters will have the good sense and integrity to fix the RCUK mandate: It just needs two simple patches to make it effective and scalable.

Once Green OA is effectively mandated worldwide, affordable Gold OA and all the re-use rights users need and authors want to provide will follow.

But not if UK — till now the worldwide leader in OA policy and provision — instead cleaves to a needless, costly and unscalable RCUK OA policy.

2. ZERO-SUM REASONING AND ZENO’S PARALYSIS

Green OA self-archiving of articles published in subscription journals is completely free of extra cost while subscriptions are paying (in full, and fulsomely!) the cost of publication.

If and when global Green OA makes subscriptions unsustainable, then, and only then, should the (remaining, much reduced) costs of publication be paid for via Gold OA, out of a fraction of the subscription savings.

Not now, pre-emptively, before Green OA prevails — or instead.

Richard Poynder:

+Cameron Neylon wrote: “Stevan has generally argued from a public good perspective – more research available for researchers to read is a public good – rather than a technological or industrial policy perspective. RCUK and Finch are coming from a much more innovation and industry focussed perspective.” 

I am not sure what industry Cameron is referring to here. Certainly, if Stevan is correct then the publishing industry has a great deal to gain from RCUK and Finch. However, I suspect he means that CC-BY can turn research papers into raw material that new businesses can use (by, for instance, mining their content). That’s fine, but at what price?

Stevan Harnad:

DECLARATION OF INTERESTS

@Thomas Pfeiffer wrote: “Until now I hadn’t realized that the differences between preferring Gold or Green OA depended on the philosophical stance”

Thomas, I don’t think the difference is a matter of philosophical stance. I think it depends on whose and what interests are motivating one’s position on OA, Green OA and Gold OA.

I am happy to declare mine: They are the interests of research, researchers, and the general public whose taxes pay for the research and for whose ultimate benefit the research is funded and conducted. 

To a certain extent, the R&D industry also figures in this equation, but primarily as a user and applier of the research, just like researchers, and hence as a net contributor to the public good — not merely as another proprietary means of creating wealth for itself, in the way the research publishing industry is doing. 

This applies especially to secondary content-based industries (e.g., Thomson-Reuters ISI, or Google or Connotea… or Mendeley) that now have a financial interest in “technologically enhancing” OA research output — in much the same way that publishers stress that they are “technologically enhancing” their proprietary research content, in arguing that it should remain locked in their hands and continue to be paid for. 

Ironically, the interests of the OA-content enhancing industry can generate surprising stances, such as favouring extra payment for Gold OA over cost-free Green OA because it buys a form of Libre OA that is necessary for their product or service. This is a direct conflict of interest with the interests of research, researchers, and the general public who funds the research and for whose ultimate benefit the research is funded and conducted. 

Another ironic similarity between the interests of the content-enhancing industry is that they too, like the publishing industry, keep stressing that research — both raw research and peer-reviewed research — is “not free”: Publishers stress this in defending the price of subscriptions or the price of Gold OA; content-enhancers stress this, again, in arguing for Gold OA payment for the Libre OA they need for their products and services. This is again in direct conflict of interest with the interests of research, researchers, and the general public who funds the research and in whose interest the research is being done. (It also makes no sense, because the costs in question are not the ones at issue: the publishing industry many times aired the canard that their access-tolls are somehow justified because the Internet is “not free”!)

I urge commentators, before they reply, to have another look at my short posting on PRIORITIES. Gold OA and Libre OA are secondarily beneficial to research, researchers and the public too, one because it may eventually reduce the proportion of potential research funds spent on publication instead of research, the other because it may eventually increase technologically the use and usefulness of research publication, communication and collaboration. 

But both of these are potential secondary benefits of OA. The most important and urgent benefit of OA is the primary one: making research accessible to all of its potential users, not just to those who can afford subscription access. And that means Gratis, Green OA.

And that is why it is quite disappointing when OA advocates opt, today, for paid Gold OA over cost-free Green OA, or Libre OA over Gratis OA. They are opting for the eventual, potential secondary benefits of OA over the actual, primary and long overdue benefits of OA for research, researchers and the public.

And doubly ironic, because making sure Green Gratis OA is provided today, through global Green Gratis OA mandates by funders and institutions worldwide, is the fastest, surest and by far the most affordable way to get from the status quo today not only to the Gratis OA that will at long last fulfill the primary needs of research, researchers and the public today, but eventually also to the Libre and Gold OA that will fulfill the secondary needs and further potentials as well. Hence delaying or deterring the former in the service of the latter, by favouring paid Gold over cost-free Green, is a real head-shaker (to me, at least). and not a philosophical one…

Thomas Pfeiffer:

Thank you, +Stevan Harnad, for your detailed reply. Especially the reasoning about priorities and using Green OA for the transition from subscription to Gold OA makes sense to me.

Two things are still not clear to me about your interview and I’d be glad if you could clarify them:

1. Don’t Green OA mandates restrict the choice of journal/publisher as well? Not all publishers allow self-archiving…

2. Have you presented your patches to their policy directly to RCUK yet? And if you did, what was their reaction? If they left open that “loophole” for Hybrid OA accidentally, I think they should be welcoming your suggestions.

Cameron Neylon:

+Richard Poynder You ask about costs. Realistically the transitional costs should be somewhere between nothing and maybe £15M pa for a few years. The £50M is in many ways a rather silly figure. But the real answer is that the worst case scenario is  we do 1.5% less research for a few years – and frankly that is in the noise. It’s such a small figure in the overall research budget that it seems silly to worry about that when we know that there are much bigger inefficiencies that can be addressed by OA.

But even if it did cost £50M to deliver OA to all RCUK funded outputs from April next year, wouldn’t that be a bargain? We can start to save several hundred million on subscriptions, start to address the nearly £1B of lost economic activity due to SMEs not having access, we can get efficiencies in the research process of maybe 10%, maybe 50%, maybe 100%. Even if that costs £200M over four years and if its restricted to the UK I’d say its still a bargain.

And that’s what the RCUK policy, even in its current form delivers. Authors have precisely two choices. Go to a journal that offers a gold option and take it. Or go to a journal that offers a green option with no more than 6 month embargoes. It reduces author choice but so does any effective mandate. It’s working for Wellcome so I think it can be made to work here as well. But bottom line the policy delivers OA to the UK’s RC funded output from April 2013 with at worst a six month embargo. The only real risk is that publishers form a cartel to agree to charge high prices. And that cartel is already broken by a range of OA publishers who charge much less than the average.

What I find frustrating is that I actually agree that it would be a more effective policy would be to offer the option to go green if Gold is too expensive – at least in the short term. I’m arguing for this – the PLOS position supports this because I argued for it internally – and I’m talking to folks about the details of implementation and arguing for it with the relevant people. But the firebombing of comment threads, the shouting at people who should be our allies is making my job harder and strengthening the hand of the publishers to ask for more money, on weaker terms, because they can represent the OA movement as being unreasonable, shouty, and fragmented.

What would be helpful is clear rational argument that supports the principle direction of both Finch and RCUK towards OA as fast as possible, but offers advice on the implementation – rather than outright rejection or acceptance. Making the economic case for green based on real numbers and offer it as advice, not as a shouting match, to the people who are on our side. Telling those in government and RCUK who are expending significant political capital to drive the OA agenda that they are idiots is not helpful. Claiming that green is free is not helpful. Showing how it is a cost effective as a strategy, engaging with those people and giving them the detailed modelling of how costs would pan out, is. Offering to help game out the different ways policy might have an impact, is. But doing it constructively, not combatively, and NOT IN ALL CAPS!

And finally there needs to be more listening and understanding of other’s positions and perspectives. Stevan says above he speaks for the interests of researchers but he doesn’t represent mine. Access to the literature isn’t a problem for me, I can get any paper I want if I put my mind to it, albeit (possibly) illegally. Discovery of the right literature is a problem, aggregation of data is a problem. Similarly you dismiss the potential for enhancing innovation in your reply to me, but that is the government perspective. If you don’t engage with that then they will give up and move on, and we will probably get some half baked licensing or public library scheme. 

We need to stop claiming we talk for people and starting talking with people. There are many different interests served by OA, some served perfectly well by Green or Gratis and some that are not. For those of us with needs not served, Green could be a dangerous distraction, just as Gold looks this way for those who believe Green is the fastest route to universal access.

But it doesn’t have to be this way – we can use the strengths of both approaches and each in our own way push on both routes as far and as fast as we can. There’s no need for this to be competitive. Paying for Libre in no way diminishes the value of Gratis and nor does having Gratis diminish the value in continuing to push for Libre. And both Green and Gold approaches can be complementary in keeping transitional costs under control. We can have both, arguably we need both, so lets get on with enabling both and let the market and communities decide which route works for them.

I wrote a long comment originally and lost it in an inadvertent click. Then thought this was good because I should write something shorter…then wrote something longer

Thomas Pfeiffer:

I definitely agree with Cameron that it’s better to talk with people instead of for people. Funders, OA publishers and researchers ultimately have the same goal, they just prefer different routes to it. That should not keep them from working together to reach the goal, though.

Stevan Harnad:
REPAIRING THE RCUK MANDATE

+Thomas Pfeiffer: “Don’t Green OA mandates restrict the choice of journal/publisher as well?”

60% of journals formally recognize the author’s right to provide immediate, unembargoed Green OA. Many of the remaining 40% ask for a Green OA embargo of 6-12 months. Some journals have embargoes of longer than 12 months.

Before the UK government gave all publishers the strong incentive — by promising, as per the recommendations of the Finch Committee, to take the money to pay for it out of research funds — to provide a Hybrid Gold OA option and make their Green OA embargoes much longer, to ensure that authors pay for Gold rather than provide cost-free Green, an ID/OA mandate with a maximal embargo of 6-12 months on Green OA would have been feasible, with minimal restriction on journal choice and maximal incentive on journals to minimize or eliminate their embargoes.

But now that the word is out that not only are the extra Gold OA funds to be there for the asking, but that RCUK even obliges authors to pick paid Gold over cost-free Green if Gold is offered, it is no longer possible for RCUK to require a maximum 6-12 month embargo length on Green OA.

The only way to fix the broken RCUK mandate with its perverse incentives and disincentive now is to urge rather than to require a maximum OA embargo of 6-12 months.

What a repaired RCUK mandate can require is:

PATCH 1: Repository deposit (with no exceptions) of the final refereed draft ,immediately upon acceptance for publication, by all fundees, irrespective of journal, urging that access to the deposit should be set as OA immediately if possible, or, at latest, 6 months after deposit (12 for AHRC and ESRC if necessary). (Meanwhile the repository’s “email-eprint-request” Button can tide over research user needs during the embargo period by providing “Almost OA” with one click from the requester and one click from the author.)

That applies pressure on authors and journals for short or no embargoes, but it does not prevent authors from publishing in their journal of choice.

In addition:

PATCH 2: The condition that if the journal offers both Gold and Green the author must choose Gold should be dropped completely.

PATCH 3: Funds are available to pay to publish in a Gold OA journal, but only pure-Gold journals, not hybrid subscription/Gold.

+Thomas Pfeiffer: “Have you presented your patches to their policy directly to RCUK yet?”

RCUK did not consult me in designing their policy (though I did have some indirect information from some of the people RCUK did consult).

I have posted PATCH 1 and 2 prominently now. They are simple enough so that if there is a will to fix the policy, I trust that they can and will be done.

PATCH 3 is highly advisable, if there is the will for it, though, unlike 1 & 2 it is not absolutely essential.

Stevan Harnad:

PRIORITIES, AGAIN
Reply to @Cameron Neylon
“TRANSITIONAL COSTS”: It is not at all clear to me what Cameron’s speculations about transitional costs of “between nothing and maybe £15M pa for a few years” are based on.

(I’m also not sure how “the worst case scenario is we do 1.5% less research for a few years – and frankly that is in the noise” would wash with researchers, even if were right on the money.)

Does anyone seriously imagine that if the UK, with its 6% of world research output, mandates Gold OA then all journals will convert to pure-Gold OA to accommodate the RCUK mandate?

Assuming the answer is no (and that Cameron does not imagine that all UK authors will therefore drop their existing journals and flock to the existing Gold OA journals), the only remaining option is hybrid Gold.

It is certainly conceivable (indeed virtually certain) that under the irresistible incentive of the current RCUK mandate virtually all journals will quickly come up with a Hybrid Gold option: What is also conceivable is that some journals will offer a discounted hybrid Gold option (“membership”) to authors at universities that subscribe to that journal: Maybe even free hybrid Gold for those authors, as long as their university subscribes again the next year.

But that isn’t a transition scenario, it’s a subscription deal. It locks in current subscription rates and revenues and provides Gold OA for authors from subscribing institutions. How many papers? And what about authors from non-subscribing institutions? And how does this scale, globally and across time?

Subscriptions are sold and sustained on the demand by an institution for the whole of a journal’s contents. But an institution’s published papers per journal vary from year to year and from institution to institution, What is an institution’s incentive to keep subscribing at a fixed rate? Especially if — mirabile dictu — the global proportion of Gold OA articles were to go up? (Reminder: You don’t need a subscription to access those Gold articles!)

Publishers can do this simple reckoning too. So it is much more likely that the “quick” Hybrid Gold offered by most journals under RCUK pressure will not be based on free Gold OA for subscribers, but on charging extra for Gold OA. How much? It’s up to the journal, since the mandate is just that if Gold is offered, it must be picked and paid for, if the journal is picked.

So the likelihood is that journals will charge a lot. (They already charge a lot for Gold OA.) The price per article is likely to be closer to 1/Nth of their gross revenues per article for a journal that publishes N articles per year. If they get that much per RCUK article, then that will bring in 6% more than their prior gross revenue annually, thanks to the UK’s largesse..

We can speculate on how much publishers might reduce this 1/N, in order to hedge their bets, on the off-chance that it could also catch on in some other countries whose pockets full of spare research funds are not quite as deep as the UK’s — but why are we speculating like this? No one knows what will happen if UK authors are forced to pay for Gold and journals happily offer them hybrid Gold at an asking-price of the journal’s choosing.

What’s sure is that this kind of “transition” doesn’t scale — because other countries don’t have the spare change to pay for OA this way — and especially because it is still evident for those who are still thinking straight that OA can be provided, completely free of any extra cost whilst subscriptions are paying for publication, by mandating Green OA rather than paying pre-emptively for a “transition” to Gold OA.

And certainly not paying in order to enjoy the legendary benefits of Libre OA — for authors who can’t even be bothered to provide Gratis OA unless it is mandated! (At least every researcher today, both as author and user, has a concrete sense of the frustration of gratis-access denial as a non-subscriber: How many researchers have the faintest idea of what they are missing for lack of getting or giving libre OA re-use rights?)

I would also appreciate an explanation from Cameron of the reason behind his suggestion that “even if it did cost £50M to deliver OA to all RCUK funded outputs from April next year, wouldn’t that be a bargain? We can start to save several hundred million on subscriptions”:

Does Cameron imagine that UK institutions only subscribe to journals in order to gain access to their own UK research output? (Or has Cameron forgotten about hybrid Gold OA again?)

+Cameron Neylon: “It’s working for Wellcome so I think it can be made to work here as well.”

Is it? And if Wellcome pays to make all its funded research Gold OA, does that take care of Wellcome authors’ access to research other than Wellcome-funded research?

+Cameron Neylon: “The only real risk is that publishers form a cartel to agree to charge high prices. And that cartel is already broken by a range of OA publishers who charge much less than the average.”

Is that so? Are you not forgetting Hybrid Gold again? And authors’ disinclination to give up their journal of choice in order to have to pay scarce research money for a Gold OA that they had to be mandated to act as if they wanted?

Being mandated to do a few extra keystrokes (to provide Green OA) as a condition of receiving research funding is one thing (and a familiar one), but having to give up your journal of choice and to shell out scarce research money (or possibly even some of your own dosh) is quite another.

+Cameron Neylon: “a more effective [RCUK] policy would be to offer the option to go green if Gold is too expensive? I’m? arguing for it with the relevant people”

Putting an arbitrary price-limit on the Gold fee is no solution for the profound flaw in the current RCUK policy. How much more than cost-free is “too expensive”? And why?

+Cameron Neylon: “the firebombing of comment threads? is making my job harder”

Thinking things through first might make it easier — maybe even consulting those who might have thought them through already. ;>)

+Cameron Neylon: “Claiming that green is free is not helpful”

But while subscriptions are paying the cost of publishing in full, and fulsomely, it is, helpful or not, a fact.

+Cameron Neylon: “Showing how [Green] is cost effective as a strategy, engaging with those people and giving them the detailed modelling of how costs would pan out, is [helpful].”

I believe that’s precisely what Alma Swan and John Houghton did, and their modelling and recommendations were ignored in the Finch and RCUK recommendations. Their recommendation was to mandate Green, not to pay pre-emptively for Gold. And they showed that the benefit/cost ratio was far higher for Green than Gold in the transition phase. (Post-Green Gold is another story, but we have to get there first; and the calculations confirm that mandating Green — not paying pre-emptively for Gold while still paying for subscriptions — is the way to get there from here.)

+Cameron Neylon: “Offering to help game out the different ways policy might have an impact, is [helpful].”

I offer to help.

Till now I have not been consulted in advance, so I have had no choice but to give my assessment after the policy (both Finch and RCUK) was announced as a fait accompli. My assessment was extremely negative, because both policies are just dreadful, and their defects are obvious.

But RCUK, at least, is easily reparable. I’ve described how. I’m happy to explain it to any policy-maker willing to listen to me.

(And if RCUK is fixed, that will indirectly fix Finch.)

+Cameron Neylon: “Stevan says above he speaks for the interests of researchers but he doesn’t represent mine. Access to the literature isn’t a problem for me, I can get any paper I want if I put my mind to it, albeit (possibly) illegally.”

Cameron, that response does not scale, nor is it representative.

+Cameron Neylon: “Discovery of the right literature is a problem”

The only reason discovery of the right literature is a problem is that most of it is not yet OA! You can’t “discover” what is not there, or not accessible. That’s why we need Green (Gratis) OA mandates.

+Cameron Neylon: “you dismiss the potential for enhancing innovation in your reply to me, but that is the government perspective”

Cameron, you know as well as I do that “the government” could not explain what the slogan “potential for enhancing innovation” means to save its life. “The government” gets fed these slogans and buzzwords and “perspectives” by its advisors and lobbyists and spin-doctors.

Yes, it’s near-miraculous that “the government” express any interest in OA at all. But it’s up to those who actually know what they are talking about to go on to explain to them what it means, and what to do about it.

And anyone who still has his feet on the ground (rather than levitating on gold dust or rights rapture) knows that what is needed first and foremost, and as a necessary precondition for anything further, is Gratis OA (free online access), globally. We’re nowhere near having it yet; and if RCUK persists in its present fatally flawed form, we’ll have (at the very best) UK Gold OA (raising worldwide OA by 6% from about 22% to about 28%) plus a local, unscalable policy. (More likely, we will simply have a failed mandate, non-compliant authors, a lot of money and time wasted, and the UK no longer leading the worldwide OA movement, as it had been doing for the past 8 years.)

+Cameron Neylon: “There are many different interests served by OA, some served perfectly well by Green or Gratis and some that are not. For those of us with needs not served, Green could be a dangerous distraction, just as Gold looks this way for those who believe Green is the fastest route to universal access.”

You seem to be conflating Libre and Gold here Cameron, but never mind. Gratis is for those who need free online access. Libre is for those who need free online access plus certain re-use rights. Green is for those who don’t want to wait for all journals to go Gold and don’t have the money to pay for Gold pre-emptively at today’s asking prices while subscriptions are still being paid for. Gold is for those who are galled by subscription prices (and have other sources of money).

Gratis and Libre come as either Green or Gold, but Green has no extra cost (while subscriptions are being paid); and Libre is much harder to get subscription publishers to agree to. Moreover, all four include Gratis as a necessary condition.

So without tying oneself up into speculative and ideological knots (or a transport of gold fever or rights rapture), it looks as if Gratis OA via cost-free Green OA mandates are the way to go for now (with ID/OA and the Button mooting embargoes).

The rest (Libre, Gold) will come after we’ve mandated and provided Gratis Green globally. To insist on Libre Gold locally in the UK now, by paying extra for it pre-emptively, is just a way of ensuring that the UK no longer has a scalable global solution for OA at all. And without global Gratis OA at least, the UK’s dearly purchased Gold amounts to Fool’s Gold, insofar as UK access is concerned. (And remember way back, Cameron: Open Access was about access!)

+Cameron Neylon: “There’s no need for this to be competitive. Paying for Libre in no way diminishes the value of Gratis and nor does having Gratis diminish the value in continuing to push for Libre. And both Green and Gold approaches can be complementary in keeping transitional costs under control. We can have both, arguably we need both”

I’m all for going for both — as long as cost-free Green Gratis OA is mandated and Libre Gold is a bonus option one can choose if one wishes and has the money to pay for it. Not, as RCUK currently has it, where the author may not choose Green if a journal offers Gold. That is just fatal foolishness, aka, Fool’s Gold.

#openaccess TOWIG (The Only Way Is Greold) – or are there others? Some suggestions

I’d hoped to blog about something other than Open Access – such as developing intelligent software for reading the scientific literature (and I shall). However things have niggled away and I have to get them and other ideas out before I can go back to writing code .

  • I have been invited as an OKF representative to a (invitation-only I think) meeting run by the Royal Society “Workshop on ‘Revaluing Science in the Digital Age’ 2-4 September 2012 at Chicheley Hall (Royal Soc). No idea how long I speak for, but I am expected to contribute a “provocation”. No doubt more later.
  • A recent post (http://lists.okfn.org/pipermail/openaccess/2012-July/000766.html ) to the OKFN openaccess list [0] by Katie Foxall of ecancer.org. Katie runs a discipline/community-based journal which is toll-free and APC-free (in simple language anyone can read it for free and authors don’t have to pay charges). She writes:

    I haven’t posted before but have been following the discussions [on OKFN openaccess]with much interest and have founds the info and links provided by various people really useful. I run an open access cancer journal http://ecancer.org/ecms which has no author fees – we are currently mainly supported by charity funding but the journal has been growing at a great rate this year so I’m looking into accessing any funding that might be out there to support open access publishing. The reality is that we will have to start charging author fees at some point if we can’t get more funding and we really don’t want to do that as providing a free service for the oncology community is very important to us.

    So does anyone know whether there is anything like SCOAP3 [the consortium for High Energy Physics publishing] in the field of medical publishing?

To many of us this sounds entirely natural and desirable – after all that’s how many journals started and – IMO – this often represents the best of science communication – a community-to-community process rather than the anonymous capitalist scholarly presses. So for me, part of the issue is whether #openaccess can solve Katie’s problem.

If we go to the traditional approaches “Green and Gold”) there are many words and some – but limited – progress. The publication of the Finch report and the RCUK policy on Open Access have engendered a huge amount of “debate” in the open access fora (such as the GOAL open access list http://mailman.ecs.soton.ac.uk/pipermail/goal/2012-July/thread.html ) and more recently and possibly more digestibly on Google+ http://t.co/h6p1Lb6F. Unfortunately much of this is politico-religious and revolves round “The Only Way Is Green” [1] and oh-no-it-isn’t “The Only Way is Gold”. (Hence the humpty-dumptyism of the title http://en.wikipedia.org/wiki/Portmanteau Gree-old). The “debate” consists of a lot of shouting and more recently withdrawal of cooperation: http://poynder.blogspot.co.uk/2012/07/oa-advocate-stevan-harnad-withdraws_26.html. (Summary – Stevan Harnad said one thing and then changed his mind and attacks the RCUK and Finch and with his fellow-thinkers me if I dare to post on GOAL).

So my starting point is that Green and Gold (besides being almost meaningless terms operationally – like “democracy” or “healthy”) have serious flaws and are not a solution to Katie’s problem. Why do I say that? (I remain a supporter of BMC and Gulliver Turtle and PLoS and IUCr and EGU).

The problem is that they are C19/20 solutions in the C21. They glorify the “article”. They are based on a very university-centric – and often arrogant – approach. They have very little scope for C21 (“Web 2/3.0″). There is no feeling of community if you publish a Gold article. There is no feeling of community in self-archiving a Green article. It’s a chore. They are often predicated more on glory-for-the-authors than communicating to the electronic world. The end result is an “impact factor” not a community of practice.

Contrast this with Open Street Map. http://www.openstreetmap.org/ . This creates top-quality up-to-date maps for the whole world – in many cases better than the existing commercial products. And for many years (and maybe still true) it didn’t even have a bank account. It has 250,000+ supporters who love doing it. The simple message – if you create a world community (not just an ivory-tower one) you can change the world.

Wikipedia has also done that. For that reason many academics hate and denounce it. “It can’t be good because it’s free and created by non-specialist volunteers. It has no peer-review” (wrong). It is against the laws of academia to create a volunteer-based high-quality zero-cash encyclopedia. Of course it’s not zero-cost now, and we have JimmyW’s face asking us to donate, but that’s C21.

So can this translate to community journals without Gre-old? I think we can and I think we should try.

To start I highly recommend reading http://blogs.law.harvard.edu/pamphlet/2012/03/06/an-efficient-journal/ – which gives a factual, readable, account of how a journal can be run with virtually no cash.

LeCun (editor of Journal of Machine Learning Research (JMLR), ) “The best publications in my field are not only open access, but completely free to the readers and to the authors.”

To supporters of the multi-billion dollar faceless publishing industry – like Kent Anderson of the Scholarly Kitchen – this is impossible.

I’m not entirely clear how JMLR is supported, but there is financial and infrastructure support going on, most likely from MIT. The servers are not “marginal cost = 0? — as a computer scientist, you surely understand the 20-25% annual maintenance costs for computer systems (upgrades, repairs, expansion, updates). MIT is probably footing the bill for this. The journal has a 27% acceptance rate, so there is definitely a selection process going on. There is an EIC, a managing editor, and a production editor, all likely paid positions. There is a Webmaster. I think your understanding of JMLR‘s financing is only slightly worse than mine — I don’t understand how it’s financed, but I know it’s financed somehow. You seem to believe in fairies.

Anderson’s comments (and it’s worth reading the comment thread) read like a sceptic of heavier-than-air flight: “Publishing requires fees – it is theoretically impossible to run a no-fee journal”. Read how every time his “argument” is countered he plays-the-man-rather-than-the-ball (a soccer term). “You failed to fill in a tax return for the journal so this proves you cannot run a journal without fees”.

So to Katie my suggestions would be:

  • Resist rushing into conventional solutions. You sell your soul if you involve a commercial publisher (look at all the “Society” publishers who are now controlled by for-profit publishers). [2]
  • Look to collaborate with others. There is no reason why JMLR and ecancer shouldn’t share infrastructure development, for example.
  • Be open (as you have been) and ask for help and advice. There is a lot going.

So reader, if you have a bright idea for how new methods in #scholpub can be developed bring them forward. And if you can help to develop new ways of running almost-cash-free journals I think there will be people interested in helping get this off the ground.

NOW I can get back to writing code

 

[0] The OKFN openaccess list has much of its input from outside academia and has a collaborative and innovative approach.

[1] There is a reality program in the UK called http://en.wikipedia.org/wiki/The_Only_Way_Is_Essex often abbreviated to TOWIE. I don’t watch it and it has no other relation to Open Access other than the cadence of the title.

[2] In a later post I shall compare scholarly publishing to banks – both of course are highly respectable, ultra-efficient, scandal-free, publicly loved, high-value-providers and respected.

I ask Elsevier for their list of articles published as “hybrid Open Access”

Scientists can pay up to 5000 USD for an “Open Access” article in Elsevier journals (i.e. published normally in a toll-access journal but free to read). Leaving aside whether this is good value I wish to know what the authors get for their money. In particular since the authors presumably wish me to be able to read them, Elsevier must make it easy for me to find them.

 

I ask the Director of Universal Access how I can find the articles and how they can be identified by readers:

 

To: “Wise, Alicia (ELS-OXF) [Director of Universal Access]” <A.Wise@elsevier.com>,

Dear Director of Universal Access,
I am interested in articles published in toll-access Elsevier journals as “authors-pays Open Access” often called “hybrid Gold”.  See, for example,  http://download.thelancet.com/flatcontentassets/authors/article-sponsorship_form.pdf

for the Lancet (for which the price is 400 GBP (ca 600 USD) per page (and 3000-5000 USD for articles in other journals). I am interested in all subjects, not just Biomedical.
I would be grateful if you could answer the following questions:
* What , if any, is Elsevier’s precise name for this scheme and where is it described?
* how many articles in total have been published under this scheme?
* what explicit licence, if any, is used on the articles?
* how are the articles labelled in the Elsevier journal (i.e. how is the licence and the Open Access information made apparent)?
* where is the machine-readable list of all articles published under this scheme?  I wish to download and analyze all of them.

Thanks in advance for your answers. I shall be making the results Openly available as I shall be contributing to public discussion on this matter in a few weeks.


Hybrid Gold OA and the Cheshire Cat’s Grin

Suppose you’re a subscription journal. Hybrid Gold Open Access (OA) means you just keep selling subscriptions and — on top of that — you can charge (whatever you like) as an extra fee for selling single-article hybrid gold.

How much do you charge? Well, if you publish 100 articles per year and your total annual revenue is £XXX, you charge 1% of £XXX for hybrid Gold OA per article.

Once you’ve got that (plus your unaltered subscription revenue of £XXX) you’ve earned £XXX + 1% for that year.

Good business.

And if the UK publishes 6% of the world’s articles yearly, then on average 6% of the articles in any journal will be fee-based hybrid Gold OA, thanks to Finch and RCUK. That means worldwide publisher revenue — let’s say it’s £XXX per year — will increase from £XXX per year to:

£XXX + 6% per year
Not bad.

Publishers are not too dense to do the above arithmetic. They’ve already done it. That is what hybrid Gold is predicated upon. And that is why publishers are so pleased with Finch/RCUK: “The world purports to want OA. Fine. We’re ready to sell it to them — on top of what we’re selling them already.”

In the UK, Finch and RCUK have obligingly eliminated hybrid Gold OA’s only real competition (Green OA) — Finch by ignoring it completely, and RCUK by forcing fundees to pay for Gold rather than provide Green whenever the publisher has the sense to offer Gold.


Of course publishers will say (and sometimes even mean it) that they are not really trying to inflate their income even further. As the uptake of hybrid Gold increases, they will proportionately lower the cost of subscriptions — until subscriptions are gone and all that’s left, like the Cheshire Cat’s grin, is Gold OA revenue (now no longer hybrid but “pure”) — and at the same bloated levels as today’s subscriptions.

So what? The goal was always OA, not Green OA or Gold OA. Who cares if all that money is being wasted?

I don’t.

I care about all the time (and with it all that OA usage and impact and research progress) that has been wasted, and that will continue to be wasted, as the joint thrall of Gold Fever and Rights Rapture keep the research community from mandating the cost-free Green OA that would bring them 100% OA globally in next to no time, and leave them instead chasing along the CC-BYways after gold dust year upon year, at unaffordable, unnecessary and unscaleable extra cost.

§ § § §

Let’s hope that RCUK will have the sense and integrity to recognize its mistake, once the unintended negative consequences are pointed out, and will promptly correct it. The policy can still be corrected completely with two simple patches.

RCUK should:

(1) Drop the implication that if a journal offers Green and Gold, RCUK fundees must pick Gold

and

(2) Downgrade to a request the requirement that the Green option must be within the allowable embargo interval.

(The deposit of the refereed final draft would still have to be done immediately upon publication, but the repository?s ?email-eprint-request? Button could be used to tide over user needs by providing ?Almost-OA? during the embargo.)

There is no way to resurrect the current RCUK policy in such a way as to rule out hybrid Gold: to do that, the policy would have to be re-conceived and re-written completely. If that were done, all of the fatal bugs of the present draft would be gone:

?You must provide at least gratis OA within the allowable embargo. This can be done either by paying for pure Gold OA (not hybrid) ? but then the OA must be libre and unembargoed (and the paper should be deposited in the fundee’s repository anyway). Or you can provide Gratis Green OA to the refereed final draft within the allowable embargo (but the deposit itself must be done immediately upon acceptance for publication).?

That would be a fine policy, especially if beefed up with a link to submission to HEFCE [Higher Education Funding Council for England] for RE.

Stevan Harnad

Come to the Food and Energy Security Launch Event!

T shirtWe would like to invite you to attend the Food and Energy Security launch event. This will take place at the Plant Biology Congress 2012 in Freiburg. Come to the Wiley stand No.14 from 4pm on 2nd August and meet the Editors. Editor-in-Chief Martin Parry will be there, accompanied by Associate Editors Bill Davies and Ricardo Azevedo to meet authors and reviewers and discuss this new journal. We would love to see you there and answer your questions. Free refreshments  and T-shirts!

Food and Energy Security is the new Wiley Open Access journal published in Association with the AAB (Association of Applied Biologists). It is an international high quality and high impact journal publishing original research on agricultural crop and forest productivity to improve food and energy security.

You can read our first issue here >
And submit your paper to the journal here >

What is a species?

Drosophila syntheticaHow do we determine whether two animals are of the same species?  It is not enough to judge based on similar appearance: Chihuahuas and Dalmatians look vastly different but we consider these to be the same species (in case you’re wondering the crossbreed is a “Chimatian”).  So where does the boundary fall?  In his book, Systematics and the Origin of Species (1942), famed evolutionary biologist Ernst Mayr proposed the definitive criteria that are still used today: “groups of actually or potentially interbreeding natural populations, which are reproductively isolated from other such groups”.

An article published today with PLOS ONE describes the first reported creation of a synthetic species, a fruit fly that has been christened Drosophila synthetica. Author Eduardo Moreno of the University of Bern describes a combination of commonly used lab variants that result in a fly that is capable of producing fertile offspring with others that are genetically the same, but not with its wild-type predecessor, Drosophila melanogasterD. synthetica has smaller, paler eyes compared to D. melanogaster,  and its wings also differ. But is synthetica a different species from melanogaster?

The key may be in the phrase: “a group of … interbreeding natural populations”.  Lions can famously interbreed with tigers, to produce ligers.  There is at least one documented case of a liger that was coaxed into breeding with a lion to produce a rather unhealthy off-spring that grew to adulthood.  The reproductive barrier is not complete, but lions and tigers are still considered separate species.  The issue?  Ligers do not exist in the wild because their habitats do not overlap.  The potential for lions and tigers to interbreed has only been demonstrated in captivity.  Moreno acknowledges that D. synthetica may not meet the Mayr definition, and specifically refers to it as a synthetic species, to distinguish it from a natural species.  He cites two reasons for this: “not only because it has been created in the lab but also because it may never be able to survive outside that laboratory environment.”  Regardless, it seems he will challenge our notions of what it means to be your own species.

Citation: Moreno E (2012) Design and Construction of “Synthetic Species”. PLoS ONE 7(7): e39054. doi:10.1371/journal.pone.0039054

Read Food and Energy Security’s First Issue!

Food and Energy Security CoverFood and Energy Security has published its first issue. From a retrospective of two decades in plant biotechnology to the identification of a serine carboxypeptidase in Arabidopsis, this brings together some of the top papers on important issues of global food security. We are actively seeking submissions from countries with expanding agricultural research communities and encourage you to put your science into practice by submitting your next paper. Below are some highlights from this issue:

purple_lock_open Economists are not dismal, the world is not a Petri dish and other reasons for optimism by Richard Tiffin
Abstract: One of the recurrent themes in the debate around how to ensure global food security concerns the capacity of the planet to support its growing population. Neo-Malthusian thinking suggests that we are in a situation in which further expansion of the population cannot be supported and that the population checks, with their dismal consequences envisaged by Malthus, will lead to a new era of stagnant incomes and population. More sophisticated attempts at exploring the link between population and income are less gloomy however. They see population growth as an integral component of the economic growth which is necessary to ensure that the poorest achieve food security. An undue focus on the difficulties of meeting the demands of the increasing population risks damaging this growth. Instead, attention should be focused on ensuring that the conditions to ensure that economic growth accompanies population growth are in place.

purple_lock_open Water use indicators at farm scale: methodology and case study by Annette Prochnow, Katrin Drastig, Hilde Klauss and Werner Berg
Abstract: Three indicators to assess water use at the farm scale are introduced: farm water productivity, degree of water utilization and specific inflow of technical water. They can assist farmers in understanding the water flows on their farms and in optimizing water use by adapting agronomic measures and farm management. Factors that mainly effect these indicators and general approaches to optimize water use in farms are discussed as well as the further research required.

Food and Energy Security is published in association with the Association of Applied Biologists (AAB). It combines Wiley’s publishing expertise with the AAB’s long established reputation for world class research.

To find out when new articles and issues are published sign up for e-toc alerts >

A Serious Potential Bug in the RCUK Open Access Mandate

       David A. Arnold wrote: “Stevan – you are wrong about RCUK madating green OA. It does not. The new RCUK policy only requires green OA if the journal does not offer gold OA. Since the vast majority of journals now offer a gold route, the green option is essentially redundant. Here is the wording:

       The Research Councils will continue to support a mixed approach to Open Access. The Research Councils will recognise a journal as being compliant with their policy on Open Access if:

       1. The journal provides via its own website immediate and unrestricted access to the publisher?s final version of the paper (the Version of Record), and allows immediate deposit of the Version of Record in other repositories without restriction on re-use. This may involve payment of an ?Article Processing Charge? (APC) to the publisher. The CC-BY license should be used in this case.

Or

       2. Where a publisher does not offer option 1 above, the journal must allow deposit of Accepted Manuscripts that include all changes resulting from peer review (but not necessarily incorporating the publisher?s formatting) in other repositories, without restrictions on non-commercial re-use and within a defined period.


Here is my response to David. But as you will see, although I am doing my level best to disagree with him, in the end, it turns out he was basically right:

David, I think you are wrong that “the vast majority of journals offer a gold route”.

I also think that you are misconstruing the RCUK “mixed” approach (and the semantics of “inclusive disjunction,” i.e., “either A or B or both”).

I think RCUK fundees can comply with the RCUK mandate by depositing a peer-reviewed draft in their OA institutional repository — either the publishers version, by paying for Gold OA, or the author’s final draft (possibly after an allowable embargo interval), i.e., Green OA.

My understanding is that the constraint on journal policy is intended to be on the journal (i.e., that the journal must either offer Gold OA or endorse Green OA within the allowable embargo interval) not on the author.

The idea is that journals should know in advance that an RCUK-funded author is under a prior contractual obligation, as a condition of funding, to publish only in a journal that either offers Gold OA or (allowably embargoed) Green OA.

I don’t think the mandate is that if a journal offers both Gold and Green, then the author is obliged to pay for Gold instead of providing Green cost-free. (If it were, that would be extremely foolish and wasteful.)

However, I do think that there is a bug in the RCUK mandate that should on no account be imitated by other funders (and that should be corrected by RCUK):

(1) It is a big mistake to insist that an RCUK author must pay for Gold if his journal of choice is a hybrid Gold journal that offers Gold but does not endorse Green within the allowable embargo interval:

PATCH: Better to allow embargoed deposit and reliance on the repository’s automated “email-eprint-request” Button to provide “Almost OA” during the embargo via one click from the user to request an individual copy for research purposes, and one click form the author to comply.

(2) Much more important than (1) is the distinct possibility that RCUK’s mixed either/or policy provides an incentive to publishers — even the publishers of the 60% of journals that already endorse immediate, un-embargoed Green OA today — to change their policy so as to offer a high-priced hybrid Gold OA option, coupled with an infinitely long Green OA embargo, in order to ensure that the RCUK author must pay for hybrid Gold OA. This would be a terrible, unintended consequence of the RCUK policy, and a huge blow to OA and Green OA worldwide.

I cannot say whether the RCUK policy will have this terrible unintended consequence. All I can do is urge RCUK to patch it up — and the rest of the world to ignore it.

The best solution would be the PATCH. If the RCUK is not patched, then I predict a tremendous (and justified) researcher revolt against the policy, with the result that the policy will not be complied with, and will have to be revised after a few lost fallow years of failure.

Other funders and institutions should learn a lesson from this: There is a trade-off between embargo-tolerance and OA-cost: If you don’t want to induce journals to charge — and oblige authors to pay — needless and bloated hybrid Gold OA fees, don’t try to constrain journal choice too radically: mandate immediate deposit (whether Gold or Green), specify an allowable Green OA embargo length (preferably no more than 6 months), but don’t forbid authors to publish in journals whose embargo exceeds the specified length. Rely on the Button (and human nature) rather than forcing authors into gratuitous expenses, constrained journal choices, or non-compliance with the mandate.

Embargoes will die their well-deserved death as a natural matter of course, under the growing pressure of Green OA mandates, but not if a nonviable, unscalable mandate model is adopted.

Stevan Harnad

New Account Deal for Imperial College London

Imperial College LondonImperial College London is the latest organization to sign up for a Wiley Open Access Account and pay for its researchers to publish an open access article with Wiley.  Authors affiliated with Imperial College London can publish research articles in Wiley Open Access journals and/or OnlineOpen, without directly paying any publication charges.  When Authors submit to a Wiley Open Access journal or opt for OnlineOpen they need to state their affiliation to Imperial and their authorisation code.

Imperial joins a number of funders who have opened a Wiley Open Access Account since this was launched earlier this year. Browse our listing to see the institutions / funders who have an account or partnership with Wiley Open Access.

Imperial authors can go here for more information and to check eligibility and apply for funding>

CUEA Knowledge Party

The Catholic University of East Africa (CUEA) is in Party mode. We are in the final countdown for the launch of much awaited CUEA Learning Resource Centre (LRC) http://goo.gl/Ob4mr. This is an ultra modern facility and one of the best in the sub-Saharan Region.

As a warm up to the launch,  CUEA library is organizing a “knowledge Party” . We would like to take this auspicious occasion to expose faculty, administration, and student community to the open access (OA) resources and Institutional repositories. This will be the first OA advocacy campaign in the institution. The objective is to come up with an IR policy by the end of the year.

This is an activity that will be carried jointly by library staff and students. The library recently formed collaboration with student academic leaders who are the library “knowledge ambassadors” to promote library resources and services http://goo.gl/u4ril, http://goo.gl/SzC0D.

Find us on facebook http://goo.gl/MPF6R and twitter.com/#!/CUEALIBRARY