Abstract: Implementing open access is a tough job. Legitimate authority, sufficient resources and the right timing are crucial. Pioneers, role models and flagship institutions all have faced considerable challenges in meeting their own aims and achieving a recognized success. Professionals charged with implementing policy typically need several years to accomplish significant progress. Many institutions adopting open access policies probably need to do more, much more, if the commitment to open access is to be meaningful.
A first generation of open access policy development and implementation is coming to a close. It is thus possible to begin evaluation. Evaluating implementation establishes evidence, enables reflection, and may foster the emergence of a second generation of open access policies.
This study is based on a small number of cases, examining the implementation of open access around the world. Some of the pioneer institutions with open access mandates have been included, as well as some newer cases. The emergence of the new stakeholders in publishing is examined, such as digital repositories, research funders and research organisations.
Because this is a groundbreaking study, no claim is made that the results are representative. The emphasis is on variety and on defining a methodological standard. Each case is reconstructed individually on the basis of public documents and background information, and supported by interviews with professionals responsible for open access implementation.
Implementation is typically based on targeting researchers as authors. Indeed, the author is pivotal to any open access solution. This is the ‘tertium comparationis’ that facilitates an examination of the similarities and differences across instances in an effort to build a broader policy research agenda.
In a final section, open access is placed in the wider context of the evolution of digital scholarship. This clarifies how published research results are destined to become a key component of digital research infrastructures that provide inputs and outputs for research, teaching and learning in real time.
The ten cases examined in detail are:
– Refining green open access policy: Queensland University of Technology (September 2003)
– Refining policy to foster deposit: University of Zurich (July 2005)
– National platform, open collection, decentralized policy: the HAL platform (June-October 2006)
– Maximising a funder’s impact: The Wellcome Trust (October 2006)
– Implementing open access as a digital infrastructure: UK PMC (January 2007)
– Learning from global research infrastructure: SCOAP3 (April 2007)
– Linking public access to open data: Howard Hughes Medical Institute (January 2008)
– Open access to all publications, internationally: Austrian Science Fund (FWF, March 2008)
– One policy, sixty publication strategies: Fraunhofer-Gesellschaft (July 2008)
– Open Access complements the Research Information System: The University of Pretoria (May 2009)
“1. ‘Selling’ knowledge is a zero sum game….
2. Sharing knowledge transforms our relationships….
3. Sharing knowledge creates power….”
This Halloween season we take a look back at some of the spookiest images and creepiest findings that we published in PLOS ONE in 2016. Creepy, weird, skin-crawling, or just plain gross – we hope you
With the exponential growth of the Open Access movement over the past decade, it is undeniable that major steps are underway toward broadening both the reach and availability of scientific research. But not every development has been a step in the right direction, and for some, Open Access has become synonymous with negative connotations. The most notable unfortunate byproduct of the boom in numbers and access to scientific journals has been the dramatic rise in predatory publishing.
Predatory Publishers can be defined as any publisher that operates on an exploitative business model, which can involve, among other things, charging fees to authors and other contributors without providing adequate peer review, misrepresenting personnel affiliated with the company, misrepresenting the company’s location, contacts , addresses, not providing archiving, plagiarism-checking, and not providing professional grade editorial and publishing services on manuscripts.
The combination of the Open Access journal model along with an increased pressure for academics to prove their work by publishing more and being cited has created a prosperous environment for unethical practices looking to take advantage. By falsifying journal information, faking editorial board members, and hiding behind a general lack of transparency, predatory publishers have been able to prey upon the desperation of academics looking to act fast and get their names out there.
With so many new journals flooding the field, it can be very difficult to tell the quality publishers from the fake.
In some instances, it is the authors themselves who are taking advantage of the system for their own benefit. With such a glut of new material, plagiarism of others’ easily accessible work has skyrocketed, and self-plagiarism, where an author uses one of the author’s own previously published articles as a template for “new” work with only minor changes is increasing.
As part of a mounting backlash against such practices, a growing movement on social media platforms works to highlight and inform the public on the actions and methods of predatory publishers. The twitter hash-tag #predatorypublishing has been effective in spreading the message through current events and academic articles relating to deceitful open access trends. Facebook is also becoming an instrumental battleground in addressing predatory publishing culture, with a dedicated watchdog group with more than 500 members that works to spread awareness of the problem and practices to academics from countries and backgrounds that might otherwise not know of such dangers.
The open platform of social media and blogs is helping academics to identify and root out the culprits. One of the most prominent leaders of this effort is Jeffrey Beall, a librarian and professor at the University of Colorado, Denver. His vigilant commentary on the scientific publishing field help many to stay abreast of the changes and developments at work, but perhaps his most important contribution has been in creating and maintaining a list of known predatory publishers for all to reference. Beall has established 52 criteria for determining if a publisher qualifies as predatory. These conditions range from distributing spam emails to falsifying details of journals’ editorial personnel.
The open access movement in academic publishing is in a state of flux. The boom in growth, awareness, and access is founded in the needed press for all researchers to have a voice and platform through which to be heard and to learn. In time, a push for standards will establish an ethical and balanced playing field, and as more of these predatory publishers are identified every year, the honest and high-quality open access journals and publishers will become even more vital.
“Sanford Thatcher has written a valuable, if anecdotal, analysis of some papers residing on Harvard’s DASH repository (Copyediting’s Role in an Open-Access World, Against the Grain, volume 23, number 2, April 2011, pages 30-34), in an effort to get at the differences between author manuscripts and the corresponding published versions that have benefited from copyediting.
“What may we conclude from this analysis?” he asks. “By and large, the copyediting did not result in any major improvements of the manuscripts as they appear at the DASH site.” He finds that “the vast majority of changes made were for the sake of enforcing a house formatting style and cleaning up a variety of inconsistencies and infelicities, none of which reached into the substance of the writing or affected the meaning other than by adding a bit more clarity here and there” and expects therefore that the DASH versions are “good enough” for many scholarly and educational uses….”
“Designed to take full advantage of the online medium, HMX integrates foundational principles and clinical applications.
Share in the Harvard Medical School experience. Learn with the material that Harvard Medical School offers to our incoming students over the summer as an opportunity to prepare for a demanding curriculum and to our current students as part of the flipped classroom model….”
“ReproZip allows you to pack your experiment along with all necessary data files, libraries, environment variables and options.
Anybody can then reproduce the experiment on a different machine, without tracking down and installing the dependencies, or even having to run the same operating system! …”
“However, all working groups agreed that reproducibility should be more than just hygiene around the data: It should come as part of the “operating system” of science. It should be a given.
Creating such an environment might be difficult to achieve in the short term, because it will require a cultural shift of how we do science. Open Science is perceived as an integral part to this idea, but this will require some of the leading market forces to enforce these best practices. However, several movements are already being pursued: ACM Journals are now awarding badges for papers whose findings are reproducible, and more and more conferences are beginning to hand out prizes for the most reproducible paper. If more journals and conferences would require to always publish data and code, this could go a long way. In addition, there’s great technology being developed to aid with these goals. For instance, ImpactStory is an online platform that tracks your impact in the open science community by awarding badges for reproducibility, openness, and social media impact. ReproZip is a framework that lets you easily store all your data, methods, and environment (including software dependencies and whatnot) in a Docker container, so you can easily store, share, and load entire experiments, no matter where you are….”
“The web is awash with uncredited images. The tech startup Mediachain Labs is hoping to get photographers the credit they’re due.
They’ve done this by ingesting a trove of images with Creative Commons licenses from over 30 image sharing platforms such as Flickr, along with the attribution found on those platforms. It then used neural network-powered content identification technology to de-duplicate over 400 million images, leaving it with a base of 125 million photos.”
“Result of the survey on the use of the Bioline International obtained in October 29, 2016 at 19:06h.”
“In celebration of Open Access week, Bioline International can report that, in the single month of October 2016, more than 1,350,000 full text downloads of articles were made from bioscience journals published in 16 developing countries. Usage statistics are reported on the fly from the web site, see http://www.bioline.org.br/, right hand side of home page. This highhttp://www.bioline.org.br/ usage demonstrates the importance of research from these regions to the progress of international science.
A recently launched online survey of users has recorded some 250 responses to date from 59 countries – see http://bioline.org.br/survey for the results so far. We are hoping to establish which particular aspects of Bioline make the site so well-used – is it because it is Open Access, or is it because the information is difficult to find elsewhere, or . . .? …”
“2,207 users from around the world have implemented repositories with the popular DSpace, out-of-the box, open source, repository software in order to openly share and preserve digital scholarly information from more than 122 countries.”
“OpenAIR was a very early open access repository at a university, putting RGU at the leading edge of what we know now as Green Open Access publishing.”
“The contents of DASH, Harvard’s open-access repository, are now indexed in Yewno and Iris, two new academic search engines connecting related works through common concepts, and using visual interfaces to support semantic exploration rather than keyword searching.”