Scholarly Communication and Its Infrastructures – Society for Social Studies of Science conference, Dec. 2022 | Asura Enkhbayar, Tim Elfenbein

one of the panels accepted for the Society for Social Studies of Science conference, to be held in Mexico, 7-10 Dec. 2022

“Asura Enkhbayar, Simon Fraser Univeristy; Timothy Weil Elfenbein, Community-led Open Publication Infrastructure for Monographs (COPIM)

Posted: February 28, 2022…

We are living in an era that has produced a profusion of new knowledge about the scholarly communication landscape: of what kinds of scholars publish in what kinds of venues; of the consolidation of the publishing and information analytics sector; and of the extent of the literature shifting to open access. And yet, we still have very little knowledge about the daily work of publishers and other intermediaries. Science studies differentiated itself from an earlier sociology of science, in part, by focusing on the everyday embodied work of producing science, embedded in object-filled spaces, with practical repertoires and flexible interpretive logics. Studies of scholarly communication have yet to embrace this turn toward everyday work, instead preferring distanced analysis of the publishing system’s inputs and outputs over labor, work process, and artifactually mediated interaction. What understandings of publishing might emerge from a shift toward a labor and process view of scholarly communication at large?   We particularly want to hear from those engaged in ongoing work on publishing infrastructure, at the articulation point of technology, scholarly practice, and representational form: for instance, work on metadata systems; annotation layers; software and code sharing; dataset and document versioning; datafication and analysis of text; modes of linking disparate digital objects into coherent research projects; modes of decomposing, recomposing, and augmenting existing scholarship. Those working at the edges of the scholarly communication ecology, rethinking the genres, forms, or emplacements of scholarship, are also well positioned to reflect on the mediation of knowledge.

Contact: asura.enkhbayar@gmail.com, timelf@gmail.com

Keywords: Scholarly Communication, Publishing, Labor”

Meeting

Accepted Open Panels

How UiT The Arctic University of Norway protects researchers’ freedom to choose whatever publication venue they want | Plan S

In 2008 Harvard’s Faculty of Arts & Sciences voted unanimously to adopt a ground-breaking open access policy. Since then, over 70 other institutions, including other Harvard faculties, Stanford and MIT, have adopted similar policies based on the Harvard model. In Europe such institutional policies have, so far, been slow to get off the ground.

We are beginning to see that situation change. In 2021 the University of Tromsø – The Arctic University of Norway (UiT) adopted an Open Access policy that came into force on 1st January 2022.

Here, UiT members Camilla Brekke (Prorector for Research and Development), Johanne Raade (Library Director), Tanja Larssen (Open Science Advisor) and Per Pippin Aspaas (Head of Library Research and Publishing Support), tell us about the process of creating and implementing their policy….”

Public Access Language in the U.S. Innovation & Competition Act (USICA) – SPARC

“The House and Senate are currently considering a key legislative package aimed at bolstering America’s science and technology investments. The Senate bill, called the U.S. Innovation & Competition Act (USICA), includes language that supports providing public access to taxpayer-funded research results. 

Section 2527 of USICA would codify the current policy established by President Obama’s 2013 White House Memorandum on Increasing Public Access to Federally Funded Scientific Research by “directing federal agencies funding more than $100 million annually in research and development expenditures to provide for free online public access to federally-funded research no later than 12 months after publication in peer-reviewed journals, preferably sooner.” 

This language signals Congress’ continued support for making taxpayer-funded research readily available and fully usable by scientists and the public alike. SPARC supports maintaining this provision, even as we continue to advocate for a zero-embargo national open access policy. 

Current Status: On March 28th, the Senate cleared a procedural hurdle to begin the conference process with the House. The House is expected to officially call for a House-Senate Conference Committee to work out differences between the two bills in the coming days….”

Subscribe to Open Developments from Annual Reviews – Crowdcast

“Annual Reviews developed the Subscribe to Open (S2O) business model to make valuable  scholarly content open to all – those at your institution, in your local community, and globally. 

During this online event, you will learn more about the progress that we have made so far, as well as our exciting future plans for Subscribe to Open at Annual Reviews.

We’ll start with an energetic discussion among four panelists sharing a variety of perspectives on Annual Reviews’ move to open access through Subscribe to Open. Virginia Steel, UCLA librarian, will moderate a conversation among Richard Gallagher, President and Editor-in-Chief of Annual Reviews; Curtis Brundy, a university librarian; and Tracey Mears, a researcher and professor. A live Q&A from the audience will follow the panel discussion….”

Global Community Guidelines for Documenting, Sharing, and Reusing Quality Information of Individual Digital Datasets

Open-source science builds on open and free resources that include data, metadata, software, and workflows. Informed decisions on whether and how to (re)use digital datasets are dependent on an understanding about the quality of the underpinning data and relevant information. However, quality information, being difficult to curate and often context specific, is currently not readily available for sharing within and across disciplines. To help address this challenge and promote the creation and (re)use of freely and openly shared information about the quality of individual datasets, members of several groups around the world have undertaken an effort to develop international community guidelines with practical recommendations for the Earth science community, collaborating with international domain experts. The guidelines were inspired by the guiding principles of being findable, accessible, interoperable, and reusable (FAIR). Use of the FAIR dataset quality information guidelines is intended to help stakeholders, such as scientific data centers, digital data repositories, and producers, publishers, stewards and managers of data, to: i) capture, describe, and represent quality information of their datasets in a manner that is consistent with the FAIR Guiding Principles; ii) allow for the maximum discovery, trust, sharing, and reuse of their datasets; and iii) enable international access to and integration of dataset quality information. This article describes the processes that developed the guidelines that are aligned with the FAIR principles, presents a generic quality assessment workflow, describes the guidelines for preparing and disseminating dataset quality information, and outlines a path forward to improve their disciplinary diversity.

Basically Everyone Tells Senators Tillis & Leahy That The SMART Copyright Act Is An Incredibly Dumb Copyright Act

We’ve already detailed why the latest bill from Senators Thom Tillis and Pat Leahy, the SMART Copyright Act, is dangerous to the future of the internet. You can read that earlier article, but the short summary is that it would deputize the Copyright Office every three years to arbitrarily bless certain “technological measures” that websites, that host 3rd party content, would need to use. The not so hidden agenda here, pushed by Hollywood basically since the internet came on their radar, is that the Copyright Office will say that any site hosting user uploaded content will need to purchase an upload filter to scan each upload to make sure it doesn’t include any of Hollywood’s content.

That upload filters routinely block perfectly legal speech is not the concern of Hollywood — or, apparently, of Tillis or Leahy (they just want to keep Hollywood happy).

Anyway, we already noted how Creative Commons responded angrily to Tillis’ office implying that Creative Commons supported the bill when it absolutely does not. But lots of other organizations are making it known that this bill would be a disaster for the open internet. A wide range of civil society groups, trade organizations, companies, and academics recently sent a letter explaining the many problems of the bill:

First, the proposed amendments to § 512(i) break the careful balance between innovation and copyright protection struck by the DMCA. For example, they significantly lessen service provider and user clarity and certainty in present and future technical measures that are employed to maintain a safe harbor for service and innovation. Rather than build confidence in the use of technical measures or incentivize further collaborative solutions, these changes would inject uncertainty into a law that has proven foundational and has supported creators, rightsholders, consumers, and online service providers of all kinds. The Copyright Office has recognized that in the decades since the passage of the DMCA, no “standard technical measures” (STMs) have emerged. Far from demonstrating an underlying flaw in the DMCA as the legislation appears to assume, this lack of standard technical measures is largely because the constructive uses of the Internet and the technologies and media involved have become so diverse. Identifying anything as “standard” under the new proposal, and avoiding technical conflicts between measures so identified, will become more, not less, difficult.

Second, the new § 514 would result in endless triennial litigation cycles through the creation of an entirely separate—and potentially unconstitutional—category of government-mandated “designated technical measures.” Section 514 gives the Copyright Office1 authority far beyond its competence and expertise to identify and mandate such measures, transforming it into an Internet regulator with responsibility for overseeing an elaborate, multi-agency bureaucratic process that would recur every three years. To avoid costly litigation and potentially extensive statutory damages, service providers would be effectively compelled to devote significant resources into implementing such measures, only to find themselves continuously exposed to renewed obligations each time new measures are designated. Such direct and heavy-handed governmental involvement in the creation of technical mandates for private industry conflicts with traditional U.S. standards policy.

This proposal would also put an agency with no engineering or other relevant expertise in charge of how digital products are designed, irrespective of whether copyright infringement is actually occurring. Additionally, the Copyright Office does not have the expertise to evaluate complex technical issues such as cybersecurity and competition.3 The legislation would put the government in the position of picking winners and losers in the market for content recognition technologies, which risks corruption and capture from specific businesses and vendors pitching their own products.4 The potentially overlapping and burdensome technical requirements designated through this process would ultimately harm users — risking their privacy and security, undermining the stability of services they rely on, and limiting choice and access to information.

Finally, digital services are already constantly fine-tuning their efforts to combat infringement online in response to the evolving tactics of commercial infringers, and they have done so with notable success.5 The legislation thus is not only unnecessary, but would freeze these efforts and stifle the ability of online services to get ahead of emerging challenges — locking collaboration into a triennial regulatory cycle and discouraging the private sector from making critical investments outside of these cycles. Within months of the designation of a technical measure, sophisticated infringers would find workarounds, while service providers would be on an endless cycle of “designated technical measure” rulemakings. Measures designated in one cycle could be rescinded in the next, creating uncertainty and constant churn.

Also a separate letter was sent from a whole bunch of internet companies (not in the “big tech” category), such as Patreon, Etsy, Cloudflare, Pinterest, and Reddit, explaining just how damaging this bill would be.

Changing the DMCA could easily make our work too expensive, difficult, or risky. But the SMART Act would do just that. For example, the bill would authorize the Copyright Office to mandate copyright upload filters. It would create ambiguous legal terms, like “relevant service providers,” that we would have to wade through during drawn-out lawsuits few of us could afford. It would generate a complex maze of “standard” and “designated” technical measures that apply to different companies in different ways—we would have to figure out which ones we had to adopt and if we got it wrong we would be back in court. This is all setting aside any actual copyright infringement, because the bill would allow large rightsholders to sue us just over whether we were using the right technologies.

Finally, it is not just about our businesses. Many of us know, first-hand, how improper copyright takedowns force our user’s non-infringing posts offline. Over-reliance on technology promises to exacerbate those concerns, stripping your constituents of expressive, creative, and economic opportunities. With stakes this high, we hope you will reconsider the SMART Act and instead focus on pro-innovation proposals that can expand opportunities for us and our users.

The Internet Archive has also made it clear just how dangerous this bill would be:

This bill and its supporters do not represent the public’s interest in fair copyright policy and a robust and accessible public domain. That is a shame, because much good could be done if policymakers would put the public’s interest first. For example, the Copyright Office—which holds records of every copyright ever registered, including all those works which have passed into the public domain—could help catalogue the public domain and prevent it from being swept up by today’s already-overzealous automated filtering technologies (an idea inspired by this white paper from Paul Keller and Felix Reda). Instead, the public domain continues to be treated as acceptable collateral damage in the quest to impose ever-greater restrictions on free expression online.

This bill is extremely harmful. So far, all signs suggest that Tillis and Leahy don’t care about that at all. But the people who use the internet every day should care about it — and should care about the cavalier attitude these senators have towards the internet, all because Hollywood supports them and hates the open internet.

NYU Faculty Cluster Hiring Initiative: Building STEM for the Public Good: Cultivating Openness in the Sciences

“Primary School: Division of Libraries

Participating Departments/Units

Business and Government Information Services
Data Services
Scholarly Communications & Information Policy
Science Services

Apply via Interfolio

STEM Librarian
Librarian for Open Innovation …”

Numbers Speak for Themselves, or Do They? On Performance Measurement and Its Implications – Berend van der Kolk, 2022

van der Kolk, B. (2022). Numbers Speak for Themselves, or Do They? On Performance Measurement and Its Implications. Business & Society. https://doi.org/10.1177/00076503211068433
 

Abstract; Performance measurement systems have the potential to improve organizational outcomes, but they often come at a cost. This commentary highlights the individual, organizational, and societal costs of performance measurement systems and explores how such costs could be reduced.

 

What we are working on: 2022-03-25 | Invest in Open Infrastructure

“…What we are working on: We shared our journey and lessons learnt from evolving our governance. In our latest blog post, we shared some of our readings and research into non-profit governance, our reflections and motivations for designing new governance structures and bodies, and what we have learnt on the way.

As next steps in our governance roadmap, we are creating charters with our Steering Committee and Community Oversight Council to bring clarity to our respective responsibilities and ways of working.

We welcomed (back) Saman Goudarzi to IOI! Saman is a recent graduate of the University of Toronto and worked with us previously on the Future of Open Scholarship report. She’ll be working with us as a part-time research analyst on several ongoing projects. We are looking forward to working with her again and onboarding more research analysts to support our research work in the future.

With Saman’s help, we are looking at the literature defining infrastructure particularly in the context of open science and scholarly communication.

We’re finalizing our initial draft of research on water utility funding models and costs frameworks as we explore the parallels between funding for water supply and open infrastructure. We look forward to releasing this research in a preliminary investigation report in the near future.

We continue to work on our strategic roadmap following the conclusion of our strategy retreat. We look forward to having details to share publicly in the near future.

We have a new Research Organization Registry (ROR) ID for IOI: 0588n4k65….”

Frontiers | Toward More Inclusive Metrics and Open Science to Measure Research Assessment in Earth and Natural Sciences | Research Metrics and Analytics

“Diversity, equity and inclusion are key components of Open Science. In achieving them, we can hope that we can reach a true Open Access of scientific resources, one that encompasses both (i) open access to the files (uploading them to a public repository) and (ii) open access to the contents (including language). Until we decide to move away from profit-driven journal-based criteria to evaluate researchers, it is likely that high author-levied publication costs will continue to maintain inequities to the disadvantage of researchers from non-English speaking and least developed countries. As quoted from Bernard Rentier, “the universal consensus should focus on the research itself, not where it was published.” ”

An open science argument against closed metrics

“In the Open Scientist Handbook, I argue that open science supports anti-rivalrous science collaborations where most metrics are of little, or of negative value. I would like to share some of these arguments here….

Institutional prestige is a profound drag on the potential for networked science. If your administration has a plan to “win” the college ratings game, this plan will only make doing science harder. It makes being a scientist less rewarding. Playing finite games of chasing arbitrary metrics or ‘prestige’ drags scientists away from the infinite play of actually doing science….

As Cameron Neylon said at the metrics breakout of the ‘Beyond the PDF’ conference some years ago, “reuse is THE metric.” Reuse reveals and confirms the advantage that open sharing has over current, market-based, practices. Reuse validates the work of the scientist who contributed to the research ecosystem. Reuse captures more of the inherent value of the original discovery and accelerates knowledge growth….”