Opinion: Why we’re becoming a Digital Public Good — and why we aren’t | Devex

“A few months ago, Medtronic LABS made the decision to open source our digital health platform SPICE, and pursue certification as a Digital Public Good. DPGs are defined by the Digital Public Good Alliance as: “Open-source software, open data, open AI models, open standards, and open content that adhere to privacy and other applicable laws and best practices, do no harm by design, and help attain the Sustainable Development Goals.” The growing momentum around DPGs in global health is relatively new, coinciding with the launch of the U.N. Secretary General’s Roadmap for Digital Cooperation in 2020. The movement aims to put governments in the driver’s seat, promote better collaboration among development partners, and reduce barriers to the digitization of health systems.”

Giving students everywhere up-close access to a world of art – Harvard Gazette

“Since its inception, the database of cultural heritage images available for free online with IIIF capability has continued to grow. In 2022, the IIIF community estimated that between all their participating cultural heritage institutions, they’ve made available more than 1 billion items available.

“With IIIF, we’re investing in the cultural heritage image community,” Snydman said. “Our goal is global, universal, as open as possible. It’s not just about Harvard’s images; it’s about enabling students and faculty to interact in the very same way with images at Oxford, the Library of Congress, or the Vatican that they do with images held at Harvard. The code word for this is interoperability.”

Of the 1 billion IIIF-compatible items, about 6 million are held in Harvard’s library collections. Everything from 500-year-old maps to modern photographs are viewable in high resolution by anyone with an internet connection. Emily Dickinson’s pencil strokes can be magnified and examined, and Persian manuscripts like the one studied by Kim’s class can be compared with illustrations from the same region and period held at the Library of Congress….

“The fact that IIIF has been able to become a universal standard, and that it’s all open-source — that has exciting implications for democratized learning,” said Snydman. “Students and scholars of all ages have the opportunity to learn with images — not just in a physical classroom or library, not just during certain hours, and not just on Harvard’s campus. This is a great example of how technology can be used to minimize inequalities in education and give open access to knowledge.” …”

Open Net Zero—can we build a web of net-zero data for everyone? | Icebreaker One, 2022-09-22 | Gavin Starks

“Today, Icebreaker One announces Open Net Zero search at https://opennetzero.org. It is a starting point for net-zero data infrastructure built to address commercial, non-commercial, government and public needs.  It’s designed to help make net-zero data discoverable, accessible and usable. There is a lot of Open Data related to net zero (e.g. company disclosures) and we aim to make this far more discoverable than it is today.  However, much of the data needed to drive net-zero decisions is not openly licensed or free for anyone to use. We aim to make this data more discoverable. To address restricted usage, we are building a Trust Framework for data sharing. This enables Shared Data to be discovered and licensed at scale.  We are not building a ‘database’ of all the data. We are working with partners [see below] to enable all the data to be more discoverable using open standards. Ideally, anyone should be able to make their own search engine or build their own data lake based on these open standards.  …”

Reducing Barriers to Open Science by Standardizing Practices and Realigning Incentives

“Open science, the practice of sharing findings and resources towards the collaborative pursuit of scientific progress and societal good, can accelerate the pace of research and contribute to a more equitable society. However, the current culture of scientific research is not optimally structured to promote extensive sharing of a range of outputs. In this policy position paper, we outline current open science practices and key bottlenecks in their broader adoption. We propose that national science agencies create a digital infrastructure framework that would standardize open science principles and make them actionable. We also suggest ways of redefining research success to align better with open science, and to incentivize a system where sharing various research outputs is beneficial to researchers.”

Blog post 9Dec2022 | The OA Switchboard I

“Following on from our initial post on the ‘intermediary’, the second on the ‘institution’, the third on the ‘publisher’ and the fourth on the ‘funder’ in the series, in this last post in this series we cover how we work together with other community led foundational (infrastructure and standards) solutions, and innovative services and solutions (both commercial and non-commercial) that are being built on top, and also how our tech partner (ELITEX), and their Dutch branch (Appetence), fit into all this….”

The great convergence – Does increasing standardisation of journal articles limit intellectual creativity? | Impact of Social Sciences

“To be sure, plenty of original research across many disciplines is regularly published in otherwise conventional formats, and even producing a relatively conventional article in STS is not exactly a trivial matter (as we can attest from experience). Yet, we also believe that especially in interpretive fields, the perceived generative potential of research lies in enabling contributions that readers will find original, critical or otherwise inspiring. It is precisely this potential to generate surprise on a conceptual level that is at risk when a typical convention of how to frame arguments becomes too strong. Will STS be open and welcoming to diverse and varied intellectual traditions and concepts with this increasingly dominant typical article format? On the basis of our findings, we are not so sure.”

Fermilab/CERN recommendation for Linux distribution

“CERN and Fermilab jointly plan to provide AlmaLinux as the standard distribution for experiments at our facilities, reflecting recent experience and discussions with experiments and other stakeholders. AlmaLinux has recently been gaining traction among the community due to its long life cycle for each major version, extended architecture support, rapid release cycle, upstream community contributions, and support for security advisory metadata. In testing, it has demonstrated to be perfectly compatible with the other rebuilds and Red Hat Enterprise Linux.

CERN and, to a lesser extent, Fermilab, will also use Red Hat Enterprise Linux (RHEL) for some services and applications within the respective laboratories. Scientific Linux 7, at Fermilab, and CERN CentOS 7, at CERN, will continue to be supported for their remaining life, until June 2024….”

CERN and Fermilab Opt for AlmaLinux as Standard for Big Science

“CERN and Fermilab will make AlmaLinux the standard distribution for experiments at their facilities based on feedback from stakeholders.

Following CentOS’s withdrawal from the enterprise server distribution market, AlmaLinux and Rocky Linux have emerged as the two best RHEL-based derivatives in this segment. As a result, it is not surprising that when looking for a free alternative to Red Hat Enterprise Linux, the choice frequently comes down to one of the two.

Probably two of the world’s leading scientific laboratories, the Swiss-based CERN and the US-based Fermilab, faced a similar dilemma….

 

Unfortunately, CERN and Fermilab do not disclose any additional details about the nature of the tests or the alternatives that led to the final choice to adopt AlmaLinux exclusively….”

The need for open technology standards for environmental monitoring | by Journal of Open HW | Nov, 2022 | Medium

“The barriers to the uptake of open hardware in environmental monitoring may seem insurmountable: not only is procurement difficult, but expertise is often hard to find and capacity is hard to build in the context of widespread commercialization of the sciences. We have already made some progress, yet not enough to gain the visibility that other open initiatives have in the broader context of Open Science. With the allocation of resources and capacity, there are straightforward ways to address the standardization issues of open instrumentation for environmental monitoring. In the US, with attention to addressing climate change and environmental inequities through initiatives such as Justice40 and legislation such as the Inflation Reduction Act, carving out a space for the inclusion of open hardware would be in the interest of an environmental monitoring space that is focused on the advancement of collective agendas towards community and environmental health. To accomplish this, we suggest the following strategies:

Co-design a common space for the generative “un-siloing” for researchers, open hardware developers, and environmental regulatory authorities. The first aim of this common space should be to create a shared agenda with actionable objectives leading toward concrete goals in the near, medium, and long term.
Co-create a certification system for open environmental monitoring hardware that can operate within regulatory systems of environmental governance. Such a system should identify where and how open hardware tools and the resulting data can be used.
Solve the documentation dilemma with standardization efforts for open instrumentation in which updates and new iterations can be easily followed and understood. A collective effort towards providing a repo of open tools, their use and role in environmental monitoring, and where and how data from these tools can constructively be used in environmental governance and management is a must.
Ensure a percentage of research funds are allocated to the maintenance of open scientific technology projects. To help senior scientists support open technologies, point them to the discussion on the return on investment in open hardware.
Common resources and community-building efforts should focus on infrastructure across the open ecosystem, not just a singular tool. While open hardware involves the design and implementation of the material part of environmental monitoring, it is part of a much broader ecosystem of open technologies that involve software, data, and analytic tools. Funding agendas many times segregate infrastructural components, and domain experts focus on their piece of the infrastructure.
Commercialization of the sciences tends to undermine our ability to achieve cohesive, inclusive, and usable environmental governance structures. Looking to open source communities for better practices for research collaboration may allow for common, centralized efforts and agendas to exist while maintaining the autonomy of decentralized projects and organizations….”

We All Know What We Mean, Can We Just Put It In The Policy? – The Scholarly Kitchen

“There is an elephant in the scholarly infrastructure room and, while some are ready to talk about it generally, few want to describe that elephant in all its glorious detail. That elephant is the guidance organizations provide to the community about the use of persistent identifiers in our community. At present, the guidance is too vague and it needs to be specific, at least at a high level, in order for the national and international mandates to be most effective.

The August 2022 OSTP “Nelson” Memo  laid out in general terms what it would take for content to be IDEALLY publicly available. This included when content should be released, and also its form and structure, suggesting that content should be made accessible in a structured form (i.e., XML or similar) along with associated “Digital Persistent Identifiers” (DPIs)—using the OSTP memo’s language, though these are more commonly referred to as persistent identifiers (PIDs)—and metadata. Because the memo is providing guidance for the numerous agencies impacted by the new policy so that they can craft their own plans, it didn’t provide explicit instruction on what those DPIs should be or the exact structure of basic metadata. It is anticipated that the affected agencies will then put forward their own specific plans, due to be submitted by February, for implementing these principles….”

A Failure to Communicate: Indicators of Open Access in the User Interface – The Scholarly Kitchen

“Over the past months, we have conducted a pilot investigation into the signals used on a sample of platforms. Our analysis revealed that this aspect of the user interface is not standardized across the industry nor are symbols and text phrases always used predictably within a given platform. Industry initiatives such as Seamless Access and GetFTR have attended to questions related to consistent user experience with authorization and indicators for access for subscribed content. In today’s post, we bring attention to the uneven user experience with signals indicating open access content.

In designing our inquiry, we were guided by two research questions: (1) how do publishing platforms indicate which articles are open access, and (2) is there consistency in the indicators used within and across scholarly publisher platforms? We believe this multi-platform analysis is particularly important as most users must navigate multiple sites during their research journey and so the user experience is not in the control of any single publisher. We selected five major publishers for our analysis: Elsevier, Springer, Wiley, Sage, and Taylor & Francis. As some of the largest academic publishers, their platforms are likely to be used, at least at some point, by the typical faculty member or college student user. …”