In May 2021, DARIAH-EU launched an annual Open Access Monograph Bursary for the publication of one’s first monograph within the domain of Digital Humanities. This initiative aims to support early career researchers to openly disseminate their first monographs in book series relevant to their field, and thus pave pathways to open research culture for arts and humanities disciplines. The bursary will fund the Open Access publication of one monograph (or other long form of scholarship) per year.
The call for the 2021 DARIAH Open Access Monograph Bursary is currently open. The deadline for applications is December 6, 2021.
Q&A session – Bring Your Questions
To support applicants and interested researchers, we will host a Q&A information session on the eligibility criteria for participation in the call on the 25th of June at 10:00-11:00 CEST.
“Three renowned researchers in digital humanities and computer science are joining forces with the Library of Congress on three inaugural Computing Cultural Heritage in the Cloud projects, exploring how biblical quotations, photographic styles and “fuzzy searches” reveal more about the collections in the world’s largest Library than first meets the eye.
Supported by a $1 million grant from the Andrew W. Mellon Foundation awarded in 2019, the initiative combines cutting edge technology with the Library’s vast collections to support digital humanities research at scale. These three outside researchers will collaborate with subject matter experts and technology specialists at the Library of Congress to experiment in pursuit of answers that can only be achieved with collections and data at scale. These collaborations will enable research on questions previously difficult to address due to technical and data constraints. Expanding the skills and knowledge necessary for this work will enable the Library to support emerging methods in cloud-based computing research such as machine learning, computer vision, interactive data visualization, and other areas of digital humanities and computer science research. As a result, the Library and other cultural heritage institutions may build upon or adapt these approaches for their own use in improving access to text and image collections….”
“On June 23-26, we welcomed 32 digital humanities (DH) researchers and professionals to the Building Legal Literacies for Text Data Mining (Building LLTDM) Institute. Our goal was to empower DH researchers, librarians, and professional staff to confidently navigate law, policy, ethics, and risk within digital humanities text data mining (TDM) projects—so they can more easily engage in this type of research and contribute to the further advancement of knowledge. We were joined by a stellar group of faculty to teach and mentor participants. Building LLTDM is supported by a grant from the National Endowment for the Humanities….”
“Publisher intransigence, library unpreparedness, and unshakable humanist allegiance to print forms of research communication distort scholarly communication systems in ways that disadvantage digital humanists and prevent migration to opener and likely more sustainable digital modes of publication and dissemination. This, in turn, isolates and disadvantages the humanities both within and outside the academy. Exactly how the humanities in general and the digital humanities specifically will break out of this untenable box remains unclear. Until they do, however, the monograph crisis will intensify, digital humanists will continue fleeing the academy for fairer, greener pastures, and the humanities will impoverish their own future.”
Abstract: As the scholarly landscape evolves into a more “open” plain, so do the shapes of institutions, labs, centres, and other places and spaces of research, including those of the digital humanities (DH). The continuing success of such research largely depends on a commitment to open access and open source philosophies that broaden opportunities for a more efficient, productive, and universal design and use of knowledge. The Electronic Textual Cultures Laboratory (ETCL; etcl.uvic.ca) is a collaborative centre for digital and open scholarly practices at the University of Victoria, Canada, that engages with these transformations in knowledge creation through its umbrella organization, the Canadian Social Knowledge Institute (C-SKI), that coordinates and supports open social scholarship activities across three major initiatives: the ETCL itself, the Digital Humanities Summer Institute (DHSI; dhsi.org), and the Implementing New Knowledge Environments (INKE; inke.ca) Partnership, including sub-projects associated with each. Open social scholarship is the practice of creating and disseminating public-facing scholarship through accessible means. Working through C-SKI, we seek ways to engage communities more widely with publicly funded humanities scholarship, such as through research creation and dissemination, mentorship, and skills training.
Abstract: Digital humanities are accused of contributing to the decay of Academia in general and of betraying the humanities principles. Through looking at the developments of the field, as well as at its research principles and practices, this article seeks to refute such an allegation, and to show that the passionate debates the digital humanities still raise are related to their critical stance towards ‘traditional’ SSH research. In the first part, the collaborative and FAIR principles (Findable, Accessible, Interoperable and Reusable) that characterise DH approach are examined, in connection with the dissatisfaction they express towards the established research practices and organisation. Based on an example of the exploration of the archives of the Hispanic 20th century vanguard, the second part focuses more specifically on the challenges of working with data and of haptic thinking in the literary and cultural fields.
[This is the abstract for just one of seven presentations.]
Abstract: Over the last decade, the digital humanities community has become increasingly concerned with the ongoing sustainability of digital projects. This anxiety stems in part from the realization that not all digital humanities projects have identical expectations of longevity. Several prominent works in the literature, such as Bethany Nowviskie and Dot Porter’s “Graceful Degradation Survey Findings: How Do We Manage Digital Humanities Projects through Times of Transition and Decline?” (2010) and Geoffrey Rockwell et al.’s “Burying Dead Projects: Depositing the Globalization Compendium” (2014), have been central to this intellectual exchange about the benefits of creating sustainability plans for projects that do not necessarily assume a default permanence, but that instead proactively consider each project’s most suitable longevity strategy.
With this realization has come a concomitant expectation: each digital humanities project must create its own customized sustainability plan, designed with its particular requirements in mind. And yet, few digital humanists have access to direct training on the process of creating and implementing professional-grade digital preservation and sustainability practices for their own work. To support the process of designing and implementing digital sustainability plans for this work, a team of scholars housed in the Visual Media Workshop at the University of Pittsburgh has created the Socio-Technical Sustainability Roadmap (STSR; http://sustainingdh.net). The STSR is a structured, process-oriented workshop, inspired by design thinking and collaborative learning approaches. This workshop, which may be implemented in a variety of institutional contexts, guides project stakeholders through the practice of creating effective, iterative, ongoing digital sustainability strategies that address the needs of both social and technological infrastructures. It is founded on the fundamental assumption that, for sustainability practices to be successful, project leaders must keep the changing, socially-contingent nature of both their project and their working environment(s) consistently in mind as they initiate, maintain, and support their own work. For this panel, we contextualize and describe the STSR, and provide reflections based on our experiences facilitating Sustaining DH: An NEH Institute for Advanced Topics in the Digital Humanities.
“Even when the coronavirus pandemic struck, and access to physical library resources came to a halt, Matt Miller and his research team didn’t have to hit pause on their project. Aided by the digital collections and research support available through the University of Maryland Libraries’s membership with Hathitrust, they could continue moving forward with their work detecting and transcribing Persian and Arabic texts.
Miller — a professor at the Roshan Institute for Persian Studies in the University of Maryland’s School of Languages, Literatures and Culture — leads a team of global scholars working to develop a user-friendly software that can create digital text using scans of Persian and Arabic books. Their enterprise is supported by an $800,000 grant Miller received from the Andrew W. Mellon Foundation back in 2019. …”
“The Academia Sinica Digital Humanities Research Platform develops digital tools to meet the demands of humanities research, assisting scholars in upgrading the quality of their research. We hope to integrate researchers, research data, and research tools to broaden the scope of research and cut down research time. The Platform provides a comprehensive research environment with cloud computing services, offering all the data and tools scholars require. Researchers can upload texts and authority files, or use others’ open texts and authority files available on the platform. Authority terms possess both manual and automatic text tagging functions, and can be hierarchically categorized. Once text tagging is complete, you can calculate authority term and N-gram statistics, or conduct term co-occurrence analysis, and then present results through data visualization methods such as statistical charts, word clouds, social analysis graphs, and maps. Furthermore, Boolean search, word proximity search, and statistical filtering, enabling researchers to easily carry out textual analysis.”