“Sci-Hub founder Alexandra Elbakyan says that following a legal process, the Federal Bureau of Investigations has gained access to data in her Google account. Google itself informed her of the data release this week noting that due to a court order, the company wasn’t allowed to inform her sooner….
In an email to Elbakyan dated March 2, 2022, Google advises that following a legal process issued by the FBI, Google was required to hand over data associated with Elbakyan’s account. Exactly what data was targeted isn’t made clear but according to Google, a court order required the company to keep the request a secret….”
When Allemai Dagnatchew (SFS ’22) began her final semester of college, the last thing she wanted to worry about was digital privacy. But within the first few days of spring 2022 classes, she found out that one of her professors mandated use of Perusall, a program that allows instructors to see how many students are doing their class readings.
Dagnatchew’s distrust of Perusall mirrors a larger sentiment college students have felt toward proctoring software and their utilization during the pandemic. The use of online test proctoring and similar programs skyrocketed over the last two years, with students reporting discomfort towards proctoring programs since the beginning of virtual learning.
“[Professors] are excited about these programs, and they don’t think it’s weird at all,” she said. “And that’s what I feel like is odd—that they think this is normal.”
Perusall, for example, gives professors access to the amount of time a student spends on a reading and how many of the assigned pages they’ve viewed. Despite students feeling like their privacy is compromised with this access and the return of most students to in-person learning, schools are still utilizing proctoring and similar invasive technologies.
The use of virtual learning tools has been subject to the fluctuating pandemic and schools’ virtual status, with the Omicron variant causing many colleges to move online for final exams and the beginning of the spring semester. As COVID-19 continues, students have been increasingly subject to excessive monitoring technologies—whether proctoring exams or scanning files—such as Proctorio, ProctorU, and Perusall.
Abstract: Deciding what risks are worth taking amidst a global pandemic poses quite specific challenges for Acquisitions librarians. For example, given that virtually all colleges and universities now offer classes electronically, demand for electronic library content has increased sharply. This challenging situation is magnified at smaller campuses, due to their smaller Acquisitions budgets and having to retain substantial print content. In response, vendors are offering free content to those affected and, given the usual limits on library funding, librarians may find such offers almost irresistible. But while there can be advantages to accepting such content, it can be a double-edged sword. In this paper, we examine how crisis situations, such as the current pandemic, affect librarian decision-making, in particular concerning accepting free content from vendors. How do we best navigate these new territories without losing our bearings amidst a pandemic? And how might these decisions and situations affect our patrons? We focus our research on three important issues, with both practical and ethical implications. First, the issue of patron privacy rights. The free content being offered by vendors poses substantial privacy risks for libraries and patrons, because it is not licensed and thus not governed by privacy agreements. Second, we examine the problem of ensuring accessibility for all users and the extent to which accessibility can be guaranteed with non-licensed content. Finally, we look at the likely impact on faculty-librarian relationships when free content will have to be relinquished and libraries cannot afford the same content. Such changes will likely cause tension between faculty and librarians and be especially frustrating for students. While vendors coming to the aid of the libraries during this time is potentially a generous gesture, it also implies pitfalls and negative impacts in its aftermath.
Abstract: On December 19, 2019, The Washington Post reported that the U.S. Justice Department is investigating the founder and operator of Sci-Hub Alexandra Elbakyan on suspicion of working with Russian intelligence to steal U.S. military secrets from defense contractors. The article further discusses Sci-Hub’s methods for acquiring the login credentials of university students and faculty “to pilfer vast amounts of academic literature.” This has long been public knowledge. But the confirmation of Sci-Hub potentially working with Russian intelligence was major news. Both fronts of the Sci-Hub assault on stealing intellectual property are concerning. Since many academic researchers and their employers routinely receive defense contracts to perform sensitive research, the article helped posit that offering free access to academic research articles is perhaps a Trojan Horse strategy for Sci-Hub. To add to The Washington Post’s report, we sought out individuals at universities with a vantage point on Sci-Hub’s activities to see if there is independent evidence to support the report. We spoke to Dr. Jason Ensor who at the time of this interview was Manager, Engagement Strategy and Scholarly Communication, Library Systems at Western Sydney University Library in Australia. Ensor holds four degrees in related critical thinking fields and is an experienced business professional in software development, data scholarship and print publishing. He is also a distinguished speaker on digital humanities and linked fields, presenting regularly in national and international forums.
“Despite the potential societal benefits of granting independent researchers access to digital platform data, such as promotion of transparency and accountability, online platform companies have few legal obligations to do so and potentially stronger business incentives not to. Without legally binding mechanisms that provide greater clarity on what and how data can be shared with independent researchers in privacy-preserving ways, platforms are unlikely to share the breadth of data necessary for robust scientific inquiry and public oversight (1). Here, we discuss two notable, legislative efforts aimed at opening up platform data: the Digital Services Act (DSA), recently approved by the European Parliament (2), and the Platform Accountability and Transparency Act (PATA), recently proposed by several US senators (3). Although the legislation could support researchers’ access to data, they could also fall short in many ways, highlighting the complex challenges in mandating data access for independent research and oversight….
Platforms’ hesitancy to share data with researchers is not wholly unwarranted. A Facebook-approved research partnership with Cambridge Analytica resulted in scandal, a $5 billion fine by the US Federal Trade Commission (FTC) for privacy violations, and new requirements in an FTC consent order to implement comprehensive data privacy and security safeguards (9)….”
“In the spring of 2018, we [Rapid7] launched the Open Data initiative to provide security teams and researchers with access to research data generated from Project Sonar and Project Heisenberg. Our goal for those projects is to understand how the attack surface is evolving, what exposures are most common or impactful, and how attackers are taking advantage of these opportunities. Ultimately, we want to be able to advocate for necessary remediation actions that will reduce opportunities for attackers and advance security. This is also why we publish extensive research reports highlighting key security learnings and mitigation recommendations.
Our goal for Open Data has been to enable others to participate in these efforts, increasing the positive impact across the community. Open Data was an evolution of our participation in the scans.io project, hosted by the University of Michigan. Our hope was that security professionals would apply the research data to their own environments to reduce their exposure and researchers would use the data to uncover insights to help educate the community on security best practices….
Yet IP addresses make up a significant portion of the data being shared in our security research data. While we believe there is absolutely a legitimate interest in processing this kind of data to advance cybersecurity, we also recognize the need to take appropriate balancing controls to protect privacy and ensure that the processing is “necessary and proportionate” — per the language of Recital 4….”
The French Data Protection Agency, CNIL (Commission nationale de l’informatique et des libertés), has concluded that the use of Google Analytics is illegal under GDPR. The CNIL has begun issuing formal notices to website managers using Google Analytics.
The transition to online platforms for education and research—even open ones—has created new, complex, and unprecedented threats to libraries’ commitment to protecting user privacy. Vendors’ involvement in surveillance, even in areas that extend beyond the scope of their scholarly products, stands in direct contradiction to libraries’ core values and all but ensures that more surveillance will make its way into products used throughout the academic enterprise. The erosion of privacy protections and growth of surveillance has become increasingly evident throughout SPARC’s work.
SPARC, a project of the New Venture Fund, is seeking a Visiting Program Officer (VPO) to advance SPARC’s growing work related to privacy and surveillance. The aim of this position will be to educate SPARC members about privacy threats and provide avenues of support for libraries in taking action to address these threats.
While candidates should have a commitment to addressing privacy and surveillance threats, expert-level knowledge of these issues is not a requirement. The VPO will be supported by expertise from the SPARC team and experts from the wider community.
“Coordination is needed with related policy domains, including open science, which enhances transparency into research processes and outputs….
Efforts to implement open science can make more of the process and outputs of scientific research freely and readily accessible to other scientists, engineers, policymakers, students and educators, and the general public, while maintaining needed protections of national security, personal privacy, and other sensitive information. By making research publications, study data, analytical software and code, and study protocols more readily available for inspection and reuse—as Federal science agencies are currently doing—open science affords new opportunities to detect instances of interference, mischaracterization, and other policy violations. As such, open science is an essential enabler of scientific integrity….
Open science policies and practices provide transparency to help ensure that publications, data, and other outputs of Federally funded research are readily available to other researchers, innovators, students, and the public (taking into consideration legal and ethical limitations on access, such as national security and privacy)….
Facilitate free flow of scientific and technological information, by availability online in open formats and, where appropriate, including data and models underlying regulatory proposals and policy decisions…. ”
“SPARC is seeking two Visiting Program Officers (VPOs) to assist us with critical initiatives over the next year. The first is a VPO for Open Models, who will engage with libraries and pro-open publishers to develop programs and resources designed to increase understanding and accelerate adoption of innovative open business models. The second is the VPO for Privacy, whose role will be to educate SPARC members about emerging privacy threats facing libraries and to cultivate a community of practice to support members in developing strategies to mitigate these threats. Applications for both roles will be open until February 23rd. More details about each role, including links to the role descriptions and application are below. Please contact Val Hollister at email@example.com with any questions.”
Helen Beetham, University of Wolverhampton
Amy Collier, Middlebury College
Laura Czerniewicz, University of Cape Town
Brian Lamb, Thompson Rivers University
Yuwei Lin, University of Roehampton
Jen Ross, University of Edinburgh
Anne-Marie Scott, Athabasca University
Anna Wilson, University of Stirling
Abstract: This paper describes and critiques how surveillance is situated and evolving in higher education settings, with a focus on the surveillance of teaching and learning. It argues that intensifying practices of datafication and monitoring in universities echo those in broader society, and that the Covid-19 global pandemic has both exacerbated these practices and made them more visible. Surveillance brings risks to learning relationships, academic and work practices, as well as reinforcing economic models of extraction and inequalities in education and society. Responses to surveillance practices include resistance, advocacy, education, regulation and investment, and a number of these responses are examined here. Drawing on scholarship and practice, the paper provides an in-depth overview of this topic for people in university settings including those in leadership positions, learning technology roles, educators and students. The authors are part of an international network of researchers, educators and university leaders who are working together to develop new approaches to surveillance futures for higher education: https://aftersurveillance.net/. Authors are based in Canada, South Africa, the United Kingdom and the United States, and this paper reflects those specific contexts.
“KU Leuven has launched a new platform that makes access to research data safer for fellow researchers around the world. The University of Leuven announced this Thursday.
The new platform is the RDR platform, which stands for Search Data Warehouse, meaning as much as a “search data warehouse”. On the new platform, researchers will be able to access an archive where data is securely stored. Openness is guaranteed as much as possible, while respecting the legal framework relating to privacy. Using the metadata, researchers from all over the world can find relevant publications and documents.”
“Key points •Openness in research is discussed in many guises and brings manybenefits and there is a need to join up, share good practice andtalk a common language to make maximum progress.•Importance of open publication as a key step to increasing trustand reducing waste in research.•There is a need to be careful about the language used and it iscrucial that the right safeguards are in place to protect people’spersonal data. Personal data should not be‘open’, and discussingit in this way risks its availability and associated use.•Need to start with trust and involvement of patients and the pub-lic to ensure maximum benefit canflow from data.”