Open Scholarship Support Guide.pdf(Shared)- Adobe Document Cloud

“Steps to Support Open Scholarship

Open scholarship entails a culture shift in how research is conducted in universities. It requires action on the part of university administration, working in concert with faculty, sponsors and disciplinary communities.  Universities should consider steps in three areas:

•  Policies:  Language and guidance should be reviewed for alignment with open scholarship, in particular: (1) academic hiring, review, tenure and promotion (valuing diverse types of research products; metrics that  incentivize the open dissemination of articles, data, and other research outputs; and valuing collaborative research); (2) intellectual property (ownership, licensing and distribution of data, software, materials and publications); (3) research data protection (for data to be stored and shared through repositories); (4) attribution (recognizing full range of contributions);  and (5) privacy (insuring that privacy obligations are met). 

•  Services and Training:  Researchers need support to assure that data and other research objects are managed according to FAIR Principles: findable, accessible, interoperable and reusable.  While the specific solution must be tailored to the discipline and research, common standards, including Digital Object Identifiers (DOIs), must be followed.

•  Infrastructure:  Archival storage is required for data, materials, specimens and publications to permit reuse.  Searchable portals are needed to register research products where they can be located and accessed. Universities can recognize efficiencies by utilizing external resources (including existing disciplinary repositories) and by developing shared resources that span the institution when external resources do not exist.

Presidents and provosts are encouraged to work with their academic senates to create an open scholarship initiative that promotes institution-wide actions supporting open scholarship practices, while remaining sufficiently flexible to accommodate disciplinary differences and norms….”

Statement on the Scholarly Merit and Evaluation of Open Scholarship in Linguistics | Linguistic Society of America

“Open Scholarship can be a key component for a scholar’s portfolio in a number of situations, including but not limited to hiring, review, promotion, and awards. Because Open Scholarship can take many forms, evaluation of this work may need different tools and approaches from publications like journal articles and books.  In particular, citation counts, a common tool for evaluating publications, are not available for some kinds of Open Scholarship in the same form or from the same providers as they are from publications. Here we share recommendations on how to assess the use of Open Scholarship materials including and beyond citations, including materials that both have formal peer review and those that do not not.

For tenure & promotion committees, program managers, department chairs, hiring committees, and others tasked with evaluating Open Scholarship, NASEM has prepared a discipline-agnostic rubric that can be used as part of hiring, review, or promotion processes. Outside letters of evaluation can also provide insight into the significance and impact of Open Scholarship work. Psychologist Brian Nosek (2017) provides some insight into how a letter writer can evaluate Open Scholarship, and includes several ways that evaluation committees can ask for input specifically about contributions to Open Scholarship. Nosek suggests that letter writers and evaluators comment on ways that individuals have contributed to Open Scholarship through “infrastructure, service, metascience, social media leadership, and their own research practices.” We add that using Open Scholarship in the classroom, whether through open educational materials, open pedagogy, or teaching of Open Scholarship principles, should be included in this list. Evaluators can explicitly ask for these insights in requests to letter writers, for example by including the request to “Please describe the impact that [scholar name]’s openly available research outputs have had from the research, public policy, pedagogic, and/or societal perspectives.” These evaluations can be particularly important when research outputs are not formally peer reviewed.

For scholars preparing hiring, review, promotion, or other portfolios that include Open Scholarship, we recommend not only discussing the Open Scholarship itself, but also its documented and potential impacts on both the academic community as well as broader society. Many repositories housing Open Scholarship materials provide additional metrics such as views, downloads, comments, and forks (or reuse cases) alongside citations in published literature. The use and mention of material with a Digital Object Identifier (DOI) can be tracked using tools such as ImpactStory, Altmetric.com, and other alternative metrics. To aid with evaluation of this work, the creator should share these metrics where available, along with any other qualitative indicators (such as personal thank-yous, reuse stories, or online write-ups) that can give evaluators a sense of the impact of their work. The Metrics Toolkit provides examples and use cases for these kinds of metrics. This is of potential value when peer review of these materials may not take the same form as with published journals or books; thoughtful use and interpretation of metrics can help evaluators understand the impact and importance of the work.

The Linguistic Society of America reaffirms its commitment to fair review of Open Scholarship in hiring, tenure, and promotion, endorses all of these approaches to peer review and evaluation of Open Scholarship, and encourages scholars, departments, and personnel committees to take them into careful consideration and implement language about Open Scholarship in their evaluation processes.”

Dissemination of applied research to the field: attitudes and practices of faculty authors in social work

Abstract:  In applied research disciplines like social work, there is a clear disconnect between the production and dissemination of research and the access and use of research in practice. This research/practice divide is particularly problematic for practitioners required to work within evidence-based or research-informed frameworks. To explore this issue, we conducted a nationwide survey and qualitative interviews with social work faculty regarding their research dissemination attitudes and practices, especially to non-academic audiences. The survey and interviews provide data on faculty dissemination methods, attitudes toward gold and green open access and promotion and tenure considerations. Results demonstrate that faculty are primarily engaged with traditional publishing models and much less engaged with dissemination to non-academic audiences. Faculty are skeptical of open access journals, avoid article processing charges and are only minimally engaged with institutional repositories. Faculty are conflicted regarding the dissemination of their research, especially in the context of promotion and tenure. Shifting dissemination outside of non-academic audiences would require increased confidence in open access, support for the creation of practitioner-focused materials and prioritizing the impact of research on practice.

 

Recognition and rewards – Open Science – Universiteit Utrecht

“Open science means action. And the way we offer recognition and reward to academics and university staff is key in bringing about the transition that Utrecht University aims for. Over the course of the past year the working group on Recognition and Rewards, part of the Open Science Programme, has reflected and thoroughly debated a novel approach to ensuring that we offer room for everyone’s talent, resulting in a new vision (pdf)….

In the current system, researchers and their research are judged by journal impact factors, publisher brands and H-indices, and not by actual quality, real use, real impact and openness characteristics….

Under those circumstances, at best open science practices are seen as posing an additional burden without rewards. At worst, they are seen as actively damaging chances of future funding and promotion & tenure. Early career researchers are perhaps the most dependent on traditional evaluation culture for career progression, a culture held in place by established researchers, as well as by institutional, national and international policies, including funder mandates….”

 

 

Utrecht University Recognition and Rewards Vision

“By embracing Open Science as one of its five core principles1, Utrecht University aims to accelerate and improve science and scholarship and its societal impact. Open science calls for a full commitment to openness, based on a comprehensive vision regarding the relationship with society. This ongoing transition to Open Science requires us to reconsider the way in which we recognize and reward members of the academic community. It should value teamwork over individualism and calls for an open academic culture that promotes accountability, reproducibility, integrity and transparency, and where sharing (open access, FAIR data and software) and public engagement are normal daily practice. In this transition we closely align ourselves with the national VSNU program as well as developments on the international level….”

Practitioner Perspectives: The DOERS3 Collaborative on OER in Tenure and Promotion – BCcampus

“Andrew McKinney, OER coordinator at the City University of New York (CUNY), and Amanda Coolidge, director of Open Education at BCcampus in British Columbia, Canada, share the development of an adaptable matrix to help faculty include OER (Open Educational Resources) in their tenure and promotion portfolios. …”

Recognition and rewards in the Open Era: Turning thoughts into actions | Open Working

“The TU Delft Open Science programme held its very first thematic session on the Recognition and Rewards cross-cutting theme on October 5, 2020. The Open Science Programme currently has 5 projects and 3 cross-cutting themes, from FAIR software to Open Education. This means that the programme core team is composed of members from many different departments (not only within the Library), bringing in their diverse perspectives and skills! But this also poses a challenge on teamwork- we need a way for us to all stay in touch, be able to see and learn from each other’s work, and contribute and provide feedback – hence the idea of the thematic sessions.Ingrid Vos, the leader of the Recognition and Rewards theme, has kindly volunteered to lead this first thematic session. Since this theme relates to everyone’s work within the Open Science Programme, Ingrid wanted to make sure everyone can be effectively engaged in the session and their voices can be heard – more on this below.Key takeaways: A re-examination of rewards and recognition is needed to further fuel the cultural and behavioural changes towards open science TU Delft’s work in this aspect builds upon VSNU’s “Room for everyone’s talent” position paper. Every university in the Netherlands has a committee on Recognition & Rewards. The TU Delft committee is led by Ena Voûte. The Open Science Programme team had fruitful discussions around open research and education behaviours and “products”, how to evaluate, appreciate and reward these, as well as emerging career paths We’d love to hear your ideas and thoughts, both on rewards and recognition and on how you’d like to contribute and participate in these discussions- please use the comment section of this post!  …”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Reimagining Academic Career Assessment: Stories of innovation and change

“This report and the accompanying online repository1 bring together case studies in responsible academic career assessment. Gathered by the San Francisco Declaration on Research Assessment (DORA),2 European University Association (EUA),3 and Scholarly Publishing and Academic Resources Coalition (SPARC) Europe, 4 the case studies independently serve as a source of inspiration for institutions looking to improve their academic career assessment practices. Following the publication of guidelines and recommendations on more responsible evaluation approaches, such as DORA,5 the Leiden Manifesto for Research Metrics, 6 and the Metric Tide, 7 more and more institutions have begun to consider how to implement a range of practical changes and innovations in recent years. However, information about the creation and development of new practices in academic career assessment are not always easy to find. Collectively, the case studies will further facilitate this “practical turn” toward implementation by providing a structured overview and conceptual clarity on key characteristics and contextual factors. In doing so, the report examines emerging pathways of institutional reform of academic career assessment…”