“The Blue-Cloud project is piloting a web-based Open Science cyberspace to service the needs of marine scientists and researchers in the marine domain. This survey was launched to build a vision for its long-term evolution into 2030, generating value and benefits for a much larger user base and for wider stakeholder communities -including not only scientists & researchers, but also Blue Economy SMEs, maritime industries, policy makers, NGOs and ultimately citizens. Your response to this survey will contribute to shaping strategic policy recommendations towards that end, considering your needs and expectations and aligning with wider developments….”
“The Global Collaboration on Traumatic Stress, a coalition of 11 scientific societies in the field of traumatic stress, is conducting a survey to better understand traumatic stress researchers’ opinions and experiences regarding data sharing and data re-use.
If you are a traumatic stress researcher at any career stage (including trainees) we invite you to share your opinions and experiences by participating in this survey. …”
Abstract: One of the ways in which the publisher PLOS supports open science is via a stringent data availability policy established in 2014. Despite this policy, and more data sharing policies being introduced by other organizations, best practices for data sharing are adopted by a minority of researchers in their publications. Problems with effective research data sharing persist and these problems have been quantified by previous research as a lack of time, resources, incentives, and/or skills to share data.
In this study we built on this research by investigating the importance of tasks associated with data sharing, and researchers’ satisfaction with their ability to complete these tasks. By investigating these factors we aimed to better understand opportunities for new or improved solutions for sharing data.
In May-June 2020 we surveyed researchers from Europe and North America to rate tasks associated with data sharing on (i) their importance and (ii) their satisfaction with their ability to complete them. We received 617 completed responses. We calculated mean importance and satisfaction scores to highlight potential opportunities for new solutions to and compare different cohorts.
Tasks relating to research impact, funder compliance, and credit had the highest importance scores. 52% of respondents reuse research data but the average satisfaction score for obtaining data for reuse was relatively low. Tasks associated with sharing data were rated somewhat important and respondents were reasonably well satisfied in their ability to accomplish them. Notably, this included tasks associated with best data sharing practice, such as use of data repositories. However, the most common method for sharing data was in fact via supplemental files with articles, which is not considered to be best practice.
We presume that researchers are unlikely to seek new solutions to a problem or task that they are satisfied in their ability to accomplish, even if many do not attempt this task. This implies there are few opportunities for new solutions or tools to meet these researcher needs. Publishers can likely meet these needs for data sharing by working to seamlessly integrate existing solutions that reduce the effort or behaviour change involved in some tasks, and focusing on advocacy and education around the benefits of sharing data.
There may however be opportunities – unmet researcher needs – in relation to better supporting data reuse, which could be met in part by strengthening data sharing policies of journals and publishers, and improving the discoverability of data associated with published articles.
The purpose of this paper was to draw on evidence from computer-mediated transparency and examine the argument that open government data and national data infrastructures represented by open data portals can help in enhancing transparency by providing various relevant features and capabilities for stakeholders’ interactions.
The developed methodology consisted of a two-step strategy to investigate research questions. First, a web content analysis was conducted to identify the most common features and capabilities provided by existing national open data portals. The second step involved performing the Delphi process by surveying domain experts to measure the diversity of their opinions on this topic.
Identified features and capabilities were classified into categories and ranked according to their importance. By formalizing these feature-related transparency mechanisms through which stakeholders work with data sets we provided recommendations on how to incorporate them into designing and developing open data portals.
The creation of appropriate open data portals aims to fulfil the principles of open government and enables stakeholders to effectively engage in the policy and decision-making processes.
By analyzing existing national open data portals and validating the feature-related transparency mechanisms, this paper fills this gap in existing literature on designing and developing open data portals for transparency efforts.
“We have put together a short survey to learn about people’s experiences with open access. This survey asks a range of questions, and you only need to answer the ones that are relevant to you! Everyone in the scholarly community is welcome to participate, including students, publishers, and scholars.
Some of the questions you will be asked in the survey are:
Do you believe the scholarly community could do research more effectively if all scientific communication were freely available under an open access license?
Have you ever published an article open access?
What is a reasonable APC for an open access research article?
Would you prefer if peer reviews were made open? For example, so anyone could read what the reviewer recommended and anyone could know who the reviewer was.
Have you ever needed access to a research article and were unable to read it due to pay walls? …”
“The Global Sustainability Coalition for Open Science Services (SCOSS) is now four years old. We are delighted that we have been able to support eight extraordinary organisations that provide Open Science Infrastructure in that time. As we grow into the next phase of our development, we have sought to learn more about how people perceive us and the work we do, and where our priorities should lie as we develop a new SCOSS strategy.
We conducted a consultation to understand awareness and perceptions of Open Science Infrastructure in the sector, and the role SCOSS plays in providing support.
As part of this consultation, we undertook a survey which attracted over 200 responses. We are incredibly grateful to everyone who took the time to respond to the survey, and are delighted to share some of the results….
When we asked what types of organisations SCOSS itself should prioritise, the responses aligned with the general priorities respondents identified. We are pleased to see that the infrastructures supported so far by SCOSS are well aligned with these choices, and that very few respondents (only four) wanted to prioritise infrastructure not covered by the categories offered.
When we asked which criteria should be used to prioritise support, Interoperability was the most popular option, chosen by 59% of respondents, with Community Governance chosen by 53% of respondents and 45% choosing Global significance. No other options were chosen by more than 30% of respondents, although three, Organizational resilience (29%), Urgency of need for funding (26%) and Innovation of solution (26%), were grouped together as the next three most popular choices.”
Abstract: A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software. However, funders and institutions lack sufficient tools, time or resources to monitor compliance with these policies.
To better understand funder and institution needs related to understanding open research practices of researchers, we targeted funders and institutions with a survey in 2020 and received 122 completed responses. Our survey assessed and scored, (from 0-100), the importance of and satisfaction with 17 factors associated with understanding open research practices. This includes things such as knowing if a research paper includes links to research data in a repository; knowing if a research grant made code available in a public repository; knowing if research data were made available in a reusable form; and knowing reasons why research data are not publicly available. Half of respondents had tried to evaluate researchers’ open research practices in the past and 78% plan to do this in the future. The most common method used to find out if researchers are practicing open research was personal contact with researchers and the most common reason for doing it was to increase their knowledge of researchers’ sharing practices (e.g. determine current state of sharing; track changes in practices over time; compare different departments/disciplines). The results indicate that nearly all of the 17 factors we asked about in the survey were underserved. The mean importance of all factors to respondents was 71.7, approaching the 75 threshold of “very important”. The average satisfaction of all factors was 41.3, indicating a negative level of satisfaction with ability to complete these tasks. The results imply an opportunity for better solutions to meet these needs. The growth of policies and requirements for making research data and code available does not appear to be matched with solutions for determining if these policies have been complied with. We conclude that publishers can better support some of the needs of funders and institutions by introducing simple solutions such as: – Mandatory data availability statements (DAS) in research articles – Not permitting generic “data available on request” statements – Enabling and encouraging the use of data repositories and other methods that make data available in a more reusable way – Providing visible links to research data on publications – Making information available on data and code sharing practices in publications available to institutions and funding agencies – Extending policies that require transparency in sharing of research data, to sharing of code
“Publishers investing in simple solutions in their workflows can help to better meet the needs of funders and institutions who wish to support open research practices, research released this week by PLOS concludes.
Policies can be an effective solution for changing research culture and practice. A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software — as do publishers. Seeking to deepen our understanding of funder and institution needs related to open research, we surveyed more than 100 funders and institutions in 2020. We wanted to know if they are evaluating how researchers share data and code, how they are doing it, why they are doing it, and how satisfied they are with their ability to get these tasks done. Our results are available as a preprint along with an anonymised dataset….
Simple solutions more publishers could provide include:
Mandatory Data Availability Statements (DAS) in all relevant publications.
Across the STM industry around 15% of papers include a DAS. Since we introduced our data availability policy in 2014, 100% of PLOS research articles include a DAS.
Supporting researchers to provide information on why research data (and code) are not publicly available with their publications.
Time and again “data available on request” has been shown to be ineffective at supporting new research — and is not permitted in PLOS journals.
Enabling and encouraging the use of data repositories.
Recommending the use of data repositories is a useful step, but making them easily and freely accessible — integrated into the publishing process — can be even more effective. Rates of repository use are higher in journals that partner closely with repositories and remove cost barriers to their use.
Providing visible links to research data on publications. Many researchers also struggle to find data they can reuse, hence PLOS will soon be experimenting with improving this functionality in our articles, and integrating the Dryad repository with submission….”
The European Parliament’s directive on open data indicates the direction to follow for all public institutions in Europe. The portal Polish Platform of Medical Research (PPM) required more information about researcher attitudes and training requirements for strategic planning.
The aim was to assess (1) the status of knowledge about research data management among medical researchers in Poland, and (2) their attitudes towards data sharing. This knowledge may help to inform a training program and adapt PPM to the requirements of researchers.
The authors circulated an online survey and received responses from 603 researchers representing medical sciences and related disciplines. The survey was conducted in 2019 at seven Polish medical universities and at the Nofer Institute of Occupational Medicine. Analysis used descriptive statistics.
Data sharing was not widespread (55.7% only shared with their research team, 9.8% had shared data on an open access basis). Many cited possible benefits of research data sharing but were concerned about drawbacks (e.g. fraud, plagiarism).
Polish medical scientists, like many researchers, are not aware of the processes required for safe data preparation for sharing. Academic libraries should develop roles for data librarians to help train researchers.
Fears about the dangers of data sharing need to be overcome before researchers are willing to share their own research data.
Abstract: Responding to calls to take a more active role in communicating their research findings, scientists are increasingly using open online platforms, such as Twitter, to engage in science communication or to publicize their work. Given the ease with which misinformation spreads on these platforms, it is important for scientists to present their findings in a manner that appears credible. To examine the extent to which the online presentation of science information relates to its perceived credibility, we designed and conducted two surveys on Amazon’s Mechanical Turk. In the first survey, participants rated the credibility of science information on Twitter compared with the same information in other media, and in the second, participants rated the credibility of tweets with modified characteristics: presence of an image, text sentiment, and the number of likes/retweets. We find that similar information about scientific findings is perceived as less credible when presented on Twitter compared to other platforms, and that perceived credibility increases when presented with recognizable features of a scientific article. On a platform as widely distrusted as Twitter, use of these features may allow researchers who regularly use Twitter for research-related networking and communication to present their findings in the most credible formats.
Abstract: The LYRASIS open source software (OSS) survey was conducted in spring 2021 as a mechanism to better understand how institutions interact with and support OSS programs. For the purposes of the survey, OSS programs were defined as community-based programs specifically designed for GLAM institutions, such as FOLIO, ArchivesSpace (a LYRASIS supported community), and Omeka. This report provides institutions with an opportunity to see where their efforts fall amongst the activities of their peers in three categories: funding/supporting OSS, justifying OSS, and evaluating OSS. The first section covers how/how much institutions contribute to OSS programs, either through financial contributions or staff time devoted to program contributions/governance. The second section focuses on how institutions justify investment in OSS programs. The final section covers the ways that GLAM institutions determine the qualifications for OSS, their evaluation tactics, and their decision-making about long term OSS maintenance.
Abstract: Public research policies have been promoting open-access publication in recent years as an adequate model for the dissemination of scientific knowledge. However, depending on the disciplines, its use is very diverse. This study explores the determinants of open-access publication among academic researchers of economics and business, as well as their assessment of different economic measures focused on publication stimulus. To do so, a survey of Spanish business and economics researchers was conducted. They reported an average of 19% of their publications in open-access journals, hybrids or fully Gold Route open access. Almost 80% of the researchers foresee a future increase in the volume of open-access publications. When determining where to publish their research results, the main criterion for the selection of a scientific journal is the impact factor. Regarding open access, the most valued aspect is the visibility and dissemination it provides. Although the cost of publication is not the most relevant criterion in the choice of a journal, three out of four researchers consider that a reduction in fees and an increase in funding are measures that would boost the open-access model.
From Google’s English: “Aware of the importance of promoting access and dissemination of scientific research at all levels of society, and considering the UNESCO declaration related to the importance of Open Science to expand the social impact of science and as a response to the changes, challenges, opportunities and risks of the digital age; The Knowledge Research Network, Free Software and Hardware (RICHSL) , the Innovation and Transfer HUB of Quito , and the Openlab Ecuador Citizen Laboratory , want to join forces to generate a community of researchers, teachers, students and administrative personnel interested in the Open Science .
The following form seeks to gather opinions from those interested and interested in being part of this community, with the possibility of integrating Working Groups on the subject in your institution, to work in an articulated way promoting and promoting Open Science….”
“The aim of this survey is to assess the levels of preprint sharing taking place using generalist repositories.
A preprint is defined as a scientific manuscript without peer-review typically submitted to a public server/ repository by the author. [Definition adapted from ASAPbio description].
A generalist repository is a repository that collects content from a variety of domains and content types, such as institutional, national and international repositories (e.g. Zenodo, HAL, Harvard’s DASH repository)
With the COVID-19 pandemic, we have seen a rise in researchers sharing their preprints. Traditionally, institutional and generalist repositories have not played a significant role in hosting these objects. However, as the sharing of preprints becomes more widely embraced, these types of repositories are obvious mechanisms to expand the preprint ecosystem internationally, without having to launch new preprint services.
This survey is targeted at institutional and other generalist repositories to gauge their current activities and future plans related to the collection of preprints. The survey will take only about 5 minutes and will be open from August 4 – September 10, 2021….”