Merit Review Policy – [U of Maryland, Psychology Department]

“Examples of specific evaluative criteria to be used in merit review, based on professional standards for evaluating faculty performance….Openness and transparency: Degree to which research, data, procedures, code, and research products are made openly available where appropriate; the use of registered reports or pre-registration. Committee should recognize that researchers may not be able to share some types of data, such as when data are proprietary or subject to ethical concerns over confidentiality[7, 1, 6, 2, 5] These limitations should be documented by faculty.”

 

Extending the open monitoring of open science – Archive ouverte HAL

Abstract:  Abstract : We present a new Open Science Monitor framework at the country level for the case of France. We propose a fine-grained monitoring of the dynamics of the open access to publications, based on historical data from Unpaywall, and thus limited to Crossref-DOI documents. The economic models of journals publishing French publications are analyéed as well as the open access dynamics by discipline and open access route (publishers and repositories). The French Open Science Monitor (BSO) website: https://frenchopensciencemonitor.esr.gouv.fr presents the results to date (last observation date December 2021). 62% of the 170,000 French 2020 publications are available in December 2021. This rate has increased by 10 points in one year. The level of open access varies significantly from one discipline to another. Some disciplines, such as the physical sciences and mathematics, have long been committed to opening up their publications, while others, such as chemistry, are rapidly catching up. In the context of the COVID-19 pandemic crisis and the urgent need to open up scholarly outputs in the health field, a specific version of the French Open Science Monitor has been built: https://frenchopensciencemonitor.esr.gouv.fr/health. It monitors the open access dynamics of French publications in the biomedical field. It also analyses the transparency of the results of clinical trials and observational studies conducted in France. Only 57% of clinical trials completed in the last 10 years have shared their results publicly. In contrast to other Open Science Monitoring initiatives, the source code and the data of the French Open Science Monitor are shared with an open licence. The source code used for the French Open Science Monitor is available on GitHub, and shared with an open licence. The code is split in modules, in particular for indicators computations https://github.com/dataesr/bso-publications and https://github.com/dataesr/bso-clinical-trials and the web user interface https://github.com/dataesr/bso-ui. The data resulting of this work is shared on the French Ministry of Higher Education, Research and Innovation open data portal: https://data.enseignementsup-recherche.gouv.fr/explore/dataset/open-access-monitor-france/information/ and https://data.enseignementsup-recherche.gouv.fr/explore/dataset/barometre-sante-de-la-science-ouverte/information/. The originality of the French Open Science Monitor also lies in the fact that it can easily be adapted to the level of an higher education and research institution. To date, some twenty higher education and research institutions have already used it to obtain reliable and open indicators on the progress of open science in their scientific production.

 

Biosecurity in an age of open science

Abstract:  The risk of accidental or deliberate misuse of biological research is increasing as biotechnology advances. As open science becomes widespread, we must consider its impact on those risks and develop solutions that ensure security while facilitating scientific progress. Here, we examine the interaction between open science practices and biosecurity and biosafety to identify risks and opportunities for risk mitigation. Increasing the availability of computational tools, datasets, and protocols could increase risks from research with misuse potential. For instance, in the context of viral engineering, open code, data, and materials may increase the risk of release of enhanced pathogens. For this dangerous subset of research, both open science and biosecurity goals may be achieved by using access-controlled repositories or application programming interfaces. While preprints accelerate dissemination of findings, their increased use could challenge strategies for risk mitigation at the publication stage. This highlights the importance of oversight earlier in the research lifecycle. Preregistration of research, a practice promoted by the open science community, provides an opportunity for achieving biosecurity risk assessment at the conception of research. Open science and biosecurity experts have an important role to play in enabling responsible research with maximal societal benefit.

 

Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies | Health Research Policy and Systems | Full Text

Abstract:  Background

In light of replication and translational failures, biomedical research practices have recently come under scrutiny. Experts have pointed out that the current incentive structures at research institutions do not sufficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimize their fitness in the struggle for publications and grants. This cross-sectional study aimed to describe whether and how relevant policies of university medical centres in Germany support the robust and transparent conduct of research and how prevalent traditional metrics are.

Methods

For 38 German university medical centres, we searched for institutional policies for academic degrees and academic appointments as well as websites for their core facilities and research in general between December 2020 and February 2021. We screened the documents for mentions of indicators of robust and transparent research (study registration; reporting of results; sharing of research data, code and protocols; open access; and measures to increase robustness) and for mentions of more traditional metrics of career progression (number of publications; number and value of awarded grants; impact factors; and authorship order).

Results

While open access was mentioned in 16% of PhD regulations, other indicators of robust and transparent research were mentioned in less than 10% of institutional policies for academic degrees and academic appointments. These indicators were more frequently mentioned on the core facility and general research websites. Institutional policies for academic degrees and academic appointments had frequent mentions of traditional metrics.

Conclusions

References to robust and transparent research practices are, with a few exceptions, generally uncommon in institutional policies at German university medical centres, while traditional criteria for academic promotion and tenure still prevail.

Investigating the Effectiveness of the Open Data Badge Policy at Psychological Science Through Computational Reproducibility

Abstract:  In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its stated aim at Psychological Science: ensuring reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all articles provided at least some data, 6/14 articles provided analysis code or scripts, only 1/14 articles was rated to be exactly reproducible, and 3/14 essentially reproducible with minor deviations. We recommend that Psychological Science require a check of reproducibility at the peer review stage before awarding badges, and that the Open Data badge be renamed “Open Data and Code” to avoid confusion and encourage researchers to adhere to this higher standard.

 

Data and Software for Authors | AGU

“AGU requires that the underlying data needed to understand, evaluate, and build upon the reported research be available at the time of peer review and publication. Additionally, authors should make available software that has a significant impact on the research. This entails:

Depositing the data and software in a community accepted, trusted repository, as appropriate, and preferably with a DOI
Including an Availability Statement as a separate paragraph in the Open Research section explaining to the reader where and how to access the data and software
And including citation(s) to the deposited data and software, in the Reference Section….”

The doors of precision: Reenergizing psychiatric drug development with psychedelics and open access computational tools

“In a truly remarkable way, the study was performed at essentially no additional cost. Ballentine et al. (3) made use of existent, openly available resources: the Erowid psychedelic “experience vault,” the pharmacokinetic profiles of each psychedelic, the Allen Human Brain gene transcription profiles, and the Schafer-Yeo brain atlas that mapped gene transcript to brain structure. The computational tools—primarily python toolboxes—that Ballentine et al. deployed were also available at no cost. So in the same way that the psychedelics industry is repurposing old drugs, Ballentine et al. repurposed old data and tools to define a new framework….”

The doors of precision: Reenergizing psychiatric drug development with psychedelics and open access computational tools

“In a truly remarkable way, the study was performed at essentially no additional cost. Ballentine et al. (3) made use of existent, openly available resources: the Erowid psychedelic “experience vault,” the pharmacokinetic profiles of each psychedelic, the Allen Human Brain gene transcription profiles, and the Schafer-Yeo brain atlas that mapped gene transcript to brain structure. The computational tools—primarily python toolboxes—that Ballentine et al. deployed were also available at no cost. So in the same way that the psychedelics industry is repurposing old drugs, Ballentine et al. repurposed old data and tools to define a new framework….”

Which solutions best support sharing and reuse of code? – The Official PLOS Blog

“PLOS has released a preprint and supporting data on research conducted to understand the needs and habits of researchers in relation to code sharing and reuse as well as to gather feedback on prototype code notebooks and help determine strategies that publishers could use to increase code sharing.

Our previous research led us to implement a mandatory code sharing policy at PLOS Computational Biology in March 2021 to increase the amount of code shared alongside published articles. As well as exploring policy to support code sharing, we have also been collaborating with NeuroLibre, an initiative of the Canadian Open Neuroscience Platform, to learn more about the potential role of technological solutions for enhancing code sharing. Neurolibre is one of a growing number of interactive or executable technologies for sharing and publishing research, some of which have become integrated with publishers’ workflows….”

A survey of researchers’ code sharing and code reuse practices, and assessment of interactive notebook prototypes | OSF Preprints

Cadwallader, L., & Hrynaszkiewicz, I. (2022, March 2). A survey of researchers’ code sharing and code reuse practices, and assessment of interactive notebook prototypes. https://doi.org/10.31219/osf.io/tys8p

Abstract: This research aimed to understand the needs and habits of researchers in relation to code sharing and reuse; gather feedback on prototype code notebooks created by Neurolibre; and help determine strategies that publishers could use to increase code sharing. We surveyed 188 researchers in computational biology. Respondents were asked about how often and why they look at code, which methods of accessing code they find useful and why and what aspects of code sharing are important to them, and how satisfied they are with their ability to complete these. Respondents were asked to look at a prototype code notebook and give feedback on its features. Respondents were also asked how much time they spent preparing code and if they would be willing to increase this to use a code sharing tool, such as a notebook. As a reader of research articles the most common reason (70%) for looking at code was to gain a better understanding of the article. The most commonly encountered method for code sharing – linking articles to a code repository — was also the most useful method of accessing code from the reader’s perspective. As authors, the respondents were largely satisfied with their ability to carry out tasks related to code sharing. The most important of these tasks were ensuring that the code was running in the correct environment, and sharing code with good documentation. The average researcher, according to our results, is unwilling to incur additional costs (in time, effort or expenditure) that are currently needed to use code sharing tools alongside a publication. We infer this means we need different models for funding and producing interactive or executable research outputs if they are to reach a large number of researchers. For the purpose of increasing the amount of code shared by authors, PLOS Computational Biology is, as a result, focusing on policy rather than tools.

An Open Access Resource for Functional Brain Connectivity from Fully Awake Marmosets: Open Access Marmoset Functional Connectivity Resource – ScienceDirect

Abstract:  The common marmoset (Callithrix jacchus) is quickly gaining traction as a premier neuroscientific model. However, considerable progress is still needed in understanding the functional and structural organization of the marmoset brain to rival that documented in long-standing preclinical model species, like mice, rats, and Old World primates. To accelerate such progress, we present the Marmoset Functional Connectivity Resource (marmosetbrainconnectome.org), consisting of over 70 hours of resting-state fMRI (RS-fMRI) data acquired at 500 µm isotropic resolution from 31 fully awake marmosets in a common stereotactic space. Three-dimensional functional connectivity (FC) maps for every cortical and subcortical gray matter voxel are stored online. Users can instantaneously view, manipulate, and download any whole-brain functional connectivity (FC) topology (at the subject- or group-level) along with the raw datasets and preprocessing code. Importantly, researchers can use this resource to test hypotheses about FC directly – with no additional analyses required – yielding whole-brain correlations for any gray matter voxel on demand. We demonstrate the resource’s utility for presurgical planning and comparison with tracer-based structural connectivity as proof of concept. Complementing existing structural connectivity resources for the marmoset brain, the Marmoset Functional Connectivity Resource affords users the distinct advantage of exploring the connectivity of any voxel in the marmoset brain, not limited to injection sites nor constrained by regional atlases. With the entire raw database (RS-fMRI and structural images) and preprocessing code openly available for download and use, we expect this resource to be broadly valuable to test novel hypotheses about the functional organization of the marmoset brain.

 

Open Research Principles – CHORUS

“Open research is concerned with making scientific research more transparent, more collaborative and more efficient. Other aspects are more open forms of collaboration and engagement with a wider audience. The following principles were set forth by CHORUS in support of open research.

We believe in sustainable Open Access practices and workflows.
We believe that it should be easy for researchers to understand how publishing in our publications will support them in complying with funder OA mandates.
We believe users should be directed to the best version of an article available to them, ideally the Version of Record on the publisher site, where they may find essential context, tools, and information.
We believe all parties should be able to track funded research literature with minimal administrative overheads.
We believe data associated with research, as well as methods and code, should comply with relevant FAIR principles, taking into account differences between fields and categories of research objects.”

An open repository of real-time COVID-19 indicators | PNAS

Abstract:  The COVID-19 pandemic presented enormous data challenges in the United States. Policy makers, epidemiological modelers, and health researchers all require up-to-date data on the pandemic and relevant public behavior, ideally at fine spatial and temporal resolution. The COVIDcast API is our attempt to fill this need: Operational since April 2020, it provides open access to both traditional public health surveillance signals (cases, deaths, and hospitalizations) and many auxiliary indicators of COVID-19 activity, such as signals extracted from deidentified medical claims data, massive online surveys, cell phone mobility data, and internet search trends. These are available at a fine geographic resolution (mostly at the county level) and are updated daily. The COVIDcast API also tracks all revisions to historical data, allowing modelers to account for the frequent revisions and backfill that are common for many public health data sources. All of the data are available in a common format through the API and accompanying R and Python software packages. This paper describes the data sources and signals, and provides examples demonstrating that the auxiliary signals in the COVIDcast API present information relevant to tracking COVID activity, augmenting traditional public health reporting and empowering research and decision-making.