A small step in the right direction for open data in Chemistry.
Category Archives: News blog
Perceptions and Practices of #Replication by Social and Behavioral Scientists: Results from a Survey
Researchers from the German Institute of Economic research in Berlin present the results of a recent survey among social and behavioral researchers on data sharing and replication. Working paper out now.
Research data explored: an extended analysis of citations and altmetrics
In this study, we explore the citedness of research data, its distribution over time and its relation to the availability of a digital object identifier (DOI) in the Thomson Reuters database Data Citation Index (DCI). We investigate if cited research data “impacts” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media platforms. Three tools are used to collect altmetrics scores, namely PlumX, ImpactStory, and Altmetric.com, and the corresponding results are compared. We found that out of the three altmetrics tools, PlumX has the best coverage. Our experiments revealed that research data remain mostly uncited (about 85 %), although there has been an increase in citing data sets published since 2008. The percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories are responsible for research data with DOIs and two or more citations. The number of cited research data with altmetrics “foot-prints” is even lower (4–9 %) but shows a higher coverage of research data from the last decade. In our study, we also found no correlation between the number of citations and the total number of altmetrics scores. Yet, certain data types (i.e. survey, aggregate data, and sequence data) are more often cited and also receive higher altmetrics scores. Additionally, we performed citation and altmetric analyses of all research data published between 2011 and 2013 in four different disciplines covered by the DCI. In general, these results correspond very well with the ones obtained for research data cited at least twice and also show low numbers in citations and in altmetrics. Finally, we observed that there are disciplinary differences in the availability and extent of altmetrics scores.
Read @petersuber ‘s writings on #OpenAccess
Peter Suber’s excellent readings on Open Access; of course free to download.
A Research Symbiont: Data Sharing
Benedikt Fecher and Gert Wagner in a recent Science letter on credit for academic data sharing.
Misconceptions about academic data sharing #datasharing #openscience
Gert Wagner and Benedikt Fecher reply to an editorial about data sharing in medicine.
Longo and Drazen miss the very point of scientific research when they write, that the researchers may «even use the data to try to disprove what the original investigators had posited«. It is at the core of the scientific paradigm that researchers take nothing as final truth. This is what Popper proposed in his critical rationalism and Merton in his conceptualization of skepticism.
NEJM Editorial and the journals reply #datasharing
Last week, Longo and Drazen published a frantic editorial in the New England Journal of Medicing on academic data sharing, implying that researchers that use data from other researcher are “research parasites”. The journal replied:
We want to clarify, given recent concern about our policy, that the Journal is committed to data sharing in the setting of clinical trials. As stated in the Institute of Medicine report from the committee1 on which I served and the recent editorial by the International Committee of Medical Journal Editors (ICMJE),2 we believe there is a moral obligation to the people who volunteer to participate in these trials to ensure that their data are widely and responsibly used.
Launch of Digital Archive for Historical Research #digitalhumanities
Today Cendari (Collaborative European Digital Archive Infrastructure) has been launched. It is featured as a “powerful toolkit for digital historical research”.
Wikidata for research
Wiki4R will create an innovative virtual research environment (VRE) for Open Science at scale, engaging both professional researchers and citizen data scientists in new and potentially transformative forms of collaboration.
The ResearchGate Score: a good example of a bad metric
According to ResearchGate, the academic social networking site, their RG Score is “a new way to measure your scientific reputation”. With such high aims, Peter Kraker, Katy Jordan and Elisabeth Lex take a closer look at the opaque metric. By reverse engineering the score, they find that a significant weight is linked to ‘impact points’ – a similar metric to the widely discredited journal impact factor. Transparency in metrics is the only way scholarly measures can be put into context and the only way biases – which are inherent in all socially created metrics – can be uncovered.