“Institutional and subject repositories are excellent locations to make research outputs publicly accessible. Researchers can share their research with the public through a variety of alternative dissemination mechanisms, including Research Gate, Academia.edu and others. One of the best effective techniques to boost a research paper’s visibility and number of citations is through open-access (OA) publication, because it makes the study publicly accessible from the very beginning. Researchers can boost their visibility, preserve their work and make it available for use in the future by making all of their outputs publicly accessible. Ogunleye (2019) made a study on “Some determinants of visibility boost for research publications among early career educational researchers in southwest, Nigeria”. In this study, he described that the early career of educational researchers in Southwest Nigeria looked into some determinants (shared reference databases, research profiles, publishing in OA, self-archiving, publication metadata, researcher profiles and social media platforms) for boosting visibility of the publication. A structured questionnaire on factors determining publication boost (r = 0.81) was utilised to collect data, and multiple regression analysis and the Pearson’s correlation approach were employed to evaluate the data. A significant positive correlation between each of the following was discovered in the results: joint reference databases (r = 0.17), Publication metadata (r = 0.23), result profiles (r = 0.44), open-access publishing (r = 0.27), self-archiving (r = 0.52), social media networks (r = 0.43) and accessibility of published work are all positively correlated with each other. The six variables had a positive correlation with the publication visibility (R = 0.60), and they were responsible for 32.9% of the gains invisibility of early career researchers’ publications. Norman (2012) conducted a research on “Maximizing Journal Article Citation Online: Readers, Robots, and Research Visibility”. Then he explained that online academic publications with peer review provide numerous advantages for researchers. They can enhance an article’s popularity and publicity, connect someone’s research to the relevant web of existing literature rapidly and add other scholars’ attention who will use it, increasing the likelihood of it being used. Also provided five basic areas to make the literature more popular which are choosing a search engine-friendly title, writing of abstracts and introductions, making the article easy to find, using of media and links, dissemination of articles after publication and emphasised on increasing a piece of content’s prospects of future downloads, citations and visibility.”
“In September, Ithaka S+R and the Association of University Presses published the report “Print Revenue and Open Access Monographs: A University Press Study,” as well as its affiliated data set. Join authors of the study during OA Week 2023 to discuss the findings in this report, and share ideas about how university presses can use the information to develop sustainable OA monograph publishing solutions. This research was funded by a Level I Digital Humanities Advancement Grant from the National Endowment for the Humanities (NEH).”
“Twenty years ago this month, PLOS Biology was launched, helping to catalyze a movement that has transformed publishing in the life sciences. In this issue, we explore how the community can continue innovating for positive change in the next decades….”
Abstract: This guide focuses specifically on data from the data provider and company, Altmetric, but other types of altmetrics are mentioned and occasionally used as a comparison in this guide, such as the Open Syllabus database to find the educational engagement with scholarly outputs. This guide opens with an introduction followed by an overview of Altmetric and the Altmetric Attention Score, Altmetrics and Responsible Research Assessment, Output Types Tracked by Altmetric, and the Altmetric Sources of Attention, which include: News and Mainstream Media, Social Media (X (formerly Twitter), Facebook, Reddit, and historical data from Google+, Pinterest, LinkedIn, and Sina Weibo); Patents, Peer Review, Syllabi (historical data only), Multimedia, Public Policy Documents, Wikipedia, Research Highlights, Reference Managers, and Blogs; finally, there is a conclusion, a list of related resources and readings, two appendices, and references. This guide is intended for use by librarians, practitioners, funders, and other users of Altmetric data or those who are interested in incorporating altmetrics into their bibliometric practice and/or research analytics. It can also help researchers who are going up for annual evaluations and promotion and tenure reviews, who can use the data in informed and practical applications. It can also be a useful reference guide for research managers and university administrators who want to understand the broader online engagement with research publications beyond traditional scholarly citations, also known as bibliometrics, but who also want to avoid misusing, misinterpreting, or abusing Altmetric data when making decisions, creating policies, and evaluating faculty members and researchers at their institutions.
“cOAlition S is pleased to announce that the tender process for a study to assess the impact of Plan S on the global scholarly communication ecosystem has been successfully completed. The tender has been awarded to scidecode science consulting, an international team of experts with extensive consulting experience and project work within the scholarly communication domain.
To assess the impact Plan S has had on the scholarly communication ecosystem and on facilitating research to be published Open Access, scidecode will follow a multifaceted approach, encompassing both quantitative econometrics and a qualitative methodology based on desk research, a comprehensive literature analysis, and in-depth interviews with key stakeholders. These stakeholders will include research funders, advocates for institutional Open Access initiatives, publishers and researchers.
The study is anticipated to deliver valuable insights into the effectiveness of cOAlition S in achieving its objectives. and provide actionable recommendations to improve and expand upon these. The findings and recommendations are expected to be published in mid-2024….”
Abstract: Academic journals have been publishing the results of biomedical research for more than 350 years. Reviewing their history reveals that the ways in which journals vet submissions have changed over time, culminating in the relatively recent appearance of the current peer-review process. Journal brand and Impact Factor have meanwhile become quality proxies that are widely used to filter articles and evaluate scientists in a hypercompetitive prestige economy. The Web created the potential for a more decoupled publishing system in which articles are initially disseminated by preprint servers and then undergo evaluation elsewhere. To build this future, we must first understand the roles journals currently play and consider what types of content screening and review are necessary and for which papers. A new, open ecosystem involving preprint servers, journals, independent content-vetting initiatives, and curation services could provide more multidimensional signals for papers and avoid the current conflation of trust, quality, and impact. Academia should strive to avoid the alternative scenario, however, in which stratified publisher silos lock in submissions and simply perpetuate this conflation.
Abstract: Open access has presented a fresh challenge to the publishing and scholarly communication sectors with the start of the twenty-first century. For the previous two decades, libraries have struggled to keep their journal subscriptions at a level that will support their research and development efforts due to rising publication fees and a stable budget. In the interim, the Web’s ability to publish academic articles in the public domain has opened up new communication channels for the scientific community. Along with various OA techniques, supporting business models have been proposed. Authors have reviewed a number of recent studies to demonstrate the effect of open access on its use and citation in scholarly and research. According to studies, open access (OA) has a significant impact on scientific communication since it helps boost the citation impact of journals and makes scientific research more visible and accessible. The writers are optimistic about the future of OA.
We examine the impact of the U.S. Department of Energy’s open-access mandate
Scientific articles subject to the mandate were utilized on average 42% more in patents
Articles subject to the mandate were not cited more frequently by other academic papers
Small firms were the primary beneficiaries of the increased knowledge diffusion…”
Abstract: The value of articles published in journals devoted to the scholarship of teaching and learning constitutes a relatively unexplored topic of inquiry within the broader field of inquiry on the scholarship of teaching and learning. This article addresses this topic using citations and four types of altmetrics as indicators of value. We used a sample of 100 articles published in four SOTL focused journals: two high consensus journals (BioScience: Journal of College Biology Teaching and The Journal of Chemical Education) and two low consensus journals (Teaching History and Teaching Sociology). In addition to the level of consensus of the discipline of these journals, we also measured the institutional type of the first authors of these articles and the type of study of the article. We advanced three conclusions from our data analysis with the first one being of particular significance to SOTL work. This conclusion is that the pattern of findings of this study cry out fairly loudly that articles published in SOTL-focused journals hold value to users of the articles as expressed through citations of them, as well as mentions of them through various altmetrics. Moreover the similar magnitudes of this value transpires regardless of the institutional type of the article’s first author and whether the article recommended a practice or recommended content. However, the value ascribed to articles differ according to the level of consensus of the field of the SOTL journal, which show a difference in article views, Twitter mentions and Mendeley uses.
Abstract: Multiple studies across a variety of scientific disciplines have shown that the number of times that a paper is shared on Twitter (now called X) is correlated with the number of citations that paper receives. However, these studies were not designed to answer whether tweeting about scientific papers causes an increase in citations, or whether they were simply highlighting that some papers have higher relevance, importance or quality and are therefore both tweeted about more and cited more. The authors of this study are leading science communicators on Twitter from several life science disciplines, with substantially higher follower counts than the average scientist, making us uniquely placed to address this question. We conducted a three-year-long controlled experiment, randomly selecting five articles published in the same month and journal, and randomly tweeting one while retaining the others as controls. This process was repeated for 10 articles from each of 11 journals, recording Altmetric scores, number of tweets, and citation counts before and after tweeting. Randomization tests revealed that tweeted articles were downloaded 2.6–3.9 times more often than controls immediately after tweeting, and retained significantly higher Altmetric scores (+81%) and number of tweets (+105%) three years after tweeting. However, while some tweeted papers were cited more than their respective control papers published in the same journal and month, the overall increase in citation counts after three years (+7% for Web of Science and +12% for Google Scholar) was not statistically significant (p > 0.15). Therefore while discussing science on social media has many professional and societal benefits (and has been a lot of fun), increasing the citation rate of a scientist’s papers is likely not among them.
Abstract: Institutional Repositories (IRs) development in Tanzania has made publications readily available, accessible, and retrievable. IRs have increased the visibility of researchers and institutions and have contributed to the University ranking. Several Higher Learning Institutions (HLIs) in Tanzania have developed their IRs hosting institutional publications. This study assessed the citation impact of IR contents of selected Tanzanian HLIs. The study evaluated the citation impact of IR contents using publications indexed in the Scopus database. Four HLIs were purposively selected. The search within reference advanced feature for the Scopus database was conducted. The publications indexed in Scopus citing the selected IR contents from 2018 to 2022 were identified and extracted. Data analysis was carried out using Microsoft Excel and SPSS. The study findings indicated that the Tanzanian IR contents had a low citation impact. The study recommends that Tanzanian HLIs devise strategies for increasing IR content visibility. The strategies may include registering the IRs in online platforms and ensuring the Handle System is implemented to improve the accessibility of the IR content. Furthermore, the HLIs should create awareness of research visibility, enabling researchers to publish and increase their visibility.
“At OpenStax, we are driven by a clear and powerful mission: to improve educational access and learning for everyone. Rooted in the belief that education is a public good, we strive to offer products, innovative research, and services that benefit educators and learners worldwide. Our approach is simple but impactful—we listen to the needs of the educational community, secure philanthropic support and community donations for funding, and embark on a rigorous development process.
Since our inception, OpenStax has grown to offer an impressive range of 65 textbooks, a testament to our commitment to providing comprehensive learning materials. Since our first textbook launch in 2012, we’ve already saved more than 36 million students an astounding $2.9 billion. This past school year, more than 7 million students utilized OpenStax materials….”
Abstract: During career advancement and funding allocation decisions in biomedicine, reviewers have traditionally depended on journal-level measures of scientific influence like the impact factor. Prestigious journals are thought to pursue a reputation of exclusivity by rejecting large quantities of papers, many of which may be meritorious. It is possible that this process could create a system whereby some influential articles are prospectively identified and recognized by journal brands but most influential articles are overlooked. Here, we measure the degree to which journal prestige hierarchies capture or overlook influential science. We quantify the fraction of scientists’ articles that would receive recognition because (a) they are published in journals above a chosen impact factor threshold, or (b) are at least as well-cited as articles appearing in such journals. We find that the number of papers cited at least as well as those appearing in high-impact factor journals vastly exceeds the number of papers published in such venues. At the investigator level, this phenomenon extends across gender, racial, and career stage groupings of scientists. We also find that approximately half of researchers never publish in a venue with an impact factor above 15, which under journal-level evaluation regimes may exclude them from consideration for opportunities. Many of these researchers publish equally influential work, however, raising the possibility that the traditionally chosen journal-level measures that are routinely considered under decision-making norms, policy, or law, may recognize as little as 10-20% of the work that warrants recognition.
“The enormous difference in sheer volume means that an OA megajournal is likely to have quite a few papers with more cites than the Nature median — high impact work that we would miss entirely if we focused only on the JIF. The flip side is where we find the halo effect: there are, in any given year, hundreds of Nature papers that underperform quite a bit relative to the IF (indeed half of them underperform relative to the median). This —the skewed distributions for both the megajournal and the glamour journal— shows why it is a bad idea to ascribe properties to individual papers based on how other papers published under the same flag have been cited….”
Abstract: Access to scientific data can enable independent reuse and verification; however, most data are not available and become increasingly irrecoverable over time. This study aimed to retrieve and preserve important datasets from 160 of the most highly-cited social science articles published between 2008-2013 and 2015-2018. We asked authors if they would share data in a public repository — the Data Ark — or provide reasons if data could not be shared. Of the 160 articles, data for 117 (73%, 95% CI [67% – 80%]) were not available and data for 7 (4%, 95% CI [0% – 12%]) were available with restrictions. Data for 36 (22%, 95% CI [16% – 30%]) articles were available in unrestricted form: 29 of these datasets were already available and 7 datasets were made available in the Data Ark. Most authors did not respond to our data requests and a minority shared reasons for not sharing, such as legal or ethical constraints. These findings highlight an unresolved need to preserve important scientific datasets and increase their accessibility to the scientific community.