This study considered all articles published in six Public Library of Science (PLOS) journals in 2012 and Web of Science citations for these articles as of May 2015. A total of 2,406 articles were analyzed to examine the relationships between Altmetric Attention Scores (AAS) and Web of Science citations. The AAS for an article, provided by Altmetric aggregates activities surrounding research outputs in social media (news outlet mentions, tweets, blogs, Wikipedia, etc.). Spearman correlation testing was done on all articles and articles with AAS. Further analysis compared the stratified datasets based on percentile ranks of AAS: top 50%, top 25%, top 10%, and top 1%. Comparisons across the six journals provided additional insights. The results show significant positive correlations between AAS and citations with varied strength for all articles and articles with AAS (or social media mentions), as well as for normalized AAS in the top 50%, top 25%, top 10%, and top 1% datasets. Four of the six PLOS journals, Genetics, Pathogens, Computational Biology, and Neglected Tropical Diseases, show significant positive correlations across all datasets. However, for the two journals with high impact factors, PLOS Biology and Medicine, the results are unexpected: the Medicine articles showed no significant correlations but the Biology articles tested positive for correlations with the whole dataset and the set with AAS. Both journals published substantially fewer articles than the other four journals. Further research to validate the AAS algorithm, adjust the weighting scheme, and include appropriate social media sources is needed to understand the potential uses and meaning of AAS in different contexts and its relationship to other metrics.
Based on the principles of the h‐index, I propose a new measure, the w‐index, as a particularly simple and more useful way to assess the substantial impact of a researcher's work, especially regarding excellent papers. The w‐index can be defined as follows: If w of a researcher's papers have at least 10w citations each and the other papers have fewer than 10(w+1) citations, that researcher's w‐index is w. The results demonstrate that there are noticeable differences between the w‐index and the h‐index, because the w‐index plays close attention to the more widely cited papers. These discrepancies can be measured by comparing the ranks of 20 astrophysicists, a few famous physical scientists, and 16 Price medalists. Furthermore, I put forward the w(q)‐index to improve the discriminatory power of the w‐index and to rank scientists with the same w. The factor q is the least number of citations a researcher with w needed to reach w+1. In terms of both simplicity and accuracy, the w‐index or w(q)‐index can be widely used for evaluation of scientists, journals, conferences, scientific topics, research institutions, and so on.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.