Authors, editors, and reviewers need to have a good perception regarding the quality of a manuscript in order to improve their skills, save effort, and prevent errors that can affect the submission procedure. In this paper, we compared the author’s perception of a manuscript’s quality with the manuscript’s actual impact. In addition, we analyzed the uncertainty of the author’s perception of the manuscript’s quality. From there, we defined ‘partition’ as the author’s ability to perceive the actual quality. We did this by launching a website for the use of the scientific community. This webpage provided a tool to help improve an investigator’s skill in understanding and recognizing the quality of a manuscript so as to help researchers improve and maximize their works’ potential impact. We carried out the experiment with 106 experienced users who tested our webpage. We found that the Abstract, the Title, and the Keywords were enough to perform a substantially decent evaluation of a manuscript. Most of the researchers were able to determine the quality of a paper in less than a minute from this small amount of information.
The scientific community has reacted to the COVID-19 outbreak by producing a high number of literary works that are helping us to understand a variety of topics related to the pandemic from different perspectives. Dealing with this large amount of information can be challenging, especially when researchers need to find answers to complex questions about specific topics. We present an Information Retrieval System that uses latent information to select relevant works related to specific concepts. By applying Latent Dirichlet Allocation (LDA) models to documents, we can identify key concepts related to a specific query and a corpus. Our method is iterative in that, from an initial input query defined by the user, the original query is expanded for each subsequent iteration. In addition, our method is able to work with a limited amount of information per article. We have tested the performance of our proposal using human validation and two evaluation strategies, achieving good results in both of them. Concerning the first strategy, we performed two surveys to determine the performance of our model. For all the categories that were studied, precision was always greater than 0.6, while accuracy was always greater than 0.8. The second strategy also showed good results, achieving a precision of 1.0 for one category and scoring over 0.7 points overall.
In this paper, we propose a method to help authors to choose alternative Keywords that help their papers to gain visibility. These alternative keywords must have a certain level of popularity in the scientific community and, at the same time, be keywords that have fewer competitors. The competitors would be derived from other papers containing the same keywords. Having fewer competitors would allow the author’s paper to have a higher consult frequency. In order to recommend keywords, we must first determine an Attention-Survival score. The attention score is obtained by using the popularity of a keyword. The survival score is derived by the number of manuscripts using the same keyword. With these two scores, we created a new algorithm that finds alternative keywords with a high Attention-Survival score. We used ontologies in order to ensure that alternative keywords proposed by our method are semantically related to the original authors keywords that authors wish to refine. The hierarchical structure in an ontology supports the relationship between the alternative keywords and the input keywords. To test the sensibility of the ontology, we used two sources: WordNet and The Computer Science Ontology. Finally, we launched a survey to have human validation for our algorithm, by using keywords from Web of Science papers and three ontologies: WordNet, CSO and DBpedia. We obtained good results in all our tests.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.