This article summarizes findings from studies that employed electronic mail (e-mail) for conducting indepth interviewing. It discusses the benefits of, and the challenges associated with, using e-mail interviewing in qualitative research. The article concludes that while a mixed mode interviewing strategy should be considered when possible, e-mail interviewing can be in many cases a viable alternative to face-to-face and telephone interviewing. A list of recommendations for carrying out effective e-mail interviews is presented.
This paper revises David Ellis's information-seeking behavior model of social scientists, which includes six generic features: starting, chaining, browsing, differentiating, monitoring, and extracting. The paper uses social science faculty researching stateless nations as the study population. The description and analysis of the information-seeking behavior of this group of scholars is based on data collected through structured and semistructured electronic mail interviews. Sixty faculty members from 14 different countries were interviewed by e-mail. For reality check purposes, face-to-face interviews with five faculty members were also conducted. Although the study confirmed Ellis's model, it found that a fuller description of the information-seeking process of social scientists studying stateless nations should include four additional features besides those identified by Ellis. These new features are: accessing, networking, verifying, and information managing. In view of that, the study develops a new model, which, unlike Ellis's, groups all the features into four interrelated stages: searching, accessing, processing, and ending. This new model is fully described and its implications on research and practice are discussed. How and why scholars studied here are different than other academic social scientists is also discussed.
The authors apply a new bibliometric measure, the h-index (Hirsch, 2005), to the literature of information science. Faculty rankings based on raw citation counts are compared with those based on h-counts. There is a strong positive correlation between the two sets of rankings. It is shown how the h-index can be used to express the broad impact of a scholar's research output over time in more nuanced fashion than straight citation counts.
It is a sobering fact that some 90% of papers that have been published in academic journals are never cited. Indeed, as many as 50% of papers are never read by anyone other than their authors, referees and journal editors. We know this thanks to citation analysis, a branch of information science in which researchers study the way articles in a scholarly field are accessed and referenced by others (see box 1).Citation analysis is, however, about much more than producing shock statistics. Along with peer review, it has over the past three decades been increasingly used to judge and quantify the importance of scientists and scientific research. Citation analysis is also the machinery behind journal "impact factors" -figures of merit that researchers take note of when deciding which journal to submit their work to so that it is read as widely as possible. Indeed, the output from citation studies is often the only way that non-specialists in governments and funding bodies -or even those in different scientific disciplines -can judge the importance of a piece of scientific research.
This study examines the differences between Scopus and Web of Science in the citation counting, citation ranking, and h-index of 22 top human-computer interaction (HCI) researchers from EQUATOR-a large British Interdisciplinary Research Collaboration project. Results indicate that Scopus provides significantly more coverage of HCI literature than Web of Science, primarily due to coverage of relevant ACM and IEEE peer-reviewed conference proceedings. No significant differences exist between the two databases if citations in journals only are compared. Although broader coverage of the literature does not significantly alter the relative citation ranking of individual researchers, Scopus helps distinguish between the researchers in a more nuanced fashion than Web of Science in both citation counting and h-index. Scopus also generates significantly different maps of citation networks of individual scholars than those generated by Web of Science. The study also presents a comparison of h-index scores based on Google Scholar with those based on the union of Scopus and Web of Science. The study concludes that Scopus can be used as a sole data source for citation-based research and evaluation in HCI, especially when citations in conference proceedings are sought, and that h scores should be manually calculated instead of relying on system calculations.
The authors describe a large-scale, longitudinal citation analysis of intellectual trading between information studies and cognate disciplines. The results of their investigation reveal the extent to which information studies draws on and, in turn, contributes to the ideational substrates of other academic domains. Their data show that the field has become a more successful exporter of ideas as well as less introverted than was previously the case. In the last decade, information studies has begun to contribute significantly to the literatures of such disciplines as computer science and engineering on the one hand and business and management on the other, while also drawing more heavily on those same literatures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.