The tremendous amount of information available online has resulted in considerable research on information and source credibility. The vast majority of scholars, however, assume that individuals work in isolation to form credibility opinions and that people must assess information credibility in an effortful and time-consuming manner. Focus group data from 109 participants were used to examine these assumptions. Results show that most users rely on others to make credibility assessments, often through the use of group-based tools. Results also indicate that rather than systematically processing information, participants routinely invoked cognitive heuristics to evaluate the credibility of information and sources online. These findings are leveraged to suggest a number of avenues for further credibility theorizing, research, and practice.Although research in the last decade has provided significant insight into how people assess the credibility of online information and sources, much of it has proceeded from dual assumptions that are currently untenable. Specifically, the majority of research almost exclusively considers individuals as making credibility judgments in isolation from one another, thereby ignoring more social means and tools of credibility evaluation. In addition, scholars have largely presumed that individuals must evaluate information in a cognitively effortful fashion to arrive at credibility judgments and have thus neglected the ways in which more heuristic evaluative strategies may be invoked to form credibility assessments.This study extends conceptualizations of credibility assessment to incorporate new realities of the Web environment, as well as new theorizing in the areas of information processing and evaluation. Specifically, we suggest that recent sociotechnical developments offer new means for social-and group-based information evaluation and credibility assessment that have been entirely ignored by existing credibility research. In addition, mounting evidence in cognitive science and psychology suggests that, particularly within information-abundant environments such as the Web,
This article summarizes much of what is known from the communication and information literacy fields about the skills that Internet users need to assess the credibility of online information. The article reviews current recommendations for credibility assessment, empirical research on how users determine the credibility of Internet information, and describes several cognitive models of online information evaluation. Based on the literature review and critique of existing models of credibility assessment, recommendations for future online credibility education and practice are provided to assist users in locating reliable information online. The article concludes by offering ideas for research and theory development on this topic in an effort to advance knowledge in the area of credibility assessment of Internet-based information.
People increasingly rely on Internet and web-based information despite evidence that it is potentially inaccurate and biased. Therefore, this study sought to assess people's perceptions of the credibility of various categories of Internet information compared to similar information provided by other media. The 1,041 respondents also were asked about whether they verified Internet information. Overall, respondents reported they considered Internet information to be as credible as that obtained from television, radio, and magazines, but not as credible as newspaper information. Credibility among the types of information sought, such as news and entertainment, varied across media channels. Respondents said they rarely verified web-based information, although this too varied by the type of information sought. Levels of experience and how respondents perceived the credibility of information were related to whether they verified information. This study explores the social relevance of the findings and discusses them in terms of theoretical knowledge of advanced communication technologies.
Data from 574 participants were used to assess perceptions of message, site, and sponsor credibility across four genres of websites; to explore the extent and effects of verifying web-based information; and to measure the relative influence of sponsor familiarity and site attributes on perceived credibility.The results show that perceptions of credibility differed, such that news organization websites were rated highest and personal websites lowest, in terms of message, sponsor, and overall site credibility, with e-commerce and special interest sites rated between these, for the most part.The results also indicated that credibility assessments appear to be primarily due to website attributes (e.g. design features, depth of content, site complexity) rather than to familiarity with website sponsors. Finally, there was a negative relationship between self-reported and observed information verification behavior and a positive relationship between self-reported verification and internet/web experience. The findings are used to inform the theoretical development of perceived web credibility.
The proliferation of information sources as a result of networked computers and other interconnected devices has prompted significant changes in the amount, availability, and nature of geographic information. Among the more significant changes is the increasing amount of readily available volunteered geographic information. Although volunteered information has fundamentally enhanced geographic data, it has also prompted concerns with regard to its quality, reliability, and overall value. This essay situates these concerns as issues of information and source credibility by (a) examining the information environment fostering collective information contribution, (b) exploring the environment of information abundance, examining credibility and related notions within this environment, and leveraging extant research findings to understand user-generated geographic information, (c) articulating strategies to discern the credibility of volunteered geographic information (VGI), including relevant tools useful in this endeavor, and (d) outlining specific research questions germane to VGI and credibility.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.