2016
DOI: 10.1177/2053168016654326
|View full text |Cite
|
Sign up to set email alerts
|

Exploring the difference in participants’ factual knowledge between online and in-person survey modes

Abstract: Over the past decade, an increasing number of scholars and professionals have turned to the Internet to gather samples of subjects for research ranging from public opinion surveys to experiments in the social sciences. While there has been a focus on whether online samples are representative and accurate, fewer studies examine the behavioral differences between individuals who participate in surveys and experiments on a computer versus in-person. Here, I use an experiment to gauge whether respondents who self-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 21 publications
0
5
0
Order By: Relevance
“…As surveys are increasingly administered online, however, respondents are able to search the web for answers, potentially altering what scholars are measuring with factual knowledge questions. A recent body of research demonstrates that outside search occurs and can affect the estimated levels of political knowledge among the public (Burnett 2016; Clifford and Jerit 2014, 2016; Motta, Callaghan, and Smith 2016; Shulman and Boster 2014), but that literature has yet to establish how search behavior affects the validity of knowledge measures. Across experimental and observational studies, we find a consistent pattern of results—namely, that search engine use reduces the validity of political knowledge measures and undermines the ability to replicate canonical findings in the public opinion literature.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…As surveys are increasingly administered online, however, respondents are able to search the web for answers, potentially altering what scholars are measuring with factual knowledge questions. A recent body of research demonstrates that outside search occurs and can affect the estimated levels of political knowledge among the public (Burnett 2016; Clifford and Jerit 2014, 2016; Motta, Callaghan, and Smith 2016; Shulman and Boster 2014), but that literature has yet to establish how search behavior affects the validity of knowledge measures. Across experimental and observational studies, we find a consistent pattern of results—namely, that search engine use reduces the validity of political knowledge measures and undermines the ability to replicate canonical findings in the public opinion literature.…”
Section: Resultsmentioning
confidence: 99%
“…Outside search has been detected in several participant populations with search rates reaching as high as 41 percent. Evidence of search engine use comes from a variety of sources, including studies showing higher factual knowledge scores in the online condition of randomized mode experiments (Burnett 2016; Clifford and Jerit 2014), lower scores when respondents are randomly assigned instructions not to search (Clifford and Jerit 2016; Motta, Callaghan, and Smith 2016; Vezzoni and Ladini 2017), correct answers to extremely difficult and obscure open-ended questions (“catch” questions; Motta, Callaghan, and Smith 2016), and the outright admission of searching (Clifford and Jerit 2016; Jensen and Thomsen 2014). Outside search is common even in high-quality surveys that have explicitly instructed respondents not to look up answers.…”
mentioning
confidence: 99%
“…The potential impact of cheating in self-administered surveys on the validity of the derived measures has gathered substantial attention. A stream of this research has relied on comparisons between survey modes and sample types (Ansolabehere and Schaffner, 2014;Burnett, 2016;Shulman and Boster, 2014;Strabac and Aalberg, 2011). These performance comparisons do not allow to untangle the provenance of these observed score differences, i.e.…”
Section: A Previous Evidence On Cheating Onlinementioning
confidence: 99%
“…This has become increasingly difficult due to the widespread use of technological devices and the omnipresence of internet search engines. Recent scholarship has demonstrated that due to respondent dishonesty, responses to factual knowledge items in self-administered surveys cannot fully be trusted [3][4][5][6]. However, conducting face-toface survey interviews is much more costly and time-consuming than, for example, web-based surveys.…”
Section: Introductionmentioning
confidence: 99%