2017
DOI: 10.1177/0894439317710450
|View full text |Cite
|
Sign up to set email alerts
|

Investigating the Adequacy of Response Time Outlier Definitions in Computer-Based Web Surveys Using Paradata SurveyFocus

Abstract: Web surveys are commonly used in social research because they are usually cheaper, faster, and simpler to conduct than other modes. They also enable researchers to capture paradata such as response times. Particularly, the determination of proper values to define outliers in response time analyses has proven to be an intricate challenge. In fact, to a certain degree, researchers determine them arbitrarily. In this study, we use "SurveyFocus (SF)"-a paradata tool that records the activity of the web-survey page… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
35
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 35 publications
(41 citation statements)
references
References 8 publications
5
35
0
1
Order By: Relevance
“…Respondents in the two distraction conditions felt significantly more distracted and were less able to pay attention to the survey. The study, however, found no significant effect of distraction for any of the data quality and attentiveness measures, replicating findings from previous observational studies (Aizpurua et al, 2018;Antoun et al, 2017;Höhne & Schlosser, 2018;Kennedy & Everett, 2011;Lavrakas et al, 2010;Lynn & Kaminska, 2012;Schober et al, 2015;Sendelbah et al, 2016). A significant interaction effect was found for response times: The music treatment had a positive effect on survey duration for PC respondents but a negative effect for tablet respondents.…”
Section: Discussionsupporting
confidence: 84%
See 1 more Smart Citation
“…Respondents in the two distraction conditions felt significantly more distracted and were less able to pay attention to the survey. The study, however, found no significant effect of distraction for any of the data quality and attentiveness measures, replicating findings from previous observational studies (Aizpurua et al, 2018;Antoun et al, 2017;Höhne & Schlosser, 2018;Kennedy & Everett, 2011;Lavrakas et al, 2010;Lynn & Kaminska, 2012;Schober et al, 2015;Sendelbah et al, 2016). A significant interaction effect was found for response times: The music treatment had a positive effect on survey duration for PC respondents but a negative effect for tablet respondents.…”
Section: Discussionsupporting
confidence: 84%
“…They recorded focus-out events that capture whether respondents opened another browser window during survey completion as well as question-level response times. The studies found a significant effect of focus-out events on item nonresponse but no effect on straight-lining (Höhne & Schlosser, 2018;Sendelbah et al, 2016).…”
mentioning
confidence: 94%
“…Paradata of the subcategory 'answer' can, for instance, be data from a computer-assisted personal interviewing (e.g., Couper and Kreuter 2013) and strategies such as the "four-screens-per-question-technique" (e.g., Mayerl 2013) allow to measure response latencies in interviews. For item batteries, response times are either not analyzed at all (e.g., Yan and Tourangeau 2008), the total time per page is used (Malhotra 2008;Mavletova and Couper 2016;Höhne and Schlosser 2018), or the completion time for the whole instrument is analyzed (e.g., Liu and Cernat 2016). Only some exceptions investigate time differences between questions of item batteries (e.g., Zhang and Conrad 2013).…”
Section: Answermentioning
confidence: 99%
“…In their most granular form, response phase paradata are collected at the action levelfor example, keystrokes, mouse movements, clicks, or touch events-and are aggregated to the page, session, and respondent levels. Examples of frequently analyzed response phase paradata include response times (Callegaro, Yang, Bhola, Dillman, & Chin, 2009;Gummer & Roßmann, 2015;Heerwegh, 2003;Höhne & Schlosser, 2018;Malhotra, 2008;Yan & Tourangeau, 2008;Zhang & Conrad, 2014); device, browser, operating system, and screen dimensions with which the survey is taken (e.g., Callegaro, 2013a;Couper, Antoun, & Mavletova, 2017;de Bruijne &Wijnant, 2013Keusch & Yan, 2017;Mavletova, 2013;Mavletova & Couper, 2016;Wells, Bailey, & Link, 2014), also relevant to the access phase; and touch events, mouse movements, or mouse clicks, where captured in a production survey and not just a pilot or laboratory study (e.g., Heerwegh, 2002Heerwegh, , 2003Smyth, Dillman, Christian, & Stern, 2006;Stern, 2008;Stieger & Reips, 2010;Horwitz et al, 2017). Many other examples exist; for example, scrolling movements (Couper & Peterson, 2016) and pinch and zoom events have emerged as useful given the growing use of mobile devices for survey taking (Couper et al, 2017).…”
Section: A Typology Of Web Survey Paradata For Assessing Tsementioning
confidence: 99%