2014
DOI: 10.1002/ir.20069
|View full text |Cite
|
Sign up to set email alerts
|

Survey Page Length and Progress Indicators: What Are Their Relationships to Item Nonresponse?

Abstract: The popularity of online student surveys has been associated with greater item nonresponse. This chapter presents research aimed at exploring what factors might help minimize item nonresponse, such as altering online survey page length and using progress indicators.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…Future testing should aim to collect evaluation data online, that is, an online survey would enable respondents to complete the survey at their convenience, also allowing for the collection of detailed demographic and outcome data. Furthermore, while missing data were not an issue in this feasibility study, longer surveys pose a greater risk for missing data (Galesic & Bosnjak, 2009; Sarraf & Tukibayeva, 2014). Online data collection tools have capabilities that can minimize nonresponse bias (Albaum, Roster, Wiley, Rossiter, & Smith, 2010) and, therefore, will be incorporated into future testing phases.…”
Section: Discussionmentioning
confidence: 92%
“…Future testing should aim to collect evaluation data online, that is, an online survey would enable respondents to complete the survey at their convenience, also allowing for the collection of detailed demographic and outcome data. Furthermore, while missing data were not an issue in this feasibility study, longer surveys pose a greater risk for missing data (Galesic & Bosnjak, 2009; Sarraf & Tukibayeva, 2014). Online data collection tools have capabilities that can minimize nonresponse bias (Albaum, Roster, Wiley, Rossiter, & Smith, 2010) and, therefore, will be incorporated into future testing phases.…”
Section: Discussionmentioning
confidence: 92%
“…These qualities of web-survey tools can make them an ideal option for the study of sensitive topics, and sociological research thrives on the investigation of such topics. However, a crucial limitation of web surveys is that they generally produce relatively low response rates, potentially increasing the threat of nonresponse bias (Dugan et al 2015; Sarraf and Cole 2014). Improving response rates does not necessarily eliminate the possibility of nonresponse bias, but the systematic application of a different protocol in the second phase of a responsive survey design draws in respondents who themselves are systematically different than those in the first phase (Axinn et al 2011; Peytchev et al 2009).…”
Section: Discussionmentioning
confidence: 99%
“…Surveys of individual college students have suffered from declining response rates (Jans and Roman 2007), with recent large national multi-institutional studies obtaining response rates around 30 percent (Dugan, Turman, and Torrez 2015; Krebs et al 2016; Sarraf and Cole 2014). Krebs et al (2016) conducted a similar survey at several schools and produced average response rates of 54 percent for females and 40 percent for males.…”
Section: Conceptual Frameworkmentioning
confidence: 99%
“…One area of concentrated work investigated the use of progress indicators in online surveys (Villar, Callegaro, & Yang, 2013. Also see Conrad, Couper, Tourangeau, & Peytchev, 2010;Matzat, Snijders, and van der Horst 2009;and Sarraf and Tukibayeva, 2014). The emphasis was on the effect of alternative progress-indicator design to reduce "drop-off rates" -that is, to minimize the probability that a respondent fails to complete a survey after starting it.…”
Section: Background and Hypothesis Developmentmentioning
confidence: 99%