Several authors and software vendors advocate the benefits of auto forwarding in web surveys, but there is little empirical research on this approach. We experimentally tested automatic versus manual forwarding (MF) under different levels of cognitive effort. We manipulated information accessibility (IA; low vs. high) and consistency requirements (CRs; yes vs. no), along with auto forwarding (AF) versus MF in two studies conducted among students in Finland. We find that an AF survey takes less time to complete, but only for those completing a survey on personal computers or tablets; no time advantage is found for smartphone users. We also find that respondents in both AF and MF conditions return more often to items with higher cognitive burden (low IA or a CR). MF respondents change answers more often than AF respondents. AF appears to reduce straightlining slightly. We find no difference in response consistency between two behavioral items between AF and MF, but a slight advantage for AF for two attitude items. Finally, respondents reported more positive experiences with the AF version. Auto forwarding appears to be somewhat more efficient and easy to use but may decrease the quality of responses to cognitively demanding questions.
We examine the satisficing respondent behavior and cognitive load of the participants in particular web survey interfaces applying automatic forwarding (AF) or manual forwarding (MF) in order to forward respondents to the next item. We create a theoretical framework based on the Cognitive Load theory (CLT), Cognitive Theory of Multimedia Learning (CTML) and Survey Satisficing Theory taken also into account the latest findings of cognitive neuroscience. We develop a new method in order to measure satisficing responding in web surveys. We argue that the cognitive response process in web surveys should be interpreted starting at the level of sensory memory instead of at the level of working memory. This approach allows researchers to analyze an accumulation of cognitive load across the questionnaire based on observed or hypothesized eye-movements taken into account the interface design of the web survey. We find MF reducing both average item level response times as well as the standard deviation of item-level response times. This suggests support for our hypothesis that the MF interface as a more complex design including previous and next buttons increases satisficing responding generating also the higher total cognitive load of respondents. The findings reinforce the view in HCI that reducing the complexity of interfaces and the presence of extraneous elements reduces cognitive load and facilitates the concentration of cognitive resources on the task at hand. It should be noted that the evidence is based on a relatively short survey among university students. Replication in other settings is recommended.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.