We carried out an experiment that compared telephone and Web versions of a questiontiaire that assessed attitudes toward science and knowledge of basic scientific facts. Members of a random digit dial {RDD) sample were initially contacted by telephone and answered a few screening questions, including one that asked whether they had Internet access. Those with Internet access were randomly assigned to complete either a Web version of the questionnaire or a computerassisted telephone interview. There were four main findings. First, although we offered cases assigned to the Web survey a larger incentive, fewer of them completed the online questionnaire; almost all those who were assigned to the telephone condition completed the interview. The two samples of Web u.sers nonetheless had similar demographic characteristics. Second, the Web survey produced less itern nonresponse than the telephone survey. The Web questionnaire prompted respondents when they left an item blank, whereas the telephone interviewers accepted "no opinion" answers without probing them. Third, Web respondents gave less differentiated answers to batteries of attitude items than their telephone counterparts. The Web questionnaire presented these items in a grid that may have made their similarity more salient.
Leaving the interpretation of words up to participants in standardized survey interviews, aptitude tests, and experiment instructions can lead to unintended interpretation; more collaborative interviewing methods can promote uniform understanding. In two laboratory studies (a factorial experiment and a more naturalistic investigation), respondents interpreted ordinary survey concepts like 'household furniture' and 'living in a house' quite differently than intended in strictly standardized interviews, when the interpretation was left entirely up to them. Comprehension was more accurate when interviewers responded to requests for clarification with non-standardized paraphrased definitions, and most accurate when interviewers also provided clarification whenever they suspected respondents needed it.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.