2015
DOI: 10.3233/sji-150908
|View full text |Cite
|
Sign up to set email alerts
|

Interviewer effects in real and falsified interviews: Results from a large scale experiment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
7
1

Relationship

4
4

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 13 publications
0
12
0
Order By: Relevance
“…Kreuter et al (2011) report that payment per hour did not alter interviewer effects associated with responses to filter questions, compared to payment per completed interview. By contrast, Winker et al (2015) found that experimentally and randomly varied payment schemes (payment per hour vs. per completed interviews) had an effect on the interview data (e.g., interview durations).…”
Section: Introductionmentioning
confidence: 79%
“…Kreuter et al (2011) report that payment per hour did not alter interviewer effects associated with responses to filter questions, compared to payment per completed interview. By contrast, Winker et al (2015) found that experimentally and randomly varied payment schemes (payment per hour vs. per completed interviews) had an effect on the interview data (e.g., interview durations).…”
Section: Introductionmentioning
confidence: 79%
“…[18]). The database in this paper consists of three datasets, obtained in the summer of 2011, and is described in detail by Kemper and Menold [19] as well as by Menold and Kemper [7]:…”
Section: Database and Methodsmentioning
confidence: 99%
“…As a result, some of the sociodemographic variables, like age or education, show only small variances (cp. [18]). …”
Section: Database and Methodsmentioning
confidence: 99%
“…As long as data quality issues do not become public, it appears rational for the field agency not to discuss them with the researcher nor to invest effort on repeating (parts) of the survey at its own cost. It is also rational to pay interviewers per completed interview although it is known that payment per hour reduces the risk of deviant behavior and improves the quality of collected data as does a careful training and supervision of interviewers [16,25,33].…”
Section: Incentivesmentioning
confidence: 99%
“…Thus, for a full picture of deviant behavior, which might be used to improve and evaluate detection methods, access to many more real datasets with identified deviant behavior is required. When it comes to evaluate the performance of specific detection methods for specific types of deviant behavior, experimental and simulated data sets might be used to complement the findings from real data [3,31,33].…”
Section: Documentation and Reputationmentioning
confidence: 99%