2020
DOI: 10.3758/s13428-020-01401-8
|View full text |Cite
|
Sign up to set email alerts
|

A little garbage in, lots of garbage out: Assessing the impact of careless responding in personality survey data

Abstract: In self-report surveys, it is common that some individuals do not pay enough attention and effort to give valid responses. Our aim was to investigate the extent to which careless and insufficient effort responding contributes to the biasing of data. We performed analyses of dimensionality, internal structure, and data reliability of four personality scales (extroversion, conscientiousness, stability, and dispositional optimism) in two independent samples. In order to identify careless/insufficient effort (C/IE… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

8
116
2
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 91 publications
(127 citation statements)
references
References 59 publications
8
116
2
1
Order By: Relevance
“…Concretely, if the same participants who engage in C/IE responding on surveys (and who therefore inaccurately report high levels of psychiatric symptoms) also respond with insufficient effort on behavioral tasks, this can cause experimenters to observe an entirely spurious correlation between greater symptom severity and worse task performance (see Figure 1). A similar effect has been well documented in personality psychology, where the presence of C/IE responding can induce correlations between questionnaires, and bias estimated factors in factor analysis [8,10,[14][15][16].…”
Section: Introductionsupporting
confidence: 58%
“…Concretely, if the same participants who engage in C/IE responding on surveys (and who therefore inaccurately report high levels of psychiatric symptoms) also respond with insufficient effort on behavioral tasks, this can cause experimenters to observe an entirely spurious correlation between greater symptom severity and worse task performance (see Figure 1). A similar effect has been well documented in personality psychology, where the presence of C/IE responding can induce correlations between questionnaires, and bias estimated factors in factor analysis [8,10,[14][15][16].…”
Section: Introductionsupporting
confidence: 58%
“…Modeling strategies based on confirmatory factor analysis have traditionally been used to account for this response bias, but they have recently become under scrutiny due to their incorrect assumption of population homogeneity, inability to recover uncontaminated person scores or preserve structural validities, and their inherent ambiguity. Recently, two constrained factor mixture analysis (FMA) models have been proposed by Arias et al (2020) and Steinmann et al (2021) that can be used to identify and screen inconsistent response profiles. While these methods have shown promise, tests of their performance have been limited and they have not been directly compared.…”
mentioning
confidence: 99%
“…The results indicated that removing the inconsistent respondents identified by both FMAs (≈8%) reduced the amount of wording effects in the database. However, whereas the Steinmann et al method only cleaned the data partially, the Arias et al (2020) method was able to remove the great majority of the wording effects variance. Based on the screened data with the Arias et al method, we evaluated the psychometric properties of the RSES for the Dominican population, and the results indicated that the scores had good validity and reliability properties.…”
mentioning
confidence: 99%
“…Rather, a substantial proportion of survey respondents adopt cognitive shortcuts, for example, by using specific response patterns such as selecting the same response category for multiple items instead of evaluating the actual item content (Johnson, 2005). Particularly, in web-based studies, careless responses have become a major hindrance to data quality (Bowling & Huang, 2018;Weiner & Dalessio, 2006) that can have a substantial impact on the study results (Arias et al, 2020;Huang et al, 2015). Accordingly, various data screening methods have been proposed to identify careless respondents (Meade & Craig, 2012;Niessen et al, 2016) such as probing items that directly assess test-taking behavior (e.g., bogus items), auxiliary or paradata (e.g., response times), or data-driven techniques (e.g., Mahalanobis distance).…”
Section: Using Stochastic Gradient Boostingmentioning
confidence: 99%
“…The phenomenon of careless responding has been explained with different nuances and labels: "content nonresponsivity" (Nichols et al, 1989, p. 239), "careless inattentiveness" (Johnson, 2005, p. 104), "insufficient effort responding" (Huang et al, 2012, p. 100), or "careless/insufficient effort" (Arias et al, 2020). Most commonly, it is considered a bias in survey responses without regard to the actual item content.…”
Section: Types and Sources Of Careless Respondingmentioning
confidence: 99%