2017
DOI: 10.1080/08959285.2017.1403441
|View full text |Cite
|
Sign up to set email alerts
|

An Information-Processing-Based Conceptual Framework of the Effects of Unproctored Internet-Based Testing Devices on Scores on Employment-Related Assessments and Tests

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
33
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 25 publications
(35 citation statements)
references
References 69 publications
2
33
0
Order By: Relevance
“…Morelli, Potosky, Arthur, and Tippins (2017) note that "reactive equivalence" studies comparing assessment modes are not theoretically (or even practically) informative as they do not address why mode differences occur or the reasons for the construct non-equivalence. Frameworks such as Potosky's (2008) and Arthur, Keiser, and Doverspike's (2018) indicate ways in which different assessment delivery devices might differ; these need to be expanded and tested to more systematically understand whether and why more efficient delivery of assessments might be a more rather than less accurate way to measure (see Apers & Derous, 2017, for an example in the context of resume screening). Morelli et al (2017) make a strong case for better theory-based predictions that go beyond considering variance associated with technology use as "construct irrelevant."…”
Section: Effects Of Efficiencymentioning
confidence: 99%
“…Morelli, Potosky, Arthur, and Tippins (2017) note that "reactive equivalence" studies comparing assessment modes are not theoretically (or even practically) informative as they do not address why mode differences occur or the reasons for the construct non-equivalence. Frameworks such as Potosky's (2008) and Arthur, Keiser, and Doverspike's (2018) indicate ways in which different assessment delivery devices might differ; these need to be expanded and tested to more systematically understand whether and why more efficient delivery of assessments might be a more rather than less accurate way to measure (see Apers & Derous, 2017, for an example in the context of resume screening). Morelli et al (2017) make a strong case for better theory-based predictions that go beyond considering variance associated with technology use as "construct irrelevant."…”
Section: Effects Of Efficiencymentioning
confidence: 99%
“…Whereas Potosky (2008) framed the technological attributes that affect the communication quality between two parties, the structural characteristics and information processing (SCIP) framework (Arthur, Keiser, & Doverspike, 2017) is a conceptualization of technology that assumes structural attributes affect the assessee's cognition. Specifically, this model explains how an unproctored Internet test's (UIT) device type introduces construct-irrelevant variance by affecting the respondents' cognitive load.…”
Section: Arthur Keiser and Doverspike's (2017) Scip Frameworkmentioning
confidence: 99%
“…Specifically, this model explains how an unproctored Internet test's (UIT) device type introduces construct-irrelevant variance by affecting the respondents' cognitive load. Arthur, Keiser, et al (2017) preface the development of their model with a literature review of the 23 published and unpublished studies on UIT device types. The review's summary of the literature's findings indicated that although mobile and nonmobile cognitive and noncognitive assessments do not differ in terms of psychometric properties, such as factor structure, the reliability of scores, and differential item functioning, among others, cognitive assessments on mobile devices typically result in lower scores than those on nonmobile devices.…”
Section: Arthur Keiser and Doverspike's (2017) Scip Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…Industrial-organizational psychology has an opportunity to provide some important theoretical research and empirical data to identify the best practices for companies considering utilizing these technologies, particularly VR technology. The structural characteristics and information processing (SCIP) framework may be particularly useful in this endeavor (Arthur, Keiser, & Doverspike, 2017). There appears to be a wealth of empirical data being generated in other fields (e.g., education, instructional design) that could aid in development of additional theoretical models that could be applied to understanding the role of technology in developing effective training programs.…”
mentioning
confidence: 99%