PsycEXTRA Dataset 2011
DOI: 10.1037/e642002011-001
|View full text |Cite
|
Sign up to set email alerts
|

Accessible Reading Assessments for Students with Disabilities: The Role of Cognitive, Grammatical, Lexical, and Textual/Visual Features

Abstract: The purpose of this study is to examine the characteristics of reading test items that may differentially impede the performance of students with disabilities. By examining the relationship between select item features and performance, the study seeks to inform strategies for increasing the accessibility of reading assessments for individuals from this group. Including students with disabilities in large-scale, statewide assessment and accountability systems, as mandated by the Individuals with Disabilities Ed… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0
7

Year Published

2011
2011
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(23 citation statements)
references
References 43 publications
(56 reference statements)
0
16
0
7
Order By: Relevance
“…The Patient Reported Outcomes Measurement Information System (PROMIS) Upper Extremity computer adaptive test and short form had the most complex grammar, as most items used the structure, ‘I could [functional task]’; ‘could + verb’ is a complex verb structure that may be difficult for young people with developmental disabilities to understand. 16 This complex structure may be related to the PROMIS Upper Extremity’s incorporation of a recall period.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The Patient Reported Outcomes Measurement Information System (PROMIS) Upper Extremity computer adaptive test and short form had the most complex grammar, as most items used the structure, ‘I could [functional task]’; ‘could + verb’ is a complex verb structure that may be difficult for young people with developmental disabilities to understand. 16 This complex structure may be related to the PROMIS Upper Extremity’s incorporation of a recall period.…”
Section: Resultsmentioning
confidence: 99%
“…If a single word was considered unfamiliar or not simple, the whole item received a ‘no’ rating for this feature. For the feature ‘simple grammar,’ verb complexity was based on the National Center for Research on Evaluation, Standards, and Student Testing’s definition: ‘complex verbs are multi-part with a base or main verb and several auxiliaries.’ 16 Sentence length and number of clauses was also considered for ‘complex grammar.’ Raters met to review discrepancies and reach consensus for each feature. When necessary, the second author provided input to resolve discrepancies.…”
Section: Methodsmentioning
confidence: 99%
“…Item adaptations that are developed to measure the same target and cognitive complexity are derived from Messick's (1989) theory of measurement validity and on Mislevy's (1994) Evidence Centered Design. Abedi and colleagues's (e.g., Abedi et al, 2011;Abedi & Lord, 2001) and Kopriva's (2008aKopriva's ( , 2008c explanations of principled linguistic simplification of text and use of compensatory visuals articulate some of the tenets associated with this type of adaptation that is developed to not alter the content and depth of knowledge. Using cognitive load theory, with an emphasis on removing extraneous cognitive load from the test-taking process, scholars at Vanderbilt University articulated a sequence of steps in item modification process (e.g., .…”
Section: Conceptualization Of Accessmentioning
confidence: 95%
“…However, although this may reflect a more quantitative approach, it may not fully explain why student performance changes or does not change with adaptations to different features of an item. To gain more explanatory results on effects of item changes, researchers may also utilize a panel of experts to evaluate test items according to several characteristics such as item grammar and vocabulary Downloaded by [University of Nebraska, Lincoln] at 23:27 04 April 2015 (Abedi et al, 2011) or inclusion of visual representations ). These experts may evaluate the items in a "round table" format (Johnstone et al, 2008) or by individually rating items using protocols Liu & Anderson, 2008).…”
Section: Measuring Accessmentioning
confidence: 99%
See 1 more Smart Citation