2006
DOI: 10.1177/073724770603100206
|View full text |Cite
|
Sign up to set email alerts
|

Meaningful Assessment of Content-Area Literacy for Youth with and without Disabilities

Abstract: This article discusses methods for evaluating students' content-area literacy skills. Four specific factors that affect content-area literacy are described: vocabulary knowledge, topic knowledge, text structure knowledge, and textbook readability, along with methods of evaluating each of the factors. Most of these methods have not yet been empirically tested to determine how reliable and valid they are for assessment purposes. Teachers of students with and without disabilities, therefore, are encouraged to int… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 47 publications
(44 reference statements)
0
1
0
Order By: Relevance
“…At this point in time, studies in this area have yet to analytically compare static scores for reliability and validity for use as predictors of performance on state or commercial content area assessments. Furthermore, Troia (2006) commented that the majority of these assessments have been "designed with instructional feedback rather than psychometric rigor in mind" (p. 78). More empirical evidence is necessary to determine if content area CBMs demonstrate adequate evidence of alternate form reliability, validity for content area knowledge, and slope estimates regarding use as screening and progress monitoring tools.…”
Section: Content Area Cbm Formats and Developmentmentioning
confidence: 99%
“…At this point in time, studies in this area have yet to analytically compare static scores for reliability and validity for use as predictors of performance on state or commercial content area assessments. Furthermore, Troia (2006) commented that the majority of these assessments have been "designed with instructional feedback rather than psychometric rigor in mind" (p. 78). More empirical evidence is necessary to determine if content area CBMs demonstrate adequate evidence of alternate form reliability, validity for content area knowledge, and slope estimates regarding use as screening and progress monitoring tools.…”
Section: Content Area Cbm Formats and Developmentmentioning
confidence: 99%