2017
DOI: 10.1177/0265532217720956
|View full text |Cite
|
Sign up to set email alerts
|

What does corpus linguistics have to offer to language assessment?

Abstract: In recent years, continuing advances in technology have increased the capacity to automate the extraction of a range of linguistic features of texts and thus have provided the impetus for the substantial growth of corpus linguistics. While corpus linguistic tools and methods have been used extensively in second language learning research, they have also been used increasingly in the design and validation of language assessments (Callies & Götz, 2015; Deshors, Götz, & Laporte, 2016; Park, 2014). The collection … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 20 publications
0
7
0
1
Order By: Relevance
“…Gilquin et al (2016) investigated the impact of rating factors, such as the frequency and type of discourse markers, and found similar results to those in Hyland and Anan (2006). Xi (2017) explains that often raters are not good at analyzing fine-grained linguistic features, and when using holistic scoring rubrics (e.g., CEFR), it is easy to prioritize the overall communicative effectiveness. In other words, even though the objective measures of linguistic features have an essential role in scoring, raters' evaluations tend to be influenced by their judgment of the overall effect of communication.…”
Section: Assessing Spoken Learner English Using Cefrmentioning
confidence: 93%
“…Gilquin et al (2016) investigated the impact of rating factors, such as the frequency and type of discourse markers, and found similar results to those in Hyland and Anan (2006). Xi (2017) explains that often raters are not good at analyzing fine-grained linguistic features, and when using holistic scoring rubrics (e.g., CEFR), it is easy to prioritize the overall communicative effectiveness. In other words, even though the objective measures of linguistic features have an essential role in scoring, raters' evaluations tend to be influenced by their judgment of the overall effect of communication.…”
Section: Assessing Spoken Learner English Using Cefrmentioning
confidence: 93%
“…Moreover, one important aspect that has not been explicitly discussed in model development and comparison is the different language backgrounds and demographics of students. Although the nature of the current data set hindered us from investigating whether such essay assessments and scoring environments interacted with students’ language backgrounds in terms of automated scoring, this is still an important and practical dimension of consideration when choosing appropriate AES systems in classrooms and evaluating the validity of AES systems (Xi, 2017). For instance, the type of scoring rubric was one of the comparison criteria which indicated that the deep-neural model had better accuracy regardless of types of scoring frameworks.…”
Section: Discussionmentioning
confidence: 99%
“…Besides these methods, an analysis of the linguistic features of writing samples, although rarely used as a follow-up, could be a viable way to investigate the DIF phenomena. Since writing tasks elicit ample linguistic data, the resulting corpus could provide new evidence for validation efforts and studies of fairness (Park, 2014;Xi, 2017). Indeed, it is desirable to take advantage of the advances in corpus linguistics and use corpus-based analysis to evaluate writing DIF.…”
Section: Lack Of Methods To Interpret and Explain Gender Difmentioning
confidence: 99%