The Vocabulary Levels Test has been widely used in language assessment and vocabulary research despite never having been properly validated. This article reports on a study which uses a range of analysis techniques to present validity evidence, and to explore the equivalence of two revised and expanded versions of the Vocabulary Levels Test.
This paper reports on an ESRC-funded study of the levels of knowledge about language of first-year undergraduate student learners of French and the relationship between this metalinguistic knowledge and language proficiency and aptitude. Tests of metalinguistic knowledge, language aptitude and French linguistic proficiency were administered to 509 students. The results show that levels of metalinguistic knowledge vary considerably. However, the relationship between metalinguistic knowledge and language proficiency is weak. Meta-linguistic knowledge and language proficiency appear to constitute two separate factors of linguistic ability. Moreover, there was no support for the belief that students with higher levels of metalinguistic knowledge perform better at French, or that they improve their French proficiency at higher rates than other students during university study. Whilst knowledge about language may be worthwhile in its own right, there is no evidence from this study to justify the teaching of metalinguistic knowledge as a means of improving students' linguistic proficiency.
The nature and validation of placement tests is rarely discussed in the language testing literature, yet placement tests are probably one of the commonest forms of tests used within institutions which are not designed by individual teachers and which are used to make decisions across the institution rather than within individual classes. Questions to be asked in the validation and evaluation of any placement test include the following: Does the placement test correctly identify those students who most need English and study skills classes? Do the students who take the test feel that their language has been accurately measured? Is the content of the test appro priate to the uses made of the tests? Is the test reliable? This paper reports on an attempt to validate an institutional placement test at Lancaster University. After presenting the results of the study, the paper comments both on the validity and reliability of the test, and on the wider issues that influence how validation studies of placement tests can be carried out.
In this brief article, I discuss the relationship between language testing and the other sub-disciplines of applied linguistics and also the relationship, as I see it, between testing and assessment. The article starts with a brief exploration of the term ‘applied linguistics’ and then goes on to discuss the role of language testing within this discipline, the relationship between testing and teaching, and the relationship between testing and assessment. The second part of the article mentions some areas of current concern to testers and discusses in more detail recent advances in the areas of performance testing, alternative assessment, and computer assessment. One of my aims in this article is to argue that the skills involved in language testing are necessary not only for those constructing all kinds of language proficiency assessments, but also for those other applied linguists who use tests or other elicitation techniques to help them gather language data for research.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.