Language assessment literacy refers to language instructors’ familiarity with testing definitions and the application of this knowledge to classroom practices in general and specifically to issues related to assessing language. While it is widely agreed that classroom teachers need to assess student progress, many teachers and other test users have a limited understanding of assessment fundamentals. To help meet this need, a tutorial for foreign language instructors was developed (CAL, 2009) to describe the basics of language assessment and assist with test selection. In this project, group interviews and surveys were used to elicit feedback from two groups of experts, US language instructors (N = 44) and language testers (N = 30), on the content of the tutorial. The results of the project revealed the challenges of including the technical information considered essential by testers while meeting the real and practical needs of teachers. This paper investigates efforts to elicit language testers’ beliefs about measurement basics compared with those of language educators and suggests that expert beliefs about what is essential to include in such materials differ depending on the expert perspective.
Although the study abroad homestay context is commonly considered the ideal environment for language learning, host‐student interactions may be limited. The present study explored how language development of students of Spanish, Mandarin, and Russian related to student and host family perspectives on the homestay experience. The study used pretest and posttest Simulated Oral Proficiency Interviews to investigate student oral proficiency gains and surveys to examine beliefs of these students (n = 152) and their hosts (n = 87). Students and families were generally positive about the homestay, with significant variation based on language. A significant relationship was found between students' oral proficiency gains and their being glad to have lived with a host family. Significant correlations were also found between students' language learning satisfaction and their satisfaction with the homestay.
Changes to oral proficiency instruction and assessment in post‐secondary foreign language programs grew out of the proficiency movement of the 1970s and 1980s. The Oral Proficiency Interview (OPI) became the major approach to oral proficiency assessment in the United States. Initially developed for government use, the OPI was originally rated according to the Interagency Language Roundtable Guidelines. Over time, the ACTFL Proficiency Guidelines‐Speaking were developed for use with the OPI in academic settings, particularly at the post‐secondary level. In this paper, we discuss the strengths and limitations of the OPI and identify current controversies related to its use at the post‐secondary level. In addition, we explore new approaches to oral proficiency assessment, including computer‐mediated oral proficiency testing. We also examine the expected proficiency outcomes for foreign language students at different levels, an area that has been little researched. Finally, we recommend ways to increase the formal use of oral proficiency assessment and establish and publicize realistic expectations of outcomes for programs, instructors and students.
Since their initial publication in 1982, the ACTFL Guidelines and oral proficiency interview (OPI) have enjoyed widespread use by foreign language educators. They have also been the target of much criticism by researchers of second language acquisition and testing. Much of this criticism has focused on validity claims for the OPI. Other research (e.g., Thompson, 1995) has investigated the interrater reliability of the test. This article provides an overview of research (both critical analyses and empirical studies) conducted on the ACTFL OPI and Guidelines from 1990 to present. The author identifies trends in this research, discusses lessons learned from research in other proficiency test areas, and provides recommendations for areas of future study.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.