2008
DOI: 10.1121/1.2933816
|View full text |Cite
|
Sign up to set email alerts
|

Effects of hand gesture and lip movements on auditory learning of second language speech sounds

Abstract: Previous research found that auditory training helps native English speakers to perceive phonemic vowel length distinction in Japanese, but that their performance has never reached native levels (Hirata et al., 2007). Given that multimodal information, such as hand gesture and lip movements, influences semantic aspects of language processing and development (Kelly et al., 2002), we examined whether this multimodal information helps to improve native English speakers' ability to perceive Japanese vowel length d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
5
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…In earlier work on the contribution of visible speech and hand gestures to learning nonnative speech sounds, Kelly, Hirata, et al (2008) argued that lip and mouth movements help in auditory encoding of speech, whereas hand gestures can only help to understand the meaning of words in the speech stream when the auditory signal is correctly encoded. On the basis of their results, Kelly, Hirata, et al (2008) argued that the benefits of multimodal input target different stages of linguistic processing.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…In earlier work on the contribution of visible speech and hand gestures to learning nonnative speech sounds, Kelly, Hirata, et al (2008) argued that lip and mouth movements help in auditory encoding of speech, whereas hand gestures can only help to understand the meaning of words in the speech stream when the auditory signal is correctly encoded. On the basis of their results, Kelly, Hirata, et al (2008) argued that the benefits of multimodal input target different stages of linguistic processing.…”
Section: Discussionmentioning
confidence: 99%
“…On the basis of their results, Kelly, Hirata, et al (2008) argued that the benefits of multimodal input target different stages of linguistic processing. Here, mouth movements seem to aid during phonological stages, whereas hand gestures aid during semantic stages, which, according to the authors, fits with McNeill's (1992) interpretation of speech and gesture forming an integrated system during language comprehension.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…There are a number of researches devoted to this field of study which demonstrate the significant role of non-verbal communication in general and gestures and facial clues in particular in listening comprehension. Researchers such as, Hirata and Kelly (2010), Kruger (2009), Kelly, Hirata, Simester, Burch, Cullings and Demakakos (2008), Kusanagi (2005), Sueyoshi and Hardison (2005), and Tae (1993) have done valuable studies in the area. Kruger (2009) illustrates the noticeable role that non verbal behavior plays in communication.…”
Section: Introductionmentioning
confidence: 99%