This paper presents an experimental study to understand the key differences in the neural representations when the subject is presented with speech signals of a known and an unknown language and to capture the evolution of neural responses in the brain for a language learning task. In this study, electroencephalography (EEG) signals were recorded while the human subjects listened to a given set of words from English (familiar language) and Japanese (unfamiliar language). The subjects also provided behavioural signals in the form of spoken audio for each input audio stimuli. In order to quantify the representation level differences for the auditory stimuli of two languages, we use a classification approach to discriminate the two languages from the EEG signal recorded during listening phase by designing an off-line classifier. These experiments reveal that the time-frequency features along with phase contain significant language discriminative information. The language discrimination is further confirmed with a second subsequent experiment involving Hindi (the native language of the subjects) and Japanese (unknown language). A detailed analysis is performed on the recorded EEG signals and the audio signals to further understand the language learning processes. A pronunciation rating technique on the spoken audio data confirms the improvement of pronunciation over the course of trials for the Japanese language. Using single trial analysis, we find that the EEG representations also attain a level of consistency indicating a pattern formation. The brain regions responsible for language discrimination and learning are identified based on EEG channel locations and are found to be predominantly in the frontal region.
The event-related potential (ERP) of electroencephalography (EEG) signals has been well studied in the case of native language speech comprehension using semantically matched and mis-matched end-words. The presence of semantic incongruity in the audio stimulus elicits a N400 component in the ERP waveform. However, it is unclear whether the semantic dissimilarity effects in ERP also appear for foreign language words that were learned in a rapid language learning task. In this study, we introduced the semantics of Japanese words to subjects who had no prior exposure to Japanese language. Following this language learning task, we performed ERP analysis using English sentences of semantically matched and mis-matched nature where the end-words were replaced with their Japanese counterparts. The ERP analysis revealed that, even with a short learning cycle, the semantically matched and mis-matched end-words elicited different EEG patterns (similar to the native language case). However, the patterns seen for the newly learnt word stimuli showed the presence of P600 component (delayed and opposite in polarity to those seen in the known language). A topographical analysis revealed that P600 responses were pre-dominantly observed in the parietal region and in the left hemisphere. The absence of N400 component in this rapid learning task can be considered as evidence for its association with long-term memory processing. Further, the ERP waveform for the Japanese end-words, prior to semantic learning, showed a P3a component owing to the subject's reaction to a novel stimulus. These differences were more pronounced in the centro-parietal scalp electrodes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.