Proceedings of the First Workshop on Computational Approaches to Code Switching 2014
DOI: 10.3115/v1/w14-3904
|View full text |Cite
|
Sign up to set email alerts
|

Exploration of the Impact of Maximum Entropy in Recurrent Neural Network Language Models for Code-Switching Speech

Abstract: This paper presents our latest investigations of the jointly trained maximum entropy and recurrent neural network language models for Code-Switching speech. First, we explore extensively the integration of part-of-speech tags and language identifier information in recurrent neural network language models for CodeSwitching. Second, the importance of the maximum entropy model is demonstrated along with a various of experimental results. Finally, we propose to adapt the recurrent neural network language model to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 20 publications
0
7
0
Order By: Relevance
“…They showed that their system outperforms the best SVMbased systems reported in the EMNLP'14 Code-Switching Workshop. Vu and Schultz (2014) proposed to adapt the recurrent neural network language model to different code-switching behaviors and even use them to generate artificial code-switching text data. Adel et al (2013) investigated the application of RNN language models and factored language models to the task of identifying code-switching in speech, and reported a significant improvement compared to the traditional n-gram language model.…”
Section: Related Workmentioning
confidence: 99%
“…They showed that their system outperforms the best SVMbased systems reported in the EMNLP'14 Code-Switching Workshop. Vu and Schultz (2014) proposed to adapt the recurrent neural network language model to different code-switching behaviors and even use them to generate artificial code-switching text data. Adel et al (2013) investigated the application of RNN language models and factored language models to the task of identifying code-switching in speech, and reported a significant improvement compared to the traditional n-gram language model.…”
Section: Related Workmentioning
confidence: 99%
“…To alleviate the data sparsity problem, some approaches work by generating artificial CS text based on a CS-aware recurrent neural network decoder (Vu and Schultz, 2014) or a machine translation system to create CS data from monolingual data . Such techniques would benefit from better understanding of the characteristics of codeswitching data.…”
Section: Resultsmentioning
confidence: 99%
“…The second approach [10,11] leverages recurrent neural network language models trained with CS transcriptions. Because of the small amount of CS transcriptions, the generated texts are often meaningless [10]. The third approach [12] formulates the CS text generation task as a sequence to sequence problem, mapping monolingual text to CS text and solves the task in an end-to-end fashion using sequence-tosequence model [16,17].…”
Section: Introductionmentioning
confidence: 99%