Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2014
DOI: 10.3115/v1/p14-1053
|View full text |Cite
|
Sign up to set email alerts
|

Generating Code-switched Text for Lexical Learning

Abstract: A vast majority of L1 vocabulary acquisition occurs through incidental learning during reading (Nation, 2001;Schmitt et al., 2001). We propose a probabilistic approach to generating code-mixed text as an L2 technique for increasing retention in adult lexical learning through reading. Our model that takes as input a bilingual dictionary and an English text, and generates a code-switched text that optimizes a defined "learnability" metric by constructing a factor graph over lexical mentions. Using an artificial … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(17 citation statements)
references
References 19 publications
0
17
0
Order By: Relevance
“…Bespalov, Bai, Qi, and Shokoufandeh () showed that an n‐gram model combined with latent representation would produce a more suitable embedding for sentiment classification. Labutov and Lipson () re‐embed existing word embeddings with logistic regression by regarding sentiment supervision of sentences as a regularization term.…”
Section: Sentiment Analysis With Word Embeddingmentioning
confidence: 99%
“…Bespalov, Bai, Qi, and Shokoufandeh () showed that an n‐gram model combined with latent representation would produce a more suitable embedding for sentiment classification. Labutov and Lipson () re‐embed existing word embeddings with logistic regression by regarding sentiment supervision of sentences as a regularization term.…”
Section: Sentiment Analysis With Word Embeddingmentioning
confidence: 99%
“…We use insights from recent works in L2 acquisition from code-switched text as they have focused on learning from context. Labutov and Lipson (2014) carry out experiments to determine the guessability of a word in code switched text. A similar work by Knowles et al (2016) discuss the factors that can potentially affect the guessability of a German word with English context.…”
Section: Related Workmentioning
confidence: 99%
“…Word embedding learning has received a renewed interest lately due to the impressive performances obtained by the prediction-based word embedding learning methods in a wide range of NLP applications such as sentiment classification [ 6 , 11 , 12 ], named entity recognition [ 13 , 14 ], word sense disambiguation [ 15 , 16 ], relation extraction [ 17 , 18 ], semantic role labeling [ 8 ], and machine translation [ 9 ].…”
Section: Related Workmentioning
confidence: 99%