Proceedings of the 55th Annual Meeting of the Association For Computational Linguistics (Volume 1: Long Papers) 2017
DOI: 10.18653/v1/p17-1033
|View full text |Cite
|
Sign up to set email alerts
|

Topically Driven Neural Language Model

Abstract: Language models are typically applied at the sentence level, without access to the broader document context. We present a neural language model that incorporates document context in the form of a topic model-like architecture, thus providing a succinct representation of the broader document context outside of the current sentence. Experiments over a range of datasets demonstrate that our model outperforms a pure sentence-based model in terms of language model perplexity, and leads to topics that are potentiall… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
20
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 38 publications
(21 citation statements)
references
References 19 publications
1
20
0
Order By: Relevance
“…Moreover, the combination of deep learning and LDA has become another topic extraction method. For example, in this study [67], a novel neural topic model was proposed to acquire the N-gram topic by the deep neural network and then used LDA to obtain the topic representation of the document; the study [68] integrated LDA into language model LSTM for joint training. In addition to the LDA and its extensions, there are also methods for generating subject-related vectors by the neural attention model, such as Attention-based Aspect Extraction (ABAE) [69].…”
Section: Topic Extractionmentioning
confidence: 99%
“…Moreover, the combination of deep learning and LDA has become another topic extraction method. For example, in this study [67], a novel neural topic model was proposed to acquire the N-gram topic by the deep neural network and then used LDA to obtain the topic representation of the document; the study [68] integrated LDA into language model LSTM for joint training. In addition to the LDA and its extensions, there are also methods for generating subject-related vectors by the neural attention model, such as Attention-based Aspect Extraction (ABAE) [69].…”
Section: Topic Extractionmentioning
confidence: 99%
“…Our work provides an additional way to address the wellknown drawback of RNNs: they use only limited context. This has been noted as a serious problem in conversational modeling (Sordoni et al, 2015) and text generation with multiple sentences (Lau et al, 2017). Recent work on context-aware text generation (or the related task, language modeling) has studied the possibilities of using different granularity of context.…”
Section: Related Workmentioning
confidence: 99%
“…• LSTM + LDA: the topic vector θ obtained from a pretrained LDA is concatenated to the output of LSTM and a recent topic-dependent LSTM applied to our task • TD-LSTM (Lau, Baldwin, and Cohn 2017): θ is added to the output of LSTM via a dense layer. TD-LSTM and our TE-LSTM are jointly trained with our Topic Generated Equations Quantum physics…”
Section: Equation Model Evaluationmentioning
confidence: 99%