2000
DOI: 10.1109/5.880084
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting latent semantic information in statistical language modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
165
0
1

Year Published

2001
2001
2015
2015

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 268 publications
(170 citation statements)
references
References 48 publications
1
165
0
1
Order By: Relevance
“…Note that the benefit of the maximum entropy combination method is that the cluster nodes behave like latent variables in a mixture model for "soft clustering", instead of the "hard clusters" created by methods like K-means used in Bellegarda (2000). Figure 2 shows an extended semantic smoothed version of the model that incorporates additional word cluster variables within each topic, and additional topic cluster variables with each document.…”
Section: Semantic Smoothingmentioning
confidence: 99%
See 3 more Smart Citations
“…Note that the benefit of the maximum entropy combination method is that the cluster nodes behave like latent variables in a mixture model for "soft clustering", instead of the "hard clusters" created by methods like K-means used in Bellegarda (2000). Figure 2 shows an extended semantic smoothed version of the model that incorporates additional word cluster variables within each topic, and additional topic cluster variables with each document.…”
Section: Semantic Smoothingmentioning
confidence: 99%
“…This approach has several advantages over other methods for statistical modeling, such as introducing less data fragmentation (as in decision tree learning), requiring fewer independence assumptions (as in naive Bayes models), and exploiting a principled technique for automatic feature weighting. The major weakness with maximum entropy methods, however, are that they can only model distributions over explicitly observed features, whereas in natural language we encounter hidden semantic (Bellegarda, 2000;Hofmann, 2001) and syntactic information (Chelba & Jelinek, 2000) which we do not observe directly.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…[1]. It can be shown that NMF provides a clear interpretation of concepts such as abstraction in terms of linear algebraic operations [23].…”
Section: Brief Description Of the Modelmentioning
confidence: 99%