Proceedings of the Conference on Computer Support for Collaborative Learning Foundations for a CSCL Community - CSCL '02 2002
DOI: 10.3115/1658616.1658815
|View full text |Cite
|
Sign up to set email alerts
|

Latent semantic analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0
1

Year Published

2007
2007
2022
2022

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(22 citation statements)
references
References 6 publications
0
21
0
1
Order By: Relevance
“…However, the co-occurrence neighborhoods of words in nondecomposed models often include morphological variants of the same word (e.g., the first two neighbors of the word work are working and works), reflecting the fact that many morphological variants of the same root word share similar contexts. However, this is of course not always true; recall that (as cited above) Landauer and Dumais (2008) noted that Bstemming often confabulates meanings^(p. 4356) in word vectors.…”
Section: Performance Impact Of Morphological Decompositionmentioning
confidence: 97%
See 2 more Smart Citations
“…However, the co-occurrence neighborhoods of words in nondecomposed models often include morphological variants of the same word (e.g., the first two neighbors of the word work are working and works), reflecting the fact that many morphological variants of the same root word share similar contexts. However, this is of course not always true; recall that (as cited above) Landauer and Dumais (2008) noted that Bstemming often confabulates meanings^(p. 4356) in word vectors.…”
Section: Performance Impact Of Morphological Decompositionmentioning
confidence: 97%
“…As suggested by Landauer and Dumais (2008), this compression of information may introduce excessive noise into the context, thereby reducing semantic content. Implicit in such a view, however, is the assumption that compressing-or Bconfabulating^-co-occurrence statistics into a combined representation would necessarily result in a loss of information.…”
Section: Current Studymentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we discuss our approaches of using latent semantic analysis (LSA) [10] and its related packages for terms and documents comparison to recover the most related discussion themes and potential target audiences to benefit affect detection.…”
Section: Semantic Interpretation Of Social Contextsmentioning
confidence: 99%
“…To handle this gap, LSA model [7], such as probabilistic latent semantic index (pLSI) [8] and Latent Dirichlet Allocation (LDA) [9] have been proposed to extract lowdimensional semantic features. Although integrating extra semantic information is promising, this kind of approach only build linear and shallow models that can only capture pairwise semantic relationships between words [10].…”
Section: Introductionmentioning
confidence: 99%