Proceedings of the 11th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval - SIGIR 1988
DOI: 10.1145/62437.62487
|View full text |Cite
|
Sign up to set email alerts
|

Information retrieval using a singular value decomposition model of latent semantic structure

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
90
0
2

Year Published

2004
2004
2017
2017

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 206 publications
(92 citation statements)
references
References 0 publications
0
90
0
2
Order By: Relevance
“…Similarly, a user is represented by a vector in the same space. As a consequence, the correlation between user p and item i (i.e., how much the item matches the user interests) can be computed as the similarity between the correspondent vectors, for instance by means of their inner product: (10) where, a pe and b ie are the e-th (unknown) features for user p and item i, respectively. The point is to compute the l features which minimize the prediction error between the estimatedr pi and the actual value r pi .…”
Section: Dimensionality-reduction-based Collaborative Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Similarly, a user is represented by a vector in the same space. As a consequence, the correlation between user p and item i (i.e., how much the item matches the user interests) can be computed as the similarity between the correspondent vectors, for instance by means of their inner product: (10) where, a pe and b ie are the e-th (unknown) features for user p and item i, respectively. The point is to compute the l features which minimize the prediction error between the estimatedr pi and the actual value r pi .…”
Section: Dimensionality-reduction-based Collaborative Algorithmmentioning
confidence: 99%
“…Once R has been factorized, which can result particularly challenging, the system operates with vectors having only l dimensions, much less than the original space of n users and m items; • SVD reduces the noise in the data. In fact, by neglecting the singular values with low magnitude we are discarding the least-informative data, which is typically noisy [10,8]; • SVD strengthens the relationships among the data. Thus, if two vectors (either users or items) are similar (because somehow related), they are represented closer in the l-dimensional feature space than in the original space.…”
Section: Dimensionality-reduction-based Collaborative Algorithmmentioning
confidence: 99%
“…In such a model the bottom-up part would resemble cognitive component analysis [18]. Coined as a term to describe aspects of unsupervised clustering of data, the underlying algorithms approximate how our brain discovers self-organizing patterns when assembling images from lines and edges of visual objects [19], reconstructs words from the statistical regularities of phonemes in speech [20] or learn the meaning of words based on their co-occurrence within multiple contexts [21][22][23]. But equally important: cognitive processes involve a large amount of top-down feedback which sculpts the receptive responses of neurons on every level and vastly outnumbers the sensory inputs [24][25][26].…”
Section: Introductionmentioning
confidence: 99%
“…And combine the bottomup extracted representation with top-down aspects of attention reflecting preferred emotional structures, similar to the combinations of user generated affec-tive terms found in tag clouds in social networks like last.fm. Selecting a number of frequently used emotional last.fm tags as buoys to define a semantic plane of psychological valence and arousal dimensions, we project a number of song lyrics into this space and apply LSA latent semantic analysis [21][22][23], to model the correlation of texts and affective terms as vectors reflecting the emotional context of the songs. We outline in the following sections: the affective plane used for modeling emotional structure, the extraction of latent semantics from texts associated with media, an analysis of the emotional patterns, followed by a discussion of the potential in combining latent semantics and emotional components to enable personalized search of media.…”
Section: Introductionmentioning
confidence: 99%
“…LSA traces its origins to a technique in information retrieval known as Latent Semantic Indexing (LSI) (Furnas et al 1988, Deerwester et al 1990). The objective of LSI is to improve the retrieval of documents by reducing a large term-by-document matrix into a much smaller space using Singular Value Decomposition (SVD).…”
Section: Approachesmentioning
confidence: 99%