Proceedings of the 22nd ACM International Conference on Conference on Information &Amp; Knowledge Management - CIKM '13 2013
DOI: 10.1145/2505515.2505665
|View full text |Cite
|
Sign up to set email alerts
|

Learning deep structured semantic models for web search using clickthrough data

Abstract: Latent semantic models, such as LSA, intend to map a query to its relevant documents at the semantic level where keyword-based matching often fails. In this study we strive to develop a series of new latent semantic models with a deep structure that project queries and documents into a common low-dimensional space where the relevance of a document given a query is readily computed as the distance between them. The proposed deep structured semantic models are discriminatively trained by maximizing the condition… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
1,188
1
2

Year Published

2013
2013
2022
2022

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 1,562 publications
(1,205 citation statements)
references
References 20 publications
5
1,188
1
2
Order By: Relevance
“…In a separate line of research, deep learning based techniques have been proposed for semantic understanding (Mesnil et al, 2013;Huang et al, 2013;Shen et al, 2014b;Salakhutdinov and Hinton, 2009;Tur et al, 2012). We adapt the work of (Huang et al, 2013;Shen et al, 2014b) for measuring the semantic distance between a question and relational triples in the KB as the core component of our semantic parsing approach.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In a separate line of research, deep learning based techniques have been proposed for semantic understanding (Mesnil et al, 2013;Huang et al, 2013;Shen et al, 2014b;Salakhutdinov and Hinton, 2009;Tur et al, 2012). We adapt the work of (Huang et al, 2013;Shen et al, 2014b) for measuring the semantic distance between a question and relational triples in the KB as the core component of our semantic parsing approach.…”
Section: Related Workmentioning
confidence: 99%
“…In our model, we leverage the word hashing technique proposed in (Huang et al, 2013) where we first represent a word by a lettertrigram count vector. For example, given a word (e.g., cat), after adding word boundary symbols (e.g., #cat#), the word is segmented into a sequence of letter-n-grams (e.g., letter-trigrams: #-c-a, c-a-t, a-t-#).…”
Section: Convolutional Neural Network Based Semantic Modelmentioning
confidence: 99%
“…In vision, abstractions can include object detection (Girshick et al, 2014;Girshick, 2015;Ren et al, 2015), classification (Krizhevsky et al, 2012;Simonyan and Zisserman, 2014;Szegedy et al, 2015), and semantic understanding (Huang et al, 2013) using convolution neural networks (LeCun and Bengio, 1995). Inspired by the hierarchical architecture of the human visual cortex (Hubel and Wiesel, 1962), architectures for multiple convolution-pooling layers have been proposed and are being used in different machine learning tasks.…”
Section: Trends In the Development Of Ai Technology Applications For mentioning
confidence: 99%
“…We train a regression forest model (Meinshausen, 2006) and use the following features: monolingual word aligner (Sultan et al, 2014), DSSM model (Huang et al, 2013), word embedding composition (max, sum, idf-sum) with GloVe , n-gram overlap, subsequence matching, PairingWords (Han et al, 2013), word mover's distance (Kusner et al, 2015).…”
Section: Faq Search For Customer Qa Pairsmentioning
confidence: 99%