Interspeech 2013 2013
DOI: 10.21437/interspeech.2013-623
|View full text |Cite
|
Sign up to set email alerts
|

Deep belief network based semantic taggers for spoken language understanding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(15 citation statements)
references
References 18 publications
0
15
0
Order By: Relevance
“…Addressed issue Approach [Yu et al 2011] Long-range state dependency Deep learning, CRF [Deng et al 2012] To extend DCN Kernel learning, deep learning, DCN, log-linear model [Deoras and Sarikaya 2013] Data sparsity problem (CRF)…”
Section: Papermentioning
confidence: 99%
See 2 more Smart Citations
“…Addressed issue Approach [Yu et al 2011] Long-range state dependency Deep learning, CRF [Deng et al 2012] To extend DCN Kernel learning, deep learning, DCN, log-linear model [Deoras and Sarikaya 2013] Data sparsity problem (CRF)…”
Section: Papermentioning
confidence: 99%
“…[ Deoras and Sarikaya 2013] applied deep belief networks (DBN) for semantic tagging integrated with lexical, named entity, dependency parser based syntactic features and part of speech (POS) tags. A DBN is a stack of Restricted Boltzmann Machines (RBMs), where the input of one layer is the output of the previous one and each layer applies a sigmoid activation function on their inputs.…”
Section: Low Resource Data Setsmentioning
confidence: 99%
See 1 more Smart Citation
“…Another line of popular approaches is to train machine learning models on labeled training data, such as support vector machine (SVM) and Adaboost [9,29] . Approaches based on deep neural network technology have shown excellent performance, such as Deep belief networks (DBNs) and RNNs [25,5]. Slot filling can be treated as a sequence labeling task.…”
Section: Spoken Language Understandingmentioning
confidence: 99%
“…The input is the sentence consisting of a sequence of words, and the output is a sequence of slot/concept IDs, one for each word. [17] and [15] used deep belief networks (DBNs), and achieved superior results compared to CRF baselines. [51; 115; 66; 113] applied RNN for slot filling.…”
Section: Language Understandingmentioning
confidence: 99%