Proceedings of the 32nd Annual Meeting on Association for Computational Linguistics - 1994
DOI: 10.3115/981732.981736
|View full text |Cite
|
Sign up to set email alerts
|

Hidden understanding models of natural language

Abstract: We describe and evaluate hidden understanding models, a statistical learning approach to natural language understanding.Given a string of words, hidden understanding models determine the most likely meaning for the string. We discuss 1) the problem of representing meaning in this framework, 2) the structure of the statistical model, 3) the process of training the model, and 4) the process of understanding using the model. Finally, we give experimental results, including results on an ARPA evaluation.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

1996
1996
2020
2020

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 59 publications
(34 citation statements)
references
References 12 publications
0
34
0
Order By: Relevance
“…Google Now 1 and Apple's Siri 2 ) and academia (e.g. [1,2,3,4,5,6,7,8,9,10,11,12]) have focused on developing semantic understanding techniques for building better spoken dialogue systems (SDS). The role of spoken language understanding (SLU) is of great significance to SDS: in order to capture the variation in language use from dialogue participants, the SLU component must create a mapping between the natural language inputs and a semantic representation that captures users' intentions.…”
Section: Introductionmentioning
confidence: 99%
“…Google Now 1 and Apple's Siri 2 ) and academia (e.g. [1,2,3,4,5,6,7,8,9,10,11,12]) have focused on developing semantic understanding techniques for building better spoken dialogue systems (SDS). The role of spoken language understanding (SLU) is of great significance to SDS: in order to capture the variation in language use from dialogue participants, the SLU component must create a mapping between the natural language inputs and a semantic representation that captures users' intentions.…”
Section: Introductionmentioning
confidence: 99%
“…AT&T's CHRONUS [5]; model based on Probabilistic Context-Free Grammar (PCFG), i.e. BBN's hidden understanding model (HUM) [6]; model proposed by He and Young [7] based on hidden Vector State (HVS) and statistic machine translation model [8]. An approach combined with these two categories makes full use of advantage of them [9].…”
Section: Introductionmentioning
confidence: 99%
“…Statistical methods are attractive because they can be easily adapted to new conditions using only annotated training data. Statistical methods for SLU have been studied in a Hidden Vector State (HVS) Model (He et al, 2005) and a data-driven statistical models (Miller et al 1994;Pieraccini et al 1992;Wang et al 2006). In addition, Jeong and Lee (2008) proposed a unified probabilistic model (triangular-chain CRF) combining the named entity and dialog-act of SLU.…”
Section: Related Workmentioning
confidence: 99%