2019
DOI: 10.3233/ao-190216
|View full text |Cite
|
Sign up to set email alerts
|

Context for language understanding by intelligent agents

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…Mcshane et al used domain knowledge to filter out domainrelated bilingual parallel corpora from large-scale general data when training translation models [12]. Ni reduces the impact of unregistered words on the overall translation performance of sentences through data generalization, improves the translation quality of unregistered words themselves, and uses a multi-coverage fusion model to improve the attention scoring mechanism to further alleviate overtranslation and neural MT in neural MT (missing translation problem) [13].…”
Section: Related Workmentioning
confidence: 99%
“…Mcshane et al used domain knowledge to filter out domainrelated bilingual parallel corpora from large-scale general data when training translation models [12]. Ni reduces the impact of unregistered words on the overall translation performance of sentences through data generalization, improves the translation quality of unregistered words themselves, and uses a multi-coverage fusion model to improve the attention scoring mechanism to further alleviate overtranslation and neural MT in neural MT (missing translation problem) [13].…”
Section: Related Workmentioning
confidence: 99%
“…e algorithm uses the Chinese Albert pretraining model to obtain the dynamic word vector with context and then uses the conditional random field model which can effectively deal with the sequence annotation problem to complete the idiom meaning slot filling task [23]. At the same time, the multicore convolutional neural network is used for training, so that each datum has a corresponding intention label, so as to complete the intention recognition task [24].…”
Section: Design Intent Recognition Algorithmmentioning
confidence: 99%
“…In "Context For Language Understanding by Intelligent Agents" McShane and Nirenburg (2019) illustrate a complete architecture aimed at providing language-endowed intelligent agents (LEIAs) with many kinds of context. This work mainly proposes a conceptual framework rather than an implemented system.…”
mentioning
confidence: 99%