2014 IEEE International Conference on Robotics and Automation (ICRA) 2014
DOI: 10.1109/icra.2014.6907843
|View full text |Cite
|
Sign up to set email alerts
|

Controlled Natural Languages for language generation in artificial cognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 13 publications
0
5
0
Order By: Relevance
“…The accuracy of the learnt ontology entries can be refined via reinforcement learning during the course of task executions, or be verified thanks to an active learning process. More precisely, the SVO entries can be encapsulated in questions in a process called language generation, in order to ask for their correctness [15]. d) Limitations: Although we assume the text is confined to a technical domain, the authors of the source might make use of partly figurative wordings.…”
Section: Discussionmentioning
confidence: 99%
“…The accuracy of the learnt ontology entries can be refined via reinforcement learning during the course of task executions, or be verified thanks to an active learning process. More precisely, the SVO entries can be encapsulated in questions in a process called language generation, in order to ask for their correctness [15]. d) Limitations: Although we assume the text is confined to a technical domain, the authors of the source might make use of partly figurative wordings.…”
Section: Discussionmentioning
confidence: 99%
“…Natural language is an ideal candidate, given that interfaces such as mouse-and-keyboard, touchscreens and programming languages are powerful, but require extensive training for proper usage [22]. Multiple facets of language-based human-robot interaction have been studied in literature, such as instruction understanding [23,24], motion plan generation [9,12,16,25], human-robot cooperation [26], semantic belief propagation [18,19], and visual language navigation [11,27]. Most of the recent works in the field have shifted from representing language in terms of classical grammatical structure towards data-driven techniques, due higher flexibility in knowledge representations [22].…”
Section: Related Workmentioning
confidence: 99%
“…These instructions are grounded using WordNet (45) and CYC (69) and are captured as a set of instructions in a knowledge base. Later work (70) discussed controlled natural language as a way to repair missing information through explicit clarification. Nyga et al (71) used a similar probabilistic model to utilize relational knowledge to fill in gaps for aspects of the language missing from the workspace.…”
Section: Formal Reasoningmentioning
confidence: 99%