2019 International Conference on Robotics and Automation (ICRA) 2019
DOI: 10.1109/icra.2019.8793587
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Learning of Assistive Camera Views by an Aerial Co-robot in Augmented Reality Multitasking Environments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 21 publications
0
12
0
Order By: Relevance
“…The authors in Bentz et al (2019) implemented a system in which an aerial collaborative robot feeds the data from the head motions of a human performing a multitasking job to an Expectation-Maximization that learns which environment views have the highest visual interest to the user. Consequently, the co-robot is directed to capture these relevant views through its camera, and an AR HMD supplements the human’s field of view with views when needed.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The authors in Bentz et al (2019) implemented a system in which an aerial collaborative robot feeds the data from the head motions of a human performing a multitasking job to an Expectation-Maximization that learns which environment views have the highest visual interest to the user. Consequently, the co-robot is directed to capture these relevant views through its camera, and an AR HMD supplements the human’s field of view with views when needed.…”
Section: Discussionmentioning
confidence: 99%
“…Multitasking is improved in Bentz et al (2019) , where data from a HMD are fit to a model that identifies views of interest to the human, directs an aerial co-robot to capture these views, and augments them on his/her. The input data is the head pose collected through a VICON motion capture system.…”
Section: Discussionmentioning
confidence: 99%
“…Internal Status [479] Robot's Capability [66] Object Status [21] Sensor/Camera Data [377] Plan and Target [136] Simulation [76] Interactive Content [386] Virtual Background [8] [ 13,15,18,19,66,86,98,101,124,158,195,262,282,285,302,324,332,375,474,479,481] [4, 15,21,24,38,47,79,84,89,102,114,143,149,150,153,171,174,178,182,184,185,201,216,2...…”
Section: Internal Information External Information Plan and Activity ...mentioning
confidence: 99%
“…The most common approaches for technical evaluation are measuring latency [48,51,59,69,318,495], accuracy of tracking [15,58,59,64,486], and success rate [262,314]. Also, we found some works evaluate their system performances based on the comparison with other systems, which for example, include comparing tracking algorithms with other approaches [38,59,63,132,406].…”
Section: Evaluation Strategiesmentioning
confidence: 99%
“…Zhou et al designed a flying drone that could keep tracking a human motion using a normal color camera [121]. Bentz et al presented an assistive aerial robot that could observe regions most interesting to the human and broadcast these views to the humans augmented reality display [7]. This resulted in reduced head motions of the human as well as improved reaction time.…”
Section: Human Following Uavsmentioning
confidence: 99%