Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…As such, current trends include using a camera, either grayscale/color or combined with a depth sensor, to detect objects on the scene presented in front of the user and a HCI to make decisions on what and how to manipulate these objects [13], [14], [15], [16], [17], [18]. Recent work has promoted the use of eye tracking as a HCI as demonstrated by [12], [14], [19], [20], [21], [22], [23], [24], [25], [26], and [27], albeit with significant limitations that will be further discussed in the next section.…”
Section: Introductionmentioning
confidence: 99%
“…As such, current trends include using a camera, either grayscale/color or combined with a depth sensor, to detect objects on the scene presented in front of the user and a HCI to make decisions on what and how to manipulate these objects [13], [14], [15], [16], [17], [18]. Recent work has promoted the use of eye tracking as a HCI as demonstrated by [12], [14], [19], [20], [21], [22], [23], [24], [25], [26], and [27], albeit with significant limitations that will be further discussed in the next section.…”
Section: Introductionmentioning
confidence: 99%
“…Combinations of two or more paradigms have also been proposed to control a robotic arm such as P300/SSVEP [52,53], SSVEP/mVEP [54], SSVEP/MI/Electromyography (EMG) [55], SSVEP/Facial gestures [56], control of a robotic arm and a wheelchair by SSVEP/cervical movements [57], SSVEP/EOG [58], SSVEP/Eye [59], SSVEP/Computer Vision [60], SSVEP/MI [55], and SSVEP/P300/MI [61].…”
Section: Introductionmentioning
confidence: 99%
“…Reference [ 26 ] presents a hybrid wearable interface using eye movement and mental focus to control a quadcopter in three-dimensional space. Reference [ 27 ] developed a hybrid BCI to manipulate a Jaco robotic arm using natural gestures and biopotentials. Reference [ 28 ] presented a semi-autonomous hybrid brain–machine interface using human intracranial EEG, eye tracing and computer vision to control an upper limb prosthetic robot.…”
Section: Introductionmentioning
confidence: 99%