2013
DOI: 10.1109/tbme.2013.2262455
|View full text |Cite
|
Sign up to set email alerts
|

Predicting Targets of Human Reaching Motions Using Different Sensing Technologies

Abstract: Rapid recognition of voluntary motions is crucial in human-computer interaction, but few studies compare the predictive abilities of different sensing technologies. This paper thus compares performances of different technologies when predicting targets of human reaching motions: electroencephalography (EEG), electrooculography, camera-based eye tracking, electromyography (EMG), hand position, and the user's preferences. Supervised machine learning is used to make predictions at different points in time (before… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
41
0
2

Year Published

2014
2014
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 46 publications
(46 citation statements)
references
References 23 publications
2
41
0
2
Order By: Relevance
“…It was shown in consumer and cognitive science research that a person's gaze can be highly predictive of future actions in selection tasks. Various researchers found that if people are asked to make a decision between a number of options, there is a significant gaze bias towards the chosen option for a short time period before the decision is announced [3,17,25,26]. It is also known that the gaze direction leads the walking direction by a few seconds [6] and the same holds true for driving [8,12].…”
Section: Related Workmentioning
confidence: 99%
“…It was shown in consumer and cognitive science research that a person's gaze can be highly predictive of future actions in selection tasks. Various researchers found that if people are asked to make a decision between a number of options, there is a significant gaze bias towards the chosen option for a short time period before the decision is announced [3,17,25,26]. It is also known that the gaze direction leads the walking direction by a few seconds [6] and the same holds true for driving [8,12].…”
Section: Related Workmentioning
confidence: 99%
“…This can be achieved by adapting the control of the device with respect to the patient’s intention. Movement intention of the patient can be detected from her/his brain activity, e.g., the electroencephalogram (EEG), as shown in healthy subjects [8], [10]–[13] as well as in stroke patients [14], and by the analysis of gaze direction and fixation [15], or by the analysis of the electromyogram (EMG) [16]. EMG activity is quite often solely used to trigger an orthosis or a prosthesis [17]–[19].…”
Section: Introductionmentioning
confidence: 99%
“…Assistive technology devices that are supported by integrated analysis of physiological or technical data to enable the detection of movement intention can support a patient for self-initiated movements. By analyzing the context of behavior even complex interaction, like grasping a certain object [15], can be triggered and executed by the device (Figure 1). What sources of physiological data should be combined depends on the requirements, e.g., the kind of disability and neuromuscular disorder [25] as well as the state and progress of the patient in rehabilitation.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Brain-machine interfaces (BMIs) have the potential to provide more natural control (Collinger et al, 2012; Ethier et al, 2012; Hochberg et al, 2012), although most BMIs that have successfully controlled reach involve invasive cortical recordings, a technology that is currently inaccessible in most clinical situations. Combining information from disparate sources has been proposed as a solution when there are few signals accessible (Batista et al, 2008; Pfurtscheller et al, 2010; Leeb et al, 2011; Corbett et al, 2013a; Novak et al, 2013; Kirchner et al, 2014). As the set of usable signals from each individual may be different, it is important to be able take advantage of all the useful channels available.…”
Section: Introductionmentioning
confidence: 99%