2022
DOI: 10.3390/app12094374
|View full text |Cite
|
Sign up to set email alerts
|

Computer Vision-Based Adaptive Semi-Autonomous Control of an Upper Limb Exoskeleton for Individuals with Tetraplegia

Abstract: We propose the use of computer vision for adaptive semi-autonomous control of an upper limb exoskeleton for assisting users with severe tetraplegia to increase independence and quality of life. A tongue-based interface was used together with the semi-autonomous control such that individuals with complete tetraplegia were able to use it despite being paralyzed from the neck down. The semi-autonomous control uses computer vision to detect nearby objects and estimate how to grasp them to assist the user in contro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(12 citation statements)
references
References 44 publications
0
5
0
Order By: Relevance
“…The interface used is multimodal, allowing the user to control the end effector via a virtual joy-stick-like environment and switch modes to semi-autonomously grasp objects. Grasping intent prediction is accomplished by identifying the closest object to the palm-side of the gripper [5]. A grasping point is then determined by approximating the shape of the object with a virtual cylinder.…”
Section: Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…The interface used is multimodal, allowing the user to control the end effector via a virtual joy-stick-like environment and switch modes to semi-autonomously grasp objects. Grasping intent prediction is accomplished by identifying the closest object to the palm-side of the gripper [5]. A grasping point is then determined by approximating the shape of the object with a virtual cylinder.…”
Section: Setupmentioning
confidence: 99%
“…Developing such an interface for these robot manipulators is a complex task [4], and therefore, supporting the control with autonomy of the robot to reduce the complexity is of great importance. While grasping points for grasping tasks of robotic arms can be synthesized well with modern computer vision methods [5], the grasp might still be unstable, causing objects to slip out of the gripper during the pickup or midair during an executed task. Due to the nature of the setup and the user group, there is little to no chance for them to react in time to a slip event.…”
Section: Introductionmentioning
confidence: 99%
“…The software packages used to extract the target object from the image and depth feed were OpenCV for thresholding and reading the incoming camera data and the Point Cloud Library [61] was used to perform RANSAC [62]. The computer vision algorithm and control method are described in greater detail in Bengtson et al [63] (referred to as the "Fixed Semi-autonomous Control" scheme). Once the position and orientation of the object were known, the trajectory could be calculated and the motion to go from the current pose of the exoskeleton to a grasp pose around the object of interest could be performed.…”
Section: Intelligent Controlmentioning
confidence: 99%
“…The software packages used to extract the target object from the image and depth feed were OpenCV for thresholding and reading the incoming camera data and the Point Cloud Library [61] was used to perform RANSAC [62]. The computer vision algorithm and control method are described in greater detail in Bengtson et al [63] (referred to as the "Fixed Semi-autonomous Control" scheme).…”
Section: Intelligent Controlmentioning
confidence: 99%
“…The software packages used to extract the target object from the image and depth feed were OpenCV for thresholding and reading the incoming camera data and the Point Cloud Library [61] was used to perform RANSAC [62]. The computer vision algorithm and control method is described in greater detail in Bengtson et al [63] (referred to as the "Fixed Semi-autonomous Control" scheme).…”
Section: Intelligent Controlmentioning
confidence: 99%