2021
DOI: 10.3390/s21144767
|View full text |Cite
|
Sign up to set email alerts
|

Inertial Measurement Unit Sensors in Assistive Technologies for Visually Impaired People, a Review

Abstract: A diverse array of assistive technologies have been developed to help Visually Impaired People (VIP) face many basic daily autonomy challenges. Inertial measurement unit sensors, on the other hand, have been used for navigation, guidance, and localization but especially for full body motion tracking due to their low cost and miniaturization, which have allowed the estimation of kinematic parameters and biomechanical analysis for different field of applications. The aim of this work was to present a comprehensi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 132 publications
(245 reference statements)
0
6
0
Order By: Relevance
“…As the fall is detected the scope increments including other needs to be treated using human activity recognition (HAR). Those oriented to sports ( Zhuang & Xue, 2019 ) and rehabilitation ( Panwar et al, 2019 ; Xing et al, 2020 ); as well as degenerative diseases that involve loss of mobility such as Parkinson’s disease and knee osteoarthritis ( Slade et al, 2021 ; Tan et al, 2021 ), and in assisted living which presents solutions for elderly people and also for people with visual impairments ( Reyes Leiva et al, 2021 ).…”
Section: Introductionmentioning
confidence: 99%
“…As the fall is detected the scope increments including other needs to be treated using human activity recognition (HAR). Those oriented to sports ( Zhuang & Xue, 2019 ) and rehabilitation ( Panwar et al, 2019 ; Xing et al, 2020 ); as well as degenerative diseases that involve loss of mobility such as Parkinson’s disease and knee osteoarthritis ( Slade et al, 2021 ; Tan et al, 2021 ), and in assisted living which presents solutions for elderly people and also for people with visual impairments ( Reyes Leiva et al, 2021 ).…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, assistive technology development and validation should involve insights from visually impaired individuals. The insufficient involvement of end-users is a limitation of the current literature; development processes must consider targeted users to ensure that usability and accuracy are not compromised [ 6 ]. The dataset created in this research could also be beneficial as an input for deep learning models that train quantitative parameters of walking, as evidenced by a recent review [ 18 ]; there is a need for research into gait in free-living conditions, i.e., at end-user residences.…”
Section: Discussionmentioning
confidence: 99%
“…Smartphone-based inertial sensing, utilizing deep learning methods, requires extensive data for training and may not be tailored specifically to the unique gait patterns of VIP [ 5 ]. Furthermore, a recent review highlighted the lack of inertial sensor systems designed for VIP and the scarcity of literature on IMU-based biomechanical analysis in VIP-oriented applications [ 6 ]. While wearable inertial sensors have gained attention in clinical research for the gait parameters of people with conditions such as stroke, Parkinson’s and multiple sclerosis, there is currently a dearth of studies focusing on VIP biomechanics [ 7 , 8 ].…”
Section: Introductionmentioning
confidence: 99%
“…They are capable of imparting huge amounts of data using minimal energy. Tracking the motion of a visually impaired person or the movement of humans within a small area can easily be implemented by pedestrian dead reckoning, which is an example of dead reckoning technology ( 11 ). Newborn systems based on indoor positioning have been seen to work when using aerial robots, mobile robots ( 12 ), and humanoid robots.…”
Section: Selection Of Relevant Techniques and Technology According To...mentioning
confidence: 99%