2012
DOI: 10.1016/j.irbm.2012.01.009
|View full text |Cite
|
Sign up to set email alerts
|

Navigation and space perception assistance for the visually impaired: The NAVIG project

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
33
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 61 publications
(38 citation statements)
references
References 23 publications
1
33
0
Order By: Relevance
“…The use of multi-sensory approaches to extract information about the surrounding environment and blind mobility [3], [4] are limited to address user location, navigation and partial environment recognition [5]- [8]. Moreover, most of these approaches rely on computer vision systems to extract surrounding information [1], [2].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The use of multi-sensory approaches to extract information about the surrounding environment and blind mobility [3], [4] are limited to address user location, navigation and partial environment recognition [5]- [8]. Moreover, most of these approaches rely on computer vision systems to extract surrounding information [1], [2].…”
Section: Discussionmentioning
confidence: 99%
“…Recently, computer vision systems have been developed to extract information about the surrounding environment and provide guidance for blind users [1], [2]. Multisensory approaches are also used to provide orientation and mobility to the blind [3], [4], addressing user location, navigation and environment recognition problems in a integrated solution [5]- [8]. The multi-sensory technology and data fusion is used in other contexts for the same purpose, as for instance in autonomous driving [9], [10].…”
Section: Introductionmentioning
confidence: 99%
“…Kammoun et al 27 proposed a project called NAVIG. Their goal was to design a system to assist blind people in navigating indoor and outdoor environments through micro-navigation (sensing immediate environments) and macro-navigation (reaching remote destinations) functions, by using the traditional navigation tools.…”
Section: Related Workmentioning
confidence: 99%
“…The 2D position of the person was detected in the second layer using corner features extracted from the laser-scan data, linear velocity measurements from the pedometer, and a filtered version of the cane's yaw. Kammoun et al [30] utilized micro-navigation (sensing immediate environments) and macro-navigation (reaching remote destinations) functions to design a system for blind people in indoor and outdoor environments. This system started searching for any object requested by the user in the captured images.…”
Section: Introductionmentioning
confidence: 99%