IECON 2018 - 44th Annual Conference of the IEEE Industrial Electronics Society 2018
DOI: 10.1109/iecon.2018.8591362
|View full text |Cite
|
Sign up to set email alerts
|

Determination of Landmarks by Mobile Robot's Vision System Based on Detecting Abrupt Changes of Echo Signals Parameters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…In such conditions, it is difficult to find differences between the parameters of signals reflected from the background and the landmark during active location by different waves (electromagnetic, ultrasonic, etc.). In [11], a method is proposed for expanding the conditions for the use of radars based on the developed system for detecting jumps in the amplitude of signals reflected from a landmark. In passive location with the use of video cameras or night vision devices that receive waves of different ranges (optical, visible and infrared), it becomes possible to identify landmarks with a high probability using the methods described in [12].…”
Section: Research Results and Discussionmentioning
confidence: 99%
“…In such conditions, it is difficult to find differences between the parameters of signals reflected from the background and the landmark during active location by different waves (electromagnetic, ultrasonic, etc.). In [11], a method is proposed for expanding the conditions for the use of radars based on the developed system for detecting jumps in the amplitude of signals reflected from a landmark. In passive location with the use of video cameras or night vision devices that receive waves of different ranges (optical, visible and infrared), it becomes possible to identify landmarks with a high probability using the methods described in [12].…”
Section: Research Results and Discussionmentioning
confidence: 99%