2021
DOI: 10.48550/arxiv.2111.14843
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Catch Me If You Hear Me: Audio-Visual Navigation in Complex Unmapped Environments with Moving Sounds

Abstract: Audio-visual navigation combines sight and hearing to navigate to a sound-emitting source in an unmapped environment. While recent approaches have demonstrated the benefits of audio input to detect and find the goal, they focus on clean and static sound sources and struggle to generalize to unheard sounds. In this work, we propose the novel dynamic audio-visual navigation benchmark which requires to catch a moving sound source in an environment with noisy and distracting sounds. We introduce a reinforcement le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 37 publications
(86 reference statements)
0
2
0
Order By: Relevance
“…In PointGoal navigation [22,4], the agent at each step receives the displacement vector to the goal that it has to reach. Whereas, in AudioGoal navigation [6,27], the agent at each step receives an audio signal emitted by a target object. Conversely, in ObjectGoal navigation [29,5,20,10], the agent receives an object category that it has to navigate to.…”
Section: Related Workmentioning
confidence: 99%
“…In PointGoal navigation [22,4], the agent at each step receives the displacement vector to the goal that it has to reach. Whereas, in AudioGoal navigation [6,27], the agent at each step receives an audio signal emitted by a target object. Conversely, in ObjectGoal navigation [29,5,20,10], the agent receives an object category that it has to navigate to.…”
Section: Related Workmentioning
confidence: 99%
“…W HILE recent progress in control and perception has propelled the capabilities of robotic platforms to autonomously operate in unknown and unstructured environments [1]- [4], this has largely focused on pure navigation tasks [5], [6]. In this work, we focus on autonomous mobile manipulation which combines the difficulties of navigating unstructured, human-centered environments with the complexity of jointly controlling the base and arm.…”
Section: Introductionmentioning
confidence: 99%