2021
DOI: 10.48550/arxiv.2106.14405
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Habitat 2.0: Training Home Assistants to Rearrange their Habitat

Abstract: We introduce Habitat 2.0 (H2.0), a simulation platform for training virtual robots in interactive 3D environments and complex physics-enabled scenarios. We make comprehensive contributions to all levels of the embodied AI stack -data, simulation, and benchmark tasks. Specifically, we present: (i) ReplicaCAD: an artist-authored, annotated, reconfigurable 3D dataset of apartments (matching real spaces) with articulated objects (e.g. cabinets and drawers that can open/close); (ii) H2.0: a high-performance physics… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(34 citation statements)
references
References 60 publications
0
34
0
Order By: Relevance
“…Hypersim [16] and OpenRooms [17] develop simulators for indoor object detection. Robotic simulators include AI-2THOR [18], Habitat [19,20], NVIDIA Isaac Sim [21], and iGibson [22] focus largely on embodied AI tasks. More generic tools for object detection dataset generation include BlenderProc [23], BlendTorch [24], NVISII [25], and the Unity Perception package [7].…”
Section: Related Workmentioning
confidence: 99%
“…Hypersim [16] and OpenRooms [17] develop simulators for indoor object detection. Robotic simulators include AI-2THOR [18], Habitat [19,20], NVIDIA Isaac Sim [21], and iGibson [22] focus largely on embodied AI tasks. More generic tools for object detection dataset generation include BlenderProc [23], BlendTorch [24], NVISII [25], and the Unity Perception package [7].…”
Section: Related Workmentioning
confidence: 99%
“…These simulators support various conditions such as lighting and weather changes, moving objects such as pedestrians, and incident scenes. For indoor navigation, iGibson [15], [16], Sapien [17], AI2Thor [18], Virtual Home [19], ThreeDWorld [20], MINOS [21], House3D [22] and CHALET [23] use synthetic scenes, while reconstructed scenes are also available in iGibson, AI Habitat [24], [25] and MINOS [21]. Compared to datasets, simulation environments have the advantage of providing access to ground truth data, e.g.…”
Section: Related Workmentioning
confidence: 99%
“…Our environment models are both indoor and outdoor containing complex topology to stress exploration and navigation algorithms. In addition to our environment models, we support the photorealistic house models from Matterport3D [3] and provide interface to AI Habitat [24], [25]. Both are widely used by the robotics and computation vision societies.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Embodied artificial intelligence (EAI) has attracted significant attention, both in advanced deep learning models and algorithms [1,2,3,4] and the rapid development of simulated platforms [5,6,7,8,9]. Many open challenges [10,11,12,13] have been proposed to facilitate EAI research. A critical bottleneck in existing simulated platforms [10,12,8,5,14] is the limited number of indoor scenes that support vision-and-language navigation, object interaction, and complex household tasks.…”
Section: Introductionmentioning
confidence: 99%