2020
DOI: 10.3389/frobt.2020.562061
|View full text |Cite
|
Sign up to set email alerts
|

ExoNet Database: Wearable Camera Images of Human Locomotion Environments

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 30 publications
(27 citation statements)
references
References 34 publications
(92 reference statements)
0
27
0
Order By: Relevance
“…However, smallscale and private training datasets have hindered the development of image classification algorithms for environment recognition [20]. To address these limitations, we developed ExoNet -the largest and most diverse open-source dataset of wearable camera images of walking environments [11]. Unparalleled in both scale and diversity, ExoNet contains over 5.6 million images of indoor and outdoor real-world environments, of which ~923,000 images were annotated using a 12class hierarchical labelling architecture; these design features are important since deep learning requires significant and diverse training data.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, smallscale and private training datasets have hindered the development of image classification algorithms for environment recognition [20]. To address these limitations, we developed ExoNet -the largest and most diverse open-source dataset of wearable camera images of walking environments [11]. Unparalleled in both scale and diversity, ExoNet contains over 5.6 million images of indoor and outdoor real-world environments, of which ~923,000 images were annotated using a 12class hierarchical labelling architecture; these design features are important since deep learning requires significant and diverse training data.…”
Section: Discussionmentioning
confidence: 99%
“…Data were collected throughout the summer, fall, and winter seasons to incorporate different weathered surfaces like snow, grass, and multicolored leaves. The image database, which we named ExoNet, was deposited in the IEEE DataPort repository and is now publicly available for download [11]. The file size of the uncompressed videos is ~140 GB.…”
Section: A Experimental Datasetmentioning
confidence: 99%
“…Many researchers have likewise used a wearable RGB camera for passive environment sensing (Da Silva et al, 2020;Diaz et al, 2018;Khademi and Simon, 2019;Krausz and Hargrove, 2015;Laschowski et al, 2019b;2020b;Novo-Torres et al, 2019;Zhong et al, 2020). Although multicamera systems could be used to capture 3D information (i.e., comparable to how the human visual system uses triangulation for depth perception) (Patla, 1997), each pixel in an RGB image contains only light intensity information.…”
Section: Discussionmentioning
confidence: 99%
“…Compared to radar and laser rangefinders, cameras can provide more detailed information about the field-of-view and detect physical obstacles and terrain changes in peripheral locations (example shown in Figure 3). Most environment recognition systems have used RGB cameras (Da Silva et al, 2020;Diaz et al, 2018;Khademi and Simon, 2019;Krausz and Hargrove, 2015;Laschowski et al, 2019b;2020b;2021b;Novo-Torres et al, 2019;Zhong et al, 2020) and/or 3D depth cameras (Hu et al, 2018;Krausz et al, 2015;Krausz and Hargrove, 2021;Massalin et al, 2018;Varol and Massalin, 2016;Zhang et al, 2019b;2019c;2019d, 2020 mounted on the chest (Krausz et al, 2015;Laschowski et al, 2019b;2020b;2021b), waist (Khademi andSimon, 2019;Krausz et al, 2019;Krausz and Hargrove, 2021;Zhang et al, 2019d), or lower-limbs (Da Silva et al, 2020;Diaz et al, 2018;Massalin et al, 2018;Varol and Massalin, 2016;Zhang et al, 2019b;2019c;2020;Zhong et al, 2020) (see Table 1). Few studies have adopted head-mounted cameras for biomimicry (Novo-Torres et al, 2019;Zhong et al, 2020).…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation