2017
DOI: 10.20944/preprints201706.0033.v2
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

UniMiB SHAR: A Dataset for Human Activity Recognition Using Acceleration Data from Smartphones

Abstract: Smartphones, smartwatches, fitness trackers, and ad-hoc wearable devices are being increasingly used to monitor human activities. Data acquired by the hosted sensors are usually processed by machine-learning-based algorithms to classify human activities. The success of those algorithms mostly depends on the availability of training (labeled) data that, if made publicly available, would allow researchers to make objective comparisons between techniques. Nowadays, publicly available data sets are few, often cont… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 58 publications
(29 citation statements)
references
References 34 publications
0
26
0
1
Order By: Relevance
“…e sampling rate of signals varied signi cantly across phones with values between 50-200Hz. [41] contains triaxial accelerometer signals collected from a Samsung Galaxy Nexus smartphone at 50Hz. irty subjects participated in the data collection process forming a diverse sample of the population with di erent height, weight, age, and gender.…”
Section: Datasetsmentioning
confidence: 99%
“…e sampling rate of signals varied signi cantly across phones with values between 50-200Hz. [41] contains triaxial accelerometer signals collected from a Samsung Galaxy Nexus smartphone at 50Hz. irty subjects participated in the data collection process forming a diverse sample of the population with di erent height, weight, age, and gender.…”
Section: Datasetsmentioning
confidence: 99%
“…To choose the optimal datasets for this study, we considered the complexity and richness of the datasets. Based on the background of our research, we selected the OPPORTUNITY [51], PAMAP2 [52] and UniMiB-SHAR [53] benchmark datasets for our experiments.…”
Section: Methodsmentioning
confidence: 99%
“…The first group analyses the vital signs provided by wearable sensors (Banaee et al, 2013) such as electrocardiogram, oxygen saturation, heart rate, photoplethysmography, blood glucose, blood pressure and respiratory rate. The second group is focused on recognising and monitoring individual human activities (Liao et al, 2005;Luque et al, 2014;Vilarinho et al, 2015;Micucci et al, 2017;Kulev et al, 2016), which also overlaps with the fields of computer vision, machine learning and data mining. Our study in this paper is closer to the second group.…”
Section: Related Workmentioning
confidence: 99%