2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN) 2022
DOI: 10.1109/bsn56160.2022.9928466
|View full text |Cite
|
Sign up to set email alerts
|

Video2IMU: Realistic IMU features and signals from videos

Abstract: Human Activity Recognition (HAR) from wearable sensor data identifies movements or activities in unconstrained environments. HAR is a challenging problem as it presents great variability across subjects. Obtaining large amounts of labelled data is not straightforward, since wearable sensor signals are not easy to label upon simple human inspection. In our work, we propose the use of neural networks for the generation of realistic signals and features using human activity monocular videos. We show how these gen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…A similar approach was chosen by Lämsä et al (2022), who used neural networks with VIDEO2IMU to generate IMU signals and features from monocular videos of human activities. Their results suggested that HAR systems trained using virtual sensor data could perform considerably better than baseline models, trained using only physical IMU data (Kwon et al, 2021;Lämsä et al, 2022). Esteban et al (2017) deployed GANs to synthesise respiratory data, where a generator model was used to augment data, and a discriminator attempts to distinguish between real and artificial data.…”
Section: Data Simulation and Synthesismentioning
confidence: 99%
See 1 more Smart Citation
“…A similar approach was chosen by Lämsä et al (2022), who used neural networks with VIDEO2IMU to generate IMU signals and features from monocular videos of human activities. Their results suggested that HAR systems trained using virtual sensor data could perform considerably better than baseline models, trained using only physical IMU data (Kwon et al, 2021;Lämsä et al, 2022). Esteban et al (2017) deployed GANs to synthesise respiratory data, where a generator model was used to augment data, and a discriminator attempts to distinguish between real and artificial data.…”
Section: Data Simulation and Synthesismentioning
confidence: 99%
“… Kwon et al (2020 , 2021) presented IMUTube, an automated processing pipeline for human activity recognition (HAR) that integrates existing computer vision and signal processing techniques to convert video of human activity into virtual streams of IMU data. A similar approach was chosen by Lämsä et al (2022) , who used neural networks with VIDEO2IMU to generate IMU signals and features from monocular videos of human activities. Their results suggested that HAR systems trained using virtual sensor data could perform considerably better than baseline models, trained using only physical IMU data ( Kwon et al, 2021 ; Lämsä et al, 2022 ).…”
Section: Related Workmentioning
confidence: 99%
“…However, none of these studies investigated on the fusion of HPE and IMUs data for identification and tracking of the 3D human skeleton associated with the individual wearing the IMUs in real and multi-person scenarios. Other works combine video and IMU to generate inertial data from human poses, with the aim of mitigating the lack of labeled training data [9], [10].…”
Section: Introductionmentioning
confidence: 99%