2019 20th Asia-Pacific Network Operations and Management Symposium (APNOMS) 2019
DOI: 10.23919/apnoms.2019.8892906
|View full text |Cite
|
Sign up to set email alerts
|

A Comprehensive Multisensor Dataset Employing RGBD Camera, Inertial Sensor and Web Camera

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…In [24], the start and the end of an action were synchronized by using the timestamps of the depth images to serve as references due to variations of the frame rate of the Kinect camera and the sampling rate of the wearable inertial sensor. Similarly, in [127], the start and the end of an action were synchronized by using the date/time of the Kinect camera (either RGB images, depth images or skeleton joints) as references. In [128], the data from a Kinect camera and a wearable inertial sensor were collected using C++ codes.…”
Section: A Challenges 1) Data Synchronizationmentioning
confidence: 99%
“…In [24], the start and the end of an action were synchronized by using the timestamps of the depth images to serve as references due to variations of the frame rate of the Kinect camera and the sampling rate of the wearable inertial sensor. Similarly, in [127], the start and the end of an action were synchronized by using the date/time of the Kinect camera (either RGB images, depth images or skeleton joints) as references. In [128], the data from a Kinect camera and a wearable inertial sensor were collected using C++ codes.…”
Section: A Challenges 1) Data Synchronizationmentioning
confidence: 99%