2020
DOI: 10.1007/s11042-020-08789-7
|View full text |Cite
|
Sign up to set email alerts
|

Multi-modal egocentric activity recognition using multi-kernel learning

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 60 publications
0
9
0
Order By: Relevance
“…Lu et al [31] used LSTM to categorize activities utilizing four IMU sensors and egocentric video from the CMU Multimodal Activity (CMU-MMAC) database [32]. For activity recognition, [33] used visual and audio sensors. [8] described a methodology for detecting proprioceptive activities using egocentric data from IMUs.…”
Section: Related Work and Contributionsmentioning
confidence: 99%
“…Lu et al [31] used LSTM to categorize activities utilizing four IMU sensors and egocentric video from the CMU Multimodal Activity (CMU-MMAC) database [32]. For activity recognition, [33] used visual and audio sensors. [8] described a methodology for detecting proprioceptive activities using egocentric data from IMUs.…”
Section: Related Work and Contributionsmentioning
confidence: 99%
“…Feature-level fusion is applied in [14] by combining audio-visual features with multi-kernel learning and multi-kernel boosting. After extracting specific features for each modality, the proposed framework performs adaptive fusion by selecting and weighting the features and kernels jointly.…”
Section: Multimodal Learningmentioning
confidence: 99%
“…Lu and Velipasalar [25] used LSTM to classify actions using four IMUs sensors corresponding 36 components with egocentric video from CMU Multimodal Activity (CMU-MMAC) Database [41]. Visual and audio sensors was used by [5] for activity recognition. [1] has presented framework for recognizing proprioceptive activities using IMUs egocentric data.…”
Section: Related Workmentioning
confidence: 99%