Proceedings of the 5th ACM on International Conference on Multimedia Retrieval 2015
DOI: 10.1145/2671188.2749361
|View full text |Cite
|
Sign up to set email alerts
|

Using Viewer's Facial Expression and Heart Rate for Sports Video Highlights Detection

Abstract: Viewer interest, evoked by video content, can potentially identify the highlights of the video. This paper explores the use of facial expressions (FE) and heart rate (HR) of viewers captured using camera and non-strapped sensor for identifying interesting video segments. The data from ten subjects with three videos showed that these signals are viewer dependent and not synchronized with the video contents. To address this issue, new algorithms are proposed to effectively combine FE and HR signals for identifyi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…We found that there is no universal threshold value for classifying heart rate; researchers design their own standards based on their study. Considering recommendations from previous studies, we selected 45 to 130 beats per minute (bpm) as an acceptable range for our experimental simulation (Chakraborty et al, 2015; Chowdhury et al 2017). We tested the performances of linear interpolation, Kalman, spline and Stineman interpolation.…”
Section: Initial Methodologymentioning
confidence: 99%
“…We found that there is no universal threshold value for classifying heart rate; researchers design their own standards based on their study. Considering recommendations from previous studies, we selected 45 to 130 beats per minute (bpm) as an acceptable range for our experimental simulation (Chakraborty et al, 2015; Chowdhury et al 2017). We tested the performances of linear interpolation, Kalman, spline and Stineman interpolation.…”
Section: Initial Methodologymentioning
confidence: 99%
“…color, texture, and motion), or emotional responses from viewers during watching data stimuli (e.g. facial expression [8], EEG signal [9], heart rate [10], and multiple physiological signals [11]). In this paper, we focus on approaches belonging to the first way that represents affective dimensions using features extracted from the data.…”
Section: B Avca Approaches Using Affective Dimensionsmentioning
confidence: 99%
“…We computed some overall time domain features like mean and standard deviation for each activity using all heart rate samples for that activity. Adopting a similar approach [14], for each activity, our work also calculated several timevarying features of heart-rate such as derivatives, gradient, energy, and variance. Obtained heart rate data was time stamped in beats-per-minute and a sliding window (with a span of 9 seconds) was used to compute the above temporal features.…”
Section: B Features Extractionmentioning
confidence: 99%