2012 12th International Conference on Control Automation Robotics &Amp; Vision (ICARCV) 2012
DOI: 10.1109/icarcv.2012.6485186
|View full text |Cite
|
Sign up to set email alerts
|

Activity recognition from a wearable camera

Abstract: This paper proposes a novel activity recognition approach from video data obtained with a wearable camera. The objective is to recognise the user's activities from a tiny frontfacing camera embedded in his/her glasses. Our system allows carers to remotely access the current status of a specified person, which can be broadly applied to those living with disabilities including the elderly who require cognitive assistance or guidance for daily activities. We collected, trained and tested our system on videos coll… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
62
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(63 citation statements)
references
References 15 publications
(14 reference statements)
1
62
0
Order By: Relevance
“…Therefore, motion in FPV of an ambulatory activity is generally dominated by a global motion on which discriminant features are extracted. Existing motion-features use either raw grid optical flow [8,11] or limited directional and/or magnitude information [12][13][14] . Motion patterns of activities can vary in their magnitude, direction and frequency characteristics [14] .…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, motion in FPV of an ambulatory activity is generally dominated by a global motion on which discriminant features are extracted. Existing motion-features use either raw grid optical flow [8,11] or limited directional and/or magnitude information [12][13][14] . Motion patterns of activities can vary in their magnitude, direction and frequency characteristics [14] .…”
Section: Related Workmentioning
confidence: 99%
“…Application domains that employ wearable cameras ( Fig. 1 ) include life-logging and video summarization [3][4][5][6][7] , activity recognition [8][9][10][11][12][13][14][15][16][17][18][19][20][21] , and eye-tracking and gaze detection [22][23][24][25] . Human activities can be categorized as ambulatory (e.g., walk) [8][9][10][11][12][13][14][15] ; person-to-object interactions (e.g., cook) [16][17][18][19] ; and person-to-person interactions (e.g., handshake) [20,21] .…”
Section: Introductionmentioning
confidence: 99%
“…Video content-based camera motion analysis methods make use of template matching [1] and optical flow [6]. Methods derived from optical flow are widely used nowadays for human activity and action recognition from third person perspective [8,20] (where a fixed and static camera captures third person activities such that the optical flow is strongly associated with their activity) and first person perspective [30] (where camera wearer activities affect the global camera motion).…”
Section: Focused Interaction Datasetmentioning
confidence: 99%
“…Although audio signals provide information about social interactions, the fusion of visual and audio cues for detection of social interactions in egocentric video was rarely explored. Furthermore, the effect of integrating global camera motion analysis methods, nowadays used for human activity recognition in egocentric videos [30], with other visual and audio features for social interaction analysis still needs to be researched.…”
Section: Introductionmentioning
confidence: 99%
“…These approaches include multistage recognition processes, and hence, recognition errors tend to be stacked. To avoid explicit object recognition, many studies use motion feature such as optical flow with a classifier such as LogitBoost and SVM [2,3,[26][27][28][29].…”
Section: First-person Activity Recognitionmentioning
confidence: 99%