Proceedings 15th International Conference on Pattern Recognition. ICPR-2000
DOI: 10.1109/icpr.2000.902899
|View full text |Cite
|
Sign up to set email alerts
|

Structuring personal activity records based on attention-analyzing videos from head mounted camera

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 3 publications
0
7
0
Order By: Relevance
“…IV-C). By departing from traditional frame-based decisions [17], [26], [27], we capture long-term temporal dependencies. As we will see below, doing so is beneficial for detecting subtle periods of engagement and accounting for their variable length.…”
Section: Approachmentioning
confidence: 99%
“…IV-C). By departing from traditional frame-based decisions [17], [26], [27], we capture long-term temporal dependencies. As we will see below, doing so is beneficial for detecting subtle periods of engagement and accounting for their variable length.…”
Section: Approachmentioning
confidence: 99%
“…When a person is paying attention to visual objects or for these behaviors, head movements are approximately classified into two types of attention events: Active Attention and Passive Attention [10] . "Active Attention" means that we often gaze at something and track it when it attracts our interest.…”
Section: Attention Status Observationmentioning
confidence: 99%
“…These techniques have been used to understand human cognition and behavior. One example is the use of these techniques in 'life-log' analyses; i.e., capturing personal behavior and experiences using a wearable video/camera system equipped with other sensors and computers on the human body [19,20]. For example, camera and object motions in footage taken from a head-mounted video camera have been analyzed to investigate how humans pay attention to objects [19].…”
Section: Introductionmentioning
confidence: 99%