2005
DOI: 10.1109/tpami.2005.90
|View full text |Cite
|
Sign up to set email alerts
|

A video database of moving faces and people

Abstract: We describe a database of static images and video clips of human faces and people that is useful for testing algorithms for face and person recognition, head/eye tracking, and computer graphics modeling of natural human motions. For each person there are nine static "facial mug shots" and a series of video streams. The videos include a "moving facial mug shot," a facial speech clip, one or more dynamic facial expression clips, two gait videos, and a conversation video taken at a moderate distance from the came… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
91
0

Year Published

2005
2005
2020
2020

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 185 publications
(91 citation statements)
references
References 22 publications
0
91
0
Order By: Relevance
“…These studies are essential before we can tackle complex databases and spontaneous expressions, such as those of ref. 39. Without an understanding of which AUs represent each category of emotion, it is impossible to understand naturalistic expressions and address fundamental problems in neuroscience (40), study psychiatric disorders (41), or design complex perceptual interfaces (42).…”
Section: Discussionmentioning
confidence: 99%
“…These studies are essential before we can tackle complex databases and spontaneous expressions, such as those of ref. 39. Without an understanding of which AUs represent each category of emotion, it is impossible to understand naturalistic expressions and address fundamental problems in neuroscience (40), study psychiatric disorders (41), or design complex perceptual interfaces (42).…”
Section: Discussionmentioning
confidence: 99%
“…Further, although the MMI database is probably the only publicly available dataset containing recordings of spontaneous facial behavior at present, it still lacks metadata about the context in which these recordings were made such the utilized stimuli, the environment in which the recordings were made, the presence of other people, etc. Another database of spontaneous facial expressions was collected at UT Dallas (O'Toole et al, 2005). Similarly to the second part of the MMI facial expression database, facial displays were elicited using film clips.…”
Section: Facial Expression Databases and Ground Truthmentioning
confidence: 99%
“…We make use of the "Database of Moving Faces and People" [11] kindly provided by Perception Lab. at the University of Texas at Dallas.…”
Section: Resultsmentioning
confidence: 99%