2021 IEEE International Conference on Image Processing (ICIP) 2021
DOI: 10.1109/icip42928.2021.9506789
|View full text |Cite
|
Sign up to set email alerts
|

Spatiotemporal Features and Local Relationship Learning for Facial Action Unit Intensity Regression

Abstract: The action units (AU) encoded by the Facial Action Coding System (FACS) have been widely used in the representation of facial expressions. Although work on automatic facial AU detection has achieved quite good results in recent years, there remains much research potential for more accurate AU detection and intensity regression. Moreover, most work only considers the spatial information and ignores the temporal information. In practice, changes in facial AUs involve both spatial and temporal variation. In this … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 17 publications
0
2
0
Order By: Relevance
“…[11] both databases were shaped based on deliberate expressions, whereas D.I.S.F.A. [12][13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31] was created by using unprompted expressions. Accordingly, well-built datasets are prominent due to their ability to include as many aspects as possible.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…[11] both databases were shaped based on deliberate expressions, whereas D.I.S.F.A. [12][13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31] was created by using unprompted expressions. Accordingly, well-built datasets are prominent due to their ability to include as many aspects as possible.…”
Section: Related Workmentioning
confidence: 99%
“…Eye-Tracking is said to be the method of identification of the gaze point or the spot where users are watching for a specific visual stimulus. Different scholars use different devices for Eye-tracking, like Virtual Reality [16,17,28], Mobile Eye-tracking [14][15], and Desktop Eye-tracking [13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31][32]…”
Section: Data Acquisition and Preparationmentioning
confidence: 99%