2012
DOI: 10.1117/1.oe.51.10.107201
|View full text |Cite
|
Sign up to set email alerts
|

Learning embedded lines of attraction by self organization for pose and expression invariant face recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2014
2014
2016
2016

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…Recognition Rate Sparse Representation-based Classification [55] 99.49 Compressive Sensing based [56] 100 Self organization learning embedded lines of attraction [57] 100 Our Proposed Method 99.93 Table V.1: The comparison with the results in [55,56,57] on CMU AMP Facial Expression database one of them needs to consider choosing the proper parameter in order to balance the accuracy and computation cost. We lose some accuracy but the efficiency is higher.…”
Section: Methodsmentioning
confidence: 99%
“…Recognition Rate Sparse Representation-based Classification [55] 99.49 Compressive Sensing based [56] 100 Self organization learning embedded lines of attraction [57] 100 Our Proposed Method 99.93 Table V.1: The comparison with the results in [55,56,57] on CMU AMP Facial Expression database one of them needs to consider choosing the proper parameter in order to balance the accuracy and computation cost. We lose some accuracy but the efficiency is higher.…”
Section: Methodsmentioning
confidence: 99%
“…Given equation 16, we can embed the normalized coefficients to multiply the weights using the following equation. 11) . .…”
Section: Nonlinear Dimensionality Reductionmentioning
confidence: 97%
“…The nonlinear line attractor (NLA) network, formed by Seow et al 9 is the type we will be focusing on, which has applications in skin color association, pattern association, 10 and pose and expression invariant face recognition. 11 Most recurrent neural networks are totally interconnected, meaning that every node in the network is connected to each other. If we glean information about the structures in the brain, we can see that the neurons are not fully interconnected, but connected only to surrounding neurons.…”
Section: Introductionmentioning
confidence: 99%