DOI: 10.17077/etd.d47y-9s7b
|View full text |Cite
|
Sign up to set email alerts
|

Random neural networks for dimensionality reduction and regularized supervised learning

Abstract: , for serving as my committee members. Thank you for letting my defense be an enjoyable moment. Your brilliant comments and constructive suggestions are much appreciated. Lastly, I would like to give my family, especially my parents, a special thanks. I am appreciative beyond the words for the love and encouragement you give me, which makes me have the courage to pursuit what I really love. Thank you for all the long-lasting companionship, unconditional supports, and constant guidance.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 66 publications
0
1
0
Order By: Relevance
“…At present, the algorithms for recognizing the open and closed state of eyes are divided into two types: manual feature extraction and automatic feature extraction. Among them, manual feature extraction mainly includes the template matching detection method, texture feature detection method, and shape feature detection method [26,27]. These methods rely on the extraction of texture features, and the selection of texture features requires a great deal of experimentation and sufficient experience.…”
Section: Methodsmentioning
confidence: 99%
“…At present, the algorithms for recognizing the open and closed state of eyes are divided into two types: manual feature extraction and automatic feature extraction. Among them, manual feature extraction mainly includes the template matching detection method, texture feature detection method, and shape feature detection method [26,27]. These methods rely on the extraction of texture features, and the selection of texture features requires a great deal of experimentation and sufficient experience.…”
Section: Methodsmentioning
confidence: 99%