2020
DOI: 10.1177/1064804620924133
|View full text |Cite
|
Sign up to set email alerts
|

Human–Computer Interactive Gesture Feature Capture and Recognition in Virtual Reality

Abstract: With the development of computer technology, the simulation authenticity of virtual reality technology is getting higher and higher, and the accurate recognition of human–computer interaction gestures is also the key technology to enhance the authenticity of virtual reality. This article briefly introduced three different gesture feature extraction methods: scale invariant feature transform, local binary pattern and histogram of oriented gradients (HOG), and back-propagation (BP) neural network for cl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 15 publications
(15 reference statements)
0
3
0
Order By: Relevance
“…Work on hand gesture recognition was originally seen as a method of human-computer interaction, with [3] exploring three different gesture feature extraction applications, scale invariant feature transform (SIFT), local binary pattern (LBP), and histogram of oriented gradients (HOG). Most applications have been developed on static gesture recognition, i.e., geometric feature extraction, the number of fingers lifted in a gesture, the distance from the center of the palm of the hand to the fingertips and the valley between the fingers [4].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Work on hand gesture recognition was originally seen as a method of human-computer interaction, with [3] exploring three different gesture feature extraction applications, scale invariant feature transform (SIFT), local binary pattern (LBP), and histogram of oriented gradients (HOG). Most applications have been developed on static gesture recognition, i.e., geometric feature extraction, the number of fingers lifted in a gesture, the distance from the center of the palm of the hand to the fingertips and the valley between the fingers [4].…”
Section: Related Workmentioning
confidence: 99%
“…It is performed by remembering long-term information and deciding which value to pass on to the other time step block, with W i and b being the weights and bias parameters associated with each Z i gate. These parameters are used to minimize the error of our training data, as shown in Equations ( 2) and (3).…”
Section: Equationsmentioning
confidence: 99%
“…With the continuous progress of novel sensing and computing techniques, Human Activity Recognition (HAR) has attracted extensive attention and become a promising approach for applications in medical treatment, body kinematics, Human-Computer Interaction (HCI), Virtual Reality (VR), motion analysis, Daily Living Assistant (DLA) and elderly care [1]- [4]. Compared with Computer Vision (CV)-based techniques, the wearable sensing device-based solution breaks the limits of space by getting rid of the fixed and bulky equipment.…”
Section: Introductionmentioning
confidence: 99%
“…Human-machine interaction (HMI) may be a critical area, where human gesture, posture, or motion can be recognized for the efficient interaction between machines covering a wide field of applications, including human-object interaction, virtual reality, immersive entertainment, etc. [4][5][6]. Human activity recognition and motion analysis has also been an effective way for sports analysis evaluation, while peer investigations are identified for golf swing analysis [7], swimming velocity estimation [8], and sports training [9].…”
Section: Introductionmentioning
confidence: 99%