Optical measurement devices for eye movements are generally expensive and it is often necessary to restrict user head movements when various eye-gaze input interfaces are used. Previously, we proposed a novel eye-gesture input interface that utilized electrooculography amplified via an AC coupling that does not require a head mounted display. Instead, combinations of eyegaze displacement direction were used as the selection criteria. When used, this interface showed a success rate approximately 97.2%, but it was necessary for the user to declare his or her intention to perform an eye gesture by blinking or pressing an enter key. In this paper, we propose a novel eye-glance input interface that can consistently recognize glance behavior without a prior declaration, and provide a decision algorithm that we believe is suitable for eye-glance input interfaces such as small smartphone screens. In experiments using our improved eye-glance input interface, we achieved a detection rate of approximately 91% and a direction determination success rate of approximately 85%. A smartphone screen design for use with the eye-glance input interface is also proposed.
In the past, many studies have been carried out on eyegaze input; however, in this study, we developed an eyeglance input interface that tracks a combination of short eye movements. Unlike eye-gaze input that requires high accuracy measurements, eye-glance input can be detected with only a rough indication of the direction of the eye movements, making it possible to operate even terminals with small screens, such as smartphones. In this study, we used an inexpensive camera to measure eye movements and analyzed its output using the OpenCV, an open source computer vision and machine learning software library, to construct an inexpensive and non-contact interface. In a previous study, we developed an algorithm that detected eye-glance input through image analysis using OpenCV, and fed the result of the algorithm back to our subjects. In that study, the average detection rate for the eye-glance input was 76 %. However, we also observed several problems with the algorithm, particularly the problem of false detections due to blinking of the eyes, and implemented solutions for improvement. In this study, we have made improvement with respect to the unsatisfactory detection rate recorded in our previous study, and addressed problems related to user convenience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.