Electrically tunable lenses (ETL), also known as liquid lenses, can be focused at various distances by changing the electric signal applied on the lens. ETLs require no mechanical structures, and therefore, provide a more compact and inexpensive focus control than conventional computerized translation stages. They have been exploited in a wide range of imaging and display systems and enabled novel applications for the last several years. However, the optical fluid in the ETL is rippled after the actuation, which physically limits the response time and significantly hampers the applicability range. To alleviate this problem, we apply a sparse optimization framework that optimizes the temporal pattern of the electrical signal input to the ETL. In verification experiments, the proposed method accelerated the convergence of the focal length to the target patterns. In particular, it converged the optical power to the target at twice the speed of the simply determined input signal, and increased the quality of the captured image during multi-focal imaging.
Fig. 1. The IlluminatedFocus technique optically defocuses real-world appearances in a spatially varying manner regardless of the distance from the user's eyes to observed real objects. The proposed technique enables various vision augmentation applications. (a) The proposed system consists of focal sweep eyeglasses (two Electrically Focus-Tunable Lenses (ETL)) and a high-speed projector. (b) Experimental proof of the proposed technique. (b-1) Experimental setup. Four objects (A, B, C, and D) are placed in front of the projector and ETL. A camera is regarded as a user's eye. (b-2, b-3, b-4) The objects are illuminated by the projector at different timings and the camera's focal length is periodically modulated by the ETL. As indicated by the yellow arrows, objects to appear focused (A, C, and D) are illuminated when they are in focus and the other object to appear blurred (B) is illuminated when it is out of focus. (b-5) When the frequency of the focal sweep is higher than the critical fusion frequency (CFF), these appearances are perceived to be integrated. The appearance of this image (only B is blurred) cannot be achieved by normal lens systems. Note that the brightness of (b-2) to (b-5) has been adjusted for better understanding.Abstract-Aiming at realizing novel vision augmentation experiences, this paper proposes the IlluminatedFocus technique, which spatially defocuses real-world appearances regardless of the distance from the user's eyes to observed real objects. With the proposed technique, a part of a real object in an image appears blurred, while the fine details of the other part at the same distance remain visible. We apply Electrically Focus-Tunable Lenses (ETL) as eyeglasses and a synchronized high-speed projector as illumination for a real scene. We periodically modulate the focal lengths of the glasses (focal sweep) at more than 60 Hz so that a wearer cannot perceive the modulation. A part of the scene to appear focused is illuminated by the projector when it is in focus of the user's eyes, while another part to appear blurred is illuminated when it is out of the focus. As the basis of our spatial focus control, we build mathematical models to predict the range of distance from the ETL within which real objects become blurred on the retina of a user. Based on the blur range, we discuss a design guideline for effective illumination timing and focal sweep range. We also model the apparent size of a real scene altered by the focal length modulation. This leads to an undesirable visible seam between focused and blurred areas. We solve this unique problem by gradually blending the two areas. Finally, we demonstrate the feasibility of our proposal by implementing various vision augmentation applications.Index Terms-Vision augmentation, spatial defocusing, depth-of-field, focal sweep, high-speed projection, spatial augmented reality • Daisuke Iwai is with Osaka University and JST, PRESTO.
Spatial zooming and magnification, which control the size of only a portion of a scene while maintaining its context, is an essential interaction technique in augmented reality (AR) systems. It has been applied in various AR applications including surgical navigation, visual search support, and human behavior control. However, spatial zooming has been implemented only on video see-through displays and not been supported by optical see-through displays. It is not trivial to achieve spatial zooming of an observed real scene using near-eye optics. This paper presents the first optical see-through spatial zooming glasses which enables interactive control of the perceived sizes of real-world appearances in a spatially varying manner. The key to our technique is the combination of periodically fast zooming eyeglasses and a synchronized high-speed projector. We stack two electrically focus-tunable lenses (ETLs) for each eyeglass and sweep their focal lengths to modulate the magnification periodically from one (unmagnified) to higher (magnified) at 60 Hz in a manner that prevents a user from perceiving the modulation. We use a 1,000 fps high-speed projector to provide high-resolution spatial illumination for the real scene around the user. A portion of the scene that is to appear magnified is illuminated by the projector when the magnification is greater than one, while the other part is illuminated when the magnification is equal to one. Through experiments, we demonstrate the spatial zooming results of up to 30% magnification using a prototype system. Our technique has the potential to expand the application field of spatial zooming interaction in optical see-through AR.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.