Optical coherence elastography (OCE) has been proposed for a range of clinical applications. However, the majority of these studies have been performed using bulky, labbased imaging systems. A compact, handheld imaging probe would accelerate clinical translation, however, to date, this had been inhibited by the slow scan rates of compact devices and the motion artifact induced by the user's hand. In this paper, we present a proofof-concept, handheld quantitative micro-elastography (QME) probe capable of scanning a 6 × 6 × 1 mm volume of tissue in 3.4 seconds. This handheld probe is enabled by a novel QME acquisition protocol that incorporates a custom bidirectional scan pattern driving a microelectromechanical system (MEMS) scanner, synchronized with the sample deformation induced by an annular PZT actuator. The custom scan pattern reduces the total acquisition time and the time difference between B-scans used to generate displacement maps, minimizing the impact of motion artifact. We test the feasibility of the handheld QME probe on a tissue-mimicking silicone phantom, demonstrating comparable image quality to a benchmounted setup. In addition, we present the first handheld QME scans performed on human breast tissue specimens. For each specimen, quantitative micro-elastograms are co-registered with, and validated by, histology, demonstrating the ability to distinguish stiff cancerous tissue from surrounding soft benign tissue.
Virtual and augmented reality (VR/AR) displays crucially rely on stereoscopic rendering to enable perceptually realistic user experiences. Yet, existing near-eye display systems ignore the gaze-dependent shift of the no-parallax point in the human eye. Here, we introduce a gaze-contingent stereo rendering technique that models this effect and conduct several user studies to validate its effectiveness. Our findings include experimental validation of the location of the no-parallax point, which we then use to demonstrate significant improvements of disparity and shape distortion in a VR setting, and consistent alignment of physical and digitally rendered objects across depths in optical see-through AR. Our work shows that gaze-contingent stereo rendering improves perceptual realism and depth perception of emerging wearable computing systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.