We present a method for measuring the orbital angular momentum (OAM) of optical vortices through extracting the phase values sampled by a multipinhole plate. We demonstrate that the phase of an optical vortex passing through a multipinhole plate can be directly extracted from the Fourier transform of a single diffraction intensity pattern according to a simple algorithm and thus the l state or the OAM of the photons can be measured quantitatively.
In this paper we suggest the use of tangible user interfaces (TUIs) for human-robot interaction (HRI) applications. We discuss the potential benefits of this approach while focusing on low-level of autonomy tasks. We present an experimental robotic interaction test bed to support our investigation. We use the test bed to explore two HRIrelated task-sets: robotic navigation control and robotic posture control. We discuss the implementation of these two task-sets using an AIBO™ robot dog. Both tasks were mapped to two different robotic control interfaces: keypad interface which resembles the interaction approach currently common in HRI, and a gesture input mechanism based on Nintendo Wii™ game controllers. We discuss the interfaces implementation and conclude with a detailed user study for evaluating these different HRI techniques in the two robotic tasks-sets.
For the iterative phase retrieval, multiple measured intensity images in output plane are only considered for accelerating the convergence. The amplitude and phase at the observed object plane of measurement system are unknown in this research. The observing system is composed of gyrator transform, in which several images are recorded by using several transform angles for the same input image. An amplitude-phase retrieval scheme is designed and tested. The numerical simulations have demonstrated that the amplitude and phase pattern within a very small error (less than 0.04 and 0.0005 for an 8-bit two-dimensional data) can be recovered after 1000 iterations.
Multiple distance phase retrieval methods hold great promise for imaging and measurement due to their less expensive and compact setup. As one of their implementations, the amplitude-phase retrieval algorithm (APR) can achieve stable and high-accuracy reconstruction. However, it suffers from the slow convergence and the stagnant issue. Here we propose an iterative modality named as weighted feedback to solve this problem. With the plug-ins of single and double feedback, two augmented approaches, i.e. the APRSF and APRDF algorithms, are demonstrated to increase the convergence speed with a factor of two and three in experiments. Furthermore, the APRDF algorithm can extend the multiple distance phase retrieval to the partially coherent illumination and enhance the imaging contrast of both amplitude and phase, which actually relaxes the light source requirement. Thus the weighted feedback enables a fast-converging and high-contrast imaging scheme for the iterative phase retrieval.
Tone-mapping operator (TMO) is intended to convert high dynamic range (HDR) content into a lower dynamic range so that it can be displayed on a standard dynamic range (SDR) device. The tonemapped result of HDR content is usually stored as SDR image. For different HDR scenes, traditional TMOs are able to obtain a satisfying SDR image only under manually fine-tuned parameters. In this paper, we address this problem by proposing a learning-based TMO using deep convolutional neural network (CNN). We explore different CNN structure and adopt multi-scale and multi-branch fully convolutional design. When training deep CNN, we introduce image quality assessments (IQA), specifically, tone-mapped image quality assessment, and implement it as semi-supervised loss terms. We discuss and prove the effectiveness of semisupervised loss terms, CNN structure, data pre-processing, etc. by several experiments. Finally, we demonstrate that our approach can produce appealing results under diversified HDR scenes.
As a coherent diffractive imaging technique, axial multi-image phase retrieval utilizes a series of diffraction patterns on the basis of axial movement diversity to reconstruct full object wave field. Theoretically, fast convergence and high-accuracy of axial multi-image phase retrieval are demonstrated. In experiment, its retrieval suffers from the tilt illumination, in which diffraction patterns will shift in the lateral direction as the receiver traverses along the axis. In this case, the reconstructed result will be blurry or even mistaken. To solve this problem, we introduce cross-correlation calibration to derive the oblique angle and employ tilt diffraction into axial phase retrieval to recover a target, which is successfully demonstrated in simulation and experiment. Also, our method could provide a useful guidance for measuring how obliquely the incident light illuminates in an optical system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.