This paper implements and analyzes a CMOS angular velocity-and direction-selective rotation sensor with a retinal processing circuit. The proposed rotation sensor has a polar structure and is selective of the angular velocity and direction (clockwise and counterclockwise) of the rotation of images. The correlation-based algorithm is adopted and each pixel in the rotation sensor is correlated with the pixel that is 45 apart. The angular velocity selectivity is enhanced by placing more than one pixel between two correlated pixels. The angular velocity selectivity is related to both the number and the positions of the edges in an image. Detailed analysis characterizes angular velocity selectivity for different edges. An experimental chip consisting 104 pixels, which form five concentric circles, is fabricated. The single pixel has an area of 91 84 m 2 and a fill factor of 20%, whereas the area of the chip is 1812 1825 m 2 . The experimental results concerning the fabricated chip successfully verified the analyzed characteristics of angular velocity and direction selectivity. They showed that the detectable angular velocity and range of illumination of this rotation sensor are from 2 5 10 3 s to 40 s and from 0.91 lux to 366 lux, respectively.
This work presents and implements a CMOS real-time focal-plane motion sensor intended to detect the global motion, using the bipolar junction transistor (BJT)-based retinal smoothing network and the modified correlation-based algorithm. In the proposed design, the BJT-based retinal photoreceptor and smoothing network are adopted to acquire images and enhance the contrast of an image while the modified correlation-based algorithm is used in signal processing to determine the velocity and direction of the incident image. The deviations of the calculated velocity and direction for different image patterns are greatly reduced by averaging the correlated output over 16 frame-sampling periods. The proposed motion sensor includes a 32 32 pixel array with a pixel size of 100 100 m 2. The fill factor is 11.6% and the total chip area is 4200 4000 m 2. The dc power consumption is 120 mW at 5 V in the dark. Experimental results have successfully confirmed that the proposed motion sensor can work with different incident images and detect a velocity between 1 pixel/s and 140,000 pixels/s via controlling the frame-sampling period. The minimum detectable displacement in a frame-sampling period is 5 m. Consequently, the proposed high-performance new motion sensor can be applied to many real-time motion detection systems.
In this paper, the quantum-dot Largeneighborhood cellular neural (nonlinear) network (QLN-CNN) is proposed and analyzed. In the proposed QLN-CNN, the quantum dots are used to realized neuron cells whereas the strength of Coulombic forces among neurons are used as weights among neurons. The proposed QLN-CNN can perform the functions of image noise removal. It has small chip area and high cell density. Moreover, the power dissipation is very low. Thus large-size QLN-CNN could be realized for nanoelectronic systems. research efforts have been devoted to the implementation of CNNs using the QCA [ 3 ] . However, the LN-CNN realization using the QCA has not yet proposed so far. In this paper, the quantum-dot LN-CNN (QLN-CNN) is proposed and analyzed. In the proposed QLN-CNN, the QCA are used to form CNN neuron cells [4]. The Coulombic forces among neurons are used to realize the synaptic weights among neurons. The function of noise removal processing has been successfully performed in the QLN-CNN. This verifies the correct function of the proposed QLN-CNN. It has small chip area and low power dissipation, being suitable for large array application.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.