Robots can mimic humans, including recognizing faces and emotions. However, relevant studies have not been implemented in real-time humanoid robot systems. In addition, face and emotion recognition have been considered separate problems. This study proposes a combination of face and emotion recognition for real-time application in a humanoid robot. Specifically, face and emotion recognition systems were developed simultaneously using convolutional neural network architectures. The model was compared to well-known architectures, such as AlexNet and VGG16, to determine which is better for implementation in humanoid robots. Data used for face recognition were primary data taken from 30 electrical engineering students after preprocessing, resulting in 18,900 data points. Emotion data of surprise, anger, neutral, smile, and sad were taken from the same respondents and combined with secondary data for a total of 5,000 data points for training and testing. The test was carried out in real time on a humanoid robot using the two architectures. The face and emotion recognition accuracy was 85% and 64%, respectively, using the AlexNet model. VGG16 yielded recognition accuracies of 100% and 73%, respectively. The proposed model architecture showed 87% and 67% accuracies for face recognition and emotion recognition, respectively. Thus, VGG16 performs better in recognizing faces as well as emotions, and it can be implemented in humanoid robots. This study also provides a method for measuring the distance between the recognized object and robot with an average error rate of 2.52%.
A robot must employ a suitable control method to obtain a good stability. The Two-Wheeled Self Balancing Robot in this paper is designed using a MPU-6050 IMU sensor module and ATmega128 microcontroller as its controller board. This IMU sensor module is employed to measure any change in the robot's tilt angle based on gyroscope and accelerometer readings contained in the module. The tilt angle readings are then utilized as the setpoint on the control methods, namely PD (Proportional Derivative), PI (Proportional Integral), or PID (Proportional Integral Derivative). Based on the conducted testing results, the PID controller is the best control strategy when compared to the PD and PI control. With parameters of Kp = 14, Ki = 0005 and Kd = 0.1, the robot is able to adjust the speed and direction of DC motor rotation to maintain upright positions on flat surfaces.
Voice activity detection (VAD) is an important preprocessing step for various speech applications to identify speech and non-speech periods in input signals. In this paper, we propose a deep neural network (DNN)-based VAD method for detecting such periods in noisy signals using speech dynamics, which are time-varying speech signals that may be expressed as the first-and second-order derivatives of mel cepstra, also known as the delta and delta-delta features. Unlike these derivatives, in this paper, the dynamics are highlighted by speech period candidates, which are calculated based on heuristic rules for the patterns of the first and second derivatives of the input signals. These candidates, together with the log power spectra, are input into the DNN to obtain VAD decisions. In this study, experiments are conducted to compare the proposed method with a DNN-based method, which exclusively utilizes log power spectra by using speech signals smeared with five types of noise (white, babble, factory, car, and pink) with signal-to-noise ratios (SNRs) of 10, 5, 0, and − 5 dB. The experimental results show that the proposed method is superior under all the considered noise conditions, indicating that the speech period candidates improve the log power spectra.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.