To facilitate content-based video analysis, automatic scene change detection (SCD) with large-scale motion activity is an essential fundamental step for locating a transition from one video scene to another. With the exponential increase in digital media usage, SCD has become more challenging in processing large motion content with minimal information loss and maximum perseverance. Wipe SCD in object-camera motion is noticeable evidence of this issue. Wipe transitions, which are a type of gradual transition, have diverse motion pattern changes when influenced by object-camera motion (camera pan, large-object, and zoom-in/out), creating a velocity imbalance in the same frame. Furthermore, this motion imbalance leads to false detection. Due to the loss of motion information and longer processing time of existing frameworks, we propose a novel method of wipe scene change detection (WSCD) based on deep spatial-motion feature analysis. First, large input videos are segmented into shots using dimensionality reduction and adaptive threshold. Secondly, linear regression is used to compute slope angle changes in shots for candidate selection and wipe localization. Finally, only selected candidates are processed to extract features using a two-stream inflated 3D-convolutional neural network for RGB stream and optical flow velocity for motion stream network (I3DCNN) and then classified into wipe in-motion and no-motion clips. The experimental results are obtained by classifying wipe patterns using a detection reviewing and merging strategy on corresponding wipe frames. The average improvement in wipe scene change detection accuracy evaluated on the benchmark TRECVID dataset is 11.9%, demonstrating the efficacy of our proposed method.
several researches concerning electrooculography interface for Human Computer Interface (HCI) have been developed in recent years. For applications of disabled person such as lock-in, and Motor Neuron disease, a simple and effective technology for communication is necessary. Eye blink is defined as a selection command in existing research. Problem of current research is occurred when user blinks his eye involuntarily. To resolve this problem, in this paper, we develop a new electrooculography based system for typing words via virtual keyboard by using voltage threshold algorithm. EOG signal with different direction of eye movement in horizontal and vertical directions are detected. EOG signal is measured by two channels with six electrodes. Measurement circuit consists of three major processes: instrument amplifier, filter and signal conditioning amplifier processes. These circuits filter noise out, pass frequencies in ranges of EOG signal and then amplify the signal. The voltage threshold algorithm is then used to classify the EOG signal. Selection command is defined by closing eye in a short period of use to avoid eye blink involuntary. To test the performance of method, typing rate and accuracy are measured. Typing rate on virtual keyboard 25.94 seconds/letter and its accuracy is 95.2%. The results show the feasibility of proposed method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.