Copter-based unmanned aerial vehicle (drone) systems are being utilized for surveillance, inspection and security purposes for well sites, gathering centers, pipelines, refineries, and other surface facilities. However, most of the practices largely rely on humans, including drone operation, data transfer, image analysis, etc. In this paper, we present a comprehensive, cloud-enabled, human-free autopilot drone system and its application in field surveillance and anomaly detection powered by customizable deep neural network and computer vision models.
The proposed system consists of customized quadcopter drones equipped with high-definition cameras, thermal imaging and gas sensing devices, autopiloted by cloud-connected onboard computers. A series of advanced algorithms are developed and deployed onboard and over the Cloud for processing and diagnosing the image/thermal/gas sensing data collected by the drones in real-time or near real-time, including accurate 2D geospatial aerial mapping, anomaly detection and classification for events like oil leak, gas leak, facility failure, human activities, etc. Object detection deep learning models are customized and parallelized for low-profile multi-core single board computers.
In our case study, a pre-configured drone flew along the same path twice at a 6-month gap. A robust, iterative image registering algorithm is developed to precisely align and overlay images taken at different days at the same or similar GPS locations, even with significant changes to the environment due to season shift, human activities, camera angles or height variations. Local changes are filtered and selected based on their sizes and magnitudes in the residual images by subtracting pairs of perfectly overlaid scenes. Pre-trained Residual Convolutional Neural network (He et al. 2015) is rapidly re-trained to further classify the type of changes using the techniques of transfer learning and data augmentation. An ROC of 99% was achieved in the multi-task binary classification, wherein the detected changes are divided into positive anomalies (such as oil/gas leak, facility failures, unauthored human activities) and negative (natural/insignificant) signals. Comparing against a support vector machine baseline with a ROC=92%, the ResNet model demonstrates significant, more promising detection accuracy at a faster training time.
This innovative integrated platform is presented that combines physical drone, onboard imaging/sensing devices, cloud connectivity, onboard and back-end control system, deep learning and computer vision architecture for situational awareness of oil & gas fields and the mining industry. It achieves full automation of mass surveillance, data acquisition and storage, diagnostics and asset situational understanding. The system architecture, especially the onboard and cloud computation engines, can be readily transferred and applied to other common drone platforms.