Visual sensing of the environment is crucial for flying an unmanned aerial vehicle (UAV) and is a centerpiece of many related applications. The ability to run computer vision and machine learning algorithms onboard an unmanned aerial system (UAS) is becoming more of a necessity in an effort to alleviate the communication burden of high-resolution video streaming, to provide flying aids, such as obstacle avoidance and automated landing, and to create autonomous machines. Thus, there is a growing interest on the part of many researchers in developing and validating solutions that are suitable for deployment on a UAV system by following the general trend of edge processing and airborne computing, which transforms UAVs from moving sensors into intelligent nodes that are capable of local processing. In this paper, we present, in a rigorous way, the design and implementation of a 12.85 kg UAV system equipped with the necessary computational power and sensors to serve as a testbed for image processing and machine learning applications, explain the rationale behind our decisions, highlight selected implementation details, and showcase the usefulness of our system by providing an example of how a sample computer vision application can be deployed on our platform.
Situational awareness is a critical aspect of the decision-making process in emergency response and civil protection and requires the availability of up-to-date information on the current situation. In this context, the related research should not only encompass developing innovative single solutions for (real-time) data collection, but also on the aspect of transforming data into information so that the latter can be considered as a basis for action and decision making. Unmanned systems (UxV) as data acquisition platforms and autonomous or semi-autonomous measurement instruments have become attractive for many applications in emergency operations. This paper proposes a multipurpose situational awareness platform by exploiting advanced on-board processing capabilities and efficient computer vision, image processing, and machine learning techniques. The main pillars of the proposed platform are: (1) a modular architecture that exploits unmanned aerial vehicle (UAV) and terrestrial assets; (2) deployment of on-board data capturing and processing; (3) provision of geolocalized object detection and tracking events; and (4) a user-friendly operational interface for standalone deployment and seamless integration with external systems. Experimental results are provided using RGB and thermal video datasets and applying novel object detection and tracking algorithms. The results show the utility and the potential of the proposed platform, and future directions for extension and optimization are presented.
Rescue operations in both small-scale emergencies and major natural or man-made disasters are very challenging. The first responders are requested to explore unknown and potentially hazardous environments, risking their own well-being in order to save others. New innovative technologies are essential to support the first responders in their tasks, ensuring their safety and the effectiveness of their operations. These technologies, that may include wearable devices, automated vehicles and drones or back-end services require communications in order to operate in full capacity. Available infrastructure often fail in cases of emergency, while some operational environments may not even support them to begin with. Aiming to alleviate this barrier, a resilient, field deployable system, that can support the communication between all the equipment deployed at the field and multiple backhaul networks is presented here. The design of the communications, the hardware solutions that support the design as well as the selected configurations are discussed in detail.
The evacuation and abandonment of large passenger ships, involving thousands of passengers, is a safety-critical task where techniques and systems that can improve the complex decision-making process and the timely response to emergencies on board are of vital importance. Current evacuation systems and processes are based on predefined and static exit signs, information provided to the passengers in the form of evacuation drills, emergency information leaflets and public announcements systems. It is mandatory for passengers to wear lifejackets during an evacuation, which are made of buoyant or inflatable material to keep them afloat in the water. Time is the most critical attribute in ship evacuation and can significantly affect the overall evacuation process in case passengers do not reach their embarkation stations in a timely manner. Moreover, extreme conditions and hazards, such as fire or flooding, can prevent and hinder the timely evacuation process. To improve the current evacuation systems onboard large passenger ships, a smart lifejacket has been designed and implemented within the context of the project SafePASS. The proposed smart lifejacket integrates indoor localization and navigation functionality to assist passengers during the evacuation process. Once the passenger location is calculated within the ship, the navigation feature guides the passengers along an escape route using vibration motors attached to the lifejacket. This is done in the form of haptic cues to help passengers reach their destination, especially in low-visibility conditions and in case they are left behind or lost. This can increase passenger safety and reduce the total evacuation time, as well as support dynamic evacuation scenarios where the predefined routes and static exit routes may not be available due to fire or flooding incidents.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.