The management of remote services, such as remote surgery, remote sensing, or remote driving, has become increasingly important, especially with the emerging 5G and Beyond 5G technologies. However, the strict network requirements of these remote services represent one of the major challenges that hinder their fast and large-scale deployment in critical infrastructures. This article addresses certain issues inherent in remote and immersive control of virtual reality (VR)-based unmanned aerial vehicles (UAVs), whereby a user remotely controls UAVs, equipped with 360 • cameras, using their head-mounted devices (HMD) and their respective controllers. Remote and immersive control services, using 360 • video streams, require much lower latency and higher throughput for true immersion and high service reliability. To assess and analyze these requirements, this article introduces a real-life testbed system that leverages different technologies (e.g., VR, 360 • video streaming over 4G/5G, and edge computing). In the performance evaluation, different latency types are considered. They are namely: 1) glass-to-glass latency between the 360 • camera of a remote UAV and the HMD display; 2) user/pilot's reaction latency; and 3) the command/execution latency. The obtained results indicate that the responsiveness (dubbed Glass-to-Reaction-to-Execution-GRElatency) of a pilot, using our system, to a sudden event is within an acceptable range, i.e., around 900 ms.
The emergence of 5th-generation networks and the introduction of the ultra-low latency Internet, namely, tactile internet, by the International Telecommunication Union (ITU), has opened up a wide range of applications. Extended Reality (XR), holoportation, and remote control of machines are among the ones that would revolutionize the future of factories, smart cities, and digital healthcare. Virtual Reality (VR) technology provides users with highly realistic visual and auditory experiences, enabling a high sense of immersion and embodiment in virtual environments. In the real world, however, we use more senses than only vision and hearing to perceive our surroundings. Particularly, tactile sensation is the only bidirectional modality that enables us to perceive and interact with the objects and surfaces around us. This paper introduces a real-life testbed Unmanned Aerial Vehicle (UAV)-based system that leverages different technologies, including VR, 360 • video streaming over 4G/5G, and edge computing to enable 6 Degrees of Freedom (6DoF) view with haptic feedback for a maximized immersion. The demonstration videos of the testbed are made publicly available on the following links: Link1 and Link2.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.