Virtual Reality (VR) popularity is increasing as it is becoming more affordable for end users. Available VR hardware includes low-end inexpensive devices like Google Cardboard and high-end ones like HTC Vive or Oculus Rift, which are more expensive headsets. Using VR as a platform for content delivery allows better user engagement than other traditional methods, as VR headsets remove external distractions. Multiuser VR applications provide shared experiences where users can communicate and interact in the same virtual space. This shared environment, however, introduces challenges regarding network performance, quality of service (QoS) and sessions privacy. This paper presents a multi-user VR application and aims to evaluate network behaviour in a number of scenarios, including real VR headsets (i.e. Oculus Rift), as well as simulated ones. This QoS analysis is important for the understanding of how many VR users can be simultaneously connected with high image quality.
Virtual reality (VR) is currently being used in many different areas such as car prototyping, gaming, medical training, teaching, and so on. The Internet of Things (IoT) devices such as systems-on-achip (e.g. Raspberry Pi), smart appliances, and sensors support a wide range of services, including machine automation, and remote monitoring and control. This paper introduces a novel social VR-IoT environment, which allows users to share and control local or remote IoT devices in a virtual platform. Two approaches using the VR-IoT solution are presented: one local network-based and one cloud-based. The proposed VR-IoT environment contains VRITESS, the novel VR-IoT Environment Synchronization Scheme, which facilitates a consistent and integrated experience for users by enabling control of real IoT objects with VR headsets. The control of some IoT objects in extreme environments or devices which are complex to operate can be simplified in a virtual environment. The VRITESS synchronization scheme maintains the real objects updated following instructions given in the virtual world and vice-versa. Testing involved local network-based and cloud-based testbeds created with a VR headset and IoT devices with the Performance Engineering Laboratory, Dublin City University's, Ireland. Test results demonstrated that lower latency is experienced in the local-network testbed in comparison with the cloud testbed. Furthermore, tests regarding the communications protocols implemented in the cloud testbed indicated that MQTT generates less delay and data traffic than REST. INDEX TERMS Multimedia IoT, three-dimensional visualization, virtual reality (VR), VR-IoT. ANDERSON AUGUSTO SIMISCUKA (S'17) received the B.Sc. degree in information systems from Mackenzie Presbyterian University, São Paulo, Brazil. He is currently pursuing the Ph.D. degree with the Performance Engineering Laboratory, School of Electronic Engineering, Dublin City University (DCU). He was involved in several telecom and software development projects with companies such as Wittel,
The Internet of Things (IoT) can avail from device-to-device (D2D) communication techniques to increase object data exchange performance. IoT networks aim to offer a massive number of services at high quality levels, and many of the devices providing these services are mobile. Devices such as wearables, sensors, drones and smart vehicles need constant connectivity despite their moving patterns and therefore, an IoT architecture should consider both Quality of Service (QoS) and mobility. D2D allows devices to communicate directly to share content and functionality, such as access to the Internet. This paper proposes REMOS-IoT -A RElay and MObility Scheme for improved IoT communication performance in support of increased QoS for the data exchange services between mobile IoT devices. Simulation-based testing showed how performance of devices increased in several scenarios, demonstrating the efficiency of the proposed architecture and algorithms.
High-resolution audio-visual virtual reality (VR) technologies currently offer satisfying experiences for both sight and hearing senses in the world of multimedia. However, the delivery of truly immersive experiences requires the incorporation of other senses such as touch and smell. Multisensorial effects are usually manually synchronized with videos and data is stored in companion files, which contain timestamps for these effects. This manual task becomes very complex for 360°videos, as the scenes triggering effects can occur in different viewpoints. The solution proposed in this paper aims to automatically add extra sensory information to immersive 360°videos. A novel scent prediction scheme using Convolutional Neural Networks (CNN) is proposed to perform scene predictions on 360°videos represented in the Equi-Angular Cubemap format in order to add scents relevant to the detected content. Digital signal processing is used to detect loud sounds in the video with a Root Mean Squared (RMS) function, which are then associated with haptic feedback. A prototype was developed, which outputs multisensorial stimuli by using an olfaction dispenser and a haptic mouse. The proposed solution has been tested and it achieved excellent results in terms of accuracy of scene detection, olfaction latency and correct execution of the relevant effects. Different CNN architectures, including AlexNet, ResNet18 and ResNet50, were also assessed comparatively, achieving a labeling accuracy of up to 72.67% for olfaction-enhanced media.INDEX TERMS Multisensory, neural networks, three-dimensional visualization, immersive video.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.