Augmented reality (AR) is a well-known technology that can be exploited to provide mass-market users an effective and customizable support in a large spectrum of personal applications, by overlapping computer-generated hints to the real world. Mobile devices, such as smartphones and tablets, are playing a key role in the exponential growth of this kind of solutions. Nonetheless, there exists some application domains that just started to take advantage from the AR systems. Maintenance, repair, and assembly have been considered as strategic fields for the application of the AR technology from the 1990s, but often only specialists using ad hoc hardware were involved in limited experimental tests. Nowadays, AR-based maintenance and repair procedures are available also for end-users on consumer electronics devices. This paper aims to explore new challenges and opportunities of this technology, by also presenting the software framework that is being developed in the EASE-R 3 project by exploiting reconfigurable AR procedures and tele-assistance to overcome some of the limitations of current solutions.
The evolution of input device technologies led to identification of the natural user interface (NUI) as the clear evolution of the human-machine interaction, following the shift from command-line interfaces (CLI) to graphical user interfaces (GUI). The design of user interfaces requires a careful mapping of complex user "actions" in order to make the human-computer interaction (HCI) more intuitive, usable, and receptive to the user's needs: in other words, more user-friendly and, why not, fun. NUIs constitute a direct expression of mental concepts and the naturalness and variety of gestures, compared with traditional interaction paradigms, can offer unique opportunities also for new and attracting forms of human-machine interaction. In this paper, a kinectbased NUI is presented; in particular, the proposed NUI is used to control the Ar.Drone quadrotor.
This paper studies the opportunities coming from the use of consumer devices like smartphones and tablets to perform maintenance and assembly procedures with Augmented Reality (AR). Pros and cons are evaluated by comparing completion times and errors made while executing a maintenance procedure with an AR-based tool and paper-based instructions.
The evolution of input device technologies led to identification of the natural user interface (NUI) as the clear evolution of the human-machine interaction, following the shift from command-line interfaces (CLI) to graphical user interfaces (GUI). The design of user interfaces requires a careful mapping of complex user "actions" in order to make the human-computer interaction (HCI) more intuitive, usable, and receptive to the user's needs: in other words, more user-friendly and, why not, fun. NUIs constitute a direct expression of mental concepts and the naturalness and variety of gestures, compared with traditional interaction paradigms, can offer unique opportunities also for new and attracting forms of human-machine interaction. In this paper, a kinectbased NUI is presented; in particular, the proposed NUI is used to control the Ar.Drone quadrotor.
The digitalization is transforming the very nature of factories, from automated systems to intelligent ones. In this process, industrial robots play a key role. Even if repeatability, precision and velocity of the industrial manipulators enable reaching considerable production levels, factories are required to face an increasingly competitive market, which requires being able to dynamically adapt to different situations and conditions. Hence, facilities are moving toward systems that rely on the collaboration between humans and machines. Human workers should understand the behavior of the robots, placing trust in them to properly collaborate. If a fault occurs on a manipulator, its movements are suddenly stopped for security reasons, thus workers may not be able to understand what happened to the robot. Therefore, the operators' stress and anxiety may increase, compromising the human-robot collaborative scenario. This work fits in this context and it proposes an adaptive Augmented Reality system to display industrial robot faults by means of the Microsoft HoloLens device. Starting from the methodology employed to identify which virtual metaphors best evoke robot faults, an adaptive modality is presented to dynamically display the metaphors in positions close to the fault location, always visible from the user and not occluded by the manipulator. A comparison with a non adaptive modality is proposed to assess the effectiveness of the adaptive solution. Results show that the adaptive modality allows users to recognize faults faster and with fewer movements than the non adaptive one, thus overcoming the limitation of the narrow field-of-view of the HoloLens device.
Augmented Reality (AR) applications are nowadays largely diffused in many fields of use, especially for entertainment, and the market of AR applications for mobile devices grows faster and faster. Moreover, new and innovative hardware for human-computer interaction has been deployed, such as the Leap Motion Controller. This paper presents some preliminary results in the design and development of a hybrid interface for hand-free augmented reality applications. The paper introduces a framework to interact with AR applications through a speech and gesture recognition-based interface. A Leap Motion Controller is mounted on top of AR glasses and a speech recognition module completes the system. Results have shown that, using the speech or the gesture recognition modules singularly, the robustness of the user interface is strongly dependent on environmental conditions. On the other hand, a combined usage of both modules can provide a more robust input.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.