This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
In this article, the development of an augmented reality–based robotic work cell is presented, consisting of a virtual robot arm, conveyor belt, pallet and computer numerical control machine that simulates an actual manufacturing plant environment. The kinematics of the robot arm is realized using Denavit–Hartenberg’s theorem, which enables complete manipulation of the end-effector in three-dimensional space when interacting with other virtual machines. Collision detection is implemented in two areas, namely, modifiable marker–based detection for the robot arm, which detects nearby obstacles as well as integration with object manipulation to pick and place a virtual object around the environment. In addition, an augmented heads-up display overlay displays live information of the current system. The case studies suggest that the proposed system can simulate a collision-free operation while displaying the coordinates of the virtual object, current tool equipped and speed of the conveyor belt, with a percentage error of less than 5%.
Frisson is the feeling and experience of physical reactions such as shivers, tingling skin, and goosebumps. Using entrainment through facilitating interpersonal transmissions of embodied sensations, we present "Frisson Waves" with the aim to enhance live music performance experiences. "Frisson Waves" is an exploratory real-time system to detect, trigger and share frisson in a wave-like pattern over audience members during music performances. The system consists of a physiological sensing wristband for detecting frisson and a thermo-haptic neckband for inducing frisson. In a controlled environment, we evaluate detection (n=19) and triggering of frisson (n=15). Based on our findings, we conducted an in-the-wild music concert with 48 audience members using our system to share frisson. This paper summarizes a framework for accessing, triggering and sharing frisson. We report our research insights, lessons learned, and limitations of "Frisson Waves".
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.