Talented musicians can deliver a powerful emotional experience to the audience by skillfully modifying several musical parameters, such as dynamics, articulation, and tempo. Musical robots are expected to control those musical parameters in the same way to give the audience an experience comparable to listening to a professional human musician. But practical control of those parameters depends on the type of musical instrument being played. In this study, we describe our newly developed music dynamics control system for the Waseda Anthropomorphic Saxophonist robot. We first built a physical model for the saxophone reed motion and verified the dynamics-related parameters of the overall robot-saxophone system. We found that the magnitude of air flow is related to the sound pressure level, as expected, but also that the lower lip is critical to the sound stability. Accordingly, we then implemented a music dynamics control system for the robot and succeeded in enabling the robot to perform a music piece with different sound pressure levels. Index Terms-Entertainment robotics, human-centered robotics, humanoid robots. I. INTRODUCTION M USIC is a social activity that can powerfully influence large groups of people. A skillful musician can elicit powerful emotions in the audience by careful modulation of several different musical parameters, such as dynamics, tempo, articulation and pitch [1]. In the emerging field of entertainment robotics, musical robots are attracting attention for their multiuser interactive experience potential [2]. With their musical performance abilities, these robots are expected to entertain and interact with a large crowd.
The benchmark suggests that the implemented system had an acceptable performance, and evaluation in the training environment demonstrated improved surgical task outcomes in expert surgeons. We will conduct a more comprehensive in vivo study in the future.
Purpose An endoscopic system is needed that presents informative images irrespective of the surgical situation and the number of degrees of freedom in endoscopic manipulation. This goal may be achieved with a virtual reality view for a region of interest from an arbitrary viewpoint. An endoscopic pseudo-viewpoint alternation system for this purpose was developed and tested.Method Surgical experts and trainees from an endoscopic surgery training course at the minimally invasive surgery training center of Kyushu University were enrolled in a trial of a virtual reality system. The initial viewpoint was positioned to approximate the horizontal view often seen in laparoscopic surgery, with between the optical axis of the endoscope and the task surface. A right-to-left suturing task with right hand, based on a task from the endoscopic surgery training course, was selected for testing. We compared task outcomes with and without use of a new virtual reality-viewing system.Result There was a 0.37 mm reduction in total error () with use of the proposed system. Error reduction was composed of 0.1 mm reduction on the y-axis and 0.27 mm reduction on the x-axis. Experts benefited more than novices from use of the proposed system. Most subjects worked at a pseudo-viewpoint of around 34.Discussion Suturing performance improved with the new virtual reality endoscopic display system. Viewpoint alternation resulted in an overview that improved depth perception and allowed subjects to better aim the marker. This suggests the proposed method offers users better visualization and control in endoscopic surgery.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.