The present study was designed to investigate the incorporation of tools into the human body schema. Previous research on tool use suggests that through physical interaction with a tool, the representation of the body is adjusted to incorporate or "embody" the tool. The present experiment was conducted to test the limb-specific nature of tool embodiment. Participants were presented with images of a person holding a rake and executed hand- and foot-press responses to colored targets superimposed on the hand, foot, and rake of the image. This task was completed before and after moving a ball around a course with a hand-held rake. Consistent with previous research, a body-part compatibility effect emerged-response times (RTs) were shorter when the responding limb and target location were compatible (e.g., hand responses to targets on the hand) than when they were incompatible (e.g., hand responses to targets on the foot). Of greater theoretical relevance, hand RTs to targets presented on the hand were shorter than those to targets on the rake prior to experience, but were not different after completing the rake task. The post-experience similarity in hand RTs emerged because there was a significant reduction in RTs to targets on the rake following use. There was no significant pre-/post-experience change in hand RTs to targets on the hand or, importantly, for any response executed by the foot. These results provide new evidence that a tool is embodied in a limb-specific manner and is represented within the body schema as if it was an extension of the limb.
Recent work in neuroscience suggests that there is a common coding in the brain between perception, imagination and execution of movement. Further, this common coding is considered to allow people to recognize their own movements when presented as abstract representations, and coordinate with these movements better. We are investigating how this 'own movement effect' could be extended to improve the interaction between players and game avatars, and how it might be leveraged to augment players' cognition. To examine this question, we have designed and developed a tangible puppet interface and 3D virtual environment that are tailored to investigate the mapping between player and avatar movements. In a set of two experiments, we show that when the puppet interface is used to transfer players' movements to the avatar, the players are able to recognize their own movements, when presented alongside others' movements. In both experiments, players did not observe their movements being transferred to the avatar, and the recognition occurred after a week of the transfer. Since the recognition effect persisted even with these two handicaps, we conclude that this is a robust effect, and the puppet interface is effective in personalizing an avatar, by transferring a player's own movements to the virtual character.
Aspects of spatial cognition, specifically spatial skills, are strongly correlated with interest and success in STEM courses and STEM-related professions. Because growth in STEM-related industries is expected to continue for the foreseeable future, it is important to develop evidence-based and theoretically grounded methods and interventions that can help train relevant spatial skills. In this article, we discuss research showing that aspects of spatial cognition are embodied and how these findings and theoretical developments can be used to influence the design of tangible and embodied interfaces (TEIs). TEIs seek to bring interaction with digital content off the screen and into the physical environment. By incorporating physical movement and tangible feedback in digital systems, TEIs can leverage the relationship between the body and spatial cognition to engage, support, or improve spatial skills. We use this knowledge to define a design space for TEIs that engage spatial cognition and illustrate how TEIs that are designed and evaluated from a spatial cognition perspective can expand the design space in ways that contribute to the fields of cognitive science and human computer interaction.
We have developed an embodied puppet interface that translates a player's body movements to a virtual character, thus enabling the player to have a fine grained and personalized control of the avatar. To test the efficacy and short-term effects of this control interface, we developed a two-part experiment, where the performance of users controlling an avatar using the puppet interface was compared with users controlling the avatar using two other interfaces (Xbox controller, keyboard). Part 1 examined aiming movement accuracy in a virtual contact game. Part 2 examined changes of mental rotation abilities in users after playing the virtual contact game. Results from Part 1 revealed that the puppet interface group performed significantly better in aiming accuracy and response time, compared to the Xbox and keyboard groups. Data from Part 2 revealed that the puppet group tended to have greater improvement in mental rotation accuracy as well. Overall, these results suggest that the embodied mapping between a player and avatar, provided by the puppet interface, leads to important performance advantages.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.