Human movement has been studied for decades, and dynamic laws of motion that are common to all humans have been derived. Yet, every individual moves differently from everyone else (faster/slower, harder/smoother, etc.). We propose here an index of such variability, namely an individual motor signature (IMS) able to capture the subtle differences in the way each of us moves. We show that the IMS of a person is time-invariant and that it significantly differs from those of other individuals. This allows us to quantify the dynamic similarity, a measure of rapport between dynamics of different individuals' movements, and demonstrate that it facilitates coordination during interaction. We use our measure to confirm a key prediction of the theory of similarity that coordination between two individuals performing a joint-action task is higher if their motions share similar dynamic features. Furthermore, we use a virtual avatar driven by an interactive cognitive architecture based on feedback control theory to explore the effects of different kinematic features of the avatar motion on coordination with human players.
An important open problem in Human Behaviour is to understand how coordination emerges in human ensembles. This problem has been seldom studied quantitatively in the existing literature, in contrast to situations involving dual interaction. Here we study motor coordination (or synchronisation) in a group of individuals where participants are asked to visually coordinate an oscillatory hand motion. We separately tested two groups of seven participants. We observed that the coordination level of the ensemble depends on group homogeneity, as well as on the pattern of visual couplings (who looked at whom). Despite the complexity of social interactions, we show that networks of coupled heterogeneous oscillators with different structures capture well the group dynamics. Our findings are relevant to any activity requiring the coordination of several people, as in music, sport or at work, and can be extended to account for other perceptual forms of interaction such as sound or feel.
Joint improvisation is often observed among humans performing joint action tasks. Exploring the underlying cognitive and neural mechanisms behind the emergence of joint improvisation is an open research challenge. This paper investigates jointly improvised movements between two participants in the mirror game, a paradigmatic joint task example. First, experiments involving movement coordination of different dyads of human players are performed in order to build a human benchmark. No designation of leader and follower is given beforehand. We find that joint improvisation is characterized by the lack of a leader and high levels of movement synchronization. Then, a theoretical model is proposed to capture some features of their interaction, and a set of experiments is carried out to test and validate the model ability to reproduce the experimental observations. Furthermore, the model is used to drive a computer avatar able to successfully improvise joint motion with a human participant in real time. Finally, a convergence analysis of the proposed model is carried out to confirm its ability to reproduce joint movements between the participants.
We present novel, low-cost and non-invasive potential diagnostic biomarkers of schizophrenia. They are based on the ‘mirror-game’, a coordination task in which two partners are asked to mimic each other’s hand movements. In particular, we use the patient’s solo movement, recorded in the absence of a partner, and motion recorded during interaction with an artificial agent, a computer avatar or a humanoid robot. In order to discriminate between the patients and controls, we employ statistical learning techniques, which we apply to nonverbal synchrony and neuromotor features derived from the participants’ movement data. The proposed classifier has 93% accuracy and 100% specificity. Our results provide evidence that statistical learning techniques, nonverbal movement coordination and neuromotor characteristics could form the foundation of decision support tools aiding clinicians in cases of diagnostic uncertainty.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.