Movement coordination depends on directing our limbs to the right place and in the right time. Movement science can study this central requirement in the Fitts task that asks participants to touch each of two targets in alternation, as accurately and as fast as they can. The Fitts task is an experimental attempt to focus on how the movement system balances its attention to speed and to accuracy. This balance in the Fitts task exhibits a hierarchical organization according to which finer details (e.g., kinematics of single sweeps from one target to the other) change with relatively broader constraints of task parameters (e.g., distance between targets and width of targets). The present work seeks to test the hypothesis that this hierarchical organization of movement coordination reflects a multifractal tensegrity in which non-linear interactions across scale support stability. We collected movement series data during a easy variant of the Fitts task to apply just such a multifractal analysis with surrogate comparison to allow clearer test of non-linear interactions across scale. Furthermore, we test the role of visual feedback both in potential and in fact, i.e., by manipulating both whether experimenters instructed participants that they might potentially have to close their eyes during the task and whether participants actually closed their eyes halfway through the task. We predict that (1) non-linear interactions across scales in hand movement series will produce variability that will actually stabilize aiming in the Fitts task, reducing standard deviation of target contacts; (2) non-linear interactions across scales in head sway will stabilize aiming following the actual closing eyes; and (3) non-linear interactions across scales in head sway and in hand movements will interact to support stabilizing effects of expectation about closing eyes. In sum, this work attempts to make the case that the multifractal-tensegrity hypothesis supports more accurate aiming behavior in the Fitts task.
Research into haptic perception typically concentrates on mechanoreceptors and their supporting neuronal processes. This focus risks ignoring crucial aspects of active perception. For instance, bodily movements influence the information available to mechanoreceptors, entailing that movement facilitates haptic perception. Effortful manual wielding of an object prompts feedback loops at multiple spatio-temporal scales, rippling outwards from the wielding hand to the feet, maintaining an upright posture and interweaving to produce a nonlinear web of fluctuations throughout the body. Here, we investigated whether and how this bodywide nonlinearity engenders a flow of multifractal fluctuations that could support perception of object properties via dynamic touch. Blindfolded participants manually wielded weighted dowels and reported judgements of heaviness and length. Mechanical fluctuations on the anatomical sleeves (i.e. peripheries of the body), from hand to the upper body, as well as to the postural centre of pressure, showed evidence of multifractality arising from nonlinear temporal correlations across scales. The modelling of impulse–response functions obtained from vector autoregressive analysis revealed that distinct sets of pairwise exchanges of multifractal fluctuations entailed accuracy in heaviness and length judgements. These results suggest that the accuracy of perception via dynamic touch hinges on specific flowing patterns of multifractal fluctuations that people wear on their anatomical sleeves.
A long history of research has pointed to the importance of fractal fluctuations in physiology, but so far, the physiological evidence of fractal fluctuations has been piecemeal and without clues to bodywide integration. What remains unknown is how fractal fluctuations might interact across the body and how those interactions might support the coordination of goal-directed behaviors. We demonstrate that a complex interplay of fractality in mechanical fluctuations across the body supports a more accurate perception of heaviness and length of occluded handheld objects via effortful touch in blindfolded individuals. For a given participant, the flow of fractal fluctuation through the body indexes the flow of perceptual information used to derive perceptual judgments. These patterns in the waxing and waning of fluctuations across disparate anatomical locations provide novel insights into how the highdimensional flux of mechanotransduction is compressed into low-dimensional perceptual information specifying properties of hefted occluded objects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.