Determining the handedness of visually presented stimuli is thought to involve two separate stages--a rapid, implicit recognition of laterality followed by a confirmatory mental rotation of the matching hand. In two studies, we explore the role of the dominant and non-dominant hands in this process. In Experiment 1, participants judged stimulus laterality with either their left or right hand held behind their back or with both hands resting in the lap. The variation in reactions times across these conditions reveals that both hands play a role in hand laterality judgments, with the hand which is not involved in the mental rotation stage causing some interference, slowing down mental rotations and making them more accurate. While this interference occurs for both lateralities in right-handed people, it occurs for the dominant hand only in left-handers. This is likely due to left-handers' greater reliance on the initial, visual recognition stage than on the later, mental rotation stage, particularly when judging hands from the non-dominant laterality. Participants' own judgments of whether the stimuli were 'self' and 'other' hands in Experiment 2 suggest a difference in strategy for hands seen from an egocentric and allocentric perspective, with a combined visuo-sensorimotor strategy for the former and a visual only strategy for the latter. This result is discussed with reference to recent brain imaging research showing that the extrastriate body area distinguishes between bodies and body parts in egocentric and allocentric perspective.
In the hand laterality task participants judge the handedness of visually presented stimuli – images of hands shown in a variety of postures and views - and indicate whether they perceive a right or left hand. The task engages kinaesthetic and sensorimotor processes and is considered a standard example of motor imagery. However, in this study we find that while motor imagery holds across egocentric views of the stimuli (where the hands are likely to be one's own), it does not appear to hold across allocentric views (where the hands are likely to be another person's). First, we find that psychophysical sensitivity, d', is clearly demarcated between egocentric and allocentric views, being high for the former and low for the latter. Secondly, using mixed effects methods to analyse the chronometric data, we find high positive correlation between response times across egocentric views, suggesting a common use of motor imagery across these views. Correlations are, however, considerably lower between egocentric and allocentric views, suggesting a switch from motor imagery across these perspectives. We relate these findings to research showing that the extrastriate body area discriminates egocentric (‘self’) and allocentric (‘other’) views of the human body and of body parts, including hands.
Previous studies have found that perception in older people benefits from multisensory over unisensory information. As normal speech recognition is affected by both the auditory input and the visual lip movements of the speaker, we investigated the efficiency of audio and visual integration in an older population by manipulating the relative reliability of the auditory and visual information in speech. We also investigated the role of the semantic context of the sentence to assess whether audio–visual integration is affected by top-down semantic processing. We presented participants with audio–visual sentences in which the visual component was either blurred or not blurred. We found that there was a greater cost in recall performance for semantically meaningless speech in the audio–visual ‘blur’ compared to audio–visual ‘no blur’ condition and this effect was specific to the older group. Our findings have implications for understanding how aging affects efficient multisensory integration for the perception of speech and suggests that multisensory inputs may benefit speech perception in older adults when the semantic content of the speech is unpredictable.
When interpreting other people's movements or actions, observers may not only rely on the visual cues available in the observed movement, but they may also be able to “put themselves in the other person's shoes” by engaging brain systems involved in both “mentalizing” and motor simulation. The ageing process brings changes in both perceptual and motor abilities, yet little is known about how these changes may affect the ability to accurately interpret other people's actions. Here we investigated the effect of ageing on the ability to discriminate the weight of objects based on the movements of actors lifting these objects. Stimuli consisted of videos of an actor lifting a small box weighing 0.05–0.9 kg or a large box weighting 3–18 kg. In a four-alternative forced-choice task, younger and older participants reported the perceived weight of the box in each video. Overall, older participants were less sensitive than younger participants in discriminating the perceived weight of lifted boxes, an effect that was especially pronounced in the small box condition. Weight discrimination performance was better for the large box compared to the small box in both groups, due to greater saliency of the visual cues in this condition. These results suggest that older adults may require more salient visual cues to interpret the actions of others accurately. We discuss the potential contribution of age-related changes in visual and motor function on the observed effects and suggest that older adults' decline in the sensitivity to subtle visual cues may lead to greater reliance on visual analysis of the observed scene and its semantic context.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.