The roles of visual and haptic experience in different aspects of haptic processing of objects in peripersonal space are examined. In three trials, early-blind, late-blind, and blindfolded-sighted individuals had to match ten shapes haptically to the cut-outs in a board as fast as possible. Both blind groups were much faster than the sighted in all three trials. All three groups improved considerably from trial to trial. In particular, the sighted group showed a strong improvement from the first to the second trial. While superiority of the blind remained for speeded matching after rotation of the stimulus frame, coordinate positional-memory scores in a non-speeded free-recall trial showed no significant differences between the groups. Moreover, when assessed with a verbal response, categorical spatial-memory appeared strongest in the late-blind group. The role of haptic and visual experience thus appears to depend on the task aspect tested.
Systematic deviations occur when blindfolded subjects set a test bar parallel to a reference bar in the horizontal plane using haptic information (Kappers and Koenderink 1999, Perception 28:781-795; Kappers 1999, Perception 28:1001-1012). These deviations are assumed to reflect the use of a combination of a biasing egocentric reference frame and an allocentric, more cognitive one (Kappers 2002, Acta Psychol 109:25-40). In two experiments, we have examined the effect of delay between the perception of a reference bar and the parallel setting of a test bar. In both experiments a 10-s delay improved performance. The improvement increased with a larger horizontal (left-right) distance between the bars. This improvement was interpreted as a shift from the egocentric towards the allocentric reference frame during the delay period.
It has been argued that representations of peripersonal space based on haptic input are systematically distorted by egocentric reference frames. Interestingly, a recent study has shown that noninformative vision (i.e., freely viewing the region above the haptic workspace) improves performance on the so-called haptic parallel-setting task, in which participants are instructed to rotate a test bar until it is parallel to a reference bar. In the present study, we made a start at identifying the different sensory integration mechanisms involved in haptic space perception by distinguishing the possible effects of orienting mechanisms from those of noninformative vision. We found that both the orienting direction of head and eyes and the availability of noninformative vision affect parallel-setting performance and that they do so independently: orienting towards a reference bar facilitated the parallel-setting of a test bar in both no-vision and noninformative vision conditions, and noninformative vision improved performance irrespective of orienting direction. These results suggest the effects of orienting and noninformative vision on haptic space perception to depend on distinct neurocognitive mechanisms, likely to be expressed in different modulations of neural activation in the multimodal parietofrontal network, thought to be concerned with multimodal representations of peripersonal space.
Our haptic sense provides us with essential information about the spatial layout of peripersonal space-that is, the size, shape, position, and orientation of things within reach. Remarkably, haptic perception of basic spatial properties such as line length (Lanca & Bryant, 1995;Marks & Armstrong, 1996), path length (Lederman, Klatzky, & Barber, 1985), and orientation (see, e.g., Appelle & Countryman, 1986;Gentaz & Hatwell, 1998, 1999Kappers, 1999;Zuidhoek, Visser, Bredero, & Postma, 2004) is susceptible to marked distortions, at least in blindfolded sighted individuals. Interestingly, the distortions in haptic perception of orientation appear to be both systematic over multiple locations and consistent over different tasks-for example, the setting of a test bar either parallel or collinear with a reference bar, or the pointing of a bar toward a marker (Kappers, 1999(Kappers, , 2002Kappers & Koenderink, 1999). In the present study, we set out to investigate the nature of these systematic errors further by comparing blind participants to sighted controls.Kappers (2003) argued that setting two bars parallel to each other in a plane on the basis of touch alone typically recruits a mixture of egocentric (viz., hand-centered) and allocentric reference frames. The effect of the former is indicated by large, systematic errors-that is, clockwise with test bars to the right and counterclockwise with test bars to the left of the reference bar (see, e.g., Kappers, 1999Kappers, , 2003Kappers & Koenderink, 1999;Zuidhoek, Kappers, van der Lubbe, & Postma, 2003, Zuidhoek et al., 2004). An egocentric reference frame presupposes coding of the locations and spatial orientations of items in the outside world with respect to a part of one's body, such as the hands, the midsagittal plane (Zuidhoek, Kappers, & Postma, 2005), the body as a whole (Heller, Calcaterra, Green, & Barnette, 1999), or a subjective gravitational frame (Luyat, Gentaz, Corte, & Guerraz, 2001). In an allocentric reference frame, spatial information is encoded with respect to external landmarks in the outside world, such as when taking into account the available surrounding background information (Millar, 1988(Millar, , 1994Thinus-Blanc & Gaunet, 1997). The first phase of haptic processing might almost obligatorily involve the employment of an ego-or bodycentered reference frame. It should be noted that in the Early-blind, late-blind, and blindfolded sighted participants were presented with two haptic allocentric spatial tasks: a parallel-setting task, in an immediate and a 10-sec delay condition, and a task in which the orientation of a single bar was judged verbally. With respect to deviation size, the data suggest that mental visual processing filled a beneficial role in both tasks. In the parallel-setting task, the early blind performed more variably and showed no improvement with delay, whereas the late blind did improve, but less than the sighted did. In the verbal judgment task, both early-and late-blind participants displayed larger deviations than the sig...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.