A necessary condition for visually guided action is that an organism perceive what actions are afforded by a given environmental situation. Warren (1984) proposed that an affordance such as the climbability of a stairway is determined by the fit between properties of the environment and the organism and can be characterized by optimal points, where action is most comfortable or efficient, and critical points, where a phase transition to a new action occurs. Perceiving an affordance, then, implies perceiving the relation between the environment and the observer's own action system. The present study is an extension of this analysis to the visual guidance of walking through apertures. We videotaped large and small subjects walking through apertures of different widths to determine empirically the critical aperture-to-shoulder-width ratio (A/S) marking the transition from frontal walking to body rotation. These results were compared with perceptual judgments of "passability" under static and moving viewing conditions. Finally, we tested the hypothesis that such judgments are based on intrinsic or body-scaled information specifying aperture width as a ratio of the observer's eyeheight. We conclude (a) that the critical point in free walking occurs at A/S = 1.30, (b) that static monocular information is sufficient for judging passability, and (c) that the perception of passability under such conditions is based on body-scaled eyeheight information.
How do animals visually guide their activities in a cluttered environment? Gibson (1979) proposed that they perceive what environmental objects offer or afford for action. An analysis of affordances in terms of the dynamics of an animal-environment system is presented. Critical points, corresponding to phase transitions in behavior, and optimal points, corresponding to stable, preferred regions of minimum energy expenditure, emerge from variation in the animal-environment fit. It is hypothesized that these points are constants across physically similar systems and that they provide a natural basis for perceptual categories and preferences. In three experiments these hypotheses are examined for the activity of human stair climbing, by varying riser height with respect to leg length. The perceptual category boundary between "climbable" and "unclimbable" stairs is predicted by a biomechanical model, and visually preferred riser height is predicted from measurements of minimum energy expenditure during climbing. It is concluded that perception for the control of action reflects the underlying dynamics of the animal-environment system.
How might one account for the organization in behavior without attributing it to an internal control structure? The present article develops a theoretical framework called behavioral dynamics that integrates an information-based approach to perception with a dynamical systems approach to action. For a given task, the agent and its environment are treated as a pair of dynamical systems that are coupled mechanically and informationally. Their interactions give rise to the behavioral dynamics, a vector field with attractors that correspond to stable task solutions, repellers that correspond to avoided states, and bifurcations that correspond to behavioral transitions. The framework is used to develop theories of several tasks in which a human agent interacts with the physical environment, including bouncing a ball on a racquet, balancing an object, braking a vehicle, and guiding locomotion. Stable, adaptive behavior emerges from the dynamics of the interaction between a structured environment and an agent with simple control laws, under physical and informational constraints.Keywords: perception and action, perceptual-motor control, dynamical systems, self-organization, locomotionThe organization of behavior has been a central concern of psychology for well over a century. How is it that humans and other animals can generate behavioral patterns that are tightly coordinated with the environment, in the service of achieving a specific goal? This ability to produce stable yet adaptive behavior raises two constituent issues. On the one hand, it implicates the coordination of action, such that the many neuromusculoskeletal components of the body become temporarily organized into an ordered pattern of movement. On the other, it implicates perception, such that information about the world and the body enables appropriate actions to be selected and adapted to environmental conditions. At a basic level, the problem of the organization of behavior is thus synonymous with the problem of perception and action. Moreover, an adequate theory of perceptually controlled action would provide a platform for understanding more "cognitive" behavior such as extended action sequences, anticipatory behavior oriented to remote goals, or predictive behavior that takes account of hidden environmental properties.It seems natural to presume that observed organization in behavior implies ipso facto the existence of a centralized controller-a pattern generator, action plan, or internal model that is responsible for its organization and regulation. Such an assumption has been commonplace in psychology, cognitive science, neuroscience, and robotics. In each domain, organization in behavior has been attributed to prior organization in the structure of the nervous system (the neuroreductionist view), the structure of internal representations (the cognitivist view), or in the contingencies presented by the environment (the behaviorist view). This is unsatisfying because it merely displaces the original problem of behavioral organization to a preexist...
How is human locomotion visually controlled? Fifty years ago, it was proposed that we steer to a goal using optic flow, the pattern of motion at the eye that specifies the direction of locomotion. However, we might also simply walk in the perceived direction of a goal. These two hypotheses normally predict the same behavior, but we tested them in an immersive virtual environment by displacing the optic flow from the direction of walking, violating the laws of optics. We found that people walked in the visual direction of a lone target, but increasingly relied on optic flow as it was added to the display. The visual control law for steering toward a goal is a linear combination of these two variables weighted by the magnitude of flow, thereby allowing humans to have robust locomotor control under varying environmental conditions.
The authors investigated the dynamics of steering and obstacle avoidance, with the aim of predicting routes through complex scenes. Participants walked in a virtual environment toward a goal (Experiment 1) and around an obstacle (Experiment 2) whose initial angle and distance varied. Goals and obstacles behave as attractors and repellers of heading, respectively, whose strengths depend on distance. The observed behavior was modeled as a dynamical system in which angular acceleration is a function of goal and obstacle angle and distance. By linearly combining terms for goals and obstacles, one could predict whether participants adopt a route to the left or right of an obstacle to reach a go (Experiment 3). Route selection may emerge from on-line steering dynamics, making explicit path planning unnecessary.
Radial patterns of optical flow produced by observer translation could be used to perceive the direction of self-movement during locomotion, and a number of formal analyses of such patterns have recently appeared. However, there is comparatively little empirical research on the perception of heading from optical flow, and what data there are indicate surprisingly poor performance, with heading errors on the order of 5 degrees-10 degrees. We examined heading judgments during translation parallel, perpendicular, and at oblique angles to a random-dot plane, varying observer speed and dot density. Using a discrimination task, we found that heading accuracy improved by an order of magnitude, with 75%-correct thresholds of 0.66 degrees in the highest speed and density condition and 1.2 degrees generally. Performance remained high with displays of 63-10 dots, but it dropped significantly with only 2 dots; there was no consistent speed effect and no effect of angle of approach to the surface. The results are inconsistent with theories based on the local focus of outflow, local motion parallax, multiple fixations, differential motion parallax, and the local maximum of divergence. But they are consistent with Gibson's (1950) original global radial outflow hypothesis for perception of heading during translation.
Why do humans switch from walking to running at a particular speed? It is proposed that gait transitions behave like nonequilibrium phase transitions between attractors. Experiment 1 examined walking and running on a treadmill while speed was varied. The transition occurred at the equal-energy separatrix between gaits, with predicted shifts in stride length and frequency, a qualitative reorganization in the relative phasing of segments within a leg, a sudden jump in relative phase, enhanced fluctuations in relative phase, and hysteresis. Experiment 2 dissociated speed, frequency, and stride length to show that the transition occurred at a constant speed near the energy separatrix. Results are consistent with a dynamic theory of locomotion in which preferred gaits are characterized by stable phase relationships and minimum energy expenditure, and gait transitions by a loss of stability and the reduction of energetic costs.Motor behavior in humans and animals exhibits two notable features: the presence of stable patterns of coordination and the sudden reorganization that occurs when switching between them. Much research has been directed at describing individual motor patterns such as walking and reaching, but the study of behavioral transitions may reveal principles of the formation of coordinative patterns. Locomotion offers a model system for the study of both, for it is a fundamental, fluent, and complex behavior that is likely to share basic characteristics with other skilled actions. In this article, we examine the shift between walking and running in humans and offer a qualitative dynamic theory of gait transitions.As speed increases, humans and other animals shift from a walking gait to a running gait at a characteristic speed. Why does this occur? A common view is that each gait is orchestrated by a central motor plan, such as a motor program or spinal pattern generator, and that gait transitions simply involve switching between plans (e.g., Shapiro, Zernicke, Gregor, & Diestel, 1981). This view does not offer predictions about the details of behavior at gait transitions. By contrast, we propose that gait transitions are a consequence of the intrinsic dynamics of a complex system, with properties characteristic of bifurcations between attractors. We show that the walk-run (W-R) transition exhibits features of a nonequilibrium phase transition and that it occurs at a speed that tends to reduce energetic costs. This
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.