In Experiments 1 and 2, the time to locate and identify a visual target (visual search performance in a two-alternative forced-choice paradigm) was measured as a function of the location of the target relative to the subject's initial line of gaze. In Experiment 1, tests were conducted within a 260°region on the horizontal plane at a fixed elevation (eye level). In Experiment 2, the position of the target was varied in both the horizontal (260°) and the vertical (±46°from the initial line of gaze) planes. In both experiments, and for all locations tested, the time required to conduct a visual search was reduced substantially (175-1,200 msec) when a 10-Hz click train was presented from the same location as that occupied by the visual target. Significant differences in latencies were still evident when the visual target was located within 10°of the initial line of gaze (central visual field). In Experiment 3, we examined head and eye movements that occur as subjects attempt to locate a sound source. Concurrent movements of the head and eyes are commonly encountered during auditorily directed search behavior. In over half of the trials, eyelid closures were apparent as the subjects attempted to orient themselves toward the sound source. The results from these experiments support the hypothesis that the auditory spatial channel has a significant role in regulating visual gaze. Statement of the ProblemThe auditory system in human beings has only limited spatial resolving power; the ability to discriminate the location of a sound source, for example, is seldom better than 1°-2 0.1 Although an extensive literature exists on the topic of auditory spatial processes, little attention has been paid to evaluating the function of this system. In our search for a role for the auditory spatial system, we assumed that the function it serves must require no more than the limited resolution normally observed. We wish to suggest the following hypothesis: The primary function of the auditory spatial system may be to provide information that allows the individual to redirect the eyes in order to bring the fovea into line with an acoustically active object. Since the fovea, which is the most powerful information processing segment of the retina, extends over several degrees of visual angle, additional auditory spatial capacity may not have had any adaptive value. In the following section, we will attempt to present the arguments that led us to this conclusion. OverviewIn human beings, the eyes are located relatively close together at the front of the head. One cost of this arrangement is that people have available only a limited sample of the immediate environment. As noted by Gibson
Visual search performance was examined in a two-alternative, forced-choice paradigm. The task involved locating and identifying which of two visual targets was present on a trial. The location of the targets varied relative to the subject's initial fixation point from 0 to 14.8 deg. The visual targets were either presented concurrently with a sound located at the same position as the visual target or were presented in silence. Both the number of distractor visual figures (0-63) present in the field during the search (Experiments 1 and 2) and the distinctness of the visual target relative to the distractors (Experiment 2) were considered. Under all conditions, visual search latencies were reduced when spatially correlated sounds were present. Aurally guided search was particularly enhanced when the visual target was located in the peripheral regions of the central visual field and when a larger number of distractor images (63) were present. Similar results were obtained under conditions in which the target was visually enhanced. These results indicate that spatially correlated sounds may have considerable utility in high-information environments (e.g., piloting an aircraft).
In the present investigation, the effects of spatial separation on the interstimulus onset intervals (ISOIs) that produce auditory and visual apparent motion were compared. In Experiment 1, subjects were tested on auditory apparent motion. They listened to 50-msec broadband noise pulses that were presented through two speakers separated by one of six different values between 0°a nd 160°. On each trial, the sounds were temporally separated by 1 of 12 ISOIs from 0 to 500 msec. The subjects were instructed to categorize their perception of the sounds as "single," "simultaneous," "continuous motion," "broken motion," or "succession." They also indicated the proper temporal sequence of each sound pair. In Experiments 2 and 3, subjects were tested on visual apparent motion. Experiment 2 included a range of spatial separations from 6°to 80°; Experiment 3 included separations from .5°to 10°. The same ISOIs were used as in Experiment 1. When the separations were equal, the ISOIs at which auditory apparent motion was perceived were smaller than the values that produced the same experience in vision. Spatial separation affected only visual apparent motion. For separations less than 2°, the ISOIs that produced visual continuous motion were nearly equal to those which produced auditory continuous motion. For larger separations, the ISOIs that produced visual apparent motion increased.Apparent motion is an illusion produced by the proper timing and placement of two discrete stimuli; under optimal conditions, movement of the lead stimulus toward the lag stimulus is perceived. Apparent motion was first demonstrated by Exner in 1875, but it was Wertheimer's (1912) article that initiated much interest in the phenomenon. In 1917, Burtt demonstrated that the illusion of apparent motion could also occur in the auditory and tactual modalities (Burtt, 1917a(Burtt, , 1917b. Most subsequent investigations of apparent motion, however, have been focused on the visual illusion. No direct comparisons between the modalities of apparent motion have been reported. Our purpose in the present investigation was to compare the visual modality of the illusion of apparent motion with the auditory modality. At the same time, we examined the effects of the spatial separation between the stimuli that were used in each modality. Visual Apparent MotionSoon after Wertheimer's seminal paper, Korte published what have been called the "laws" of apparent motion, which describe the relationships between the "primary"This research was supported in part by grants from theNational Science Foundation (BNS-8512317) and the National Institute of Health (3506 RR0801-1452). Carol L. Manligas is now in the Department of Psychology at the University of Georgia, Athens, GA 30602. Correspondence should be addressed to Thomas Z. Strybel, Department of Psychology, California State University, Long Beach, CA 90840. 439variables that affect the visual illusion (Korte, 1915). These variables include the exposure time of each stimulus, as well as the temporal and spatial separ...
Situation awareness (SA) is the understanding required to operate a complex system in a highly dynamic environment. We evaluate theories of individual SA and the processes by which individuals maintain their understanding of a situation. We support a situated approach, which holds that individual operators make use of limited internal representation, and rely extensively on interactions with external props and tools to achieve and maintain SA. We also propose a synthesis of Durso et al. (2007) [Durso, F., Rawson, K., and Girotto, S., 2007. Comprehension and situation awareness. In: F. Durso, et al., eds. Handbook of applied cognition. 2nd ed. Hoboken, NJ: Wiley, 163-194.] Construction-Integration model of scene analysis with Sperber and Wilson's (1995) [Sperber, D. and Wilson, D., 1995. Relevance: communication and cognition. 2nd ed.Oxford: Blackwell.] Relevance Theory of comprehension. We show that the combination of the two theories can provide a more complete account of the mechanisms and processes underlying situation assessment, or sense-making, and is consistent with the situated approach to SA.
Two experiments examined auditory spatial facilitation of visual search performance under conditions varying in auditory cue precision and visual distractor density. The auditory cue was spatially coincided with the target, was displaced from the target by 6°, or was uninformative. Distractors were manipulated globally (throughout the field) and locally (within 6.5° of the target) separately at densities of 0%, 20%, and 80%. In Experiment 1, auditory cue precision was constant and distractor densities varied within a trial block; in Experiment 2, auditory precision varied and distractor densities were constant within a trial block. Coincident auditory cues minimized local and global distractor effects in both experiments, suggesting that auditory spatial cues facilitate both target localization and identification. The effectiveness of displaced auditory cues depended on cue reliability: In some conditions, displaced cues caused higher mean search latencies than did centered cues, indicating that participants were unable to ignore inaccurate auditory stimuli. Actual or potential applications of this research include virtual audio environments and auditory displays in cockpits.
Abstract. This study compared situation awareness across three flight deck decision aiding modes. Pilots resolved air traffic conflicts using a click and drag software tool. In the automated aiding condition, pilots executed all resolutions generated by the automation. In the interactive condition, automation suggested a maneuver, but pilots had the choice of accepting or modifying the provided resolution. In the manual condition pilots generated resolutions independently. A technique that combines both Situation Global Assessment Technique and Situation Present Awareness Method was used to assess situation awareness. Results showed that situation awareness was better in the Manual and Interactive conditions when compared to the Automated condition. The finding suggests that pilots are able to maintain greater situation awareness when they are actively engaged in the conflict resolution process.
In the future auditory directional cues may enhance situational awareness in cockpits with head-coupled displays. This benefit would depend, however, on the pilot's ability to detect the direction of moving sounds at different locations in space. The present investigation examined this ability. Auditory motion acuity was measured by the minimum audible movement angle (MAMA): the minimum angle of travel required for detection of the direction of sound movement. Five experienced listeners were instructed to indicate the direction of travel of a sound source (broadband noise at 50 dBA) that moved at a velocity of 20 deg/s. Nine azimuth positions were tested at 0 deg elevation. Five elevations were then tested at 0 deg azimuth. Finally two azimuth positions were tested at an elevation of 80 deg. The position of the source did not significantly affect the MAMA for azimuth locations between +40 and -40 deg and elevations below 80 deg. Within this area the MAMA ranged between 1 and 2 deg. Outside this area the MAMA increased to 3 to 10 deg.
We measured situation awareness (SA) of pilots in a simulation of an approach to a large metropolitan airport (DFW), using both SAGAT and SPAM probe techniques. Both methods of SA measurement significantly predicted pilot performance on a self-spacing task but in SPAM scenarios, probe latency predicted IAS variability, and in SAGAT scenarios, accuracy predicted IAS variability.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.