Young children often have a preference for auditory input, with auditory input often overshadowing visual input. The current research investigated the developmental trajectory and factors underlying these effects with 137 infants, 132 four-year-olds, and 89 adults. Auditory preference reverses with age: Infants demonstrated an auditory preference, 4-year-olds switched between auditory and visual preference, and adults demonstrated a visual preference. Furthermore, younger participants were likely to process stimuli only in the preferred modality, thus exhibiting modality dominance, whereas adults processed stimuli in both modalities. Finally, younger participants ably processed stimuli presented to the nonpreferred modality when presented in isolation, indicating that auditory and visual stimuli may be competing for attention early in development. Underlying factors and broader implications of these findings are discussed.
Although it is well documented that language plays an important role in cognitive development, there are different views concerning the mechanisms underlying these effects. Some argue that even early in development, effects of words stem from top-down knowledge, whereas others argue that these effects stem from auditory input affecting attention allocated to visual input. Previous research (e.g., Robinson & Sloutsky, 2004a) demonstrated that non-speech sounds attenuate processing of corresponding visual input at 8, 12, and 16 months of age, whereas the current study demonstrates that words attenuate visual processing at 10 months but not at 16 months (Experiment 1). Furthermore, prefamiliarization with non-speech sounds (Experiment 2) resulted in able processing of visual input by 16-month-olds. These findings suggest that some effects of labels found early in development may stem from familiarity with human speech. The possibility of general-auditory factors underlying the effects of words on cognitive development is discussed.
The ability to process simultaneously presented auditory and visual information is a necessary component underlying many cognitive tasks. While this ability is often taken for granted, there is evidence that under many conditions auditory input attenuates processing of corresponding visual input. The current study investigated infants' processing of visual input under unimodal and cross-modal conditions. Results of the three reported experiments indicate that different auditory input had different effects on infants' processing of visual information. In particular, unfamiliar auditory input slowed down visual processing, whereas more familiar auditory input did not. These results elucidate mechanisms underlying auditory overshadowing in the course of cross-modal processing and have implications on a variety of cognitive tasks that depend on cross-modal processing.
Although it is generally accepted that labels facilitate categorization in infancy, recent evidence suggests that infants and young children are more likely to process visual input when presented in isolation than when paired with nonlinguistic sounds or linguistic labels. These findings suggest that auditory input (when compared to a no-auditory baseline) may hinder rather than facilitate categorization. This study assessed 8-month-olds' (n = 191) and 12-month-olds' (n = 81) abilities to form categories when images were paired with nonlinguistic sounds, linguistic labels, and when presented in isolation. Overall, infants accumulated more looking when visual stimuli were accompanied by sounds or labels; however, infants were more likely to categorize when the visual images were presented without an auditory stimulus.The ability to form categories by treating discriminable stimuli as members of an equivalence class is an important component of human cognition (see Murphy, 2002, for a review). This ability appears early in development, with very young infants ably forming basic-level as well as superordinate or global categories (
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.