Both touch and speech independently have been shown to play an important role in infant development. However, little is known about how they may be combined in the input to the child. We examined the use of touch and speech together by having mothers read their 5-month-olds books about body parts and animals. Results suggest that speech+touch multimodal events are characterized by more exaggerated touch and speech cues. Further, our results suggest that maternal touches are aligned with speech and that mothers tend to touch their infants in locations that are congruent with names of body parts. Thus, our results suggest that tactile cues could potentially aid both infant word segmentation and word learning.
Infants' experiences are defined by the presence of concurrent streams of perceptual information in social environments. Touch from caregivers is an especially pervasive feature of early development. Using three lab experiments and a corpus of naturalistic caregiver-infant interactions, we examined the relevance of touch in supporting infants' learning of structure in an altogether different modality: audition. In each experiment, infants listened to sequences of sine-wave tones following the same abstract pattern (e.g., ABA or ABB) while receiving time-locked touch sequences from an experimenter that provided either informative or uninformative cues to the pattern (e.g., knee-elbow-knee or knee-elbow-elbow). Results showed that intersensorily redundant touch supported infants' learning of tone patterns, but learning varied depending on the typicality of touch sequences in infants' lives. These findings suggest that infants track touch sequences from moment to moment and in aggregate from their caregivers, and use the intersensory redundancy provided by touch to discover patterns in their environment.
In the first year of life, the ability to engage in sustained synchronous interactions develops as infants learn to match social partner behaviors and sequentially regulate their behaviors in response to others. Difficulties developing competence in these early social building blocks can impact later language skills, joint attention, and emotion regulation. For children at elevated risk for autism spectrum disorder (ASD), early dyadic synchrony and responsiveness difficulties may be indicative of emerging ASD and/or developmental concerns. As part of a prospective developmental monitoring study, infant siblings of children with ASD (high-risk group n = 104) or typical development (low-risk group n = 71), and their mothers completed a standardized play task when infants were 6, 9, and/or 12 months of age. These interactions were coded for the frequency and duration of infant and mother gaze, positive affect, and vocalizations, respectively. Using these codes, theory-driven composites were created to index dyadic synchrony and infant/maternal responsiveness. Multilevel models revealed significant risk group differences in dyadic synchrony and infant responsiveness by 12 months of age. In addition, high-risk infants with higher dyadic synchrony and infant responsiveness at 12 months received significantly higher receptive and expressive language scores at 36 months. The findings of the present study highlight that promoting dyadic synchrony and responsiveness may aid in advancing optimal development in children at elevated risk for autism.
Purpose Caregivers may show greater use of nonauditory signals in interactions with children who are deaf or hard of hearing (DHH). This study explored the frequency of maternal touch and the temporal alignment of touch with speech in the input to children who are DHH and age-matched peers with normal hearing. Method We gathered audio and video recordings of mother–child free-play interactions. Maternal speech units were annotated from audio recordings, and touch events were annotated from video recordings. Analyses explored the frequency and duration of touch events and the temporal alignment of touch with speech. Results Greater variance was observed in the frequency of touch and its total duration in the input to children who are DHH. Furthermore, touches produced by mothers of children who are DHH were significantly more likely to be aligned with speech than touches produced by mothers of children with normal hearing. Conclusion Caregivers' modifications in the input to children who are DHH are observed in the combination of speech with touch. The implications for such patterns and how they may impact children's attention and access to the speech signal are discussed.
Much of our basic understanding of cognitive and social processes in infancy relies on measures of looking time, and specifically on infants’ visual preference for a novel or familiar stimulus. However, despite being the foundation of many behavioral tasks in infant research, the determinants of infants’ visual preferences are poorly understood, and differences in the expression of preferences can be difficult to interpret. In this large-scale study, we test predictions from the Hunter and Ames model of infants' visual preferences. We investigate the effects of three factors predicted by this model to determine infants’ preference for novel versus familiar stimuli: age, stimulus familiarity, and stimulus complexity. Drawing from a large and diverse sample of infant participants (N = XX), this study will provide crucial empirical evidence for a robust and generalizable model of infant visual preferences, leading to a more solid theoretical foundation for understanding the mechanisms that underlie infants’ responses in common behavioral paradigms. Moreover, our findings will guide future studies that rely on infants' visual preferences to measure cognitive and social processes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.