The ability of newborns to discriminate and respond to different emotional facial expressions remains controversial. We conducted three experiments in which we tested newborns’ preferences, and their ability to discriminate between neutral, fearful, and happy facial expressions, using visual preference and habituation procedures. In the first two experiments, no evidence was found that newborns discriminate, or show a preference between, a fearful and a neutral face. In the third experiment, newborns looked significantly longer at a happy facial expression than a fearful one. We raise the possibility that this preference reflects experience acquired over the first few days of life. These results show that at least some expressions are discriminated and preferred in newborns only a few days old.
When we sense a touch, our brains take account of our current limb position to determine the location of that touch in external space [1, 2]. Here we show that changes in the way the brain processes somatosensory information in the first year of life underlie the origins of this ability [3]. In three experiments we recorded somatosensory evoked potentials (SEPs) from 6.5-, 8-, and 10-month-old infants while presenting vibrotactile stimuli to their hands across uncrossed- and crossed-hands postures. At all ages we observed SEPs over central regions contralateral to the stimulated hand. Somatosensory processing was influenced by arm posture from 8 months onward. At 8 months, posture influenced mid-latency SEP components, but by 10 months effects were observed at early components associated with feed-forward stages of somatosensory processing. Furthermore, sight of the hands was a necessary pre-requisite for somatosensory remapping at 10 months. Thus, the cortical networks [4] underlying the ability to dynamically update the location of a perceived touch across limb movements become functional during the first year of life. Up until at least 6.5 months of age, it seems that human infants' perceptions of tactile stimuli in the external environment are heavily dependent upon limb position.
Event-related potentials were recorded from adults and 4-month-old infants while they watched pictures of faces that varied in emotional expression (happy and fearful) and in gaze direction (direct or averted). Results indicate that emotional expression is temporally independent of gaze direction processing at early stages of processing, and only become integrated at later latencies. Facial expressions affected the face-sensitive ERP components in both adults (N170) and infants (N290 and P400), while gaze direction and the interaction between facial expression and gaze affected the posterior channels in adults and the frontocentral channels in infants. Specifically, in adults, this interaction reflected a greater responsiveness to fearful expressions with averted gaze (avoidance-oriented emotion), and to happy faces with direct gaze (approach-oriented emotions). In infants, a larger activation to a happy expression at the frontocentral negative component (Nc) was found, and planned comparisons showed that it was due to the direct gaze condition. Taken together, these results support the shared signal hypothesis in adults, but only to a lesser extent in infants, suggesting that experience could play an important role.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.