230Efforts to understand how the brain processes emotional information have increased rapidly over the last years. New studies are shedding light on the complex and diverse nature of these processes and on the speed with which emotional information is assimilated to ensure effective social behavior. The emotional significance of incoming events needs to be evaluated within milliseconds, followed by more conceptual-based knowledge processing, which often regulates emotionally appropriate behaviors (cf. Phillips, Drevets, Rauch, & Lane, 2003). In everyday situations, emotional stimuli are not often processed in isolation; rather, we are confronted with a stream of incoming information or events from different sources. To advance knowledge of how our brain successfully processes emotions from different information sources and how these sources influence each other, we investigated how and when emotional tone of voice influences the processing of an emotional face, a situation that occurs routinely in face-to-face social interactions.
Effects of Emotional ContextThe importance of context in emotion perception has been emphasized by several researchers (e.g., de Gelder et al., 2006;Feldmann Barett, Lindquist, & Gendron, 2007;Russell & Fehr, 1987). Context may be defined as the situation or circumstances that precede and/or accompany certain events, and both verbal contexts (created by language use) and social contexts (created by situations, scenes, etc.) have been shown to influence emotion perception. For example, depending on how the sentence, "Your parents are here? I can't wait to see them," is intoned, the speaker may be interpreted either as being pleasantly surprised by the unforeseen event or as feeling the very opposite. Here, emotional prosody-that is, the acoustic variations of perceived pitch, intensity, speech rate, and voice quality-serves as a context for interpreting the verbal message. Emotional prosody can influence not only the interpretation of a verbal message but also the early perception of facial expressions Massaro & Egan, 1996;Pourtois, de Gelder, Vroomen, Rossion, & Crommelinck, 2000). The latter reports have led to the proposal that emotional information in the prosody and face channels is automatically evaluated and integrated early during the processing of these events (de Gelder et al., 2006 The influence of emotional prosody on the evaluation of emotional facial expressions was investigated in an event-related brain potential (ERP) study using a priming paradigm, the facial affective decision task. Emotional prosodic fragments of short (200-msec) and medium (400-msec) duration were presented as primes, followed by an emotionally related or unrelated facial expression (or facial grimace, which does not resemble an emotion). Participants judged whether or not the facial expression represented an emotion. ERP results revealed an N400-like differentiation for emotionally related prime-target pairs when compared with unrelated prime-target pairs. Faces preceded by prosodic primes of mediu...