Perception of facial expressions of emotion is generally assumed to correspond to underlying muscle movement. However, it is often observed that some individuals have sadder or angrier faces, even for neutral, motionless faces. Here, we report on one such effect caused by simple static configural changes. In particular, we show four variations in the relative vertical position of the nose, mouth, eyes, and eyebrows that affect the perception of emotion in neutral faces. The first two configurations make the vertical distance between the eyes and mouth shorter than average, resulting in the perception of an angrier face. The other two configurations make this distance larger than average, resulting in the perception of sadness. These perceptions increase with the amount of configural change, suggesting a representation based on variations from a norm (prototypical) face.
A device has been designed to simultaneously measure the vertical pressure and the anterior-posterior and medial-lateral distributed shearing forces under the plantar surface of the foot. The device uses strain gauge technology and consists of 16 individual transducers (each with a surface area measuring 2.5 × 2.5 cm) arranged in a 4 × 4 array. The sampling frequency is 37 Hz and data may be collected for 2 s. The device was calibrated under both static and dynamic conditions and revealed excellent linearity (±5%), minimal hysteresis (±7.5%), and very good agreement between applied and measured loads (±5%). Vector addition of the distributed loads gave resultant forces that were qualitatively very similar to those obtained from a standard force plate. Data are presented for measurements from the forefoot of 4 diabetic subjects during the initiation of gait, demonstrating that distributed shear and pressure on the sole of the foot can be measured simultaneously.
Research suggests that configural cues (second-order relations) play a major role in the representation and classification of face images; making faces a “special” class of objects, since object recognition seems to use different encoding mechanisms. It is less clear, however, how this representation emerges and whether this representation is also used in the recognition of facial expressions of emotion. In this paper, we show how configural cues emerge naturally from a classical analysis of shape in the recognition of anger and sadness. In particular our results suggest that at least two of the dimensions of the computational (cognitive) space of facial expressions of emotion correspond to pure configural changes. The first of these dimensions measures the distance between the eyebrows and the mouth, while the second is concerned with the height-width ratio of the face. Under this proposed model, becoming a face “expert” would mean to move from the generic shape representation to that based on configural cues. These results suggest that the recognition of facial expressions of emotion shares this expertise property with the other processes of face processing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.