We present the methods that were used in capturing a library of human movements for use in computeranimated displays of human movement. The library is an attempt to systematically tap into and represent the wide range of personal properties, such as identity, gender, and emotion, that are available in a person's movements. The movements from a total of 30 nonprofessional actors (15 of them female) were captured while they performed walking, knocking, lifting, and throwing actions, as well as their combination in angry, happy, neutral, and sad affective styles. From the raw motion capture data, a library of 4,080 movements was obtained, using techniques based on Character Studio (plug-ins for 3D Studio MAX, AutoDesk, Inc.), MATLAB (The MathWorks, Inc.), or a combination of these two. For the knocking, lifting, and throwing actions, 10 repetitions of the simple action unit were obtained for each affect, and for the other actions, two longer movement recordings were obtained for each affect. We discuss the potential use of the library for computational and behavioral analyses of movement variability, of human character animation, and of how gender, emotion, and identity are encoded and decoded from human movement.
We examined how the recognition of facial emotion was influenced by manipulation of both spatial and temporal properties of 3-D point-light displays of facial motion. We started with the measurement of 3-D position of multiple locations on the face during posed expressions of anger, happiness, sadness, and surprise, and then manipulated the spatial and temporal properties of the measurements to obtain new versions of the movements. In two experiments, we examined recognition of these original and modified facial expressions: in experiment 1, we manipulated the spatial properties of the facial movement, and in experiment 2 we manipulated the temporal properties. The results of experiment 1 showed that exaggeration of facial expressions relative to a fixed neutral expression resulted in enhanced ratings of the intensity of that emotion. The results of experiment 2 showed that changing the duration of an expression had a small effect on ratings of emotional intensity, with a trend for expressions with shorter durations to have lower ratings of intensity. The results are discussed within the context of theories of encoding as related to caricature and emotion. IntroductionThe ability to perceive human action through point-light motion displays (Johansson 1973) has long been used as a way to explore motion processing and the recognition of human movement. Owing to its unique function in human communication, one question which has received attention is whether information about emotional state can be derived from movement. Investigations of dance movements indicate that the emotions of surprise, fear, anger, disgust, grief/sadness, and joy/happiness can be recognised at above chance levels (Walk and Homan 1984;Dittrich et al 1996). Even arm movements alone performing simple actions have been shown to convey information about affect (Pollick et al 2001b). Of particular interest for communication of emotion is the nonrigid motion of the face, and research by Bassili (1978Bassili ( , 1979 has indicated that point-light displays of facial motion can be effective stimuli for the recognition of emotion. Here, we investigate whether the recognition of emotion from point-light displays can be enhanced through the manipulation of spatial and temporal properties of facial movement.It is well known that exaggerating the spatial features of static images of faces can enhance recognition of a number of facial characteristics (eg facial identity, facial expression, age, and attractiveness). The initial research in this area looked at facialidentity recognition, and showed that line-drawing or photographic-quality caricatures of faces produced better recognition than the original (undistorted) faces (Rhodes et al 1987;Benson and Perrett 1991;Calder et al 1996). In addition, they showed that images in which the facial features were made less distinctive, so-called anticaricatures, were less readily recognised than the original faces. The images in these different studies were prepared by the same basic computer-based procedu...
Vocal pitch has been found to influence judgments of perceived trustworthiness and dominance from a novel voice. However, the majority of findings arise from using only male voices and in context-specific scenarios. In two experiments, we first explore the influence of average vocal pitch on first-impression judgments of perceived trustworthiness and dominance, before establishing the existence of an overall preference for high or low pitch across genders. In Experiment 1, pairs of high-and low-pitched temporally reversed recordings of male and female vocal utterances were presented in a two-alternative forced-choice task. Results revealed a tendency to select the low-pitched voice over the high-pitched voice as more trustworthy, for both genders, and more dominant, for male voices only. Experiment 2 tested an overall preference for low-pitched voices, and whether judgments were modulated by speech content, using forward and reversed speech to manipulate context. Results revealed an overall preference for low pitch, irrespective of direction of speech, in male voices only. No such overall preference was found for female voices. We propose that an overall preference for low pitch is a default prior in male voices irrespective of context, whereas pitch preferences in female voices are more context-and situation-dependent. The present study confirms the important role of vocal pitch in the formation of first-impression personality judgments and advances understanding of the impact of context on pitch preferences across genders.
Background: During the past decade the prevalence of autistic spectrum disorders (ASD) has increased to 1% of the UK population. Anecdotal reports of children with ASD suggest they are atypical selective eaters who restrict their food intake based on idiosyncratic pre-requisites of texture and food presentation. Research has previously focused on the nutritional adequacy of such diets rather than underlying sensory processing capabilities that are known to be impaired in up to 90% of children with ASD (Leekham et al., 2007). Sensory processing skills, eating behaviour and parent-child relationships are intersecting aspects of ASD; and each play a role in achieving maximum potential in developing daily living skills. Previous research, relating sensory processing ability and eating behaviour, is limited. The aim of the study was to explore the relationship between sensory processing ability and eating behaviour in children with ASD. Methods: This analytical, exploratory study used purposive sampling, through approach of a parental support group for children with ASD in October 2009. Consent was sought from parents or guardians to complete two questionnaires; the validated Short Sensory Profile (SSP) and an eating behaviour questionnaire (EBQ) developed by the author and consisting of eight domains and four open questions. The domains of the EBQ were taste, texture, smell, behaviour, sound, environment, vision and touch. The domains were selected to reflect the SSP, and as a result of qualitative research from parental and specialist input and literature. Results of both questionnaires were statistically analysed using SPSS. Domains within each questionnaire were presented as mean (SD) and performance related data (SSP only) was presented as 'typical performance', 'probable difference' and 'definite difference' to indicate the degree of sensory processing difficulty (Dunn, 1999). Spearman Rank Correlation coefficient was used to explore correlations between domains within the two questionnaires. Ethical approval was gained from Leeds Metropolitan University Research Ethics Committee. Results: All parents (n = 20) participated in the study; completing questionnaires for 20 children (18 male, two female), mean age 10.8(2.6) years. Results of the SSP indicated that 71% of children in the study had sensory processing difficulties, of which auditory filtering (83%) and tactile sensitivity (68%) were 'definite' problem areas. Within the eating behaviour questionnaire, mealtime behaviour, vision, taste and smell were common areas that caused difficulty in daily life. Significant correlations between 20% of factors on the SSP and eating questionnaire were noted, the highest being SSP-taste/smell and EBQ taste (r = 0.9, P < 0.001) and SSP-taste/smell and EBQ vision (r = 0.9, P < 0.001).Discussion: This pilot study confirmed previous reports of sensory processing difficulties in children with autism (Tomcheck and Dunn, 2007) and provides preliminary data indicating a potential relationship between aspects of sensory process...
We aim to understand the difference in stigma and discrimination, in particular sexual rejection, experienced between gay and heterosexual men living with HIV in the UK. The People Living with HIV StigmaSurvey UK 2015 recruited a convenience sample of persons with HIV through over 120 cross sector community organisations and 46 HIV clinics to complete an online survey. 1162 men completed the survey, 969 (83%) gay men and 193 (17%) heterosexual men, 92% were on antiretroviral therapy. Compared to heterosexual men, gay men were significantly more likely to report worrying about workplace treatment in relation to their HIV (21% vs. 11%), worrying about HIV-related sexual rejection (42% vs 21%), avoiding sex because of their HIV status (37% vs. 23%), and experiencing HIV-related sexual rejection (27% vs. 9%) in the past 12 months. In a multivariate logistic regression controlling for other sociodemographic factors, being gay was a predictor of reporting HIV-related sexual rejection in the past 12 months (aOR 2.17, CI 1.16, 4.02). Both gay and heterosexual men living with HIV experienced stigma and discrimination in the past 12 months, and this was higher for gay men in terms of HIV-related sexual rejection. Due to the high proportion of men reporting sexual rejection, greater awareness and education of the low risk of transmission of HIV among people on effective treatment is needed to reduce stigma and sexual prejudice towards people living with HIV.
In addition to benefiting reproducibility and transparency, one of the advantages of using R is that researchers have a much larger range of fully customizable data visualizations options than are typically available in point-and-click software because of the open-source nature of R. These visualization options not only look attractive but also can increase transparency about the distribution of the underlying data rather than relying on commonly used visualizations of aggregations, such as bar charts of means. In this tutorial, we provide a practical introduction to data visualization using R specifically aimed at researchers who have little to no prior experience of using R. First, we detail the rationale for using R for data visualization and introduce the “grammar of graphics” that underlies data visualization using the ggplot package. The tutorial then walks the reader through how to replicate plots that are commonly available in point-and-click software, such as histograms and box plots, and shows how the code for these “basic” plots can be easily extended to less commonly available options, such as violin box plots. The data set and code used in this tutorial and an interactive version with activity solutions, additional resources, and advanced plotting options are available at https://osf.io/bj83f/ .
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.