Emojis are frequently used by people worldwide as a tool to express one’s emotional states and have recently been considered for assessment in research. However, details regarding the ways in which they correspond to human emotional states remain unidentified. Thus, this study aimed to understand how emojis are classified on the valence and arousal axes and to examine the relationship between the former and human emotional states. In an online survey involving 1082 participants, a nine-point scale was employed to evaluate the valence and arousal levels of 74 facial emojis. Results from the cluster analysis revealed these emojis to be categorized into six different clusters on the two axes of valence and arousal. Further, the one-way analysis of variance indicated that these clusters have six valence and four arousal levels. From the results, each cluster was interpreted as (1) a strong negative sentiment, (2) a moderately negative sentiment, (3) a neutral sentiment with a negative bias, (4) a neutral sentiment with a positive bias, (5) a moderately positive sentiment, and (6) a strong positive sentiment. Therefore, facial emojis were found to comprehensively express the human emotional states.
This study examined effects of hand movement on visual perception of 3-D movement. I used an apparatus in which a cursor position in a simulated 3-D space and the position of a stylus on a haptic device could coincide using a mirror. In three experiments, participants touched the center of a rectangle in the visual display with the stylus of the force-feedback device. Then the rectangle's surface stereoscopically either protruded toward a participant or indented away from the participant. Simultaneously, the stylus either pushed back participant's hand, pulled away, or remained static. Visual and haptic information were independently manipulated. Participants judged whether the rectangle visually protruded or dented. Results showed that when the hand was pulled away, subjects were biased to perceive rectangles indented; however, when the hand was pushed back, no effect of haptic information was observed (Experiment 1). This effect persisted even when the cursor position was spatially separated from the hand position (Experiment 2). But, when participants touched an object different from the visual stimulus, this effect disappeared (Experiment 3). These results suggest that the visual system tried to integrate the dynamic visual and haptic information when they coincided cognitively, and the effect of haptic information on visually perceived depth was direction-dependent.
The perceived temporal order of external successive events does not always follow their physical temporal order. We examined the contribution of self-motion mechanisms in the perception of temporal order in the auditory modality. We measured perceptual biases in the judgment of the temporal order of two short sounds presented successively, while participants experienced visually induced self-motion (yaw-axis circular vection) elicited by viewing long-lasting large-field visual motion. In experiment 1, a pair of white-noise patterns was presented to participants at various stimulus-onset asynchronies through headphones, while they experienced visually induced self-motion. Perceived temporal order of auditory events was modulated by the direction of the visual motion (or self-motion). Specifically, the sound presented to the ear in the direction opposite to the visual motion (ie heading direction) was perceived prior to the sound presented to the ear in the same direction. Experiments 2A and 2B were designed to reduce the contributions of decisional and/or response processes. In experiment 2A, the directional cueing of the background (left or right) and the response dimension (high pitch or low pitch) were not spatially associated. In experiment 2B, participants were additionally asked to report which of the two sounds was perceived 'second'. Almost the same results as in experiment 1 were observed, suggesting that the change in temporal order of auditory events during large-field visual motion reflects a change in perceptual processing. Experiment 3 showed that the biases in the temporal-order judgments of auditory events were caused by concurrent actual self-motion with a rotatory chair. In experiment 4, using a small display, we showed that 'pure' long exposure to visual motion without the sensation of self-motion was not responsible for this phenomenon. These results are consistent with previous studies reporting a change in the perceived temporal order of visual or tactile events depending on the direction of self-motion. Hence, large-field induced (ie optic flow) self-motion can affect the temporal order of successive external events across various modalities.
SUMMARYIn this study, we investigate the model for the human strategy of path selection behavior in an open space. In the experiment, a moving obstacle was displayed in an immersive virtual reality system. Subjects were required to walk from a start point to a goal point in the virtual space and their walking trajectories were measured. The model for the path selection strategy, composed of two stages of global path planning and local direction control, was proposed to account for the experimental results. In the global path planning stage, the model determined a transit position that minimizes an equation consisting of terms about walking distance and dangerousness. In the local direction control stage, a walking direction was calculated at every step to avoid a moving obstacle. The results of simulation well accounted for the experimental results in general. Moreover, we applied the model to the results of another experiment in order to investigate the generality of the model.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.