BackgroundListening to music is amongst the most rewarding experiences for humans. Music has no functional resemblance to other rewarding stimuli, and has no demonstrated biological value, yet individuals continue listening to music for pleasure. It has been suggested that the pleasurable aspects of music listening are related to a change in emotional arousal, although this link has not been directly investigated. In this study, using methods of high temporal sensitivity we investigated whether there is a systematic relationship between dynamic increases in pleasure states and physiological indicators of emotional arousal, including changes in heart rate, respiration, electrodermal activity, body temperature, and blood volume pulse.MethodologyTwenty-six participants listened to self-selected intensely pleasurable music and “neutral” music that was individually selected for them based on low pleasure ratings they provided on other participants' music. The “chills” phenomenon was used to index intensely pleasurable responses to music. During music listening, continuous real-time recordings of subjective pleasure states and simultaneous recordings of sympathetic nervous system activity, an objective measure of emotional arousal, were obtained.Principal Findings Results revealed a strong positive correlation between ratings of pleasure and emotional arousal. Importantly, a dissociation was revealed as individuals who did not experience pleasure also showed no significant increases in emotional arousal.Conclusions/SignificanceThese results have broader implications by demonstrating that strongly felt emotions could be rewarding in themselves in the absence of a physically tangible reward or a specific functional goal.
We have developed a prototype device for take-home use that can be used in the treatment of amblyopia. The therapeutic scenario we envision involves patients first visiting a clinic, where their vision parameters are assessed and suitable parameters are determined for therapy. Patients then proceed with the actual therapeutic treatment on their own, using our device, which consists of an Apple iPod Touch running a specially modified game application. Our rationale for choosing to develop the prototype around a game stems from multiple requirements that such an application satisfies. First, system operation must be sufficiently straightforward that ease-of-use is not an obstacle. Second, the application itself should be compelling and motivate use more so than a traditional therapeutic task if it is to be used regularly outside of the clinic. This is particularly relevant for children, as compliance is a major issue for current treatments of childhood amblyopia. However, despite the traditional opinion that treatment of amblyopia is only effective in children, our initial results add to the growing body of evidence that improvements in visual function can be achieved in adults with amblyopia.
Abstract. Numerous devices have been invented with three or more degrees of freedom (DoF) to compensate for the assumed limitations of the 2 DoF mouse in the execution of 3D tasks. Nevertheless, the mouse remains the dominant input device in desktop 3D applications, which leads us to pose the following question: is the dominance of the mouse due simply to its widespread availability and long-term user habituation, or is the mouse, in fact, more suitable than dedicated 3D input devices to an important subset of 3D tasks? In the two studies reported in this paper, we measured performance efficiency of a group of subjects in accomplishing a 3D placement task and also observed physiological indicators through biosignal measurements. Subjects used both a standard 2D mouse and three other 3 DoF input devices. Much to our surprise, the standard 2D mouse outperformed the 3D input devices in both studies.
Animals with front facing eyes benefit from a substantial overlap in the visual fields of each eye, and devote specialized brain processes to using the horizontal spatial disparities produced as a result of viewing the same object with two laterally placed eyes, to derived depth or 3-D stereo information. This provides the advantage to break the camouflage of objects in front of similarly textured background and improves hand eye coordination for grasping objects close at hand. It is widely thought that about 5% of the population have a lazy eye and lack stereo vision, so it is often supposed that most of the population (95%) have good stereo abilities. We show that this is not the case; 68% have good to excellent stereo (the haves) and 32% have moderate to poor stereo (the have-nots). Why so many people lack good 3-D stereo vision is unclear but it is likely to be neural and reversible.
Amblyopia, a developmental disorder of the visual cortex, is one of the leading causes of visual dysfunction in the working age population. Current estimates put the prevalence of amblyopia at approximately 1-3%
With increasing reliance on the location and orientation sensors in smartphones for not only augmented reality applications, but also for meeting government-mandated emergency response requirements, the reliability of these sensors is a matter of great importance. Previous studies measure the accuracy of the location sensing, typically GPS, in handheld devices including smartphones, but few studies do the same for the compass or gyroscope (gyro) sensors, especially in real-world augmented reality situations. In this study, we measure the reliability of both the location and orientation capabilities of three current generation smartphones: Apple iPhone 4 and iPhone 4s (iOS) phones, as well as a Samsung Galaxy Nexus (Android). Each is tested in three different orientation/body position combinations, and in varying environmental conditions, in order to obtain quantifiable information useful for understanding the practical limits of these sensors when designing applications that rely on them. Results show mean location errors of 10-30 m and mean compass errors around 10-30 • , but with high standard deviations for both making them unreliable in many settings.
BackgroundThe haptic perception of ground compliance is used for stable regulation of dynamic posture and the control of locomotion in diverse natural environments. Although rarely investigated in relation to walking, vibrotactile sensory channels are known to be active in the discrimination of material properties of objects and surfaces through touch. This study investigated how the perception of ground surface compliance is altered by plantar vibration feedback.Methodology/Principal FindingsSubjects walked in shoes over a rigid floor plate that provided plantar vibration feedback, and responded indicating how compliant it felt, either in subjective magnitude or via pairwise comparisons. In one experiment, the compliance of the floor plate was also varied. Results showed that perceived compliance of the plate increased monotonically with vibration feedback intensity, and depended to a lesser extent on the temporal or frequency distribution of the feedback. When both plate stiffness (inverse compliance) and vibration amplitude were manipulated, the effect persisted, with both factors contributing to compliance perception. A significant influence of vibration was observed even for amplitudes close to psychophysical detection thresholds.Conclusions/SignificanceThese findings reveal that vibrotactile sensory channels are highly salient to the perception of surface compliance, and suggest that correlations between vibrotactile sensory information and motor activity may be of broader significance for the control of human locomotion than has been previously acknowledged.
Locomotion generates multisensory information about walked-upon objects. How perceptual systems use such information to get to know the environment remains unexplored. The ability to identify solid (e.g., marble) and aggregate (e.g., gravel) walked-upon materials was investigated in auditory, haptic or audio-haptic conditions, and in a kinesthetic condition where tactile information was perturbed with a vibromechanical noise. Overall, identification performance was better than chance in all experimental conditions and for both solids and the better identified aggregates. Despite large mechanical differences between the response of solids and aggregates to locomotion, for both material categories discrimination was at its worst in the auditory and kinesthetic conditions and at its best in the haptic and audio-haptic conditions. An analysis of the dominance of sensory information in the audio-haptic context supported a focus on the most accurate modality, haptics, but only for the identification of solid materials. When identifying aggregates, response biases appeared to produce a focus on the least accurate modality--kinesthesia. When walking on loose materials such as gravel, individuals do not perceive surfaces by focusing on the most accurate modality, but by focusing on the modality that would most promptly signal postural instabilities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.