Common navigational aids used by blind travelers during large-scale navigation divert attention away from important cues of the immediate environment (i.e., approaching vehicles). Sensory augmentation devices, relying on principles similar to those at work in sensory substitution, can potentially bypass the bottleneck of attention through sub-cognitive implementation of a set of rules coupling motor actions with sensory stimulation. We provide a late blind subject with a vibrotactile belt that continually signals the direction of magnetic north. The subject completed a set of behavioral tests before and after an extended training period. The tests were complemented by questionnaires and interviews. This newly supplied information improved performance on different time scales. In a pointing task we demonstrate an instant improvement of performance based on the signal provided by the device. Furthermore, the signal was helpful in relevant daily tasks, often complicated for the blind, such as keeping a direction over longer distances or taking shortcuts in familiar environments. A homing task with an additional attentional load demonstrated a significant improvement after training. The subject found the directional information highly expedient for the adjustment of his inner maps of familiar environments and describes an increase in his feeling of security when exploring unfamiliar environments with the belt. The results give evidence for a firm integration of the newly supplied signals into the behavior of this late blind subject with better navigational performance and more courageous behavior in unfamiliar environments. Most importantly, the complementary information provided by the belt lead to a positive emotional impact with enhanced feeling of security. The present experimental approach demonstrates the positive potential of sensory augmentation devices for the help of handicapped people.
Facial mimicry is a central feature of human social interactions. Although it has been evidenced in other mammals, no study has yet shown that this phenomenon can reach the level of precision seem in humans and gorillas. Here, we studied the facial complexity of group-housed sun bears, a typically solitary species, with special focus on testing for exact facial mimicry. Our results provided evidence that the bears have the ability to mimic the expressions of their conspecifics and that they do so by matching the exact facial variants they interact with. In addition, the data showed the bears produced the open-mouth faces predominantly when they received the recipient’s attention, suggesting a degree of social sensitivity. Our finding questions the relationship between communicative complexity and social complexity, and suggests the possibility that the capacity for complex facial communication is phylogenetically more widespread than previously thought.
Background: Children who are frequently aggressive or lack empathy show various deficits in their social information processing. Several findings suggest that children with conduct problems (CP) show a tendency to interpret ambiguous situations as hostile (hostile attribution bias) and have difficulties to disengage from negative stimuli (attentional bias). The role that additional callous-unemotional traits (CU-traits) play in these biases is yet unclear. Investigating both attentional and attributional aspects of social information processing in children can help us to understand where anomalies in the processing pathway occur and whether the biases are associated with CP and CU-traits separately or in an interactive manner. Methods: We compared three groups of children: (a) 25 children with CP and low levels of CU-traits (b) 25 children with CP and elevated levels of CU-traits (c) 50 gender (68% male), age (8-17 years) and intelligence score-matched typically developing children, on a pictorial emotional stroop task and a hostile attribution bias task. Results: In contrast to our predictions, there were no significant group differences regarding attentional biases or hostile attribution biases. Boys with CP and high levels of CU-traits showed a significantly higher hostile attribution bias compared to girls with CP and high levels of CU-traits. The attention bias to angry stimuli significantly correlated with the hostile attribution bias. Compared to the control group the CP group with low levels of CU-traits showed a significantly stronger association between the attention bias to angry stimuli and the hostile attribution bias. Conclusions: The current study provides evidence that boys with CP and high levels of CU-traits interpret ambiguous situations as more hostile than girls do. Our results further provide indications that the interaction of attentional and attributional biases in children with CP might contribute to their increased aggressive behavior.
This study aimed to assess whether callous-unemotional traits (CU) are associated with deficits in emotion recognition independent of externalizing behavior and whether such deficits can be explained by aberrant attention. As previous studies have produced inconsistent results, the current study included two different emotion recognition paradigms and assessed the potential influence of factors such as processing speed and attention. The study included N = 94 children (eight to 14 years) with an oversampling of children with conduct problems (CP) and varying levels of CU-traits. Independent of externalizing behavior, CU-traits were associated with slower recognition of angry, sad and fearful facial expressions but not with higher error rates. There was no evidence that the association between CU-traits and emotion processing could be explained by misguided attention. Our results implicate that in children with high levels of CU-traits emotion recognition deficits depend on deficits in processing speed.
Spatial transposition tasks assess individuals' ability to represent nonvisible spatial object displacements. Several nonhuman mammal species have been tested on this task including primates, cats, and dogs, but to date, great apes seem the only taxon that has repeatedly and consistently solved spatial transposition tasks. The authors investigated the ability of captive sloth and sun bears to solve spatial transposition tasks. Both species belong to the same taxonomic group as cats and dogs, but unlike them and similar to apes, they have an omnivorous diet that requires them to keep track of fruit sources in space and time. The bears were first tested on a visible displacement task and those that succeeded were further tested on a spatial transposition task that involved a 180° transposition, followed by 2 tasks with two 360° transpositions. All 7 sloth bears and 7 out of 9 sun bears solved the visible displacement task. The 180° transposition task was solved by 6 out of 7 sloth bears and 1 out of the 5 tested sun bears. Three sloth bears were tested on all 4 experiments and even solved 2-chained 360° transpositions. Control conditions were conducted showing that the bears' performance did not rely on olfactory or auditory cues. The results provide the first indication that bears might be able to track invisible objects. Further studies will be necessary to confirm these results and to control the influence of associative learning. The present study emphasizes the importance of including different animal species in the investigation of what underlies the evolution of different cognitive skills. (PsycINFO Database Record
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.