In a time of societal acrimony, psychological scientists have turned to a possible antidote — intellectual humility. Interest in intellectual humility comes from diverse research areas, including researchers studying leadership and organizational behaviour, personality science, positive psychology, judgement and decision-making, education, culture, and intergroup and interpersonal relationships. In this Review, we synthesize empirical approaches to the study of intellectual humility. We critically examine diverse approaches to defining and measuring intellectual humility and identify the common element: a meta-cognitive ability to recognize the limitations of one’s beliefs and knowledge. After reviewing the validity of different measurement approaches, we highlight factors that influence intellectual humility, from relationship security to social coordination. Furthermore, we review empirical evidence concerning the benefits and drawbacks of intellectual humility for personal decision-making, interpersonal relationships, scientific enterprise and society writ large. We conclude by outlining initial attempts to boost intellectual humility, foreshadowing possible scalable interventions that can turn intellectual humility into a core interpersonal, institutional and cultural value.
While they usually should, people do not revise their beliefs more to expert (economist) opinion than to lay opinion. The present research sought to better understand the factors that make it more likely for an individual to change their mind when faced with the opinions of expert economists versus the general public. Across five studies we examined the role that overestimation of knowledge plays in this behavior. We replicated the finding that people fail to privilege the opinion of experts over the public across two different (Study 1) and five different (Study 5) economic issues. We further find that undermining an illusion of both topic-relevant (Studies 2–4) and -irrelevant knowledge (Studies 3 and 4) leads to greater normative belief revision in response to expert rather than lay opinion. We suggest one reason that people fail to revise their beliefs more in response to experts is because people think they know more than they really do.
Across two experiments (N=799) we demonstrate that people's use of quantitative information (e.g., base-rates) when making a judgment varies as the causal link of qualitative information (e.g., stereotypes) changes. That is, when a clear causal link for stereotypes is provided, people make judgments that are far more in line with them. When the causal link is heavily diminished, people readily incorporate non-causal base-rates into their judgments instead. We suggest that people use and integrate all of the information that is provided to them to make judgements, but heavily prioritize information that is causal in nature. Further, people are sensitive to the underlying causal structures in their environment and adapt their decision making as such. Keywords base-rate neglect. causality. stereotypes. heuristics and biases. probabilistic reasoning In a study 1000 people were tested. Among the participants there were 5 engineers and 995 lawyers. Jack is a randomly chosen participant of this study.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.