Previous research has shown that algorithmic decisions can reflect gender bias. The increasingly widespread utilization of algorithms in critical decision-making domains (e.g., healthcare or hiring) can thus lead to broad and structural disadvantages for women. However, women often experience bias and discrimination through human decisions and may turn to algorithms in the hope of receiving neutral and objective evaluations. Across three studies (N = 1107), we examine whether women’s receptivity to algorithms is affected by situations in which they believe that their gender identity might disadvantage them in an evaluation process. In Study 1, we establish, in an incentive-compatible online setting, that unemployed women are more likely to choose to have their employment chances evaluated by an algorithm if the alternative is an evaluation by a man rather than a woman. Study 2 generalizes this effect by placing it in a hypothetical hiring context, and Study 3 proposes that relative algorithmic objectivity, i.e., the perceived objectivity of an algorithmic evaluator over and against a human evaluator, is a driver of women’s preferences for evaluations by algorithms as opposed to men. Our work sheds light on how women make sense of algorithms in stereotype-relevant domains and exemplifies the need to provide education for those at risk of being adversely affected by algorithmic decisions. Our results have implications for the ethical management of algorithms in evaluation settings. We advocate for improving algorithmic literacy so that evaluators and evaluates (e.g., hiring managers and job applicants) can acquire the abilities required to reflect critically on algorithmic decisions.
Several countries' economies have been disrupted by the sharing economy. Global champions like Airbnb and Uber use similar models and platforms across many countries. However, each country and its consumers have different characteristics including the language used. The text in the profile of those offering their properties in England in English and in Germany in German, are compared to explore whether trust is built, and privacy concerns are reduced in the same way. Six methods of building trust are used by the landlords: (1) the level of formality, (2) distance and proximity, (3) emotiveness and humor, (4) being assertive and passive aggressive, (5) conformity to the platform language style and terminology and (6) setting boundaries. Privacy concerns are not usually reduced directly as this is left to the platform. The findings indicate that language has a limited influence and the
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.