Artificial Intelligence (AI) technologies have the potential to dramatically impact the lives and life chances of people with disabilities seeking employment and throughout their career progression. While these systems are marketed as highly capable and objective tools for decision making, a growing body of research demonstrates a record of inaccurate results as well as inherent disadvantages for women and people of colour (Broussard, 2018; Noble, 2018; O’Neil 2017). Assessments of fairness in Recruitment AI for people with disabilities have thus far received little attention or have been overlooked (Guo et al., 2019; Petrick, 2015; Trewin, 2018; Trewin et al. 2019; Whittaker et al., 2019). This white paper details the impacts to and concerns of disabled employment seekers using AI systems for recruitment, and provides recommendations on the steps employers can take to ensure innovation in recruitment is also fair to all users. In doing so, we further the point that making systems fairer for disabled employment seekers ensures systems are fairer for all.
Artificial Intelligence (AI) technologies have the potential to dramatically impact the lives and life chances of people with disabilities seeking employment and throughout their career progression. While these systems are marketed as highly capable and objective tools for decision making, a growing body of research demonstrates a record of inaccurate results as well as inherent disadvantages for historically marginalised groups. Assessments of fairness in Recruitment AI for people with disabilities have thus far received little attention or have been overlooked. This paper examines the impacts to and concerns of disabled employment seekers using AI systems for recruitment, and discusses recommendations for the steps employers can take to ensure innovation in recruitment is also fair to all users. In doing so, we further the point that making systems fairer for disabled employment seekers ensures systems are fairer for all.
Artificial Intelligence (AI) technologies have the potential to dramatically impact the lives and life chances of people with disabilities seeking employment and throughout their career progression. While these systems are marketed as highly capable and objective tools for decision making, a growing body of research demonstrates a record of inaccurate results as well as inherent disadvantages for historically marginalised groups. Assessments of fairness in Recruitment AI for people with disabilities have thus far received little attention or have been overlooked. This paper examines the impacts to and concerns of disabled employment seekers using AI systems for recruitment, and discusses recommendations for the steps employers can take to ensure innovation in recruitment is also fair to all users. In doing so, we further the point that making systems fairer for disabled employment seekers ensures systems are fairer for all.
For employers, like many members of the general public, mental illness is a scary subject — the more so that they don't know who to turn to if a member of staff starts to show signs of abnormal distress. Susan Scott‐Parker argues that if employers are going to be persuaded that it is in the interests of their business to take on people with mental health problems, then mental health professionals and support workers must pay more attention to what employers actually need.
If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service information about how to choose which publication to write for and submission guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information. About Emerald www.emeraldinsight.comEmerald is a global publisher linking research and practice to the benefit of society. The company manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as providing an extensive range of online products and additional customer resources and services.Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive preservation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.