2019
DOI: 10.1111/hypa.12483
|View full text |Cite
|
Sign up to set email alerts
|

Gender Bias in Medical Implant Design and Use: A Type of Moral Aggregation Problem?

Abstract: In this article, I describe how gender bias can affect the design, testing, clinical trials, regulatory approval, and clinical use of implantable devices. I argue that bad outcomes experienced by women patients are a cumulative consequence of small biases and inattention at various points of the design, testing, and regulatory process. However, specific instances of inattention and bias can be difficult to identify, and risks are difficult to predict. This means that even if systematic gender bias in implant d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(18 citation statements)
references
References 60 publications
0
17
0
1
Order By: Relevance
“…30 Elsewhere I have identified three features of aggregative harms: their systemic nature, the (relative) invisibility of the forces that give rise to them, and the expedience of practices that support them. 29 All three factors are present in the case of gender biases affecting women surgeons. The instances of subtle gender bias are often invisible to perpetrators, such as patients who do not realise that they have misidentified the senior surgeon in the room.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…30 Elsewhere I have identified three features of aggregative harms: their systemic nature, the (relative) invisibility of the forces that give rise to them, and the expedience of practices that support them. 29 All three factors are present in the case of gender biases affecting women surgeons. The instances of subtle gender bias are often invisible to perpetrators, such as patients who do not realise that they have misidentified the senior surgeon in the room.…”
Section: Discussionmentioning
confidence: 99%
“…Addressing aggregative harm requires understanding the functioning of the system from which it emerges, making the harms and the way they aggregate visible, and challenging expedient practices that support them. 29 In this case, the process of making visible requires research aimed at identifying small or unexpected sources of bias (like this study, or the study by Liang et al). 27 Understanding the role that expedient processes play in aggregative harm can help support the case for replacing cheap or fast processes with less efficient but fairer ones.…”
Section: Feature Articlementioning
confidence: 99%
“…With an awareness of the disproportionate harm devices have caused in women [ 13 ] and the more frequent dismissal of female post-operative symptomatology, Health Canada established the Scientific Advisory Committee on Health Products for Women. This was an important first step that may eventually eliminate gender bias in medical device development and use.…”
Section: A Safer and More Inclusive Approach: Making Women Visiblementioning
confidence: 99%
“…Press releases did not, however, mention women’s greater risk. Either manufacturers and Health Canada did not think about women’s anatomy, biology, hormones and, in particular, gait, all of which predisposed to disproportionate failure and complications, or did not deem research evidence of disproportionate harm worth considering [ 13 ].…”
Section: Introductionmentioning
confidence: 99%
“…And theorists like Michel Foucault (1975Foucault ( /1995 and Henri Lefebvre (1974Lefebvre ( /1991 have developed theories about the nature and status of classist things. There are also sexist things-found in domains that range from transportation to medicine-that play a critical role in the ecology of gender oppression (Criado-Perez, 2019;D'Ignazio & Klein, 2020;Hutchison, 2019). And there are also heterosexist things (Ahmed, 2006).…”
Section: Oppressive Things and Psychological And Algorithmic Biasesmentioning
confidence: 99%