Article (Scientific journals)
Beyond the stereotypes: Artificial Intelligence image generation and diversity in anesthesiology
Gisselbaek, Mia; Minsart, Laurens; Köselerli, Ekin et al.
2024In Frontiers in Artificial Intelligence, 7
Peer Reviewed verified by ORBi
 

Files


Full Text
frai-3-1462819.pdf
Author postprint (1.2 MB)
Download

All documents in ORBi are protected by a user license.

Send to



Details



Keywords :
anesthesiology; biases; artificial intelligence; gender equity; race/ethnicity; stereotypes
Abstract :
[en] IntroductionArtificial Intelligence (AI) is increasingly being integrated into anesthesiology to enhance patient safety, improve efficiency, and streamline various aspects of practice.ObjectiveThis study aims to evaluate whether AI-generated images accurately depict the demographic racial and ethnic diversity observed in the Anesthesia workforce and to identify inherent social biases in these images.MethodsThis cross-sectional analysis was conducted from January to February 2024. Demographic data were collected from the American Society of Anesthesiologists (ASA) and the European Society of Anesthesiology and Intensive Care (ESAIC). Two AI text-to-image models, ChatGPT DALL-E 2 and Midjourney, generated images of anesthesiologists across various subspecialties. Three independent reviewers assessed and categorized each image based on sex, race/ethnicity, age, and emotional traits.ResultsA total of 1,200 images were analyzed. We found significant discrepancies between AI-generated images and actual demographic data. The models predominantly portrayed anesthesiologists as White, with ChatGPT DALL-E2 at 64.2% and Midjourney at 83.0%. Moreover, male gender was highly associated with White ethnicity by ChatGPT DALL-E2 (79.1%) and with non-White ethnicity by Midjourney (87%). Age distribution also varied significantly, with younger anesthesiologists underrepresented. The analysis also revealed predominant traits such as “masculine, ““attractive, “and “trustworthy” across various subspecialties.ConclusionAI models exhibited notable biases in gender, race/ethnicity, and age representation, failing to reflect the actual diversity within the anesthesiologist workforce. These biases highlight the need for more diverse training datasets and strategies to mitigate bias in AI-generated images to ensure accurate and inclusive representations in the medical field.
Disciplines :
Anesthesia & intensive care
Author, co-author :
Gisselbaek, Mia;  Geneva University Hospitals > Department of Anesthesiology, Clinical Pharmacology, Intensive Care and Emergency Medicine
Minsart, Laurens;  UZA - University of Antwerp Hospital [BE] > Department of Anesthesia
Köselerli, Ekin;  University of Ankara School of Medicine > Department of Anesthesiology and Intensive Care Unit
Suppan, Mélanie;  Geneva University Hospitals > Department of Anesthesiology, Clinical Pharmacology, Intensive Care and Emergency Medicine
Meco, Basak Ceyda;  University of Ankara School of Medicine > Department of Anesthesiology and Intensive Care Unit
Seidel, Laurence  ;  Université de Liège - ULiège > Département des sciences de la santé publique
Albert, Adelin  ;  Université de Liège - ULiège > Département des sciences de la santé publique
Barreto Chang, Odmara L.;  UCSF - University of California, San Francisco [US-CA] > Department of Anesthesia and Perioperative Care
Saxena, Sarah;  AZ Sint-Jan Brugge Oostende > Department of Anesthesia and Reanimation
Berger-Estilita, Joana;  UniBE - University of Berne [CH] > Institute for Medical Education
Language :
English
Title :
Beyond the stereotypes: Artificial Intelligence image generation and diversity in anesthesiology
Publication date :
09 October 2024
Journal title :
Frontiers in Artificial Intelligence
eISSN :
2624-8212
Publisher :
Frontiers Media SA
Volume :
7
Peer reviewed :
Peer Reviewed verified by ORBi
Available on ORBi :
since 10 November 2024

Statistics


Number of views
25 (4 by ULiège)
Number of downloads
8 (2 by ULiège)

Scopus citations®
 
2
Scopus citations®
without self-citations
0
OpenCitations
 
0
OpenAlex citations
 
3

Bibliography


Similar publications



Contact ORBi