Skip to yearly menu bar Skip to main content

Affinity Workshop: Queer in AI

Que(e)rying the Use of Artificial Intelligence for Infectious Disease Surveillance: How to Ensure New Tools Do Not Perpetuate a Long History of Health Disparities Affecting LGBTQI+ Populations

Elise Racine

Keywords: [ algorithmic bias ] [ artificial intelligence ] [ queer studies ] [ infectious disease ethics ]


From HIV/AIDS to COVID-19 to Monkeypox, outbreaks have disproportionately targeted LGBTQI+ communities. Evidence suggests that the likelihood of pandemics will only continue to increase, a possibility which highlights our need for better tools. By enabling the robust, efficient, and timely analysis of huge amounts of data, artificial intelligence has the potential to help decision-makers better respond to, manage, and even prevent infectious disease outbreaks (Malik et al., 2021; Wong, Zhou, and Zhang, 2019). This could, ultimately, reduce harm, disruption, and the loss of human life. However, AI also has a history of intensifying and perpetuating major inequities—including anti-queer bias. Considering the aforementioned disproportionate affects and epidemiology’s oppressive roots, we must be particularly thoughtful as we apply these algorithmic systems for infectious disease surveillance. Drawing from queer theory, critical race theory, and critical feminist studies, this project has adopted an Intersectional and reparative approach to studying the use of AI for such purposes. In doing so, it engages the audience in a dynamic give and take to explore how we can achieve algorithmic justice for LGBTQI+ people in the face of these emerging A-enabled tools.

Chat is not available.