Brown Don't Frown Podcast

Season 3: Ep 26 - The power of Artificial Intelligence, pandemic data sharing, and gender disaggregated data

Listen on

Episode notes

*Apologies in advance for the slightly off sound quality; we hope you still find the content meaningful.*

What is Artificial Intelligence? What role can it play in ensuring information is factually correct (especially during a pandemic) and that data is gender disaggregated? Today I’m joined by our first honorary male guest, Freddie Kalaitzis, Senior Research Fellow in Machine Learning at the University of Oxford. We know that AI is modelled on human behaviour, and given that humans are innately biased, can we trust AI to be unbiased and neutral? 

Freddie also talks about his work with Amnesty International on ‘Troll Patrol', which investigated the extent of abuse against women online.  One statistic which stood out was that Women of Colour were 34% more likely to be mentioned in abusive tweets than White women and that Black women were particularly affected, being 84% more likely than white women to be mentioned in abusive tweets. 

Globally, Public Health policies have not addressed the gendered impacts of pandemics. Campaigners have argued that this research gap is caused by a lack of gender disaggregated data. Emerging evidence suggests that more men are dying, potentially due to gendered immunological or environmental differences, such as exposure to pollutants from working outside or smoking. 89% of nurses in the UK are women, emphasising the gendered nature of the healthcare workforce and the risks that women face. I ask Freddie if AI can drive information gathering techniques which simplify and enable sex disaggregated data to be collected and what solutions AI might provide in response to COVID-19.  At a time when we are saturated with news and information at our fingertips, AI could play a crucial role in preventing misinformation. Finally, Freddie reminds us why we should be mindful that correlation doesn’t always mean causation.