InflexionPoint Podcast: Cultivating Change from the Inside Out: Community Conversation: Blind Faith in AI or Inclusive Coding?
06/19/2024 03:00 pm PST
Blind Faith in AI or Inclusive Coding?
In this episode we delve into the groundbreaking work of two powerful and brilliant women in the field of Artificial Intelligence.
Dr. Joy Buolamwini's research explores the intersection of social impact, technology, and inclusion. She is the founder of the Algorithmic Justice League, a groundbreaking MIT researcher, a model and a poet of code. She is also the author of national bestseller Unmasking AI: My Mission to Protect What Is Human in a World of Machines and advises world leaders on preventing the harms of AI. Her MIT thesis methodology uncovered large racial and gender bias in AI services from companies like Microsoft, IBM and Amazon.
Dr. Cathy O’Neil is an American mathematician, data scientist, and author. She is the author of the New York Times best-seller Weapons of Math Destruction, and opinion columns in Bloomberg View. O'Neil was active in the Occupy movement. As a data skeptic she uncovers the dark secrets of big data, showing how our "objective" algorithms reinforce human bias. She believes a lot can go wrong when we put blind faith in big date.
AI in Society and Law Enforcement
"Unregulated and untested AI technologies have put innocent people at risk fo being wrongly convicted." —Innocence Project: When Artificial Intelligence Gets It Wrong
The presumption of innocense is a legal principle that every person accused of any crime is considered innocent until proven guilty. AI-empowered law enforcement sometimes results in the presumption of guilt unitl proven innocent.
Episode giveaways:
- Ted Talk Dr. Joy Buolamwini: How I'm Fighting Bias in Algorithms https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms
- Dr. Joy Buolamwini Netflix Documentary - Coded Bias https://www.netflix.com/title/81328723
- Dr. Joy Buolamwini Medium Article – InCoding: In The Beginning Was The Coded Gaze Whoever codes the system, embeds her views. A call for inclusive code https://medium.com/mit-media-lab/incoding-in-the-beginning-4e2a5c51a45d
- Dr. Cathy O’Neil Ted Talk: The Era of Blind Faith in Big Data Must End https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end
- Dr. Cathy O’Neil Talks at Google: Weapons of Math Destruction https://youtu.be/TQHs8SA1qpk?si=kOq1DwzfKKwi_RUR
- Innocence Project: When Artificial Intelligence Gets It Wrong https://innocenceproject.org/when-artificial-intelligence-gets-it-wrong/
- Predictive policing algorithms are racist. They need to be dismantled. https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/
- Rise of the racist robots – how AI is learning all our worst impulses https://www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses
HOST
Anita Russell M.Ed
InflexionPoint Podcast: Cultivating Change from the Inside Out Antiracism Activation through Courage, Conversation, Relatioship, and Accountabiliy 1st & 3...
Find out more »CO-HOSTS
Mavis Bauman
InflexionPoint Podcast: Cultivating Change from the Inside Out Creating a Brave Space for Conversations about Personal Transformation, Racism, and Accountability...
Find out more »Gail Hunter LCSW
InflexionPoint Podcast: Cultivating Change from the Inside Out Creating a Brave Space for Conversations about Personal Transformation, Racism, and Accountability...
Find out more »