Screen Shot 2020-12-18 at 8.02.29 AM
Coded Bias

Dr. Timnit Gebru, a former co-leader of Google’s Ethical Artificial Intelligence team, recently made headlines after getting fired. The cause: A paper she published criticized Google’s approach to minority hiring and the biases built into today’s artificial intelligence systems. Just one of many accounts of the tech industry’s storied diversity issue, which seeps into machine learning and perpetuates class, race, and gender bias. Filmmaker Shalini Kantayya addresses this in Coded Bias, her latest documentary, which first premiered at the 2020 Sundance Film Festival. 

“My fascination with science fiction inspires me to understand what is real, what may be possible someday, and what is sheer magical thinking about the future,” Kantayya told Prism. Adding to the list of films to unearth the pitfalls of Big Tech (The Social Dilemma, The Great Hack), Coded Bias addresses its subject by focusing on the human toll. 

The story begins by chronicling the woman who sparked the entire film, Joy Buolamwini, a poet of code who uses art and research to illuminate the social implications of artificial intelligence. She is also the founder of the Algorithmic Justice League, an organization with a mission to create a world with more equitable and accountable technology. She currently serves on the Global Tech Panel convened by the vice president of the European Commission to advise world leaders and technology executives on ways to reduce the harms of AI. 

As an MIT Media Lab researcher, Buolamwini discovered that the algorithm couldn’t detect her face—until she put on a white mask. Her finding has led to local, international, and congressional hearings on the abuse of facial recognition software deployed at scale. 

“Most people creating AI and the data being used do not reflect the vast majority of the world. As a result, there are major blind spots in the technology being created,” Buolamwini told Wired. “Full-spectrum inclusion is about asking who’s missing during the design, development, and deployment of AI systems.” 

Kantayya first learned about Buolamwini’s work while a TED fellow, where Kantayya tuned into a 2016 Ted Talk Buolamwini gave on her alarming facial recognition discovery. Here, Kantayya and Buolamwini met and began to dig into all the ways this software was used.  

Kantayya strings together local and international stories with sobering details. A teacher in Houston, Texas, recounts receiving an arbitrarily poor algorithmic evaluation despite years of experience and awards. A watchdog group in London challenges the police use of AI-based closed-circuit TV cameras that often misidentify and racially profile pedestrians. Residents in a Brooklyn housing complex are trying to prevent their landlord from installing a biometric security system, akin to a maximum-security prison. 

Following the resurgence of the Black Lives Matter movement this summer after the killing of George Floyd by Minneapolis police, Amazon issued a one-year moratorium on police use of Amazon’s facial recognition technology. IBM put a halt to developing facial recognition, and Microsoft paused deployment of its system until federal regulations were passed. While a hefty step toward more accountability, it’s not nearly enough. Buolamwini used her platform to call on tech companies to donate $1 million each to organizations like Data for Black Lives and Black in AI that advance racial justice in the tech sector. 

“The last thing we need is for this presumption of guilt of people of color, of Black people, to be confirmed erroneously through an algorithm,” Buolamwini told Fast Company in her August 2020 cover story. 

In late 2018, in partnership with the Georgetown Law Center on Privacy and Technology, she launched the Safe Face Pledge, the first agreement that prohibits the pernicious application of facial analysis and recognition technology.

“It’s really my hope that the film will kind of level the playing field because knowledge is power and I think for so many of us have never felt able to question these systems that we interact with,” Kantayya said. “It’s been sort of this big magic. I think we gotta pull the curtain back on The Wizard of Oz and see this small group of white men who are making these technologies,” she continued. 

Ultimately, algorithmic justice will require raising public awareness and more diverse voices at the table, while these technologies are being created and deployed at scale. Kantayya and Buolamwini are just beginning this work. 

Watch Coded Bias in virtual cinemas

Priscilla Ward is a Washington, D.C.-based writer, running enthusiast, music lover, currently dreaming of her next international travel destination post quarantine. She's also the founder of BLCKNLIT.