As companies race to understand and maximize the financial potential of Artificial Intelligence, facial recognition technology has quickly become a key component to that research. It is already a prevalent aspect of our daily lives as we use it to unlock our iPhones and make funny photos on Snapchat. In her thought-provoking documentary Coded Bias, director Shalini Kantayya makes a good case for why we should not be so eager to embrace this technology.
Inspired by the work of Joy Buolamwini, an MIT PhD Candidate, Kantayya’s film exposes the built-in bias that is found in the technology that consume our daily lives. When working on her idea for an Aspire Mirror—a mirror that could map the face of others, say Serena Williams—Buolamwini discovered that the facial recognition program she used could not detect her face. Being a black woman, Buolamwini decided to test the tool again while wearing a generic white mask—similar to the ones the Jabbawockeez dance crew wear—and was stunned to discover that the tech recognized the mask as a legitimate face.
Buolamwini went on to conduct a study on facial recognition programs using A.I. and what she uncovered was a “coded bias”. As A.I. is a learning technology, and most of the major tech companies feature white men in key programming positions, the programmers used people who resembled themselves as the main data reference points. This meant that not only were lighter skin tones far more likely to be detected over people of colour, but men were routinely more favourable than women.
One will always be viewed as the gold standard if you constantly cast yourself as the hero of every story.
What makes these race and gender-based bias so disturbing is that these faulty algorithms are widely being used today. This “coded bias” impacts everything from the facial recognition programs that are currently being tested by police on the streets of the U.K. to the tools that decide who can get a line of credit at a bank. Worst of all, the nine leading companies in A.I. advancements—including Facebook, Amazon, Google and IBM—do not fully understand the conclusions their algorithms reach or the consequences that technological bias has.
Imagine walking down the street and being suddenly arrested because a camera mistakenly thinks you are a wanted criminal; being a woman denied an interview because the résumé tool only approves male applicants; or getting fired because a predictive program deems you unfit without reason. Kantayya’s films shows these events are occurring right now. What was once considered science fiction is now an unjust reality.
As Coded Bias notes, we often look towards China’s surveillance state when thinking of rights being violated by facial recognition technology. However, A.I. is already collecting data about every aspect of our online lives. Every website we visit, every social media post we like and image we upload is being compiled by corporations without our approval. Fortunately, Coded Bias shows that there are women like Buolamwini, Research Fellow Deborah Raji, data journalist Meredith Broussard, mathematician Cathy O’Neil and Big Brother Watcher director Silkie Carlo to name a few, raising red flags about the bias and dangers of this technology.
While Coded Bias does a great job of putting these women at the forefront, it occasionally bites off more than it can properly chew. The film touches on racism, capitalism, sexism, police brutality, the financial crisis and much more, and a narrower focus would have enhanced the overall impact. Whether or not Coded Bias can force the massive tech industry to change their practices remains to be seen. However, it will make you think twice about who is controlling the data we are unknowingly giving away.
Coded Bias is available to stream until June 24.