★★★★ | Bigger brother


While studying at MIT, Joy Buolamwini noticed one day that her facial recognition software did not recognize her face. Odd, considering it has only one job to do, but perhaps not insurmountable. So Buolamnwini did what every engineer would: she began to troubleshoot the machine.

It wasn’t long until she figured out that the software would not recognize her black features unless she was wearing a white mask. This raised more questions than it answered. How widespread is the issue? If it’s already encoded in facial recognition technology, where else could it be found?

CODED BIAS, directed by Shalini Kantayya and playing at HIFF still this week, seeks to answer some of these questions, even as it is limited by time and the surprising scope of the problem. By the time CODED BIAS ends, it opens the floodgates to countless hours of discussion on not just the legality, but morality of the interconnected world around us. 

The major issue, Kantayya’s documentary finds, is that most of the technology created today is by white men. Something true since the founding of AI research in America. Meaning that the foundational alphabet of AI and technology research was written by a group that only knew how to code for themselves. With nothing to change the course of development through the decades, technology itself has become molded in their image.

That realization leads to the larger topic at hand. An unequal society can only breed an even more unequal one, which is the foundation for some of the most important documentaries today. Branching out from facial recognition into the algorithms that define our daily lives, targeting software used in conjunction with AI, and even the connection between corporations and media that is used to define our relationship with technology, CODED BIAS takes a big swing at a topic much bigger than a single documentary. 

Luckily Kantayya keeps the focus straight and true at every turn, making the feature easy to follow even when it’s hard to digest. Allowing for experts in their field to do the talking, Kantayya steers clear of sensationalism, letting the facts speak for themselves. Some B-rolls are a little heavy-handed, but also necessary to reach even the cheapest of seats. 

An interesting sidetrack sees Kantayya showcase the differences between China and the United States in terms of social engineering through technology. While American politicians seem happy to pretend that China is the Orwellian nightmare that needs to be avoided, others feel that they’re no different from the West in this regard. Only more transparent.

As we follow Silkie Carlo, the head of the Big Brother Watch organization, that declaration feels all too real. The Watch, created as a response to the rising amount of monitors in London, tracks the actions of the Metropolitan Police and their use of AI-guided facial recognition software that scans people without warning or approval. 

In one of numerous infuriating scenes, a young black man is pulled aside by plainclothes officers because the algorithm has tapped him as a potential terror threat. Naturally, he isn’t one, but the AI doesn’t care, nor do the cops. Even as Carlo and her associates point out that the system naturally skews against minorities in every regard the system continues to operate as usual. 

The most fascinating aspect, as a white man in the age group that this technology is catered to, is seeing just how horribly narrow the target audience is. Anyone outside the immediate realm of pale masculinity is already on unstable ground. Add to that any sort of minority or transgender status and you might as well not exist. Everything that we take for granted is kept well away from over half the planet. 

In science fiction, we portray technology as an either-or scenario. It’s always either the benevolent all-good magic trick that makes life perfect for everyone, or the antagonist of our nightmares that ruins everything by superseding mankind. In reality, Kantayya and her colleagues argue, it’s just a mirror reflecting our very worst tendencies back at us. 

Take for example Tay, the artificial intelligence chatbot that Microsoft proudly released into the web. It took less than an hour on social media for the intelligence to learn our worst ways, becoming a vile, racist Nazi before our eyes. Tech companies like to pretend that technology is dispassionate, but it is coded and created by humans who most certainly are not. Therefore our hardcoded biases, mistrusts, and even systemic racism that everyone lives with is always a part of the equation.

Which makes CODED BIAS so important and necessary. It doesn’t paint a doomsday scenario out of something that really sounds like one. Instead, it focuses on fixing the problem in the way that smart people, when presented with an issue, are prone to do. Buolamwini creates the Algorithmic Justice League, Carlo takes her fight to the streets, and S.T.E.M experts, led by Buolamwini take their case to the capitol where a new generation (AOC among them), finally hears them out. 

The documentary doesn’t really have a conclusion, because no definitive answer exists. There is only a moment where the rivers split and we reach a turning point. IBM answers the call and begins to work on a more inclusive algorithm for face detection. A racist inclusion in the security methods of a working-class neighborhood is revoked. Things look, even if for a moment, better. 

But Buolamwini and friends remain cautious. The relationship between tech and money is too strong a bond to break. It will be rebranded, one of them sighs. That’s how these things always go. 

But for the first time, we have people out there protecting us. A real-life Justice League built from smart and eloquent people who can make a difference. 

That’s a better superhero origin story than anything else out there.

By Joonatan Itkonen

Joonatan is an AuDHD writer from Helsinki, Finland. He specializes in writing for and about games, films, and comics. You can find his work online, print, radio, books, and games around the world. Toisto is his home base, where he feels comfortable writing about himself in third person.

Leave a Reply

Your email address will not be published. Required fields are marked *