I was given a screening of the Coded Bias documentary as a perk from attending the Target Lab event. Read my hot take on recruiting swag here!
From a novel to a symphony, the first sentence or musical phrase has to “grab” my attention and pique my interest. Coded Bias did this flawlessly: “Hello world! Can I just say that I am stoked to meet you? Humans are super cool. The more humans share with me, the more I learn.” is spoken by a computer-generated voice while an amorphous and ever-evolving image of red light is displayed on the screen. This moment was oddly chilling and set the tone for the rest of the documentary.
The documentary opens with MIT researcher Joy Buolamwini explaining how the development of her “Aspire Mirror” led her to discover racial bias in artificial intelligence (AI) algorithms. Her mirror project was a really fun idea: put images of inspiring people’s faces over her own as a way to boost confidence. (After countless job rejections affecting my confidence, I could really use this mirror!) But the camera would not detect her face - Joy is a woman of color - unless she put on a white plastic mask. The training set of data that was used to train the software included predominantly white and male faces, so the system was unable to process female, darker-skinned faces.
Meredith Broussard, author of “Artificial Unintelligence” explains this well: “everyone has unconscious bias and people embed their biases into technology.” Joy’s Aspire Mirror revealed that unconscious bias had entered the algorithms driving facial-recognition technology. I’d like to believe that no one intentionally embeds biases into technology because most people take great pride in their work. A biased technology is a technology that is flawed and therefore inherently sub-par. But I agree completely that our unconscious biases sneak in and greatly affect our work - and life. Usually, this causes great consequences, even if our own privilege prevents us from experiencing these consequences firsthand and/or blinds us from noticing these consequences at all. But that doesn’t mean that the consequences are not real; they very much exist!
In addition to uncovering unconscious biases, algorithms reveal differences in cultural values. While China uses algorithms for public safety and the public good, the United States uses algorithms to drive consumerism. This reflects the vast differences in what each society prioritizes and values. The United States, a country founded on democratic ideals, interestingly does not extend those democratic ideas into the digital world. Algorithms are in everything from credit offers to college admission and photo filters to social media ads! Yet, these algorithms most often replicate the world as it is, instead of actively trying to create a more equitable world. Our algorithms are a HUGE missed opportunity to drive social change and make social progress.
There is also great inequality in how new technologies are implemented. For example, most surveillance first shows up in poorer neighborhoods then gradually extends into more affluent areas. Also, those most affected by the technology are typically the most unaware of it, how it works, and what it is doing to them. Cathy O’Neil, author of “Weapons of Math Destruction,” has an awesome proposal: an FDA for algorithms. The thought of additional governmental oversight and control can be off-putting, but without any government regulation so far, our algorithms have become embedded with biases that actively hurt individuals. Additionally, new technologies have been implemented without much consideration about their implications. Someone or something needs to be in charge to create accountability and consequences.
After watching Coded Bias, I am feeling the overwhelming desire to go off the grid! As someone who loves technology, especially new technology and technology that provides value to my life, this documentary was a cautionary tale to not be so immediately trusting. Many of us, myself top of the list, are too quick to jump onto new technologies in the name of convenience without considering their true costs. The documentary did an excellent job of making the viewer think about the role technology plays in their life, potentially inspiring the viewer to re-evaluate their relationship with technology.
While at times extremely distressing, Coded Bias left me with a lot of hope. The documentary explained the issue in an entertaining way that anyone could understand. More importantly, it put the spotlight on the individuals and organizations that are working to raise awareness and combat inequality in our algorithms. These algorithmic warriors do not get enough spotlight, credit, or help in their mission! I was happy to see a movie that highlighted so many awesome data professionals, most of them women. Less than 14% of AI researchers are women, so it was interesting - but not surprising - that it is mostly women who are leading this charge for data equality. Will you join them? I will!
Coded Bias left me with the motivation to do what I can to reduce bias in my data work. It seems like an overwhelming task, especially since so many algorithms already have biases deeply built-in and I am just starting out on my career journey. The steps in the right direction are simple:
Be aware of the issue
Spread awareness of the issue
Make sure training sets are diverse
Build diverse teams
Ask someone different from you to review your code, your project, your work, etc.
Audit your code for accuracy
Have diverse individuals test your product
Create a system of checks and balances in your workplace
Ask if the pros outweigh the cons
Keep the human element in data!
Push for fair regulation and laws
Have you watched "Coded Bias" yet? Leave your review in the comments!
Comments