“Fixing” Algorithm Biases

Rachel Thomas’ article, “When Data Science Destabilizes Democracy and Facilitates Genocide,” highlights the importance of a data scientist’s responsibility in the algorithms they create, and the inherent biases such algorithms contain. She focuses on Facebook, and explains how its function as a news source often provides its users with inaccurate or fabricated stories, and leads those with no other sources of news further and further down the “rabbit hole” of conspiracy-minded groups. Apart from her mention of adding diversity to the teams building the products, and her discouragement of seeing yourself as “gender-blind” and “color-blind,” Thomas doesn’t answer the question of how to go about creating algorithms that lack bias.

Allie Nicodemo expresses in her article, How to Create Unbiased Algorithms in a Biased Society,” that putting “computer scientists into conversation with ethicists, philosophers, and others from fields” might contribute to the “fairness” of the algorithms we encounter on a daily basis. She goes on to quote Tina Eliassi-Rad, a professor of computer science at Northeastern University, who stated “we need to work with other disciplines that have spent decades studying what is fair and what is just to come up with different definitions of fairness.” Despite all of these suggestions, I still question, knowing that humans will always possess at least a sliver of bias regardless of how determined they are to be “blind” to things they themselves don’t experience, if algorithms will always possess some sort of bias as well because they are designed by humans. Along the same lines, does the solution come from creating all new, less biased algorithms, rather than fixing existing ones? Claire Cain Miller quotes Dierdre Mulligan in her New York Times article, “When Algorithms Discriminate,” who stated that “there’s a huge rush to innovate… a desire to release early and often — and then do cleanup.” Ultimately, I’m under the impression that, as long as humans are involved, computer algorithms will prove to include even the smallest amount of bias, whether intentional or not.

Works Cited:

Miller, Claire Cain. “When Algorithms Discriminate.” The New York Times, The New York Times, 9 July 2015, www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html.

Nicodemo, Allie. “How to Create Unbiased Algorithms in a Biased Society.” News @ Northeastern, 4 Dec. 2017, news.northeastern.edu/2017/12/04/how-to-create-unbiased-algorithms-in-a-biased-society/.

Thomas, Rachel. “When Data Science Destabilizes Democracy and Facilitates Genocide.” Fast.ai: Making Neural Nets Uncool Again, 2 Nov. 2017, www.fast.ai/2017/11/02/ethics/.

 

%d bloggers like this:
Skip to toolbar