Facebook, Polarization, and the Technological Imaginary
Of all of the shocking information packed into Rachel Thomas’s “When Data Science Destabilizes Democracy and Facilitates Genocide,” I was most intrigued by Facebook’s claim that its platform is value-neutral. Of course, this claim brought me back to the first week of class, in which we discussed the technological imaginary, particularly the contrast between the assumption that technology is neutral versus the reality that it promotes certain values above others.
Thomas linked an enormous number of other sources in her article, so I read the one she linked when discussing Facebook’s claim of neutrality. The article, “The False Dream of a Neutral Facebook,” was a very interesting read. The author, Alexis C. Madrigal, discusses the way Facebook’s algorithm is designed solely to increase engagement, and explains that “The goals of News Feed have nothing to do with democracy.” The goal of Facebook, of course, is to make money. It is, after all, a business. It makes money through advertising. Users see more advertisements the more time they spend engaging with the site, so Facebook is designed to promote engagement. That structure is pretty widely known, but its implications for the way we share and consume news are pretty significant.
In thinking about the spread of news on Facebook, I was reminded of a YouTube video entitled “Some Good News; 16 Ways 2016 Is Not a Total Dumpster Fire.” I recommend the video, as it contains some very encouraging information about the state of the world, but the first few lines are really of interest to me. The vlogger, John Green, explains that we tend to hear more about bad news, because it usually happens very suddenly and dramatically, while we hear less about good news, which tends to happen gradually and therefore isn’t really front page material.
This idea led me to reflect on how we share news on Facebook. Even more so than in conventional journalism, Facebook tends towards the most shocking, divisive stories, things that, as Green identifies, happen dramatically and suddenly. A piece on, for example, ways to strengthen community health centers in developing nations and continue to prevent infant mortality, is not enraging, despite being important, and is therefore far less likely to be shared. Anger and outrage make us more likely to share, which increases our engagement, which is exactly what Facebook wants. Therefore, while I don’t believe Facebook has a political agenda that can neatly be identified as liberal or conservative, it does have the agenda to promote the most enraging, divisive stories. Anger means sharing. Sharing means engagement. Engagement means money for Facebook. I don’t work behind the scenes at Facebook, so I can’t say definitively that its algorithm is designed to spread this sort of news, but the incentive is certainly there.
I don’t mean to say that there aren’t many things in our current political landscape about which to be angry. Anger proves again and again to be a powerful catalyst for change. However, when we start thinking about how Russian trolls can contribute to a more divisive political climate or spread completely inaccurate information, we have to question Facebook’s alleged neutrality. Facebook is far more concerned with its bottom line than improving political discourse, and it’s important not to lose sight of that, especially if we rely on it as a news source.