The Digital Reckoning: Filtering for Neoliberalist Conformity

Even though this article’s publication was only six years ago, the ways that technology has advanced is evidenced in the outdated supportive evidence which Rettburg provides (like Facebook not allowing you to post gifs) (23). Something which has remained a constant, and which I believe has actually grown to be a larger part of our virtual-social realities, is Marwick’s position that social media platforms filter content based upon the poster’s conformance to practicing and presenting their digital footprint as one of the “effective neoliberal subject” (23). The players in capitalism’s ever-evolving digital playground hold stakes in technological filters conforming to and promoting cultural filters—the one’s that tell you to be thinner, have “clearer” skin, cover up with this makeup brand, introduce your body with these garments. Capitalism thrives off of social media filters’ implicit messages of who is acceptable and good content, and how they arrived to this point of “good” and therefore desirable—with the products that encourage our cultural/physical/mental filtering of self, a form of self-policing to adhere to the neoliberal state’s expectations.

Even though I recognize this each time I open Instagram, I still find myself following influencers who may have knowledge (or products) for “self-improvement”; I still find myself interacting with the ads Instagram gives me, even though I’m aware of the filters which have designed this content promotion to target me and my perceived interests specifically. What does resistance to the neoliberal order look like when we are increasingly (especially at a time when, for those of us who are so privileged, our lives have literally become “remote” and “digital”) dependent upon the internet and social feeds and apps to maintain relationships, personal and professional, and to conduct work necessary for our survival or circumstances (i.e., to get your paycheck, to continue school because you’ve already taken out the loans, etc. . . .)? At a time of public crisis and collapse evidencing capitalist structures’ shortcomings all around us, what does our dependence upon “essential workers” (grocery store workers, garbage collectors, water service folk . . .) who cannot operate in a remote/digital capacity have to say about the limitations or necessary exclusions which the neoliberal order facilitates? What does it say especially when considering the negative narratives and rhetorics we’d built around such essential occupations before, when the rest of us could “work normally”?

The Irony Of Filtering the News 

Nowadays, filters are everywhere; from photos of our latest trip to reading the news on a website, filtering is a big part of our life. Looking at specific categories of the news is filtering that people do in their everyday life, sometimes without even realizing. Was that always the case? Local newspapers did not have enough news for you to need to filter what you need to know. If filtering was not important then, why is it now? How did it become such an important factor of our life? How come it became a necessity rather than a tool? 

The world is changing towards a more “globalized” version in which people are citizens of their country as well as the world; they now have the need to read the global news as well as the local ones. Newspapers or websites must choose now what is important enough to care about or what should be disregarded. Take for example a sports newspaper (if anybody still reading those); with so many top level teams and competitions around the world, the editor has to choose only some teams from some leagues from some sports to write articles about. What about the others? One has to put extra effort to find information about them; unfortunately, the more insignificant it seems to the rest of the world, the more difficult will be to learn about it.  

As explained above, it is expectable and justified then that news providers would filter out information that they think you would not care to read. However, as expectable and justified it may seem, it is a problematic process. It is problematic that somebody else decides on what you will learn about and what you won’t. Take as an example the current situation of the world during the pandemic. Most TV channels around the world have extended news reports about the pandemic, hosting doctors and Health ministers, showing numbers and graphs, predictions and statistics. There is no doubt that the pandemic is important news that everybody should know about. However, it is not the only news. The world is now turning its back to a refugee problem as if it is not important anymore. Thousands of people with no country to go to or no house to live in. Children separated from their families with no food or access to healthcare. The problem is real, those people are in Europe’s boarders but still a lot of people will not even think about it because the news do not present is as important at this moment.  

This is the irony of filtering. On personal level, filtering news allows you to quickly access the information you want. Get to your preferred website, type “coronavirus” and you will get all the information available. However, at the same time, you are only seeing what other people thought you should be seeing. In our effort to learn more and more, we might actually miss out on what we want to know. 

Filters in Tech: An Extension of Our Already Filtered Reality

Filtering content has become the norm. When we are googling for websites or images that we are looking for, we are filtering the entire, ginormous network of websites to find a single page. We are filtering our selfies and images to show our anxiety of living through a pandemic or showing how relaxing it is to be able to spend a month or two entirely for ourselves.

In Seeing Ourselves Through Technology, Rettberg focuses a chapter on how we filter reality to give it a new shape through technology. A good example Rettberg puts forward in the book is Shlovsky’s article titled “Art as Technique,” in which Shlovsky argues that art acts as a filter to see the reality that most of us cannot perceive. He quotes: “The purpose of art is to impart the sensation of things as they are perceived and not as they are know.” In this respect, technology is undoubtedly a form of art that creates new sensations, desires, and experiences that could not be experienced otherwise. I see this as a benefit of technology that improves our abilities as humans and gets us closer to a more extreme of what we call cyborgs.

While reading about Rettberg’s chapter on filtering reality, what lingered in my head is how our neuron functionality is also a filter that processes information around us. It is certain to me that we already live in a filtered world. We experience the world around us through our emotions and senses, and ultimately we register items or clues that we think we might use in the future. In that sense, every human function is also a filtered activity, and with technology, we are expanding these functions and experiences. Then, it is quite logical to argue that technology makes us live in a world where there is little “unfiltered” information as we live the world through our devices. In my belief, such a world is a world, where we are paying attention to things that are relevant to us, and the rest is unnecessary reality that does not make any significance to a 21st century human.

Technological Filters Are Really Just Social Filters

In Jill Walker Rettberg’s two chapters “Filtered Reality” and “Serial Selfies” that we read from her book Seeing Ourselves through Technology: Selfies, Blogs, and Wearable Devices to See and Shape Ourselves, a couple intriguing themes come up. She discusses filters as a broad term to describe social and cultural prejudices and as a way to alter reality. She then covers selfies as tools to develop identities and to facilitate self-exploration. One of my favorite things Rettberg says about selfies is that they are “visual identity performance” (41). She goes on to talk about the social pressures that come with taking selfies and the need to mimic a certain style to fit in with a perceived social group. I think that this idea of social pressure (from the selfies chapter) would very well into Rettberg’s filters chapter.

To me, filters are anything from a barrier to entry to an accepted social norm (and I use this phrase as though it was in scare quotes) that create an aesthetic. The pressure to reproduce an image that would give the creator a new social identity is a filter. In her discussion of filters, Rettberg says that using filters produces a new, defamiliarized, version of ourselves. I think, however, that the reproduction of filters, whether social or visual, does the exact opposite of defamiliarizing. Prolific application of filters allows users to create an entirely different persona, yes, but when they are repeated so often they become familiar and cliche. A widespread homogenization occurs, and suddenly other people and their “identities” become familiar.

The idea of an “Instagram aesthetic” captures this idea. The pink tones and bright colored filters of one instagrammer can easily be replicated. A simple Google search reveals the way to mimic others users’ aesthetics. (Instagram theme ideas article lists users and their styles to use). Filters are no longer personal expressions, but rather represent social pressures to conform and cultural beliefs.

Another example that I find interesting to apply to the idea of filters is Tucker’s example from class about Amazon’s AI for hiring and recruiting. The AI was trained on data from the type of resumes Amazon had received in the past 10 years–unsurprisingly most were male. Thus, the AI was biased against women and therefor filtered out a major portion of potential applications. This example demonstrates Rettberg’s idea of a technological filter influenced by a social filter.  While the mainstream idea of technological filters is that of cute dog ears and a long tongue on Snapchat, it is clear that there are many more social filters that we are not aware of until we start to prod deeper.

The World Might Be Ending, But At Least We Have Netflix

Author Sara Wachter-Boettcher begins her book Technically Wrong with a long list of ways in which technology has become a normal and essential part of our modern lives. Regarding the technological boom of 2007 when Twitter rebranded itself, Facebook expanded its market and consumer base monumentally, Google bought YouTube, and the first iPhone was just about to launch- Wachter-Boettcher writes; “So here we are, a decade later, and technology is so pervasive that a version of psychologist Abraham Maslow’s heirarchy of needs with ‘WiFi’ added to the base of the pyramid has become one of the most enduring internet memes around” (2-3).

In America and much of the developed world’s “normal” (meaning pre-COVID-19) social and economical state in the 21st century, it is well-known that we already rely on technology for directions, purchases, job and college applications, the informing of our decisions about current politics and trends, and to maintain contact with our peers. As a species, we constantly mock ourselves for our tech-dependency, and how dissapointed our ancestors would be in our zombie-like lifestyles as we meander throughout our daily lives with our eyes glued to one screen or another. In the same way that the rustling of leaves or the sound of massive footsteps registered as “danger” in the subconcious minds of our cavemen ancestors, a sharp “ding” or alarm ringing on our phones is followed by a spark in cortizone levels in the bodies of modern day humans. We can even experience “phantom notifications,” where our body tells us that we just heard our phone make a noise even though nothing actually happened.

Stories that fall into the “apocalypse” genre often represent a return to the “old world,” where we once again are forced to revert to our survivalist insticts of “hunt or be hunted.” Our modern world as previously described is a sharp contrast to the state of the world as it is depicted in these books and movies- so much so that they often contain a “token” character used to symbolize the pre-apocalypse world’s ultimate weakness being its excessive dependence on technology. Examples of this include Zoey Deutch in Zombieland 2, the straggler stuck using a copying machine in Zone One, and Nathan in X-Machina. Ultimately, those who can successfully master those original survivalist instincts and truly remove themselves from technology are the ones to make it to the end of the narrative.

I find this to be especially interesting in the wake of COVID-19 and what many people are jokingly (or not) calling the apocalypse of 2020. The irony of it all is that our species’s plan of attack for this pandemic is not to band together in machete-wielding mini-armies and embark on a neverending journey from location to location as we see in many zombie apocalypse movies, but to stay as isolated as possible and rely now more than ever on technology as the lifeblood of our society. One glance at social media and you’ll find people across the world joking about how they’ve already reached the end of Netflix and Hulu, checked all of their emails and snapchats, and surfed Instagram and Snapchat for hours on end. Even when it seems like the world is ending, we continue to go to class and educate ourselves- just at our laptop screens in our beds instead of in person. COVID-19 has successfully proved every apocalyptic screenwriter and storyteller absolutely wrong in showing us that in the end of times, it’s all of us that cling tighter to technology- and maybe that’s exactly what we need to do to survive.

Deserved Nuance

Today I finished reading Technically Wrong by Sara Wachter-Boettcher. While I enjoyed the book and agree with much of what Wachter-Boettcher has to say, I don’t think I come away from the book feeling how the author intended the reader to feel: angry and empowered to stand up to Big Tech. Instead, I feel wary. I feel that Technically Wrong was a bit one-sided and didn’t have enough nuance to fully convince me without doubt of Wachter-Boettcher’s arguments.

I didn’t feel this way at the beginning of the book. I enjoyed the first half, for the most part; Wachter-Boettcher’s insistence that the diversity issue in STEM is not a pipeline issue (23) rings true with me, though I think some nuance in that statement is deserved. Yes, the pipeline isn’t the issue; it’s the leaky pipeline, which in the second half of the book Wachter-Boettcher seems to refer to as the “leaky bucket” (183). Broadly speaking, the “leaky pipeline” (or “bucket”) issue is that there are women in STEM fields, but they “leak” out along the way, perhaps due to career changes or leaving to start a family. By not qualifying her statement of there not being a pipeline issue until more than 150 pages later, I felt that the author was being a bit too choosy with when she displays her evidence in a way that seems to be trying too hard.

In addition, with the discussion of the “leaky bucket”, Wachter-Boettcher makes the claim that because only 20% of women who leave STEM leave the workforce, the rest are stalled in their career or leave the field because they’re fed up with biased cultures. I would argue this is too simplistic a categorization; for example, a woman may leave the STEM field to take a job that allows her to work from home more easily, allowing her to more effectively have work-life balance. This isn’t necessarily a woman leaving because she’s fed up with discrimination, so is it, as Wachter-Boettcher hates to say, an “edge case”? By being so strictly categorical, the author has introduced her own sort of edge-case bias.

Really, though, the piece of the book that struck me as the most misleading was the portion about Dylann Roof (141-2). Wachter-Boettcher takes us through Roof’s journey of discovering hate groups on the internet somewhat innocently to opening fire at a church. Wachter-Boettcher seems to mostly blame Roof’s actions on what he found on the internet, these hate groups, and how internet searching allowed him to find them pretty easily. This is a vast oversimplification that is honestly dangerous to make. Roof’s actions were not only because of his ability to find online hate groups; while that may have contributed, it is irresponsible to ignore other potential contributing factors like his mental health or upbringing. Once again, I feel here that Wachter-Boettcher sometimes twists evidence to fit her message.

This whole post has been pretty negative about Technically Wrong, and I want to qualify it by saying that I did enjoy the book and agree with much (perhaps nearly all) of what Wachter-Boettcher has to say. However, I feel the book deserves some more nuance that it has and should be thought of critically and not blindly followed.

Pick-Up Artists

Before I discuss anything, I just wanted to say that Wachter-Boettcher’s section title on page 93, “Blinded by Delight,” was incredible. What a perfect reference!

Every week on Sunday, I get a notification from my iPhone, letting me know that I’ve spent far too many hours on my iPhone, doing absolutely nothing important. One of the major culprits for me is Instagram, where due to the recommendations algorithms, I find myself watching short ten-second clips of sports highlights I don’t even normally watch. I feel awful when I go into the screen time function in my settings, only to realize that I’m constantly picking up my phone, and not just because I have a message to which I should respond.

It sometimes seems that it’s a war of attention being waged between me and an entire design team. There are incredibly smart people working for these technology companies, whose main interest is to get us to spend more time on these apps, and to pick up our phone even when we don’t need to use it. Not only that, but these apps and websites are constantly expanding their range of functions. It seems to be a race for which company can have the most DAU (daily active users), where they take the functions that other profitable companies have, and incorporate them into their own system, no matter what that function is. There is absolutely no need for Snapchat to have a news function, yet there I sit, watching various news organizations give me ten-second clips on various topics.

Therefore it’s no surprise that the focus on gathering as many users as possible comes with stress cases. It seems like the companies focus more on creating a delightful experience that seems shallow and dopamine-inducing, instead of focusing on meaningful experiences that help their users achieve their own goal, instead of some investor-appeasing evaluation. But then again, is it too much to ask tech companies for meaningful experiences? Or are apps and websites destined to be merely distractions?

Normal vs Normate

As I read chapter three, “Normal People,” I immediately thought of a discussion I had in a disability studies class on the idea of “normal.” Wachter-Boettcher discusses Shonda Rhimes’ television shows as examples of “normalizing TV” by casting people who fit the personalities, not setting a race or last name for a character until the actor is chosen, in order for the shows to accurately represent the actual diversity of American society and normalize that depiction. Disability studies rejects the idea of “normal” as a whole and refers to nondisabled bodies as fitting the “normate,” a word meant to defamiliarize what we consider standard. The normate is not something to be desired but instead rejected; no body should be considered standard because there is no standard body. But the world is still designed for this standard body, white, middle class or higher, and abled. Although Rhimes’ ideology and that of disability scholars is in some ways similar–both work to include bodies that are usually not included in media or societal design–I wonder about the difference in how the distinctions, one accepting that a norm exists and one rejecting the existence of a norm, affect the actual outcome. Rhimes shows a “normal” world, but from a disability studies perspective, she is still attempting to match an idea that doesn’t reflect the actual world. She is still attempting to fit into a mold, and although I can only speak to Grey’s Anatomy, that world usually does not include disabled people unless they are dying in a hospital bed. Does attempting to address any sort of normate, even when including people of different demographics, continue to restrict us when we attempt to design inclusive technology?

The Issue with Instagram Ads

In the wake of Covid-19, our connection to social media has been made especially apparent. Now that we aren’t able to see each other physically, we are relying on social media to connect with almost everyone. Just today my family had a rather heartwarming zoom meeting to celebrate my grandmother’s birthday. The point is that as Sara Wachter-Boettcher states, social media is a part of almost everyone’s life now. And just like in non-digital spaces, online spaces are part of and perpetuate systems that harm marginalized individuals. But one of the things that these online spaces does better than non-digital spaces is that they are much better at collecting information and data about people. Reading about how these sites collect data for advertisers and to sell to other companies reminded me of the trend about a year ago where people figured out that they could see their ad interests on Instagram and everyone had a fun time seeing what Instagram thinks they are interested in. 

I looked at my Instagram ad interests and unsurprisingly they have a good understanding of my interests, honestly, I generally enjoy my Instagram ads. Facebook (who owns Instagram), doesn’t let you opt-out of any of this. Within Instagram, you don’t have the option to even change or delete your ad preferences. These companies make their profits is by categorizing humans into as many boxes they can. This idea aligns with how Wachter-Boettcher describes personas. Companies want to know yes or no, whether a person fits their boxes so they can sell what they want to them. But the problem with boxes is that they are not real, and all dichotomies are false dichotomies, etc. So people will be left out and misconstrued just as Wachter-Boettcher presents. 

It interests me that, at least from my experience people know that social media and the internet generally collect data about you, puts you into boxes and then tries to sell you things. I think most people range from hate to ambivalence about how these sites and apps work.

Facebook does not hide what they do with the data they collect with respect to advertising. Yet most people still continue to engage, because even if technically there is an opt-out option, social media and the internet have become so integral to our lives that option doesn’t even seem real.

The Cost of Desensitized Violence

I’m not much of a consumer of video games myself, but sadly not much of the imagery shown in Anita Sarkeesian’s video “Women as Background Decoration” surprised me. Both the blatant sexualizations and reduction of the women characters’ humanities to sex objects and the acts of normalized physical violence done to them disgusted me, but were not overly surprised me. The biggest question that kept occurring to me was how this constant desensitization of violence and sexuality, oftentimes depicted as one and the same, affects the boys and young men who consume this media. Oftentimes the audience for this media are boys going through puberty, and between media like this and their presumed consumption of pornography around that age, the messaging they consume about how they are meant to treat women in their lives must be hugely influential. While we can’t solely pinpoint the cause of phenomena such as rape culture, domestic violence, etc. to the creators and consumers of video games such as these, surely they can be considered a piece to this complicated puzzle? As someone with a teenage younger brother at home, it definitely concerns me to see such scenes, knowing that teenage boys particularly are searching for narratives they deem to be counternormative to the lessons they are learning at school and at home, making these seemingly edgy violent video games widely appealing to their developing moral senses. It brings them closer to their peers, who are also playing these games, and an anonymous online community who may exist on the fringe of society with beliefs that range from “satirical” to blatantly violent. All of it makes me deeply concerned.