Simulacra in “Be Right Back”

Black Mirror: Season 2 Episode 1 “Be Right Back,” (2013).  24 Jan. 2017 via Science of the Time

In thinking about the various questions raised by “Be Right Back,” I find my thoughts returning to a single issue, namely one concerning the reality of representation or artifice.

The problem first arises when Ash fixates on a childhood picture of himself and ignores Martha to post an ironic photo of the photo on social media. After much prodding, he puts down his phone and explains the history of the picture. With this particular representation of a representation, the term “simulacrum” comes to mind–a word that denotes an image or representation of someone or something, but often implies that representation to be a copy of a copy(or, at least, a copy of something not original, according to the work of French postmodernist Jean Baudrillard). Similar types of simulacra appear throughout the episode, ranging from the fake smile in the aforementioned photo to the imitation-Ash himself (itself?).

If we return to the conversation about the photo, we can see that Martha initially reacts positively to it, calling it “sweet” rather than darkly “funny.” Ash thinks otherwise, and even goes on to call it “fake,” while Martha implies that the photo’s authenticity didn’t matter, as his mother (who enjoyed keeping it around) had no idea it was “fake.”

With this kind of subject matter, I often conflate “real” with “true” and “fake” with “untrue.” But when a simulacrum bears no relationship with the original (as it is a representation of a representation) and makes little effort to do so, it can take on its own form of truth as it does in “Be Right Back.” At that point, the terms “real” and “fake” no longer seem to apply.  Baudrillard argues for the use of a different term to describe these simulacra–“hyperreal.”

The simulacra in “Be Right Back” all take on this aspect of hyperreallity, becoming temporarily more real than the real thing. The “fake” photo discussed above brought “real” comfort and happiness to Ash’s mom for a time. Likewise, the “fake” Ash did the same for Martha, even helping her move on from her all-consuming grief.

Technological Temptations: Postmortem Connection

Black Mirror’s “Be Right Back” prompted me with a number of questions regarding how I myself might approach a situation in which I could interact with a reconstructed version of a dearly departed loved one. The technology presented within the episode doesn’t seem so horribly far-fetched, given the development of text-based AI (see Cleverbot) in recent years. So perhaps we will be presented with similar choices in the years ahead. How should we proceed?

I’m attempting to imagine what considerations Martha must have attended to in deciding to initially open the email from the beta program, to reach out to the reconstructed version of Ash, and then to invest in a physical manifestation of this reconstruction — knowing full well that it wasn’t really him. Martha is depicted as an otherwise perfectly rational human who is grieving the loss of a loved one. What push and pull factors affected her decisions? What are the pros and cons of pursuing a remodeled version of someone so dear?

I move to thinking about how I personally might react if I found myself in a similar situation. It is horribly painful to lose someone dear to you and process the gravity of no longer being able to communicate with them. Faced with such a choice, I would consider the fact that my interaction with this new “being” would not be the same as that with my departed loved one, but perhaps hope that the reconstructed version would retain enough of the essence of that person to bring me some comfort. Maybe this version would serve as a temporary, transitional support, to help me accept the loss and eventually move on. Clearly illustrated by the episode, however, it is easy to become attached to such perceptions, perhaps making it even more difficult to move on.

Considerations for the future: Is there a less dramatic version of this technology that would, in fact, ease the transition and provide comfort during the grieving period?

Digital minds, or: How I learned to stop worrying and love the illusion of consciousness.

When distinguishing between robots and humans, one trait is discussed without fail: consciousness. The digital age has allowed for unprecedented expansion into the novel field of artificial intelligence (AI), but with its advancement come some startling implications. If an artificially created being is truly intelligent in that it may think for itself and learn, could it be considered ‘conscious?’ In this case, consciousness refers to the phenomenon which only we ourselves may be sure we experience. When a human touches a fur rug, there is a peculiar sensation that accompanies this act. The inherent difficulty in detecting true consciousness is that a sufficiently powerful AI would mimic it flawlessly.

However, if this experience can be quantified, studied, and replicated, it would have serious ramifications on our current definitions of sentience. For those who believe that consciousness is an entirely materialistic phenomenon—that is, our perception of reality is due to the complex molecular, and even quantum interactions that occur according to natural laws inside of our bodies—the answer seems to be yes. The underlying assumption here is that then given a powerful enough computer, consciousness could be artificially replicated. There are compelling arguments, however, that suggest that consciousness stems from something unquantifiable by physical methods (see: Mary the neuroscientist and the knowledge argument).

Suppose then that consciousness is indeed due to physical causes, and that the materialists are correct. Besides the obvious ethical consequences of creating immortal, sentient beings, many more interesting situations are possible. If the mind is a sum of information, then that information may be stored, copied, and replicated. In Netflix’s Black Mirror season 2 episode 1, “Be Right Back“, the protagonist Martha finds herself in a similar predicament. However, Black Mirror’s version of what may be called a neural upload is clearly dystopian, as her husband Ash Starmer’s duplicate displays several unsettling behaviorisms through an evidently rudimentary product. While the episode superbly demonstrates the potential downfalls of an imperfect neural upload, the point remains that Martha’s main point of dissatisfaction was that the replicate was incomplete. The viewer is left unsure of how Martha would have reacted should the replicate have been a carbon copy of Ash.

Brain Upload Neural uploads would include all of the subject’s flaws as well. But since data is divisible, why not pick choose the best qualities? Source: xkcd.com/1666/%5B/caption%5D

Examples of this hypothetical pervade popular culture as well. In Marvel’s Captain America, Arnim Zola is a Nazi and Hydra scientist who uploads his consciousness onto a supercomputer upon discovering his terminal illness. This character seems to retain all of his thoughts, desires, and emotions in a digital form. William Gibson’s Neuromancer, the father of cyberpunk novels, not only deals heavily with sentient artificial intelligence but also involves Dixie Flatline—a character long ‘dead’ but preserved in a computer. The list goes on, and the numerous examples often begin to demonstrate the ramifications of different degrees of success in neural uploads.

Arnim Zola as depicted in Captain America 2: The Winter Soldier. Source: vignette3.wikia.nocookie.net/marvelcinematicuniverse/images/1/18/Catws_03787.jpg/revision/latest?cb=20150304193103%5B/caption%5D

We consider the extreme ends of the spectrum: what if when Ash had died and Martha ordered a replacement, a molecule-by-molecule reconstruction of Ash was shipped to the small country cottage? The digitalization of consciousness seems to trivialize death, something that paradoxically defines life. Something fundamental to the human experience is lost. Even in Black Mirror, Martha never truly overcomes her crippling grief and rather stows yet another dead Starmer in the attic.   

To Minimize Emotional Damage, Keep Your AI as an Acquaintance

In “Be Right Back” from Black Mirror (season 2, episode 1), I was blown away with the technology like the ultra-slim, durable phones and the protagonist Martha’s design display that she uses for work. Even though we have similar technology today, or at least the “grandparents” of such devices, I found it fascinating to imagine using such technology. However, I wasn’t as amazed with the text-based virtual “Ash” that Martha interacts with when her husband first dies. With the recent expansion artificial intelligence into our everyday lives, such as through Apple’s Siri or Amazon’s Alexa, interacting with various artificial intelligence machines (AIs) is becoming commonplace. Creating a text-based AI to represent a person from their publicly available social media data seems like a plausible technology within the next five years.

Note that this prediction does not mention that the AI would even come close to passing as the specified human. The widespread AIs in today’s world (such as Siri or Alexa) are weak AI, meaning they excel on narrowly defined tasks, such as “Siri, who sings this song?” or “Alexa, how warm is it today?” To mimic a human, a strong AI is needed, one that has a broad, robust, human-like intelligence that can learn new things. Strong AIs are much more difficult to develop and getting humans to believe the machine is thinking is even more challenging. A common test (known as the Turing test) to determine whether an AI is “thinking” or not is to have humans spend five minutes individually interacting with the AI through the computer terminal. At the conclusion of the trial, the human decides if she was interacting with a person or a machine. If the machine fools humans in over 70% of the trials, the machine is considered to be thinking. In the episode, Martha goes through periods of believing in “Ash” (far beyond the 70% threshold) but each mistake “Ash” makes jars Martha into immediately emotionally reclassifying “Ash” as an imitation of her dead husband, and not his reincarnation. Since there is such as strong emotional connection between the user and the AI, the mistakes are more costly and Martha becomes less inclined to want to believe in the bot, limiting its effectiveness. Taking this episode as a “case study”, I think that using social media would enhance a generic AI’s personality but given the emotional toll seen in Martha, using this data to represent or recreate a specific known person would detract from its believability and effectiveness.

Martha becomes so attached to Ash’s AI, she orders a full-sized replica of Ash to complete the illusion of his reincarnation. katalinawatt2012. Digital Image. seenthefuture.blogspot.com/2013/02/studies-in-dystopia-black-mirror-be.html%5B/caption%5D

P.S. Messaging with Cleverbot is a fun way to experiment with  human-AI text discussions.

 

Are you cheating? Designing relationships in “Black Mirror”

Episode 1, Season 2 of Black Mirror undertakes a close inspection of loss and intimacy in a technologically advanced future world. While the sudden death of Ash is shocking and heartbreaking for Martha, I found myself wondering whether this loss had in fact occurred on a smaller scale long before his physical disappearance. The episode opens on a scene characterized by disconnect; Ash is consumed by his phone and Martha recognizes his absence. She later refers to his phone as a “thief”, a place to which he “vanishes”. Her language paints the phone as a felt presence, a person or place rather than an object devoid of value or feeling. A following scene in the bedroom highlights Ash and Martha’s disconnect on a more physical level. Their underwhelming sexual encounter, which leaves much to be desired, ends with the two rolling over and falling asleep. Romance, love, and intimacy appear to be at a slight loss although not entirely absent. We are left with a fractured romantic landscape: a normalized disconnect peppered with moments of connection (including singing and laughing).

We are first introduced to Ash, Martha, and their technology as three acting forces that interact and interplay in a three-way relationship. This brings to light an important question: At what point does the interaction between administrator and application (such as Ash and his cell phone, Martha and her synthesized partner) threaten and consume human relationships? Can you be jealous of a phone the way one would another lover? Is it cheating if you are emotionally invested in your technological device? And finally, how do we understand monogamy in a world in which a person can design their own pleasure?

The concept of a technological romance is further explored in the relationship between Martha and the “synthesized” version of Ash. I was particularly struck by the use of this term “administrator” in reference to the relationship between Martha and this man-made being. This word adds an unprecedented power dynamic to their relationship, one in which Martha is master and commander. This control, while at first pleasing, ends in frustrating failure. Martha seeks human imperfection, not robotic precision. She wants messy love with all its bumps and burdens, sweetness and care. However, Martha is simply a customer who has purchased a product intended to fulfill her needs. Ash is a mere construction, reflecting back only what she puts into him. In this sense, his limitation is a cruel reminder of her own; she is, for all purposes, dating herself.

Because technological devices and tools are modeled primarily around the relationship between administrator/application, creator/product, user/device, I believe it’s impossible for technology to ever fully replace human relationships. More often than not, technological relationships inspire both obsession and self-loathing: Martha’s initial fascination with her virtual partner ends with her screaming in pure grief from the cliff’s edge, loathing the very thing that she has made. We see obsession when Ash distances himself from his painful childhood by snapping an ironic picture of an old photo with his phone, turning to the phone before he turns to Martha. For both, technology serves as a buffer from the harshness of the world around them, a virtual safe space that offers complete control and power, designed and programmed to please in all meanings of the term.

Black Mirror: Season 2 Episode 1. N.d. Tayla Humphris A2 Media Exam Blog. Web. 23 Jan. 2017.

The interplay between human and technological romantic relationships in this episode reminded me of the movie Her (2013) in which the main character, the lonely Theodore Twombly, engages in a romantic relationship with the voice of his operating system, Samantha. As the film progresses, the plot explores the possibilities and limitations of virtual spaces and the types of relationships they offer humans. Both Black Mirror and Her expose the challenges of “synthesized” or technological relationships and introduce the question: How do we clarify the borders between relationships in a technologically advanced universe intent on sharing everything?

 

 

Ethics, digital suspension of death, and guilt in Black Mirror

Martha and her phone, image from A.V. Club.

As is the case with many Black Mirror episodes, “Be Right Back” addresses reality, death, and the reality (or potential nonreality) of death in a spare, effective manner. After viewing the episode, undeniable ethical questions regarding the future of death and technology linger. It’s hard—at least for me—to place a finger on why Ash’s state of quasi-existence makes us uncomfortable.

My friend’s family refers to this episode as “Attic dad,” which is funny, but also plays into the discomfort brought about by Ash’s strange existence in this story. At the end of the day, “Ash” is just another object stashed away in the attic, but he is an object that gestures towards the undeniable reality of the child “he” and his former girlfriend have together.

The cycle of life and death is grotesquely suspended in “Be Right Back,” and the moment at which this is most emphasized—even more so than when Martha realizes she is pregnant—is when their daughter asks to see “him” on her birthday. This is special, as she normally only interacts with “him” on the weekend.

Perhaps Martha’s limitation of her daughter’s interactions with the eerie organic robot indicate her desire to undo what she’s done by keeping “Ash” around and presents a sort of guilt, as does her anguished pause before climbing the ladder after her daughter. We are left to wonder what sort of human contact Martha and her daughter have outside of one another—where is Martha’s sister’s family at this sparse birthday celebration? Or her daughter’s friends? I can’t help but wonder if “Ash” drove them away. I think this isolation would be even more effective if the episode only cast Martha and Ash (and their daughter at the end), and had no other humans involved at all. It certainly comes close.

 

When you can’t move on, recreate them.

As a sentient human being, I understand the need that Martha felt when she lost Ash. Losing a loved one, especially when they are your spouse, is a difficult situation to navigate through. For this reason I understand her actions when she chose to recreate Ash in physical form. Martha was lonely and needed the comfort that she had recently lost. Were we to take the program that allowed Ash to be recreated and put it out into the world, I do not believe that it would be chaotic. For a small amount of time after Ash’s “rebirth” she genuinely felt better and it showed through her physical appearance. However, I think this program would have to be regulated and temporary.

The main issue that I have with the recreation of Ash is that it allowed Martha to become obsessive over an animate object that had no soul and technically was not real. She continuously denied calls from her sister and consistently updated the faux Ash about the progress of the baby. These actions are all very strange and indicate an obsession, as she was becoming absolved into a world that was not real.  We can also see that the obsession does not cease, as years later the faux Ash is still living at the house. A temporary rebirth would allow the grieving person to better their emotional state. In Martha’s case, she was pregnant. I am not too well versed in the process of pregnancy, but I do know that the emotions that the mother feels impact the development of the baby. In cases such as Martha’s, this program would be beneficial.

One of the ethical questions that came up as I watched the episode deals with the dead person’s ability to consent to being recreated. The first consent issue arises when Martha is signed up for the program even though she does not wish to become involved. The second issue is: would the dead person want to be recreated in such a manner? What if Ash, while living, has heard of the program yet had asked Martha to not do it? How can we determine what the dead would like to have done? I see the ethical dilemma arise from the fact that someone may not wish to be “rebirthed”, yet a grieving family member may go against these wishes. Does a person still have rights to their identity and digital footprint after they have passed? These are the questions that we must figure out before this becomes  a reality.

 


Sources:

Photo 1

Photo 2

 

Welcome to DIG 215

What’s the 21st century equivalent of a haunted house? What kind of ghosts inhabit our machines? How does technology change grieving? Where do our digital identities go when we die? What happens when technology itself dies?

In DIG 215 we will consider these questions and many more as we wrestle with the meaning of death in the digital age. You are now on the course site for the Spring 2017 version of DIG 215. The three most important documents here (so far) are the course guidelines, grading specifications, and weekly calendar.

More—including assignments and posts from students—will appear here soon…