Being a topic of literary representations for more than a century now, with a certain intensification during the ’80s and the rise of science fiction as a literary genre, artificial intelligence (AI) and its embodiment in cultural works has always addressed the questions and dilemmas regarding the limits, the fears, and the form of interactions between it and humans. Nowadays, AI is not just an aspect of utopian or dystopian future conditions. It has entered real life and thus, its ethics are not only discussed in a parallel literary level but also under very pragmatic political, social and economic spectrums. Besides, our times, or even the past decade, constitute the imagined and fictionalized time settings of many literary works of the past (e.g., Edward Bellamy’s 1888 work, Looking Backward, Stanley Kubrick’s 1968 film, 2001: A Space OdysseyA Space Odyssey). Consequently, today’s situation brings in the foreground questions such as what artificial intelligence is, what the relationship between it and humans is, what is real and what is not.
Questions that cannot be answered in a superficial way demand an interdisciplinary approach from various fields of studies and need to be answered soon, as the steps in AI development are rapid. The question, then, that can be traced in the base of arguments and views of AI is whether the technology can be considered a new form of the Other. Even before its release, HBO’s series Westworld, created by Jonathan Nolan and Lisa Joy, triggered discussions amongst film critics as well as academics on the nature of AI. However, the analysis of Westworld under a more general framework regarding the man/machine relationship also brings about an alternative view on the Othering question according to which we can view AI as a mirror of Human, and the interaction between them as a tool for bringing to the surface many aspects of our structures, attitudes, and needs as humans.
By drawing evidence from the monsters from Bram Stoker’s Dracula, Stevenson’s The Strange Case of Dr Jekyll and Mr Hyde (1886) and H.G. Wells’ The Island of Doctor Moreau (1896), Mladen Jakovljević notes that androids “are symbols of the fear of regression into the Western idea of the primitive and non-human by contagion of urban, civilized spaces inhabited by the population who are afraid of the dissolution of the (known) Empire and the arrival of the wild and uncivilized (unknown) others from the colonies”(118-119). Examining Westworld under the concept of Othering, the setting involves two groups, the “civilized” guests (humans) and the “primitive” hosts (androids), with the initial lack of consciousness and authenticity of the androids serving as an analogy to primitivity in the binary opposition between human and machine. Consequently, this analogy constructs a power relationship between them and places androids in an inferior position, thus allowing “civilized” and “conscious” humans to manipulate them.
The application of the Othering theory in Westworld triggers two serious issues. First, guests’ freedom to act violently and remorselessly in the park raises questions regarding the primitivity/civilization opposition. Second, as the series progress, we notice that the two main “gynoid” characters, Dolores Abernathy (Evan Rachel Wood) and Maeve (Thandie Newton), start developing a form of conscious thinking and challenge their identity and reality. Thus, their inferiority on consciousness is gradually deconstructed. On the one hand, Dolores, through her flashbacks and development of conscious feelings, tries to become free of human control, revolts and seeks revenge against her creators and disputes the binary oppositions human/machine, artificial/natural intelligence, which had initially set her in an inferior point. On the other hand, Maeve also tries to become free of her creators’ control within the amusement park; however, she also experiences a conscious presence in the real world. Her experience raises many questions to both her and the viewers regarding her identity and throughout the series we become witnesses of her efforts to enter real life.
What separates Maeve from Dolores is that there are only a few, if any, signs of revengeful behavior in her acts. Maeve’s identity and her efforts signify artificial intelligence’s will to merge with natural intelligence. Dolores and Maeve represent what Mary Canales has distinguished as two forms of the Other: the exclusionary one “where power within relationships is used for “domination and subordination”, leading to “alienation, marginalization, decreased opportunities, internalized oppression, and exclusion” and the inclusionary one which is related to “consciousness raising, sense of community, shared power, and inclusion” (19,20). Dolores’s lack of consciousness as a point of her otherness transforms into an evil behavior at the end of the season. Even though she develops a form of conscious thinking, she remains excluded as another binary opposition emerges, the good/evil one. In the final scene of the last episode, where she approaches Robert Ford (Anthony Hopkins) with a gun in her hand and finally shoots him in the back of his head, the reactions of the crowd in the reception reveal the sense of surprise and fear that conscious Dolores creates to them. The violence involved in her moves sets her in the second side of the good/evil opposition and, consequently, in the eyes of the guests, she represents the exclusionary Other that Canales discusses.
The situation becomes more complicated regarding Maeve’s case. Her efforts to find an identity are not accompanied by such violent acts and, thus, she does not become the evil, revolted gynoid that is, by definition, excluded by the “good” human society. She represents artificial intelligence’s efforts for self-identification and placing in the real world. Konstantin Payhert, referring to Maeve’s case, mentions that “artificial intelligence, possibly modeled in the image and likeness of natural intelligence, is capable to self-identify (or at least to self-evaluate) as the same intelligence as natural intelligence (to put between itself and it an equal sign or to put itself in one row of equivalent intelligences)” (89). Maeve neither fights for a superior position nor seeks for revenge from natural intelligence. Consequently, she remains the “inclusionary Other” that Canales defines. However, her way of entering real life blurs the lines between the “Self” and the “Other”, artificial and natural intelligence. Her technologically advanced, human-like physical appearance, combined with her development of consciousness, transform her into a being that undoubtedly shares more similarities than differences with real human beings.
Apart from the signs that AI can be considered a form of the Other, either inclusionary or exclusionary, Westworld‘s themes and topics add to the discussion of the ethical treatment of this Other. A large part of the series focuses on human nature and natural intelligence, regardless to whom human structures and behaviors are addressed. According to Larry Alan Busk, “the brutality of the real worlds to which the resort corresponds is neutralized, whitewashed, made harmless, sterilized. You are a cowboy and not a penniless prospector, a king and not a serf, a noble and not a slave” (28).
The park transfers real-life conditions into a virtual reality where nobody gets punished for their acts and repressed instinctive attitudes arise. Consequently, Westworld can be viewed as form of an experiment for humans and humanoids; the latter can be viewed not only as analogies to other ethnic and social groups that face discriminatory and inhuman practices but also as conscious beings who deserve rights. The blurred lines between reality and virtual reality, natural and artificial intelligence, transform the initially entertaining park into a third space of ambiguity, beyond frontiers, that deconstructs the binary oppositions between human and machine, the Self and the Other, and creates a prosperous ground analyzing deep structures of human nature. AI works clearly as a mirror of human; a mirror which is constructed by them and reflects what is really missing from our lives, which are our needs, our desires and our future views of the world around us. Through this mirror analogy, Westworld effectively challenges superficial approaches to AI by not only looking at robots as potential Others, but also by paying specific attention to the behaviors of its human agents.
A common way to view Westworld is as a park that provides clients the chance to fulfill needs that emerge from the individualistic nature of today’s societies. As an amusement theme park, it transfers its guests to a different reality, far from the real-world and its responsibilities. It fulfills the frontier nostalgia by bringing people to a Western environment, providing them with the chance to experience an era that has a significant place in the American mind. It gives them the chance to become the cowboy, one of the most prominent figures of the American national identity and provides the notions of “escapism” and “fantasy” that Michael Coyne attaches to Western as a cultural genre. A part of the nostalgic frontier, by transferring its guests to a Western environment, Westworld also takes them to a far more simplistic structure of socioeconomic relationships. Like all amusement parks, Westworld “provides all who come with a chance to be something other than what they are — workers, bosses, fathers, mothers, sons, daughters, anyone with responsibilities or socio-economic functions” (Nye, 66). In other words, the guests have the chance to escape the complexity of modern capitalist reality and travel to a reality that does not demand any effort from them.
More importantly, this reality also provides them with notions that cease to exist during our times, such as romance and adventure. These notions are embedded in a more general need for social interaction, which is fulfilled by the technologically advanced, human-like robots of Westworld. These robots embody de Graaf’s idea of the “social robot” as they “elicit social responses from their human users because they follow the rules of behavior expected by these human users” (589). They are designed according to codes of human vs. human communication and they provide real life social interaction to the guests of the park. De Graaf concludes that these robots do not completely substitute human social interactions, and specifically friendship, “since they genuinely lack mutuality and reciprocity” (594). However, humans “willingly initiate different types of relationships with these robots” (594), a notion that becomes evident in Westworld, with the most resonant example of William’s (Jimmi Simpson) amatory feelings for Dolores.
Apart from the fulfillment of humans’ social needs, though, the basic theme of artificial consciousness is also surrounded by the topics of violence, exploitation and agency. The promptings of Delos Inc. commercials to “Live without Limits”, followed by the guests’ behavior denote that, beneath the superficial issue of entertainment, Westworld brings to the fore aspects of human societies (past and present) that are, to say the least, “problematic”. When asked about the themes of the series, showrunner Lisa Joy responded: “Westworld is an examination of human nature, the best parts of human nature – paternal love, romantic love, finding oneself – but also the basest parts of human nature – violence and sexual violence.” The amusement park provides the chance for exploration in the Wild West, adventure and romance; nevertheless, it becomes apparent from the first scenes of the series that exploration and adventure transform into killing human-like robots and romance into rape and fulfillment of primal sexual urges of the guests. Even though they seek authenticity and indistinguishable virtual reality conditions, both the administrators and the guests of Westworld treat the humanoids inhumanly and transform their relationship into human vs. object opposition.
William’s case perfectly illustrates the topic of violence of human beings; he is a young newcomer in the park, brought there by Logan, his brother in-law, and he initially sympathizes with the humanoids and questions Delos’s practices and the general idea of Westworld. In his real life William describes himself as the good guy, “a philanthropist, a family man, married to a beautiful woman, father to a beautiful daughter” (Westworld). Additionally, after his first moments in the park, his words addressed to Logan (Ben Barnes) depict his challenging attitude towards the philosophy of the park: “Can you please stop trying to just kill or fuck everything?” (Westworld). However, toward the end of the season it’s revealed that William had visited the place 30 years before the present setting of the series and he is the Man in Black, the most violent character of the series who, during the search for his past, does not hesitate to shoot and finds pleasure in killing other conscious humanoids.
Through the aesthetic representation of this antithesis, Nolan and Joy manage to depict how even the human characters that treat the humanoid Other differently become the worst exploiters of it. Furthermore, they manage to depict violence in the visitors’ acts whether the receivers are non-human or human. After the bloodthirsty acts of the visitors, the viewers witness the administrators of the park treating the humanoids with an unfeeling indifference even though, while rebooting them, they talk and socially interact with them. Violence becomes a key concept of the series, eventually tainting all the characters and raising questions regarding the human treatment of their creations. Humans’ experience in Westworld is not only an entertaining trip to this artificial Wild West; it’s also an exploration of their aggressive and violent tendencies.
Apart from the primal fantasies of bloodthirsty killings by the human visitors of the park, violence is also expressed in sexual contexts. Delos’s promises for experiencing a romance quickly transformS into rapes and sex with prostitutes. If merciless killing for pleasure seems somehow far away from reality for some viewers, then rape and sexual abuse in Westworld could not be more representative of what takes place in real life, human vs. human, and mostly (but not exclusively) male vs. female relationships. In one of the first scenes of the series Dolores is sexually assaulted. Maeve is a brothel madam who cannot say no to the men entering her house. The structure of the characters also adds to the gender framework; the majority of the visitors are wealthy men, while the main host humanoid characters are women.
The features of the combined female/humanoid identity objectify them and adds to the already existing relation between natural and artificial. Female humanoids are created only to serve the sexual desires or get “killed” by the wealthy male visitors of the park. For the latter, Westworld is a place to express the deeply rooted patriarchal ideas that they share without any punishment. These scenes may be viewed as reproduction of gender stereotypes. Zoe E. Seaman-Grant successfully argues that Dolores’s abuse creates a sense of pity directed to her by the viewer but there is no specific development of her character as a woman (75). Nonetheless, these scenes may also be viewed as a mirror of reality. Sexism is, of course, a large problem in human societies and is also transferred to the humankind’s creations. If Westworld‘s setting is futuristic, the actions that take place in the park provide a warning regarding the future of the real world; a world that could very possibly transform into a Westworld, where gender inequality in America may become even more intense.
The power relationships that emerge in and out of the park, between humans and robots, androids and gynoids, can also be viewed as an analogy to the situation in today’s capitalist societies. The fee that the guests of Westworld have to pay daily transforms the notions of love, romance, adventure and social interaction that Delos promises to them into expensive commodities that can be easily bought by wealthy entrepreneurs. Frontier nostalgia is used by Delos to gain profit through consumerism. Lary Alan Busk, referring to Westworld, states that “for a price, one can bracket the real world and live the simulation” and continues “as the daily life of contemporary capitalist society becomes increasingly bland and monotonous, administered and uniform, the excitement and escape offered by the exotic simulated worlds is irresistible, and for this reason almost “believable” (28). For the capitalist entrepreneurs that visit the park, the exotic humanoid Others work as a commodity which can be consumed without any consequences, they become the objects of an exciting paradise where the former can get rid of the monotony of the real-world, yet they transfer the notions of this capitalist world to the park. What the wealthy guests of Westworld fundamentally buy is a desire to dominate, with robots being the commodity, the object that will define their domination.
In a very aesthetically rough and violent scene of the series Logan stabs and disembowels Dolores in front of William’s eyes in order to convince him that she is a senseless robot. The violence in his act becomes the signifier of his domination. Westworld provides him with the chance to express his meanest ways of managing and maintaining this domination. Once again, Westworld and the treatment of the humanoid Others works as a tough but functional experiment that also serves as a warning for human behaviors under capitalism.
The application of Othering theories to the actual themes and topics of the series shows that humanoids are viewed as both a metaphorical and a pragmatic Other. The human/machine interaction, the human reactions towards the development of consciousness of the humanoids, the senses of fear and excitement towards the robots of Westworld reveals that, whether exclusionary or inclusionary, AI develops as a form of the Other. Whether or not the robots in Westworld and in our real lives alike are viewed as a form of the Other, the series provides a prosperous ground for discussions regarding the treatment of robotic “beings”. What matters, Westworld seems to say, is not only the victimization of the humanoids, but also the expression of the instinctive, deeply rooted violent and cruel behaviors of the humans that victimize them. In other words, it’s important to isolate these acts and analyze their nature, regardless to whom they are addressed, as the humanoids may be viewed as both representations of racial, gender minorities and as beings that have developed a form of conscious thinking and deserve rights.
Critiques on the series discuss the dark, dystopian and pessimistic messages that arise from its themes and topics. Westworld, though, constitutes a “black mirror”, a reflection of what is happening in today’s societies and a warning regarding the future, which could very possibly be dystopian if specific issues of human attitudes and structures do not change. AI is increasingly occupying more place in our everyday lives, raising questions in political, social and economic disciplines. Scholars suggest that now is the time to discuss the feature of this relationship in order to define the stance that we should maintain towards its development and, perhaps, how we should limit AI. This incitement is undoubtedly essential, yet robots are created through the needs and behaviors of human beings. Consequently, prior to any reactions of fear and excitement regarding its development, AI could be approached through a reflective lens through which we will be able to understand ourselves as humans.