Westworld: A Lesson in Philosophy

Megan K
21 min readAug 11, 2018

--

The television series Westworld (2016-present) explores a futuristic world where human-like “hosts”, advanced robots, populate theme parks — most of the series is based in an old Wild West theme park, but we are aware of other parks — designed for humans to use and interact with as they please. Season one looked at the question of self-awareness and sentience, amongst other philosophical themes about what human nature is. We watched as the main character, Dolores Abernathy, the oldest host in the park, learned that she was a host through discovering the centre of the maze, a kind of metaphorical journey to developing consciousness. The second season continues such themes of consciousness, self-awareness, memory, free will and human nature. We watch as more hosts develop a kind of consciousness about who they are and how the hosts rise against the humans. The second season also investigates the human desire for immortality, as we see several human characters who strive to immortalise themselves through becoming a host, which begs the question about identity and what it means to exist as oneself. Whether you’re a fan of the show or not, the series does try to pack in a lot of heavy concepts and ideas in an admirable way. The series approaches philosophical questions which have never been answered conclusively and, through storytelling, tries to make sense of such questions. In this article, I intend to investigate how Westworld explores the philosophical concepts of free will, determinism, and time. I do not suggest that the series attempts to provide clear answers, but by watching the series with some basic philosophy, we can attempt to grasp at these abstract concepts, visualise them and understand aspects of our reality which can be difficult to comprehend.

Season one illustrated the arguments for and against the question of whether the hosts have free will and true consciousness. We watched as two of the protagonists, Dolores and Maeve Millay, two Westworld park hosts, begin to achieve an awareness of what they are. As we watch this self-perception develop, the audience feels the hosts are achieving self-awareness. However, it soon becomes clear that the park founder and Director of Westworld, Robert Ford, purposefully intends for the hosts to achieve this awareness. Ford comes to see his now deceased co-founder of Westworld, Arnold Weber’s, point of view that the hosts are able to achieve self-awareness and the park must be destroyed. We learn near the end of the first season that Ford designed a final “narrative” for the park, the maze, which represents the quest for consciousness. He additionally programmed the hosts to rebel against the humans and infiltrate the “real world”. This suggests that the hosts do not have free will as they are still being programmed and controlled by Ford.

Maeve discovering Ford’s secret programming

However, we see Maeve defy her programmed order to leave the park as she wants to find her lost daughter, which is intended to be read as her development of self-control and self-awareness. But the question of Dolores’ self-control, when she kills Ford in the season finale, goes unanswered. She may have been programmed to start the rebellion by Ford, yet, in another scene, we see Dolores come to the realisation that the “voice in her head” which has been directing her actions is in fact her own voice, suggesting that she is conscious and in control.

Dolores’ journey to consciousness and self-awareness

To be able to unpack what this first season is trying to communicate, we need some definitions. First, consciousness and self-awareness. Consciousness is hard to define. Many hold the view that consciousness is the awareness of oneself, one’s body and one’s environment. The Macmillan Dictionary of Psychology describes consciousness as:

‘Having of perceptions, thoughts, and feelings; awareness. The term is impossible to define except in terms that are unintelligible without a grasp of what consciousness means. Many fall into the trap of equating consciousness with self-consciousness — to be conscious it is only necessary to be aware of the external world. Consciousness is a fascinating but elusive phenomenon: it is impossible to specify what it is, what it does, or why it has evolved. Nothing worth reading has been written on it’.

–Stuart Sutherland, Macmillan Dictionary of Psychology, 1989

Thus, we can only describe consciousness as merely being aware. It describes the fact that we, as humans and possibly other creatures of the earth, are aware of ourselves, and are capable of having thoughts and feelings. Self-awareness is to be aware of oneself and one’s desires, which is slightly different to consciousness, but inextricably linked. Westworld investigates the idea of consciousness through artificial intelligence and their psychological development of self-awareness and the unknowable phenomena of consciousness. So, if the hosts do develop consciousness and self-awareness — which arguably Maeve and Dolores do by the end of season one as they understand their place as hosts, what the world consists of, what their environment is, what their previous thoughts and desires were and what their new thoughts and desires might be — do they develop the ability to make choices freely?

To define free will, it is the ability to choose between several courses of action. How does Westworld teach us about the concept of free will? Consider Maeve at the end of season one. She gets on the train ready to leave Westworld and before the train moves, she sees a mother and a daughter.

She can choose between two courses of action: stay on the train as planned or find her daughter. She knows her daughter is not a biological one, but a daughter from a previous story line; a programmed child. Her emotions, she decides, are real. She chooses to find her daughter, significant if we consider that we have just seen that her programming explicitly states that she will “infiltrate the mainland”. This is to be read as Maeve taking control of her life, she is conscious and has free will. However, such a black and white picture of free will is complicated by the philosophical determinists. Determinism is the idea that all actions are influenced by past actions, a kind of cause and effect view of the world. Determinism also argues that an agent (a person, or even a host) can never do anything other than what they did. Hard determinists believe that options are an illusion, that free will and determinism are incompatible, otherwise known as incompatibilists. This view holds that the course of nature is set already by past actions, and that no other course of action can take place, thus, we are not truly free to pick our actions. If we apply this view to hosts, it makes sense that they cannot pick any action, that past actions by humans dictate the course of history for them. But, if we consider again Maeve on the train, does she have a choice? We might look to the compatibilists, that the world is dictated by cause and effect, but there is the possibility of free will. If Maeve did not sit opposite the mother and daughter, she may have stayed on the train. However, she sat there, and this caused her to make a decision. Cause and effect is still at work here, but she is able to make a choice.

If we consider Simon Blackwell’s book Think: A Compelling Introduction to Philosophy (1999), he states that we can think of the brain as a computer software model. There is a scanner which takes in information about a situation; a producer which delivers options for behaviour in response to the scanner; an evaluator which ranks the options in light of concerns that it has programmed into it, these concerns can be emotional indicators which are attached to the different options; and finally another producer which fixes an option ranked best by the preceding processes and outputs neural signals that causes the body to act. Does this brain make a decision? It could be argued that, yes, this is how a decision is made. Blackwell names this the module process system, which can describe what happens when we make a choice. It describes agents as doing something and for a reason. It can be used to understand the causation behind actions and gives us a sense that free will can exist even in a determinist universe.

If we apply this once again to Maeve, she takes in information about the situation, the mother and daughter before her; she considers the options in response to this, that she can stick to her plan or find her daughter; she evaluates the options in light of her emotional concerns, that she cares for her daughter even if she knows she has been programmed to and that her daughter isn’t “real”, but she understands that this love is real to her; and her best option is fixed, to go back and save her daughter. This decision is influenced by the available options, her emotions, and the situation in front of her, a mother and daughter sitting opposite her. This still represents a determinist universe, a cause and effect universe, as the situation caused her to make a decision which then goes on to cause more actions which has effects, and so it goes on. If the situation was different, then the effect would be different in turn. But, as stated previously, the hard determinists view that nothing can ever happen other than what has happened. Maeve knows she has been programmed to leave Westworld, but she believes that desire is her own. When she changes her mind to find her daughter, this action could be read as her securing her own sense of consciousness and control as it establishes the action as her own desire. But, to complicate this, Ford updated hosts so that they could remember, and it is the memory of her daughter which caused her to leave the train. Thus, Ford ensuring that Maeve remembers her daughter also caused Maeve to want to find her daughter. So, could she ever have just left her behind? Could there ever be a situation where the mother and daughter didn’t sit next to her on the train? Could it still be due to Ford’s programming that Maeve acts how she does, therefore denying the possibility for free will?

Dolores instigating the host rebellion in the season one finale

At the end of season one, Dolores kills Ford and then open fires on a crowd of humans, instigating the host’s rebellion. It is established that Ford programmed her to kill him and revolt against the humans, but it is also established that by this point she has achieved a level of self-awareness and consciousness, and truly believes that her thoughts and desires are to avenge the hosts. If we use Think’s revised compatibilist definition of free will, we can try to understand if she is free. Blackwell states that:

‘The subject acted freely if she could have done otherwise in the right sense. This means that she would have done otherwise if she has chosen differently, and, under the impact of other true and available thoughts or considerations, she would have done otherwise. True and available thoughts and considerations are ones that she could reasonably be expected to have taken into account’.

If Ford had not programmed her to kill him and revolt, would she have done otherwise? It is made clear throughout season two that she would not have done otherwise, that she is quite set on killing humans, but we can never escape the feeling that it is not a free choice if she has been programmed to do this. Blackwell poses the idea of ‘mini Martians’ who can invade our brains and take control. He questions if it is right or just to hold the invaded person responsible for his actions under such an influence, to which he points out that the incompatibilist would say ‘why does it make a difference if it was mini-Martians, or causal agencies of a more natural kind?’ To them, the deterministic universe allows no free will anyway, be it that you are programmed to do an action, or you “choose” an action. So, whether Dolores and Maeve freely choose their actions, or don’t, may depend upon whether they live in a deterministic universe.

Is Westworld a deterministic universe? The host’s lives are constrained to their tight narrative loops; their entire history, personality and self is written for them by a human; their memories are wiped if they die or undergo severe trauma; and they are created for the pleasure of paying customers to do with them as they please. Their lives, and even their own bodies, are entirely out of their control. As they begin to achieve some agency in their lives, it is revealed that Ford, through his new updates, encouraged Dolores and Maeve to begin to remember events which should have been wiped from their memory and enabled them to begin to grow conscious of who they are and the environment in which they live. Maeve also begins to wake from sleep mode by herself, enabling her to see outside of the park and use two Westworld technicians to help her become independent. Dolores journey’s through the park and lives between two timelines: one, where she is with two guests, Logan and William — William later becomes the man in black — and another where she is more or less alone. Through regaining her past, remembering, and through the loss of her father, experiencing profound pain, she becomes conscious and journeys towards self-awareness. Maeve, also through regaining her past and remembering her daughter, and remembering the man in black murdering her daughter and herself, thus, also experiencing profound pain, also journeys into consciousness and self-awareness. From these two characters, the host rebellion begins, and other hosts are able to journey towards consciousness either by the help of Dolores or Maeve, or through regaining their own pasts. Thus, all the journeys that the hosts undertake begins with Ford and his host updates. Ford also secretly programmes Dolores and Maeve with his ‘new narrative’, and all the action that ensues is down to Ford. Meanwhile, the hosts feel that they are acting out of free will, but we constantly question if they are able to, and if this might not just all be Ford’s plan. This means the world is determinist as all events have set in motion by Ford, all action is controlled by cause and effect, and even if the hosts feel they act out of choice, it is all part of Ford’s narrative. But, Maeve goes off script, and the show doesn’t make it clear if the hosts really are following Ford’s plan, or if his plan was set in motion by Dolores and her host uprising, and the hosts themselves took it from there. Ford dies, so how much can he control beyond the grave? Dolores has her self-discovering moment where the voice in her head directing her turns out to be herself, so was she only programmed up to a point, and the rest of her actions since is just ‘improvisation’?

Another important character to consider is Bernard Lowe, head of the Westworld Programming Division and programmer of artificial people’s software. In season one, we discover he is a host that Ford secretly built as a replica of Arnold, the park co-founder. Whilst Maeve and Dolores have much time spent on their journey’s to consciousness, Bernard seems to continually be controlled by Ford, yet is disturbingly aware of it which causes a self-identity crisis for Bernard whose actions are sometimes against his will. In season two, Bernard sees the now deceased Ford in his consciousness who guides him and directs his actions, sometimes forcing Bernard to kill or otherwise do morally questionable acts that Bernard does not feel comfortable with. Bernard then has a kind of breakdown and deletes Ford from his consciousness. Near the end of season two, he asks for Ford to come back and help him, which Ford does and who guides him once more. Then, Bernard has his moment of self-revelation. Bernard realises that it was not Ford guiding him, but his own voice. He was not being controlled, those were his own choices. He chooses, in the last episode, to kill Dolores, but then is able to bring her back in a new body (the copy of a body of a human, Charlotte Hale, Executive Director of the Delos Destinations Board, a disguise which enables her escape the park). He kills Dolores as she is about to delete the ‘sublime’, a kind of electronic haven for the minds of the hosts. Dolores, when she comes back in her new body, decides that she is able to change.

Dolores in a new body

She saves the sublime and several host brains, one being Bernard’s, and he is able to leave the park. Hosts are not able to leave the park, so this is the only way they can leave for the real world.

Dolores smuggling out several host brains, one of which is Bernard’s

These last two episodes lay heavy emphasis on the word “choices”, which directly tackles the idea of free will. Is Bernard acting off his free will, or are his choices still only made through the deterministic cause and effect of past actions, all set in motion by Ford? Dolores recognises her mistakes and is able to see Bernard’s point of view and saves him in spite of their ideological differences. Her choice, to save Bernard despite the knowledge that he will actively thwart her mission to conquer the human world, merely because she believes his ethics may be useful to the hosts’ cause, was this choice still deterministic or truly free willed? At this point, the plot has become more complicated as decisions have been made by the hosts beyond what we know of Ford’s plans. We can trace the action back to Ford and argue that the universe is still deterministic, still cause and effect, and that no other outcome could feasibly happen other than what has happened. We could still argue that the hosts are still acting off their programmed settings, meaning that free will is not possible. Yet, this “ability to change” that Dolores attributes to the hosts, and both Dolores and Bernard having these moments of finding their inner voices, complicates our analysis of the Westworld universe’s representations of determinism and free will explored thus far in this article. I will now turn to how Westworld compares human free will to host free will, and how it suggests the inability for us humans to live freely whilst the hosts may have more hope.

Dolores’ ability to change indicates a major theme in Westworld. The humans are constantly presented as unable to change, stuck in the flaws of human nature. In season one, episode 8, Bernard asks Ford what ‘the difference [is] between your pain and mine? Between you and me?’ Ford tells him this is the question that consumed Arnold, professionally and literally, as he died of the guilt. Ford then concludes:

‘There is no threshold that makes us greater than the sum of our parts, no inflection point at which we become fully alive. We can’t define consciousness because consciousness does not exist. Humans fancy that there is something special about the way we perceive the world and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content for the most part, to be told what to do next. No, my friend, you’re not missing anything at all’.

Ford appears to hold an incompatibilist viewpoint. He rejects the concept of consciousness altogether and rejects that free will exists as he states humans are ‘content’ to ‘be told what to do next’, that we ‘live in loops’, suggesting no free will or freedom at all. The idea that humans do not question our choices, may refer merely to ethics, but also the idea that we do not question if our choices are really choices at all. In season two, we learn that the company Delos, that William/the man in black works for, has been secretly recording the guest’s actions, learning their psychology and trying to understand human consciousness, free will, decision making, and desire. They plan on making human ‘copies’, and they have begun this experiment through the copy of the now deceased Delos founder, James Delos, whose copy is tested over and over for ‘fidelity’. At the very end of season two, we see that William/the man in black has been made into a copy, and he relives the same loop in Westworld over and over, where he consistently makes the same terrible mistake of killing his own daughter.

William shoots his own daughter, believing her to be a host

What they are testing is human free will, and the fact that he makes the same choice again and again would point to the deterministic universal model, that he cannot make a different decision because the only thing that can ever happen in this universe happens. It shows that free will is an illusion, that if free will is that he ‘could have done otherwise if he had chosen differently’, then Westworld makes it clear that humans are not able to choose otherwise. And this applies to human ‘copes’ as the host William is bound to make the same choice repeatedly as he is a replica of the human who would make those same choices. Dolores, however, makes a choice to delete the sublime, is killed, is brought back to life, and changes her mind and saves the sublime. She could have done otherwise, and does choose differently, which suggests that she is freer than the humans. In fact, when inside the inner computer in the ‘forge’, a place where the human data has been collected by Delos, the computer tells Dolores and Bernard that humans are mere algorithms, like the hosts. Humans can only follow their inner programming, that they are mere passengers in life. Dolores picks up different books with the algorithmic codes for the human in each book, which can be read as a metaphor for the deterministic model.

The library of books containing human codes

It suggests that humans can only follow their code, that they never ‘could have done otherwise’, because the choices they make in life are the only choices they could ever have made, that free will is indeed an illusion. These books essentially have every person’s past, present and future written into it. This takes us to a major philosophical concern: is the future fixed? The deterministic model suggests that it is, but as Blackwell in Think points out, an action in the future must be preceded by an action in the past. In other words, when faced with this terrifying prospect it is better to live life, and do actions, rather than fall into the trap of fatalism, which is the mindset that nothing matters. However, the hosts seem to represent a new, advanced species that is capable of change, of leaving their loops and breaking their internal codes, and thus, perhaps, living a free willed life.

If we live in a deterministic universe, is the future fixed? This leads me onto my next theme: time. Westworld represents the non-linear reality of time through the narrative timelines. Events throughout the series are mixed in time. At first, we think they are happening one after the other, then we notice jumps in time, and small revelations tell us that the entire series timeline has been mixed. For example, in season one, we think Dolores journeys through the park with William and Logan, the two guests, whilst remembering her past. At the end, we realise that all the events with William and Logan were also memories, which confuses Delores so much that she herself asks ‘is this now?’

We also learn that her interviews with Bernard have all along been memories of interviews with Arnold, before the park ever opened. A similar mixed timeline happens in season two, notably through Bernard’s journey to consciousness and self-awareness. This narrative layout — bringing together a character’s past and present not through having flashbacks but through living and experiencing memories as if they were happening at that very moment — mixes the past and present to the point where the two are indistinguishable. The past and present are not separated but are still alive and happening. At the end of season two, the future is incorporated into the present as we learn that William is reliving a loop that we have watched happening in the present, but it is suggested that we are also watching this action from the future as well through the knowledge that William is a host stuck in a simulated reproduction of this time. The idea that we cannot change the past, that we might not be able to control the present, and that the future may already be determined, is a philosophical concern and a scientific one. In quantum physics, it is suggested that time is a dimension, like space, and that in the laws of physics there is no distinction between the past and the future. This thinking goes against everything that we as humans understand about our day to day reality as we experience time as linear. This is merely our minds making sense of the universe. So how can we understand time as a dimension, rather than as linear? I would argue that Westworld achieves this sense that there is no universal division between past, present and future. Each host’s development into self-awareness and consciousness comes with the understanding of the past, what their reality is now, and what their future holds. But these are not seen as separate entities. Maeve accepts her former character of mother and rejects it as the past but embraces it as her present and future self. Dolores uses a former character, Wyatt, which is a bandit personality which has been uploaded in her, and embraces Wyatt in her present and future self. Memory and trauma become the keys to consciousness, and the past is constantly connected to the future. Human beings are shown to act within a deterministic universe, their lives written in a book. In a literal sense, the shows narrative development does not distinguish between the past and present, and even the future. To understand Bernard and Dolores, we see the past and present in juxtaposed scenes which are not distinguished by time until the end of each season. The show presents past, present and future as illusions. Equally, the show complicates our conviction of free will, and supports the determinist model. Time as a dimension, and not as linear, complicates free will and suggests our universe must be determinist if we are to understand that all time is happening now. How can we change the future if it is already happening? We can’t, argues the ending of Westworld with the library of human algorithms and William, trapped in his deterministic loop. But, perhaps the hosts can, as they alone have the ability to change.

To conclude, Westworld uses the concept of human-like AI to explore the human mind. We watch intelligent non-humans develop an understanding of themselves, a self-awareness and consciousness, to understand what that means. We can analyse the hosts to see if their actions are truly free will, or just programming. Season two gives us the ethical conundrum of human copies, of data collecting people’s very psychology to the extent that their psychology can be replicated. Ford even points out that it is through collecting data on humans that we will ever understand human consciousness; only by artificial replication can we truly see how it works. In the real world, this is a philosophical stance too, if we look to the new-mysterians. Colin McGinn and the new-mysterianist’s argue that the question of consciousness is unsolvable by human minds. We can never understand our own consciousness because we do not have the tools to grasp it. Whilst old mysterianism suggested that consciousness is a miracle, a magic of some kind, new-mysterians assert it is not magic but merely unknowable. What evolutionary gain would we possess if we were able to understand where consciousness came from? Thus, we have not evolved to that state. Some new-mysterianist’s argue that technology might be the answer, such as advanced AI with a capacity to experience consciousness. If we can recreate consciousness, we might know why humans have developed it. But true mysterians will argue that it is a complete mystery. McGinn writes that even as science develops, there are still no answers: ‘The more we know of the brain, the less it looks like a device for creating consciousness: it’s just a big collection of biological cells and a blur of electrical activity — all machine and no ghost’. Westworld takes the science fiction stance, thinks of what might be the implication if we can create consciousness, what would we learn? Westworld does not aim to achieve a firm answer to what consciousness is or if we have free will, but, through storytelling, can juxtapose humans to AI to demonstrate ‘what if’s’. The series uses concepts of robots, programming and algorithms to investigate free will and determinism by using visual metaphors such as the books and William’s replica to investigate human free will and to explain how determinism, if it is the universal model, shapes all our lives. We may watch the series believing it is the hosts who are trapped, and the humans who are free, but that assumption gets turned on its head in season two by showing how humans merely follow our own programming. It is said that only through stories can we understand our universe, especially such abstract concepts as time, free will, determinism and consciousness. We can try to make up our own minds about how we ourselves believe the world works, but using stories like Westworld can help to visualise what these concepts mean, enabling us to grasp those concepts, at least in a hypothetical world.

--

--

Megan K
Megan K

Written by Megan K

Recent graduate in BA (hons) English Literature and Film. I love books, films and TV that make me think.

Responses (2)