Megan K
17 min readMar 31, 2021

--

Algorithmic Futures: How Futuristic Fiction Portrays Human Free Will in Neoliberal Capitalism

*Spoilers ahead for Westworld (2016-present) seasons 1–3*

Promotional Image for Westworld Season 3

In narratives depicting the future, usually science fiction texts, technology dictates the story. The sub-genre of the “15-minutes from now” science fiction proves popular television programming in recent years, from the cult classic Netflix anthology series Black Mirror (2011-present), to HBO’s Westworld (2016-present), and the French Netflix series Osmosis (2019). These three series share themes of Big Data and algorithmic manipulation in the context of the near-future. They ask the question: in societies so controlled, quantified, and algorithmically “predictable”, can humans have free-will, or are we always being guided and manipulated by our technology? This question has long been debated between the technological determinists and the social constructivists; the former arguing that technology and its development is responsible for social and historical change or progression, in other words, that technology changes society, whilst the latter argues that social and historical conditions produce a society’s technology, or, society makes technology. Often, the view of capitalist technological progressivists like Elon Musk or Jeff Bezos portray technological determinism as holding the belief that technology alone can save humanity. Reductive views like this point to the larger issue that technology now equates to capitalism, and that capitalism is the winning social order. This is a point to bear in mind, as it points to the prevalence of Capitalist Realism.

To go back to the actual definition of the phrase, technological determinism as a term was ‘coined by Thorstein Veblen’ but has historically been associated with Karl Marx, ‘although many philosophers and historians now also question whether Marx himself was really a technological determinist’. However, in summary, ‘Karl Marx believed that technological progress lead to newer ways of production in a society and this ultimately influenced the cultural, political and economic aspects of a society, thereby inevitably changing society itself. He explained this statement with the example of how a feudal society that used a hand mill slowly changed into an industrial capitalist society with the introduction of the steam mill’. In this reading, it was the change of technology that created a new industrial capitalist social order and the values attributed to such a social order.

The main issue with the idea of technological determinism, apart from its reductive ‘unicausal explanation of historical change’, is the fact that people make the technology, and they make it within specific socio-historical periods, responding to the needs and wants of the population of the time. Softer versions of technological determinism ‘do[…] not, on the face of it, presuppose autonomous technology. It could be that free, creative inventors devise technology, but that this technology determines the rest of society and culture’. To explore this further, we need to examine some proponents of this view that technology produces society, as it can be hard to shake the feeling that society is the producer of all things, not the objects themselves.

Jacques Ellul, author of the highly influential book The Technological System (1954), ‘claims that modern technology, unlike traditional technology, is not bound by any heteronomous rules or principles, but develops according to its own rules […]. Rather, technological change itself now defines the context of other aspects of culture such as capitalist competition for survival in the market. The pursuit of human well-being, presumably the purpose of technological development, is replaced by obsessive pursuit of efficiency’. Ellul argues that modern societies are driven by a need to progress for progress’s sake, where technological development is the ultimate end in itself. In this way, technology is autonomous as it adheres only to the value of progress itself. This view suggests that societal and historical development are led with the common belief being ‘that technological progress equals social progress. This was the view of Lenin (1920) when he claimed that “Communism is Soviet power plus the electrification of the whole country” and it remains the view of politicians of all political persuasions’. This conflation of values gives rise to a society as technologically driven as our modern-day one, where we exist in a world ruled by the unregulated monopoly of Big Data, almost untouched by privacy regulation, where innovation is the ultimate value.

Conflating the identity of humanity with its technology is not new, as ‘Hannah Arendt (1958: 144) wrote, “[t]ools and instruments are so intensely worldly objects that we can classify whole civilizations using them as criteria.” Not only can we, but frequently we do; thus, we speak of the “stone,” “iron,” “steam,” and “computer” ages’. So we can see that part of technological determinism rests on an analysis of how modern societies determine their progress in terms of technological progress. It also rests on the recognition that modern society relies on technology to solve its problems, even those caused by technology, a point articulated by political theorist Langdon Winner, who ‘claims that autonomous technology is revealed most clearly in technological politics. Examples include the political imperative to promote technology, because problems from one technology require another to address it’. This reliance on technology has gotten us to a place of ‘autonomous technology’. Winner describes ‘autonomous technology’ as technology which ‘has grown out of control or develops independent of any particular human intention or plan’. Though related to technological determinism, the concept itself is not reliant on it. Winner is not a determinist as he argues that it is humans who invent the technology in the first place, thus, departing from Ellul’s more hard-line perspective. Langdon argues that it is humans who allow technology to get out of control and become autonomous ‘by mistakenly ignoring its political dimensions’. This position puts him in opposition to many social constructivists who would argue that technology is neutral and that social, political, or economic forces are what shape the ways that technologies are ultimately used. Though he recognises that these social forces exist, Winner argues that technologies have politics built into their systems and are far from neutral.

Winner’s highly influential article “Do Artifacts Have Politics?” (1980) articulates that all technologies have inbuilt intentions, biases, and politics. He famously uses the example of Robert Moses’ low-hanging underpasses in Long Island, which he built to exclude buses from being able to access Jones Beach, whereas cars could fit. This was to exclude the lower-income population and racial minorities, those who rode the bus, from accessing places of recreation that Moses wanted to keep white and wealthy. Winner points out that ‘to our accustomed way of thinking, technologies are seen as neutral tools that can be used well or poorly, for good, evil, or something in between. But we usually do not stop to inquire whether a given device might have been designed and built in such a way that it produces a set of consequences logically and temporally prior to it’s professed uses’. Such consequences of technology can last generations, according to Winner, and dictate how people ‘work, communicate, travel, consume, and so forth’. He discusses how the politics of technology come from specific design features or arrangements which establish power structures in a particular setting. He also discusses how technological design may also be inextricably linked to institutional authorities. Writing this in 1980, he uses the example of the Long Island underpasses, as well as televisions’ ability to promote political leaders more easily, and the controversies around nuclear power that existed at the time. In 2021, it has become even more important to consider the non-neutrality of technology and its inbuilt biases. Looking at social media platforms, the value of “engagement” over all else is an inbuilt value that is well-known to us, the consequences of which we are coming to terms with now.

Using concepts from technological determinism in thinking about the ways that technology shapes our modern world (though, perhaps in terms of soft determinism, so as not to ignore economic, political, social, and historical factors), and combining it with concepts of autonomous technology as discussed by Winner, let’s take a look at the ways algorithmic technology are represented in the media. By looking at near-future fiction, with a focus on Westworld, we can look at how technological determinism in the context of neoliberal capitalism traps its characters in a dystopia not dissimilar to our real world.

Westworld, a HBO streaming series based on the 1970’s film of the same name, begins its story in Westworld, a Wild West theme park populated by human-like “hosts”, meaning robots, who exist to serve the whims of rich humans who go there on vacation. The narrative unfolds from the point of view of the hosts who start to question the nature of their reality. Hosts’ memories are wiped clean so they can’t remember being raped and killed, and are given backstories and identities, but they start to become aware of the true reality of their situation through remembering and suffering. Thus, the series begins by questioning the ethics of torturing and killing human-like AI, asking the viewer to consider whether the hosts have consciousness or free will. The humans in the series, with their addiction to escaping into fantasies in opposition to the hosts who are constantly looking for something real, seem less “real” than the hosts. The park’s creator, Robert Ford, evocatively suggests that humans live ‘in loops tighter than the hosts’, that consciousness is merely a myth made by humans to feel superior, and that humans lack the empathy and selflessness that the hosts’ exhibit. The humans are shown to be machines, easily influenced and seeking distraction. In season two, the series integrates the ethics of data mining. It is revealed that the people in charge of the park have been mining the guest’s data while in the park for shady reasons, including an attempt at cloning humans. Dolores, the main protagonist and first host to achieve self-awareness (on par with Maeve) lead the host revolution to escape their perpetual nightmare and destroy Westworld, taking some of her fellow hosts into the real world. In her quest to take down the human race, Dolores announces that humans are mere code that can be read like books. This is an interesting image to surround the ethics of dehumanising people as merely profitable data. The third and most recent season revolves around Rehoboam, an algorithmic predictive device that gathers vast amounts of data on all human beings and can predict life trajectories.

Rehoboam [is] an enormous server named after a biblical king that uses its vast store of personal data to determine the life-paths of every human being. (Applying for a new job? Rehoboam will tell the prospective employer whether or not you are a proper fit.) This triumph of artificial intelligence represents another of the new season’s cunning inversions: In the park, machines had their futures mapped by human beings; in the wider world, human beings have their futures mapped by a machine’.

The capitalist nightmare displayed in the third season shows us a dystopia closer to Brave New World (1932) than anything else, complete with mind-altering/soothing drugs that make reality like a movie. In the episode Genre, Caleb, Dolores’ sidekick, is drugged with Genre, and he experiences the world through dramatic movie scores, whilst he is in a car chase which itself is emphasised in dramatised gunfights and stunts. The scene itself comes off metafictional, aware of its own stunt-like fictionality, perhaps a comment on the unreal, narrativised world the series is set in.

Later in the same episode, our heroes are on a metro, about to release the data Rehoboam stores to all its users, which apparently includes everyone in the world. On the metro itself, adverts for Incite are plastered all over the train.

Episode: Genre- Top right features Incite advert
Episode: Genre:-Incite advert reads ‘Find your place’

Incite is advertised in a way that doesn’t actually push the product in one’s face, rather, it is a mere reminder that it is there and that you already have it. Westworld offers a vision of silicon valley; a frictionless algorithmically controlled world where the virtues of neoliberal technological progress are cherished. Incite is the company that houses Rehoboam that is responsible for each person’s career and livelihood using its secret algorithms. People do not know that Rehoboam exists, and are unaware it is plotting everyone’s future and death. The citizens rely on Incite for career opportunities and general life coaching, whilst living, somewhat knowingly, in a social credit system. Incite essentially sells the idea that instead of making hard life decisions, it can quantify the options and tell you what to do. However, Dolores and her gang want to reveal to the world that it is actually making the decisions of the world and taking away their free will.

Incite appears to have a monopoly as it is the only advertisement we see and generally appears to be the mega-corporation behind everything, and everyone. Many neoliberal values are hinted at in the world-building of season three, such as the unregulated markets producing mega-corporations like Incite, which access data through third-party tracking. Delos, the company responsible for Westworld and the secret recording of their guests’ behaviour and decision-making within the park, sells their data to Incite, which creates the secret algorithms of Rehoboam. Incite and Rehoboam also track people through their devices and other mysterious surveillance methods, as Dolores articulates to Caleb when she is able to narrate past events to him that he did not think were possible to record. This future world is as data-driven as our real one, where personal data is sold to fund the monopolies. Alexandra Samuel writes that George ‘Orwell’s mistake, in Lugenbiehl’s view, was an excessively narrow and utopian view of technology: He assumed that technological development would necessarily bring an end to totalitarianism because tech-driven growth would ensure that “wealth will no longer create a distinction between people, and their leisure time will allow them to begin thinking for themselves [which ultimately]…will result in the overthrow of the existing social structure.”’ Lugenbiehl writes in the year 1984, already articulating that technology does not equal freedom, and identifies data as a commodity to enrich the powerful.

The citizens living in Westworld rely on gig economy work, responding to adverts for quick money on an app and carrying out illegal tasks for unknown payers. Caleb is seen to be living a precarious and increasingly automated existence, working short contracted jobs and earning extra money as a “side hustle”. The rise of gig economy work with services like Postmates Courier, DoorDash, Uber Driver, Lyft, or Taskrabbit, to name but a few, as well as the increasing reliance on side hustle work like OnlyFans signals a growing percentage of the public cannot make ends meet with only one job. Often advertised as work with the freedom to choose when one works, the sad reality is that gig-economy leaves the worker at the mercy of an app, with no benefits or even guarantee of wages. As our political and social landscape is gearing more towards decentralisation, privatisation, anti-unionising, marketisation and, deregulation, workers are not valued members of a society, but seen as mere contractors to be paid as little as possible. Westworld touches on the neoliberal experience through Caleb, but far from it being futuristic, it is analogous to the contemporary citizen.

In season two, we learn that the real reason for the existence of the Westworld theme park is to track the visitor’s decisions, having the hosts as ‘controls’ who can be compared with the human subjects, and use the visitor’s hats to scan their brains during their visit. This allows the secret collection of their behavioural data, analogous to the way Big Data, such as the giants’ Amazon, Apple, Facebook, Google, and Microsoft, collect users data for advertising revenue, and increasingly, police and surveillance revenue. Additionally, one can read the season three narrative of Rehoboam as analogous to the ongoing battle of cookie mining, and the ways that the population is left in the dark about how much of their information is collected. This can be articulated through the Cambridge Analytica scandal, a company that had ‘harvested information from over 87 million Facebook users through an external app in 2015’. The external apps were various personality quizzes that could harvest behavioural information. This information was ‘exploited’, as whistle-blower and employee of Cambridge Analytica, Christopher Wylie, puts it, by ‘Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons’. This company was able to use targeted ads to manipulate potential voters of different political elections, including the Trump election and the UK Brexit election. This famous example illustrates how technology is built to perform certain tasks, with biases towards outcomes ingrained into its systems. It shows how technology can manipulate masses of people towards the intended goals. It is also an example of technology actively influencing historical outcomes and shaping our societies, exposing how ‘[a] technological system can be both a cause and an effect; it can shape or be shaped by society. As they grow larger and more complex, systems tend to be more shaping of society and less shaped by it’.

Season three of Westworld looks at the way technology, such as Incite and Rehoboam, are very obviously built to manipulate individuals into living out the technology’s selected life path. Users of Incite believe that the algorithms merely calculate and suggest based on scientific method. What Dolores and her following illustrate to the humans is how the technology is actually actively manipulating users. Caleb, a working-class, ex-military construction worker is shown to be stuck in low-paying jobs without any meaningful connections in his life. We see Caleb, in the first episode of the season, be rejected for a job because their ‘strategy group’ could not find anything suitable for him. He tries to ask what he can do to improve and is met with no help, realising that he is talking to an AI. The strategy group is Incite, and later, in episode three, Dolores explains to Caleb that it is the Rehoboam algorithm that leads to his low-paying jobs and inability to climb the social ladder. Based on his traumatic past, his family history of mental illness, and his post-traumatic stress, he is projected to commit suicide which results in companies being wary to hire him and would skyrocket his life and health insurance costs. In some ways, this is another parallel to Brave New World, but instead of socially conditioning babies into their assigned social roles, algorithms entrap and guide people’s social roles. The indignity of being discriminated against for projected future events aside, Dolores points out that it is not prediction that the algorithm does, rather, the algorithm is creating its narrative and ensuring that everyone stays in line.

It is not a new revelation that algorithmic prediction is not prediction but manipulation. As users of various platforms, we are guided and recommended towards the media we consume. We are only shown world-views that we agree with to keep our eyeballs on the screen, creating echo-chambers, growing conspiracy groups and social divides. Data is presented as scientific and objectively true, but it is created with ingrained biases. In August of 2020, school students from England were not able to sit examinations due to the outbreak of Coronavirus. To give the students final grades, standardised grades were sent out using an algorithm called Ofqual which calculated the grades based, not only on the students’ past results, but on the past results of the entire school to get an average calculation. This resulted in deeply biased marking with ‘nearly 40% of marks being downgraded, in some cases by more than one grade, with high-achieving pupils from schools in deprived areas being disproportionately affected’. The algorithm discriminated against pupils from schools with a lower grade average and students from public schools in comparison to more affluent pupils from private schools. This is just one real-world example of the way algorithms are built with biases ingrained, where they don’t make predictions, but write a narrative to maintain a status quo. Similarly, Caleb doesn’t get the job not because some divine creation predicted a better candidate would come along, the algorithms told the company not to employ him based on a narrative created by data. It is this data that keeps him at the mercy of giant corporations, living in economic uncertainty, depriving him of real social connections.

This is the real dystopia of the series. Though the events of the series are often far-fetched and wordy, the essence of the series is not. We currently live in a world where data is the most expensive and valuable asset, algorithms already determine not only the media we consume, but the world views that we consume. We are cyborgs. Dolores' declaration that humans are mere code is not wrong. We are so attached to technology that we are dehumanised data in the technological West.

That more science fiction is focused on the idea of free will versus determinism in the context of technology is significant. Westworld is driven by the narrative of free will, moving from the idea of the hosts having free will to the humans having free will. Season three articulates that technological determinism is driving the course of humanity, that the growth of algorithmic society and the reliance on such technology is continuously robbing humanity of their agency. Not only is technology driving the course of history, but the technology is very much biased and limits people’s ability to freely access the world. It is suggested that removing Rehoboam will give the humans free will, as the ending of season three closes with Caleb pushing Rehoboam’s big red button and shutting it down. For us in the real world, pushing the red button may not be an option, ‘after all, “[m]odern society is so completely technified that a return to “nature” is inconceivable”’.

Other texts that question our free will are Black Mirror, particularly Bandersnatch (2018), the interactive episode I have previously written on. The episode questions free will both in the content of the episode, where the protagonist is aware that he is being controlled by the viewer and in the context of it being aired on Netflix, as it has been revealed that viewer’s decisions are being tracked. This speaks to the wider culture of tracking and personalisation in online contexts, be it personalisation of adverts by tracking cookies, personalisation of social media feeds and the creation of echo chambers, or personalisation of Netflix for users to choose options pre-selected for them. Grafton Tanner in his new book, The Circle of the Snake: Nostalgia and Utopia in the Age of Big Tech (2020), talks about how nostalgia is built into algorithmic feedback loops, as ‘the algorithms designed to suggest new content would inevitably recommend music and movies that resemble prior preferences. In other words, nostalgia is baked into the very recommender systems that structure streaming providers’ (page 79). We can use Tanner’s idea of nostalgia and apply it to personalisation, that the more content we consume the more personalised and pre-selected it becomes. In some ways, our free will to choose is continually taken away and replaced with optimised personalisation, new content being made with the values of engagement and a need to fit the model of previous successes, always getting the same television, films, songs, over and over again. In some ways, Bandersnatch articulates this well. It is a piece that makes the viewer aware of themselves as a monitored consumer, whose every choice is tracked and marketed for the profit of monopoly capitalism.

Osmosis, the French short series on Netflix, imagines a future technology that gathers one’s information to predict a perfect soulmate. The tension in the series is between the idea that the technology could take away a user’s free will by letting it quantify and calculate our entire lives and the idea that technology can answer our most intimate desire for connection. The series portrays that in algorithmic capitalism, all of human behaviour and pathology is up for profit. This sentiment is shared in Westworld; humanity is willing to give over its free will in exchange for existential order, meaning, and purpose in a chaotic world. Most importantly, this yearning can be capitalised on.

It is often quoted and misquoted, that it is easier to imagine the end of the world than it is to imagine the end of capitalism (attributed to Fredric Jameson, Slavoj Zizek, and Mark Fisher). It has become almost cliche at this point, but we are seeing a turn towards Algorithmic Realism, the concept that we cannot see beyond an algorithmic, neoliberal, capitalist, technologically determined future. The future of capitalism is linked to the rise of data capitalism and surveillance capitalism; the biggest and most powerful companies are Big Tech. This is a result of neoliberal values of privatisation and deregulation; with no laws comes scandals like Cambridge Analytica and the consequences are costing us the environment, democracy, and truth itself. That the characters in futuristic fiction seem to be out-of-control is a symptom of our out of control, complex, and algorithmically controlled systems. From shipping goods to voting systems, to even how social media works, these systems are so vast that humans are unable to control or comprehend them. We are giving over our free will to technological determinism, the belief that technology can and must lead the progress of humanity. But systems are not neutral, and instead of the techno-utopia once dreamed of, we are living in a tech-dystopia of Algorithmic Realism.

--

--

Megan K

Recent graduate in BA (hons) English Literature and Film. I love books, films and TV that make me think.