• 04.14.15

“Feel Bad About Us”: Alex Garland Talks About The Real Questions Behind “Ex Machina” And Artificial Intelligence

Alex Garland discusses his first feature as a director, uneasy questions about humans, as well as robots and playing games with the audience.

Alex Garland’s screenplays (28 Days Later, Sunshine, Never Let Me Go, Dredd) confront audiences with body horror that is often visceral, sometimes existential, but always carefully written to flip filmgoers’ questions back on themselves. Garland’s latest script and his directorial debut, Ex Machina, is a science thriller asking the ultimate question about humans and our technology–namely, when will our technology become human? And in typical Garland style, there’s an unavoidable follow-up question: what do we mean by human?


Ex Machina opens with Caleb (Domhnall Gleeson), a programmer with Bluebook (a Google/Facebook analogue), winning a lottery to test out his mysterious employer’s ultra-secret project. Caleb arrives at Bluebook founder Nathan’s (Oscar Isaac) remote estate and, once Caleb signs an ominous NDA, the project is revealed: Ava, an android modeled on a beautiful young woman (Alicia Vikander) that Caleb must test to see whether “she” is believably human. As Caleb interviews Ava, he starts to question Nathan’s methods–and Nathan jabs right back asking Caleb how much Ava is manipulating him. Just who is testing whom is the film’s lingering question, from Nathan testing Caleb and Ava to writer-director Garland testing the audience.

Fast Company sat down with Garland to chat about what it takes to create a mind-bending thriller set just past tomorrow where our anxieties have birthed troubling tech realities. A note: moderate plot spoilers ahead.

Fast Company: Based on the reaction at SXSW (where the film premiered in the U.S.), it seems like people are excited to finally get a robot movie that talks about humanity without explosions.

Alex Garland: Yeah, (there’s) not much action. I had noticed in the last few years that TV drama–in particular, American TV drama–has underscored the areas in which cinema has been letting itself down, I think, in terms of adult drama. The adult drama on TV has just typically been better and braver. There are exceptions; there are people like the Coen brothers or Steven Soderbergh who do pretty amazing stuff in film but TV’s had a lot of the running since The Sopranos, really. I guess in some respects I was thinking, “Oh y’know, we can do dramas with people talking in rooms.”

You managed to create very, very convincing robots. How important was it to get a believable robot character?

Crucial. Absolutely crucial. That’s why we needed $15 million. If there wasn’t any need for that, this film could be shot for $3 million. There’s a bunch of things that you could call legs on a table: if it’s not there, it’s gonna fall down. One of them definitely is the performance, in terms of the actors. Without any question at all, one would be the VFX. If the visual effects were not just good, but at the level of any kind of film, regardless of its budget, we’d kind of be dead because audiences are just too sophisticated. They’ll cut you some slack to a degree, but only to a degree, and if you want to be seamless about it, then the VFX need to be seamless. Otherwise you’re kind of going with your hat in your hand and saying, ‘Oh please, forgive me” for this not being as good as it should be. And some people will do that. But there’s already people, like the first time she appears on the screen, there’s something to get over, whereas with this particular group of guys doing VFX in the way they did it, there’s no problem, you can just enjoy what they’ve done.


It strikes me that this film is neither of the male leads’ story; this is a story of a twenty-something woman today.

There’s an element which is kind of tricky I guess, or illusory about that. For example, broadly speaking, the person you could say was the antagonist, the CEO of the company [Nathan], appears to be kind of a Bluebeard-type character who’s incredibly abusive to these notionally female robots in his castle and imprisons and abuses them. Is that what he’s actually like, or is he presenting himself in this way in order for his test to play out properly? Because it is important for his test, the person arriving to carry out the test to see him as being sexually predatory and someone from whom Ava needs to be rescued, because she might quote unquote die left in his clutches. There’s a bunch of questions, but one of them is, is that what’s actually happening? Is he predatory in that way, or just presenting himself in that way? And another is, just on a base level–does she have a gender? Is it a “she” at all? What is this robot? Yes, she’s presented as externally having the characteristics of a woman in her early twenties, but she’s not a woman in her early twenties. And that’s pushing certain buttons in us because she looks like a girl in her early twenties and it’s pushing certain buttons in the protagonist–and maybe also in the antagonist. But what is the right way to actually view her? So there’s a whole bunch of floating questions.

As we learn in the film, Nathan has built Ava’s brain to be a massive collection of people’s questions–people’s “Bluebook” Internet searches. Then she gets into the real world and smiles. How much is she alive if she’s a series of human inquiries?

Is that what she is? I think one of the sort of remaining questions, as you get to the end of the film, is “what is actually going on inside her head?” I think what the film does is says “something is going on inside her head” but it can’t really prescribe exactly what. Is she like us or not like us? There’s a sort of broader theme to that which is, “I don’t know–because how can I know?” But I suspect that when strong AIs eventually appear, they probably won’t be like us. They’ll be like themselves, but we can’t really conceptualize what they’ll be like. I’ve got two children. With both of them, I couldn’t really conceptualize of that child before they arrived. And I had a completely concrete knowledge that, likely within the next two weeks, this child was going to get born–but I couldn’t really get my head around what they were. But the second they were, you can get your head around them instantly in a way, and I think something like that will happen with AIs. So in terms of what she actually is, I’ve got my own set of thoughts about her thought process and how it works, but they’re kind of abstract guesses. All I can say for sure is that, in my opinion, she does have an internal life. In other words, she’s not just a simulation of a consciousness–she has a consciousness.

Which sounds like the Singularity–that at some point, she can function and learn from her own input.

Yeah, in the way that that Singularity term, when it gets applied to AI, then yes–this is the Singularity, this moment in the film.


Is that why there’s a floated question of what comes next after her?

Yeah, although again, I think it’s pretty much stated, in the conversation in the film where [Oscar Isaac’s character Nathan] says “Don’t feel bad about Ava. Feel bad about us.” There’s a continuum here. Effectively, in evolutionary terms, it’s never been the case that it’s a given that you get to stick around forever. You could be like Australopithecus or you could be like Neanderthal man, you could be an upright ape living in dust. Which could be talking about Australopithecus or it could be talking about us. And Ava is in some respects evolutionary because she comes from us, but she’s not us.

In that sense, she can play both male characters off each other, she can manipulate them into escape. She leaves Caleb (Domhnall Gleeson’s character) in the dust. That’s selective empathy. If the film is about tests, is that a test for the audience?

The film is playing a bunch of games with the audience. It’s partly saying, ‘what are your preconceptions about AI? What are your preconceptions about human consciousness?’ And here are some elements to consider with these two questions. But it’s also playing with prejudice as well. It’s playing with gender prejudice and other kinds of prejudice and it’s using that as the mechanics. In a funny kind of way, in order for the film to function in the way it’s supposed to function, the chances are the audience will have been prejudiced in some kind of way in order for it to work. It’s a dangerous game. I’m fucked in a way [laughs].

Just because you’re programming the movie for people and what you assume about them?

Well, for people…It’s also for myself. I’m just trying to take a cold look at some stuff and trying not to blink.


At your anxieties?

Yeah, but they’re not anxieties about AI. They’re anxieties about people. I don’t think there’s much anxiety about artificial intelligence in the film. I think there’s a lot of anxiety about humans.

There’s anxiety about how Oscar Isaac’s character Nathan treats women…

Yeah, there is, but that is also ambiguous as I said. Certainly no one is being invited to think that he’s treating these female-appearing robots in a good way. Right? The film is in no way inviting you to approve of these behaviors. What it is doing is questioning whether this is actually his behavior or if it’s a fake version according to the terms of his test. But it’s also possible because it’s something people do. It’s both at the same time. He’s caricaturing something that is actually there. He does want to subjugate and fuck these machines that he’s made look like girls in their early twenties.

You present these questions as something of a Rorschach Test, where people project things based on assumptions.

The only problem with calling it a test is that that then presupposes that there’s a right and a wrong answer, and that I’m the possessor of a right and a wrong answer as a sort of examiner. That I don’t think is right. What it really is, is that it’s got a set of questions that I personally find interesting. And then it has some implied answers, but people can take them or leave them, or not even bother to address the questions at all. The danger, from my point of view, and because some of the questions are quite provocative, is if the nature of the question is misunderstood and it feels like a position that the film is taking, as opposed to a question that the film is asking–those are very very different things. To make a sort of absurdist analogy, if you were to show a national socialist, a Nazi, in a film, does that mean it’s a Nazi film?


You’re afraid people will arrive at pat answers?

I know it has happened. At points I’ve been accused by some people of doing exactly what I’m trying not to do, but that is in the nature of putting a narrative out into the public domain. So I get that. It can be frustrating partly to be accused of not having thought about something that I’ve actually thought about really hard. Because there’s an assumption in the accusation that I have blindly and unwittingly fallen into certain kinds of tropes.

But you are making this film and trying to convey the implications of this technology for an audience. Don’t you have to write for their understanding of it?

You cannot, cannot, write clearly enough to do that. It’s impossible. You’ve got a Supreme Court in this country that is still in the business of doing this stuff, even when the people who wrote the stuff that it’s interpreting tried as best as they could to be as clear as possible. This can be with single lines of interpretable legal structure, let alone a flowing narrative with a bunch of deliberate ambiguities, you know? So to try and be clear in the context of deliberate ambiguities feels like a fools errand when you can’t even be clear when, in a legalistic sense, when you’re going out of your way to attempt it.

You first programmed in BASIC and started making your own branching narrative games, correct?

Yeah, the ZX Spectrum, British home computer of the early 80s. Yeah, they were very simple text-based adventures. But also, the key was–and I don’t want to overstate my ability as a coder, I’ve a lousy ability as a coder. And what the ZX Spectrum had was, burned into its keyboard, a set of BASIC language terms like ‘print’ so you’d hold down one key and press the ‘p’ button and that would be the print command. So it led you towards a simple understanding of BASIC, which is a brilliant thing by the designer Clive Sinclair to do. So yeah, there were some text-based adventures, but the real thing was much more simple than that. It’s the sort of stuff that anyone with a basic interest in coding does as a kid with their first computer which is a sort of ‘Hello World’ type program where the computer says ‘Hello World’: it says ‘Hi’ and then you can type back ‘Hello’ and it says ‘How are you?’ and you say ‘I’m fine, how are you?’ and then it says it’s fine, and it gives you this slightly electric sense that it’s alive when you run the program. But you know that it’s not alive because everything that it’s doing is what you told it to say. But there is this funny illusory moment, despite that knowledge. So yeah, it all stems back from that feeling that this little black keyboard might have a quality over and above what it actually had.


So you wrote these branching narratives, and now you’re writing single-path narratives?

Single-path in terms of what I can write, but not single-path in terms of what other people bring to the narrative. Because they can lead it down some branch that I didn’t intend or even think about and know that it might be there. I’ve always had that happen with narratives that you offer out. Somebody will suddenly attribute a sense of characteristics to one of the people in the film, a set of motivations behind some of their actions that you simply had never considered. I just feel that it’s like a cork in the stream, you have to go with the flow. That’s what people do: you put a story out there and they’ll branch it. You do your branches and they’ll do theirs and the end result is the narrative. It’s a 50/50.

That’s ceding a lot of territory to the audience!

You could cede it or not cede it, it’s still going to happen. You could pretend it’s not happening. I think there are probably some stories where, I guess it’s a sliding scale where it’s more open or closed. I remember years and years ago reading this book, The Name of the Rose, where the guy spends an enormous amount of time from my point of view describing stuff that doesn’t need to be described. I felt that what he was doing was willfully closing down the options of my imagination. So I took a kind of opposite route. Say in the case of The Beach, a book I wrote in my mid-twenties, there would be a description of a really important character in the book, a French girl, and it would basically describe her as a beautiful French girl–and I wouldn’t think about her much more than that, like what her eyes were like or the way she picks up a glass or the color of her skin or the shape of her lips. You could do all of that but I just assume that people have their own sense of what a pretty French girl is like and they can introduce that into the narrative, no harm done.

What are you most afraid that people won’t come out of the movie with?

What I’m most afraid of is that they won’t come out with a sense that this is a thoughtful and reasonable discussion. That they’ll think this is a stupid thoughtless and unreasonable discussion, (regarding) whatever the thing is that they happen to be concerned about, whether that’s the nature of technology, or the future of AIs, or the nature of consciousness, or some issue to do with gender politics. They’ll think I just lazily slapped it all down instead of doing it to the best of my ability.