William Hurt is one of the most recognizable actors on the planet – and one of the best known mainstream stars to work a great deal in genre film. In AMC’s Humans (Sundays, 9/8C), he plays Dr. George Millican – one of the co-creators of the android/root servants called Synths.
Hurt spoke with a group of journalists bloggers about playing the character and what some of the ramifications of the existence of Synths might be – the ethical and practical possibilities.
Could you first just talk about kind of what it was that first attracted you to this part and made you decide that you had to do it?
William Hurt: Well, initially it was just the title. And because that’s my topic, you know. And then realized that it was about human beings and machines, but still titled Humans, it was intriguing. And it’s about a topic that I’ve been interested in most of my life.
And then I started reading it and I realized it was full of character and good questions like the nature of this interview today, and the technology that we’re using to have it, which is so dislocating but at the same time, pretty interesting. So this is an example of why the series interested me.
And out of curiosity, do you think if Synth were real, would you want one or would you be too…
Hurt: Funny, I was just talking. I just had lunch with the writers and we remarked on how every single interviewer asks us that question. And we don’t want to be boring but it’s a hard thing to answer. We don’t want to answer the same thing every time. I certainly don’t. My answer is, I don’t know.
And I wouldn’t know without knowing a lot more. And I think that’s sort of the key here, is to ask questions about, you know, about the whole, about this situation, that human beings are incorporating in the most literal sense, technology into their being.
Whether or not, you would have a robotic in your home and what level – I should probably do this here – has a lot to do with what that robotic is, what it’s, you know, what it’s equipped to do.
What kind of relationship you want to have with it, and those are the answers, those are questions I don’t have answers to yet. I’d have to interview…
The robotic?
Hurt: …you know, the respective employee.
Ever since Altered States, way back then, you’ve just made a lot of, lot of movies that have kind of had like a science fiction (element) but where real science was at a core of it. Is that the, just a coincidence or is that a subject you’ve always been interested in, how science relates to people and so on?
Hurt: Oh, no question. It’s been a fundamental interest of mine, the whole time, since I was young.
Okay, so talk a little bit more about that. What originally fascinated you about it and what, as you’ve gotten into all these different roles have you found fascinating?
Hurt: As I began to read science fiction, important science fiction, specific, most especially Isaac Asimov and began to realize that it wasn’t anywhere near as much fiction as people were thinking, or generally people were thinking. It fired my imagination, you know, to red hot.
I just realized what they were talking about was anything but imaginary. And so I was enthralled and always have been.
Okay, and so particularly on this one, on Humans, give us an example, what parts of this really particularly – because obviously it’s science and it’s not very fiction, because we’re very close to it, what parts of it particularly fascinate you?
Hurt: The thing about Humans that most interested me as a specific project was the stance from which the questions about the whole subject are posed and asked. And the stance is our life today. So it’s, you know, it’s more about, not about the future being asked what it’s going to be from the future standpoint.
It’s the present being asked what the future’s going to be with, by introducing that future to us now, who we are now. So it really is a vivid way of posing the questions to viewers today.
What I mean is that in our, you know, so we’re watching the television, and in that television is a family, and the family, there’s a house, and in the house is a living room, and in walks the Synth. And that living room is like our living room. That kitchen is like our kitchen.
Those people are like our people, like us. And they’re going to ask the questions that we would ask if that happened right now. And that’s the most vivid way to pose questions about the help, the hindrance, the invasion, the furtherance of human beings.
What intrigued me about this show and your character is that he was co-inventor of the Synth, but also now is in situation where he has almost an emotional attachment to one of the Synths in Odi, and then Vera comes into the picture. Kind of describe to us what that, those relationships are and how in your character, your life is really changed because of it.
Hurt: Well George made a choice, an important… life important choice not to go forward with designing Synth since what he prematurely. He was involved in the engineering of the mechanics of the bodies but not the so called minds.
And what he did was make a choice to remain human in the most fragile sense of the word, the most vulnerable sense of the word because he saw something in that experience though it was, you know, fraught with the worst risks any of us faced, the risk all of us face, morality itself, of realizing the potential, or his potential as a human being.
So in other words, he went home and he lived with his wife. His wife passes away and then he suffers an anomaly, a cerebral malfunction and he loses some of his memory systems, which makes Odi, who was a robotic of the fundamental sort, not the sentient kind in the life that he had with his wife.
And he, that robotic has all the memories of the event that took place while that robotic was part of their life. And that becomes George’s relationship to his wife because the Synth are not – and Synth means synthetic – remembers all those events in rudimentary fashion.
And that helps George continue the life of the life of his relationship with his wife. And that’s why the emotional part exists. He knows that Odi is a machine but he also is grateful to anything that helps keep his memories of his beloved alive.
And so he allows himself the responsible pleasure of rejecting onto Odi some of the feelings, but at the same time he always differentiates between real and unreal.
So it’s an interesting question for all of us, you know, how much are we going to let ourselves feel about machinery when in fact the machinery is there to be an extension of a far more complex computer, which is the computer of our biochemistry, our bodies. It’s a big, big, big question.
Also can we, as humans, love a machine?
Hurt: I don’t think you can really love a machine in the way that you would love a human being, unless and this is what I, this is – it may sound flippant to you, but I’ve used it in a couple of other interviews, unless the machine becomes human. So I think that’s our task.
If you want to have as fulfilling a relationship with a machine as you do with a human being, then you better make sure that, that machine is as fulfilled, or potentially fulfilled as a human being is, and that would be our task.
And I would love to be able to spend about ten minutes explaining what that means to me. I mean that if the machine were more human it would make sense. And so, are we going to have the audacity to make machines more human, which means of course, they can do very great risk, of giving machines the capacity for suffering and surprise.
Yes. Yes. That’s great. I’m sure this series is going to bring up a lot of questions like these. Yes.
Hurt: I hope it does and I hope it does to the American audience as it is already done so beautifully with the British, with the, because they responded very, very loud and clear to it. And I’m hoping the Americans do the same, we Americans.
What was it you found challenging about portraying this character?
Hurt: I find it more challenging when I’m asked to play characters that aren’t so interesting, which I usually refuse. It’s challenging – I can’t say that this was challenging because I was so furiously kind of in love it, you know. So I just went to work very excited every day.
I didn’t feel challenged in the sense that I was worried or, you know, that it was an impediment. There was no impediment here, unless it was the standard impediment that not having enough time to prepare, but which is a great one. So that would be the challenge.
The challenge would be the standard idea of preparation, but in this particular case having it done in Britain and there’s a, that comes the culture of theater, which I come from so there was a lot more possible there for me, lots of levels of communications. Go ahead.
Was there anything then about Millican that you thought of that may not have originally been in the script for him, maybe something about his backstory?
Hurt: Yes, I don’t, you know, the things, you can add anything you want as long as it doesn’t contradict anything that’s there. That’s the rule. The rule is you can invent anything that doesn’t contradict the truth of the character as described.
And no character as it exists on the page, in any script I’ve ever read is a large percentage of its potential because they leave you, in a good script they leave you creative room. So they didn’t write down how his hair, his hairdo, so I did that.
There are lots of things I invented about him, using, you know, my own personality, traits and other ones that I invented for him and, but I didn’t contradict anything on the page.
You being such a fan of SciFi working things and you spout off Asimov and even though your character as a doctor mainly worked on the hardware of the original ones.
How about this, is you as a person or you as the character, either one, how do you think the three robotic laws that Asimov laid out are enough as machines develop as you said, human qualities become human.
Are those really enough and is this show going to explore and is your character going to personally explore this as kind of the creator of these Synth, which the name in itself gives an elevation over robots or I take even androids?
Hurt: I think it’s, I don’t think the three are enough, no. I think they’re the most fundamental over general guidelines. I myself think that if Synths are to allowed to become or insist on becoming sentient.
That, that sentience will be a function of a consciousness that is in itself a function of very, very complex ethical interactions, ethics that are somehow rather transcribed into the root files of the hardware and software that go into the huge conference call of their mind, you know.
I think that the rudiments, the basic rules are great fundamental ethical guideline, but I think that the real interaction of ethics is as complex as a 1500 year Buddhist conversion.
And I think that’s where consciousness actually comes from, is that every human being has thousands of voices in their mind and spirit, interacting to create in a sense, the being of human being.
And I think that if this singularity of sentient that we call, that we somehow, I don’t know, we’re (unintelligible) to really call it, this will take place. It will take place as a matter of extraordinarily complex comprehension of all the interactive elements that go into the thing we call consciousness.
And I think that’s quite a long ways off. I think it will include perhaps unforeseen as yet dimension, or at least unforeseen by some, which is that I believe that the senses are very, very much a part of that interaction, that immense conversation that takes place within all beings.
And I think each one of those senses has in an allegorical way a mind of its own and comes to the party replete with its own genetic memory of everything that happened through creation. So I think that smell and sight and sound will all be sitting at the great round table of consciousness.
And as well as, you know, other theoretical components that may seem more complex but in my understanding, are not. I think that this thing that we’re calling the singularity, what we call the technological singularity, there are other kinds of singularity.
There’s approaching and I think that the bare beginning of the conversation about that are starting now as the rudiments of an immature computer technology are showing us the hints of the future that may be coming, or some of the vast questions about it that may be coming.
And so I think that as we map the mind, and by the mind I mean something very multidimensional in not only its physical pathways but its philosophical pathways. I think we’re going to run into a marvelous, demanding, challenging nest of components.
But I think Asimov in his absolute brilliance was able to reduce it to three principles that we can resonate with right now and I’m glad for him. I’m glad for his existence. I’m grateful that I was alive to read him.
I’m grateful he had the imagination to even give us a starting construct of something that he way early on saw coming down the line.
Hurt: Oh absolutely, without question I agree with that. You said it much more succinctly than I did, thank you.
Well thank you and it’s been an honor and pleasure to talk to you. And I totally understand how it feels. I’ve been on your side of the table before of these conference calls, trying to be human to people as you’re lined up in a, what I used to like to call the cattle call.
Hurt: Yes, it’s, but we’re doing okay so far.
I would like to just ask real quickly, initially I was going to ask what drew you to play Dr. Millican, but I did find an interview with you from a couple years ago, where you talked about you don’t play people, you chose to go for the character.
Hurt: Right. I’m a character actor.
If you would elaborate kind of on that statement now and how it pertains to your portrayal of Dr. Millican?
Hurt: Well, I, we were just talking about Asimov protocols and how they breakdown into three elegant, simple, vast ideas. I would add one more note to the comment that I made about character. I do go for character but I go, the character as a function of the entire play, the entire screen play.
So really what I want when I’m reading a screenplay, is to have the feeling when I’m finished with it that I would basically like to go and play any character they offer me or even go for coffee on the film set. That I want, my feeling is that I want to be part of that project.
So that’s the first criteria for me, is do I want to be part of the whole, the whole thing.
And then one last question real quick, kind of off topic, I understand that you’re a private pilot sir?
Hurt: Yes, I was. I mean I haven’t flown for a while but I flew for about 30 years, yes.
Yes, sir. I was just wondering if any of that training ever helped you with your characters on screen?
Hurt: Yes, it’s, I mean what helps you with your characters is inspirations in life. And the hobbies that I’ve chosen are the ones that connected me to life and that certain is one of those things that flying thrilled me for most of my life. I started out very young flying unusual aircraft, flying in unusual aircraft.
The first time I flew long distance was in 1951 when I flew from San Francisco to Hawaii in a plane called the MARS, which is larger than a 747, was an amphibious airplane, prop driven and it would double decker with birds. It had been a military aircraft and then was converted to commercial.
I flew PBYs in Catalina – in, I flew in PBYs, Catalina’s, EC3s, 2s, 4s, you know, C47s, all those things in the Pacific in the early 50s. I flew, I was in, I think I was in the second Pans Atlantic 707 flight, I think it was in the sixth or seventh Pans Atlantic comet flight.
I’ve had myself, I’ve owned airplanes from Cessna 180s. I had a part interest in a de Havilland Beaver. I had a Cessna 5, Seneca 5, I had a 206, a Bonanza, you know.
So I flew quite a bunch of stuff and it inspired me no end to see the world from that point of view, from high up but also in the peace time, civilian job, which is the job with the highest level of personal responsibility legally permitted.
Yes, when earlier you talked about how you got along well with the English because they’re basically theater background and so are you. I wanted to ask you because I happen to live in Lahore, Michigan where you got started on your theater professional theater so I just wanted to ask you.. if you could just elaborate a little bit more about kind of how theater shaped you. I mean you’ve done a zillion movies but what is it about the theater start, Boars Head or anywhere else that just kind of shapes you as an actor or a person?
Hurt: It’s great that you ask that question. Thank you very much for it. I don’t actually see myself as a film actor. I see myself as a theater practitioner. And I see all the different forms of expression in the theater as being what I do.
And it has lots of different parts but it’s fundamentally the same basic art form, and I reduced it to its components, it’s fundamental components a long time ago, the same way that Asimov tried to do with the protocols. And I see it, I see what I do as going to work at the art of the theater.
And that can happen in television. It can happen on film. It can happen on stage. It can happen lots of ways. But I do see it fundamentally as that art form and the principles of drama.
And the principles of character development, and the principles of the relationship of character to destiny are its perennial questions and those are the questions that interest me in life and that’s why I do it.
And just one real brief, they referred to you as Sir William a couple times, but you’re not really a sir, right, because you’re born in America.
Hurt: No.
No, so does this happen to you a lot? I, you know, because obviously Sir John Hurt is because he’s British, but does that happen to you a lot where people confuse you in there?
Hurt: They call me sir because I’m old.
But do you enjoy it sometimes when they do call you that, or not?
Hurt: Well, I mean, you know, if they mean, when they sir, do they mean I’m a has-been, I don’t enjoy it. But when they sir, it’s out of respect, I think that’s great.
There’s something that disturbs me about the show a little bit is with your character and Vera, it’s almost like she’s kind of not only running the household but running you a little bit. Kind of talk about that aspect of that and how machines on the series play into that.
Hurt: I think it raises a big question about whether machines are going to be used to inflict a police state on us, I mean on the people who are not competent or in agreement with the use of them.
And as we are, as in a police society, it is the responsibility for every civilian to resent the state over controlling individual’s existence, and, you know, in lots of different way, lots of different stages of life.
You know, just recently for instance in England, they agreed to allow three different portions of genetic coding to be included in one in vitro fertilization. That’s a law. That came up, there’s a lot of people who are protesting against that.
But what they were trying to do is, you know, the fight for it, which prevailed was based on an argument that said that there would be less likelihood of painful mutation in the egg when fertilized because it reduced by a very large percentage, the chance for mutation, or handicap.
And at the same time that, you know, that a lot of people are going, wait a second, are we going to be creating an animal husbandry state, a boutique genetics state and are we allowing evolution, or, you know, God to do the designing of our species.
So, you know, there are huge questions about this, and we she walks in the room she’s like the stereotype of the police state meddling in your life at a time when you’re losing some of your physical capacities but may not be losing them mentally.
Hurt: So you know, I think that the series is designed to raise a lot of questions that I don’t think it has the presumption to answer them finally. I think this is what makes it a good series, or part of what it does.
Well I agree. I definitely had a 1984 moment with her a little bit.
Hurt: Yes, sure.
We talked a lot in the show Being Humans, and your character being an older, retired doctor and both have physical and memory issues. I go to the question here.
Do you think we’re going to have a conversation in this show, of course you know more better than we do, about the value of wisdom, age, and how that reflects upon humanity?
And now bringing machines, which are all new and shiny, accept yours in that conflict which we have the new versus the old, especially in technology of people and skillsets, how do you see that portraying for you?
Hurt: I don’t know how it’s going to go with the series, because I don’t know what they have planned for the series. Anything that this series would do now or in the future, I’d want to be of, if they ask me to be.
And I can’t imagine that, that question you’re asking, which is essentially important one wouldn’t be part of the game. I can’t imagine that all important question would be there. I think that’s really kind of what it’s all about.
Now for you personally, do you think that’s what one thing would separates from AIs today and always will, or will it not?
Hurt: I don’t think necessarily always if the components of consciousness and memory are brought together with as much reverence as nature has created us with. If you make carefully asked questions with a lot of information, you’re more likely to come up with some reasonable.
So I do think that certainly the part of a potentially sentient machine that would be actually, probably easier to accomplish than consciousness itself, would be the library part, the history part, the access to information part.
I think that, that’s something that right now, is more developed than the collation or the synthesis part, or I mean, not synthesis, maybe that’s not the best word, but the collation or the interpolation part. In other words, I think accessing information is more possible at the moment than analyzing it well. I think the algorithm for analysis are the ones that you have to be most watchful about.
It’s as scary as the big data today.
Hurt: Yes, exactly right. And then you have this, you know, this issue you between notions of privacy, which wouldn’t necessarily be what someone in the NSA would be afraid it is, which is an indulgence or a right to, you know, for a few people to harm many people.
It could also be that without essences known of as privacy, that you won’t be able to create the bubbles of quiet and freedom in which human imagine can dare to go places it hasn’t been before. So that function of the notions of individuality hasn’t been talked about very much.
Most of the ones that are talked about are the ones that infer the anarchic instincts or the indulgent pleasure, the ones that are irreverent and irrelevant to human society.
But, you know, without that capacity to go where we haven’t been before, usually the vessel for that is a smaller vessel, the individual vessel, versus the mass level, which is a structural vessel.
They’re both necessary, but if a society is defined as security on the one hand and innovation on the other, or safety and love, love of the whole and love of the individual, I think you’re cutting off half of the horizon.