Using AVAnnotate to Teach Multimodal Rhetoric in the First-Year Writing Classroom

Annotating Radiolab's "Driverless Dilemma"

Below you'll find a media player featuring Radiolab's "Driverless Dilemma" podcast episode and my annotations on the podcast. You can navigate the annotations in a few different ways:

  1. Play the audio and follow along as the annotations auto-scroll according to the timestamps,
  2. Filter the annotations by their tags and categories to view specific annotations (e.g., select "Rhetorical Situation" to see all annotations about this group of tags, or select an individual tag, like "ethos," to only view moments tagged as "ethos"), and
  3. Use the search bar to search terms in the annotations.

You can also explore my annotations by navigating to the project's Index, which will visualize all of the tags and categories across the media.

00:00:00 / 00:00:00

Annotations

00:00:15 - 00:00:15

Hey, it's Latif. Feels like all anyone is talking about these days is AI. We have a few new AI stories in the pipeline that I'm excited about, but in the meantime I wanted to play you this one. It's a rerun from the, you know, old Jad and Robert days, but it's actually not just a rerun, it's a rerun embedded in another rerun, which sounds confusing but it's actually fascinating because in addition to all the baseline interesting stuff in the episode about ethics and human nature, the experience of hearing these stacked reruns actually helps you feel the speed of technology.

Transcript
Latif

00:00:26 - 00:00:26

Like, how fast newfangled things become old hat, and then how despite that, despite all that rapid technological change, how the things we all need to think about, the kind of fundamental questions that we struggle with as humans pretty much stay the same. And that's—I don't know, that's sort of—it's kind of interesting to hear. And it's pretty humbling, actually. Take a listen, I hope you enjoy. This episode is called "The Driverless Dilemma."

Transcript
Latif

00:00:26 - 00:00:26

Intro music

Music

00:00:40 - 00:00:40

I'm Jad Abumrad.

Transcript
Jad
Communicator

00:00:46 - 00:00:46

I'm Robert Krulwich. And you know what this is.

Transcript
Robert
Communicator

00:01:36 - 00:01:36

This is Radiolab. [laughs]

Transcript
Jad

00:01:39 - 00:01:39

Yeah.

Transcript
Robert

00:01:40 - 00:01:40

Okay, so we're gonna play you a little bit of tape first just to set up the—what we're gonna do today. About a month ago, we were doing the thing about the fake news.

Transcript
Jad

00:03:06 - 00:03:06

Yeah, we were very worried about a lot of fake news—a lot of people are. But in the middle of doing that reporting, we were talking with a fellow from Vanity Fair.

Transcript
Robert

00:03:07 - 00:03:07

My name is Nick Bilton. I'm a special correspondent for Vanity Fair.

Transcript
Nick
Interviewee
Communicator

00:03:10 - 00:03:10

And in the course of our conversation, Nick—and this had nothing to do with what we were talking about, by the way—Nick just got into a sort of a—well, he went into a kind of nervous reverie, I'd say.

Transcript
Robert

00:03:13 - 00:03:13

Yeah, he was like, "You know, you guys want to talk about fake news, but that's not actually what's eating at me."

Transcript
Jad

00:03:16 - 00:03:16

The thing that I've been pretty obsessed with lately is actually not fake news, but it's automation and artificial intelligence and driverless cars. Because it's going to have a larger effect on society than any technology that I think has ever been created in the history of mankind. I know that's kind of a bold statement, but ...

Transcript
Nick
Interviewee
Message

00:03:21 - 00:03:21

[laughs]

Transcript
Robert

00:03:24 - 00:03:24

Quite bold!

Transcript
Jad

00:03:28 - 00:03:28

But you've got to imagine that—you know, that there will be in the next 10 years, 20 to 50 million jobs that will just vanish to automation. You've got, you know, a million truckers that will lose their jobs, the—but it's not—we think about, like, automation and driverless cars, and we think about the fact that they are going to—the people that just drive the cars, like the taxi drivers and the truckers, are gonna lose their jobs.

Transcript
Nick
Interviewee

00:03:36 - 00:03:36

What we don't realize is that there are entire industries that are built around just cars. So for example, if you are not driving the car, why do you need insurance? There's no parking tickets because your driverless car knows where it can and cannot park and goes and finds a spot and moves and so on. If there are truckers that are no longer using rest stops because driverless cars don't have to stop and pee or take a nap, then all of those little rest stops all across America are affected. People aren't stopping to use the restrooms. They're not buying burgers. They're not staying in these hotels, and so on and so forth.

Transcript
Nick
Interviewee
Logos

00:03:43 - 00:03:43

And then if you look at driverless cars to a next level, the whole concept of what a car is is going to change. So for example, right now a car has five seats and a wheel, but if I'm not driving, what's the point of having five seats and a wheel? You could imagine that you take different cars, so maybe when I was on my way here to this interview, I wanted to work out, so I called a driverless gym car. Or I have a meeting out in Santa Monica after this, and it's an hour, so I call a movie car to watch a movie on the way out there. Or office car, and I pick up someone else and we have a meeting on the way.

Transcript
Nick
Interviewee

00:03:46 - 00:03:46

And all of these things are gonna happen not in a vacuum, but simultaneously. This, you know—pizza delivery drivers are gonna be replaced by robots that will actually cook your pizza on the way to your house in a little box and then deliver it. And so kind of a little bit of a long-winded answer, but I truly do think that—that it's gonna have a massive, massive effect on society.

Transcript
Nick
Interviewee

00:03:57 - 00:03:57

Am I stressing you guys out? Are you—are you having heart palpitations over there?

Transcript
Nick
Interviewee

00:04:02 - 00:04:02

This is not good. This is not good.

Transcript
Robert

00:04:42 - 00:04:42

So that's a fairly compelling description of a—of a very dangerous future.

Transcript
Robert

00:04:43 - 00:04:43

Yes, but you know what? It's funny. One of the things that—I mean, we couldn't use that tape, initially at least.

Transcript
Jad

00:06:17 - 00:06:17

Right.

Transcript
Robert

00:06:20 - 00:06:20

But we kept thinking about it because it actually weirdly points us back to a story we did about a decade ago. The story of a moral problem that's about to get totally reimagined.

Transcript
Jad

00:06:26 - 00:06:26

It may be that what Nick is worried about and what we were worried about 10 years ago have now come dangerously close together.

Transcript
Robert

00:06:44 - 00:06:44

So what we thought we would do is we're—we're gonna play you the story as we did it then, sort of the full segment, and then we're gonna amend it on the back end. And by way of just disclaiming, this was at a moment in our development where there's just, like, way too many sound effects. It's just gratuitous.

Transcript
Jad

00:06:51 - 00:06:51

Well, you don't have to apologize for it. Those were great sound effects.

Transcript
Robert

00:07:07 - 00:07:07

No, I'm—I'm gonna apologize because there's just too much.

Transcript
Jad

00:07:08 - 00:07:08

[laughs]

Transcript
Robert

00:07:31 - 00:07:31

Just too much. And also, like, we—we talk about the MRI machine like it's this, like, amazing thing, when it was—it's sorta commonplace now. Anyhow, doesn't matter. We're gonna play it for you and then talk about it on the back end. This is—we start with a description of something called "the trolley problem." You ready?

Transcript
Jad

00:07:32 - 00:07:32

Yeah.

Transcript
Robert

00:07:44 - 00:07:44

All right. You're gonna hear some train tracks. Go there in your mind.

Transcript
Jad
Audience

00:07:45 - 00:07:45

Okay.

Transcript
Robert

00:08:00 - 00:08:00

[soft train horn sound effect]

Sound Effect

00:08:04 - 00:08:04

There are five workers on the tracks, working. They've got their backs turned to the trolley, which is coming in the distance.

Transcript
Jad

00:08:18 - 00:08:18

You mean they're repairing the tracks?

Transcript
Robert

00:08:24 - 00:08:24

They are repairing the tracks.

Transcript
Jad

00:08:31 - 00:08:31

This is unbeknownst to them, the trolley is approaching?

Transcript
Robert

00:08:33 - 00:08:33

They don't see it. You can't shout to them.

Transcript
Jad

00:08:42 - 00:08:42

Okay.

Transcript
Robert

00:08:45 - 00:08:45

And if you do nothing, here's what will happen: five workers will die.

Transcript
Jad
Pathos

00:08:59 - 00:08:59

Oh my God! [laughs] I—that was a horrible experience. I don't want that to happen to them.

Transcript
Robert

00:09:00 - 00:09:00

No, you don't. But you have a choice. You can do A) nothing. Or B) it so happens, next to you is a lever. Pull the lever, and the trolley will jump onto some side tracks where there is only one person working.

Transcript
Jad

00:09:09 - 00:09:09

So if the—so if the trolley goes on the second track, it will kill the one guy.

Transcript
Robert

00:09:12 - 00:09:12

Yeah, so there's your choice. Do you kill one man by pulling a lever, or do you kill five men by doing nothing?

Transcript
Jad

00:09:15 - 00:09:15

Well, I'm gonna pull the lever.

Transcript
Robert

00:09:17 - 00:09:17

Naturally. All right, here's part two. You're standing near some train tracks. Five guys are on the tracks, just as before. And there is the trolley coming.

Transcript
Jad

00:09:17 - 00:09:17

I hear the train coming in the—same five guys are working on the track?

Transcript
Robert

00:09:18 - 00:09:18

Same five guys.

Transcript
Jad

00:09:21 - 00:09:21

Backs to the train, they can't see anything?

Transcript
Robert

00:09:24 - 00:09:24

Yeah, yeah, exactly. However, I'm gonna make a couple changes. Now you're standing on a footbridge that passes over the tracks. You're looking down onto the tracks. There's no lever anywhere to be seen, except next to you, there is a guy.

Transcript
Jad

00:09:25 - 00:09:25

What do you mean, "there's a guy?"

Transcript
Robert

00:09:26 - 00:09:26

A large guy, large individual standing next to you on the bridge, looking down with you over the tracks. And you realize, "Wait, I can save those five workers if I push this man, give him a little tap."

Transcript
Jad

00:09:30 - 00:09:30

[laughs]

Transcript
Robert

00:09:39 - 00:09:39

He'll land on the tracks and stop the ...

Transcript
Jad

00:09:41 - 00:09:41

And he stops the train.

Transcript
Robert

00:09:51 - 00:09:51

[laughs] Right.

Transcript
Jad

00:10:04 - 00:10:04

Oh, yeah, I'm not gonna do that. I'm not gonna do that.

Transcript
Robert

00:10:05 - 00:10:05

But surely you realize that the math is the same.

Transcript
Jad

00:10:22 - 00:10:22

You mean, I'll save four people this way?

Transcript
Robert

00:10:31 - 00:10:31

Yeah.

Transcript
Jad

00:10:34 - 00:10:34

Yeah, but I'm—this time I'm pushing the guy. Are you insane? No.

Transcript
Robert

00:10:37 - 00:10:37

All right, here's the thing. If you ask people these questions—and we did—starting with the first.

Transcript
Jad

00:10:46 - 00:10:46

"Is it okay to kill one man to save five using a lever?" nine out of ten people will say ...

Transcript
Jad

00:10:47 - 00:10:47

Yes.

Transcript
Woman 1
Interviewee

00:10:52 - 00:10:52

Yes. [laughs]

Transcript
Woman 2
Interviewee

00:11:09 - 00:11:09

Yes.

Transcript
Woman 3
Interviewee

00:11:32 - 00:11:32

Yes.

Transcript
Woman 4
Interviewee

00:11:35 - 00:11:35

Yeah.

Transcript
Woman 5
Interviewee

00:11:42 - 00:11:42

But if you ask them, "Is it okay to kill one man to save five by pushing the guy?" nine out of ten people will say ...

Transcript
Jad
Logos

00:11:44 - 00:11:44

No.

Transcript
Woman 1
Interviewee

00:11:48 - 00:11:48

No.

Transcript
Woman 2
Interviewee

00:12:00 - 00:12:00

Never.

Transcript
Woman 3
Interviewee

00:12:01 - 00:12:01

No.

Transcript
Woman 4
Interviewee

00:12:06 - 00:12:06

No.

Transcript
Woman 5
Interviewee

00:12:13 - 00:12:13

It is practically universal. And the thing is if you ask people, "Why is it okay to murder"—because that's what it is—"Murder a man with a lever and not okay to do it with your hands?" People don't really know.

Transcript
Jad

00:12:34 - 00:12:34

Pulling the lever to save the five—I don't know, that feels better than pushing the one to save the five. But I don't really know why, so that's a good—there's a good moral quandary for you. [laughs]

Transcript
Woman 1
Interviewee

00:12:41 - 00:12:41

And if having a moral sense is a unique and special human quality then maybe we—us two humans anyway, you and me ...

Transcript
Robert
Message

00:12:53 - 00:12:53

Yeah?

Transcript
Jad

00:12:55 - 00:12:55

... should at least inquire as to why this happens. And I happen to have met somebody who has a hunch. He's a young guy at Princeton University. Wild curly hair, bit of mischief in his eye. His name is Josh Greene.

Transcript
Robert

00:12:59 - 00:12:59

Alrighty.

Transcript
Josh Greene
Interviewee

00:13:11 - 00:13:11

And he spent the last few years trying to figure out where this inconsistency comes from.

Transcript
Robert

00:13:14 - 00:13:14

How do people make this judgment? Forget whether or not these judgments are right or wrong, just what's going on in the brain that makes people distinguish so naturally and intuitively between these two cases, which from an actuarial point of view, are very, very, very similar if not identical?

Transcript
Josh Greene
Interviewee
Audience

00:13:22 - 00:13:22

Josh is, by the way, a philosopher and a neuroscientist, so this gives him special powers. He doesn't sort of sit back in a chair, smoke a pipe and think, "Now why do you have these differences?" He says, "No, I would like to look inside people's heads, because in our heads we may find clues as to where these feelings of revulsion or acceptance come from." In our brains.

Transcript
Robert
Ethos

00:13:26 - 00:13:26

So we're here in the control room. Where you basically just see ...

Transcript
Josh Greene
Interviewee

00:13:30 - 00:13:30

And it just so happens that in the basement of Princeton, there was this, well ...

Transcript
Robert

00:13:37 - 00:13:37

A big circular thing.

Transcript
Robert

00:13:38 - 00:13:38

Yeah, it looks kind of like an airplane engine.

Transcript
Josh Greene
Interviewee

00:13:43 - 00:13:43

180,000-pound brain scanner.

Transcript
Robert

00:13:46 - 00:13:46

I'll tell you a funny story. You can't have any metal in there because of the magnet, so we have this long list of questions that we ask people to make sure they can go in. "Do you have a pacemaker? Have you ever worked with metal?" Blah, blah, blah, blah, blah ...

Transcript
Josh Greene
Interviewee

00:14:01 - 00:14:01

Have you ever worked with metal?

Transcript
Robert

00:14:04 - 00:14:04

Yeah, because you could have little flecks of metal in your eyes that you would never even know are there from having done metalworking. And one of the questions is whether or not you wear a wig or anything like that, because they often have metal wires in with that. And there's this very nice woman who does brain research here who's Italian, and she's asking her subjects over the phone all these screening questions

Transcript
Josh Greene
Interviewee

00:14:05 - 00:14:05

And so I have this person over to dinner. She says, "Yeah, you know, I ended up doing this study, but it asks you the weirdest questions. This woman's like, 'Do you have a hairpiece?' And—and I'm like, 'What does it have to do if I have herpes or not?'" [laughs] Anyway, and she said—you know, she asked, "Do you have a hairpiece?" But she—so now she asks people if you wear a wig or whatever.

Transcript
Josh Greene
Interviewee

00:14:12 - 00:14:12

Anyhow, what Josh does is he invites people into this room, has them lie down on what is essentially a cot on rollers, and he rolls them into the machine. Their heads are braced, so they're sort of stuck in there.

Transcript
Robert

00:14:14 - 00:14:14

Have you ever done this?

Transcript
Robert

00:14:19 - 00:14:19

Oh yeah. Yep, several times.

Transcript
Josh Greene
Interviewee

00:14:24 - 00:14:24

And then he tells them stories. He tells them the same two, you know, trolley tales that you told before.

Transcript
Robert

00:14:27 - 00:14:27

Mm-hmm.

Transcript
Jad

00:14:28 - 00:14:28

And then at the very instant that they're deciding whether I should push the lever or whether I should push the man, at that instant, the scanner snaps pictures of their brains. And what he found in those pictures was frankly, a little startling. He showed us some.

Transcript
Robert

00:14:31 - 00:14:31

All right, I'll show you some—some stuff. Okay, let me think.

Transcript
Josh Greene
Interviewee

00:14:32 - 00:14:32

The picture that I'm looking at is a sort of a—it's a brain looked at, I guess, from the top down?

Transcript
Robert

00:14:36 - 00:14:36

Yep, it's top-down. It's sort of sliced, you know, like—like a deli slicer.

Transcript
Josh Greene
Interviewee

00:14:45 - 00:14:45

And the first slide that he showed me was a human brain being asked the question, "Would you pull the lever?" And the answer in most cases was, "Yes."

Transcript
Robert

00:14:47 - 00:14:47

Yeah, I'd pull the lever.

Transcript
Man

00:14:53 - 00:14:53

When the brain's saying, "Yes," you'd see little kind of peanut-shaped spots of yellow.

Transcript
Robert

00:14:59 - 00:14:59

This little guy right here and these two guys right there.

Transcript
Josh Greene
Interviewee

00:15:03 - 00:15:03

The brain was being active in these places. And oddly enough whenever people said yes ...

Transcript
Robert

00:15:04 - 00:15:04

Yes. Yes.

Transcript
Woman

00:15:12 - 00:15:12

... to the lever question, the very same pattern lit up. Then he showed me another slide. This is a slide of a brain saying, "No."

Transcript
Robert

00:15:19 - 00:15:19

No, I would not push the man.

Transcript
Woman

00:15:22 - 00:15:22

"I will not push the large man." And in this picture ...

Transcript
Robert

00:15:23 - 00:15:23

This one we're looking at here, this ...

Transcript
Josh Greene
Interviewee

00:15:39 - 00:15:39

... it was a totally different constellation of regions that lit up.

Transcript
Robert

00:15:48 - 00:15:48

This is the "No, no, no" crowd.

Transcript
Robert

00:15:51 - 00:15:51

I think this is part of the "No, no, no" crowd.

Transcript
Josh Greene
Interviewee

00:15:55 - 00:15:55

So when people answer yes to the lever question, there are—there are places in their brain which glow?

Transcript
Jad

00:16:21 - 00:16:21

Right. But when they answer, "No, I will not push the man," then you get a completely different part of the brain lighting up.

Transcript
Robert

00:16:49 - 00:16:49

Even though the questions are basically the same?

Transcript
Jad

00:17:10 - 00:17:10

Mm-hmm.

Transcript
Robert

00:17:19 - 00:17:19

Well, what does that mean? What does Josh make of this?

Transcript
Jad

00:17:20 - 00:17:20

Well he has a theory about this.

Transcript
Robert

00:17:25 - 00:17:25

A theory—not proven, but I think—this is what I think the evidence suggests.

Transcript
Josh Greene
Interviewee

00:17:26 - 00:17:26

He suggests that the human brain doesn't hum along like one big, unified system. Instead, he says, maybe in your brain, every brain, you'll find little warring tribes, little subgroups. One that is sort of doing the logical sort of accounting kind of thing.

Transcript
Robert

00:17:27 - 00:17:27

You've got one part of the brain that says, "Huh, five lives versus one life? Wouldn't it be better to save five versus one?"

Transcript
Josh Greene
Interviewee

00:17:29 - 00:17:29

And that's the part that would glow when you answer, "Yes, I'd pull the lever."

Transcript
Robert

00:17:52 - 00:17:52

Yeah, I'd pull the lever.

Transcript
Man

00:18:15 - 00:18:15

But there's this other part of the brain which really, really doesn't like personally killing another human being, and gets very upset at the fat man case, and shouts, in effect ...

Transcript
Robert

00:18:29 - 00:18:29

No!

Transcript
Man 1

00:18:39 - 00:18:39

No!

Transcript
Man 2

00:18:41 - 00:18:41

It understands it on that level, and says ...

Transcript
Josh Greene
Interviewee

00:18:58 - 00:18:58

No!

Transcript
Man 1

00:19:09 - 00:19:09

No!

Transcript
Man 2

00:19:14 - 00:19:14

No. Bad! Don't do.

Transcript
Josh Greene
Interviewee

00:19:23 - 00:19:23

No, I don't think I could push... a person.

Transcript
Woman 1
Interviewee

00:19:41 - 00:19:41

No.

Transcript
Woman 2
Interviewee

00:20:00 - 00:20:00

Never.

Transcript
Woman 3
Interviewee

00:20:12 - 00:20:12

Instead of having sort of one system that just sort of churns out the answer and bing, we have multiple systems that give different answers, and they duke it out. And hopefully out of that competition comes morality.

Transcript
Josh Greene
Interviewee

00:20:19 - 00:20:19

This is not a trivial discovery, that you struggle to find right and wrong depending upon what part of your brain is shouting the loudest. This is—it's like bleachers morality.

Transcript
Robert

00:20:24 - 00:20:24

Do you buy this?

Transcript
Jad

00:20:30 - 00:20:30

Hmm. You know, I just don't know.

Transcript
Robert

00:20:36 - 00:20:36

Yeah.

Transcript
Jad

00:20:36 - 00:20:36

I've always kind of suspected that a sense of right and wrong is mostly stuff that you get from your mom and your dad and from experience, that it's culturally learned for the most part. Josh is kind of a radical in this respect. He thinks it's biological. I mean, deeply biological. That somehow we inherit from the deep past a sense of right and wrong that's already in our brains from the get-go, before Mom and Dad.

Transcript
Robert

00:20:38 - 00:20:38

Our—our primate ancestors, before we were full-blown humans, had intensely social lives. They have social mechanisms that prevent them from doing all the nasty things that they might otherwise be interested in doing. And so deep in our brain, we have what you might call basic primate morality. And basic primate morality doesn't understand things like tax evasion, but it does understand things like pushing your buddy off of a cliff.

Transcript
Josh Greene
Interviewee

00:20:43 - 00:20:43

Oh, so you're thinking then that the man on the bridge, that I'm on the bridge next to the large man, and I have hundreds of thousands of years of training in my brain that says, "Don't murder the large man."

Transcript
Robert

00:20:56 - 00:20:56

Right. Whereas ...

Transcript
Josh Greene
Interviewee

00:21:01 - 00:21:01

And even if I'm thinking, "If I murder the large man, I'm gonna save five lives and only kill the one man," but there's something deeper down that says, "Don't murder the large man."

Transcript
Robert

00:21:10 - 00:21:10

Right. Now that case, I think it's a pretty easy case. Even though it's five versus one, in that case, people just go with what we might call the "inner chimp." But there are other, but there ...

Transcript
Josh Greene
Interviewee

00:21:12 - 00:21:12

The "inner chimp" is your unfortunate way of describing an act of deep goodness.

Transcript
Robert

00:21:15 - 00:21:15

Right. Well, that's what's interesting.

Transcript
Josh Greene
Interviewee

00:21:28 - 00:21:28

It's the 10 Commandments, for God's sake! Inner chimp!

Transcript
Robert

00:21:31 - 00:21:31

Right. Well, what's interesting is that we think of basic human morality as being handed down from on high, and it's probably better to say that it was handed up from below, that our most basic core moral values are not the things that we humans have invented, but the things that we've actually inherited from other people. The stuff that we humans have invented are the things that seem more peripheral and variable.

Transcript
Josh Greene
Interviewee

00:21:38 - 00:21:38

But something as basic as, "Thou shalt not kill," which many people think was handed down in tablet form from a mountaintop from God directly to humans, no chimps involved ...

Transcript
Robert

00:21:44 - 00:21:44

Right.

Transcript
Josh Greene
Interviewee

00:22:56 - 00:22:56

... you're suggesting that hundreds of thousands of years of on-the-ground training have gotten our brains to think, "Don't kill your kin. Don't kill your ..."

Transcript
Robert

00:22:58 - 00:22:58

Right. Or at least, you know, that should be your default response. I mean, certainly chimpanzees are extremely violent and they do kill each other, but they don't do it as a matter of course. They, so to speak, have to have some context-sensitive reason for doing so.

Transcript
Josh Greene
Interviewee

00:23:09 - 00:23:09

So now we're getting to the rub of it. You think that profound moral positions may be somehow embedded in brain chemistry.

Transcript
Robert

00:23:12 - 00:23:12

Yeah.

Transcript
Josh Greene
Interviewee

00:23:19 - 00:23:19

And Josh thinks there are times when these different moral positions that we have embedded inside of us, in our brains, when they can come into conflict. And in the original episode, we went into one more story. This one, you might call the "Crying baby dilemma."

Transcript
Jad

00:23:21 - 00:23:21

The situation is somewhat similar to the last episode of M*A*S*H, for people who are familiar with that. But the way we tell the story, it goes like this: it's wartime ...

Transcript
Josh Greene
Interviewee

00:23:29 - 00:23:29

There's an enemy patrol coming down the road.

Transcript
Archive Clip
Sound Effect

00:23:33 - 00:23:33

You're hiding in the basement with some of your fellow villagers.

Transcript
Josh Greene
Interviewee

00:23:42 - 00:23:42

Let's kill those lights.

Transcript
Archive Clip
Sound Effect

00:23:44 - 00:23:44

And the enemy soldiers are outside. They have orders to kill anyone that they find.

Transcript
Josh Greene
Interviewee

00:23:46 - 00:23:46

Be quiet! Nobody make a sound until they've passed us.

Transcript
Archive Clip
Sound Effect

00:23:57 - 00:23:57

So there you are, you're huddled in the basement. All around you are enemy troops, and you're holding your baby in your arms, your baby with a cold, a bit of a sniffle. And you know that your baby could cough at any moment.

Transcript
Robert

00:24:03 - 00:24:03

If they hear your baby, they're gonna find you and the baby and everyone else, and they're gonna kill everybody. And the only way you can stop this from happening is cover the baby's mouth. But if you do that, the baby's going to smother and die. If you don't cover the baby's mouth, the soldiers are gonna find everybody and everybody's gonna be killed, including you, including your baby.

Transcript
Josh Greene
Interviewee
Pathos

00:24:05 - 00:24:05

And you have the choice. Would you smother your own baby to save the village, or would you let your baby cough, knowing the consequences?

Transcript
Robert

00:24:08 - 00:24:08

And this is a very tough question. People take a long time to think about it, and some people say yes, and some people say no.

Transcript
Josh Greene
Interviewee

00:24:17 - 00:24:17

Children are a blessing and a gift from God, and we do not do that to children.

Transcript
Woman 1
Interviewee

00:24:22 - 00:24:22

Yes, I think I would kill my baby to save everyone else and myself.

Transcript
Woman 2
Interviewee

00:24:35 - 00:24:35

No, I would not kill the baby.

Transcript
Man 1
Interviewee

00:24:36 - 00:24:36

I feel because it's my baby, I have the right to terminate the life.

Transcript
Woman 2
Interviewee

00:24:43 - 00:24:43

I'd like to say that I would kill the baby, but I don't know if I'd have the inner strength.

Transcript
Man 2
Interviewee

00:25:07 - 00:25:07

No. If it comes down to killing my own child, my own daughter or my own son, then I choose death.

Transcript
Man 3
Interviewee

00:25:28 - 00:25:28

Yeah. If you have to because it was done in World War II. When the Germans were coming around, there was a mother that had a baby that was crying, and rather than be found, she actually suffocated the baby, but the other people lived.

Transcript
Man 4
Interviewee

00:25:49 - 00:25:49

Sounds like an old M*A*S*H thing. No, you do not kill your baby.

Transcript
Woman 3
Interviewee

00:25:56 - 00:25:56

In the final M*A*S*H episode, the Korean woman who's a character in this piece, she murders her baby.

Transcript
Robert

00:25:59 - 00:25:59

She killed it. She killed it. Oh my God, oh my God! I didn't mean for her to kill it. [crying I did not. I—I just wanted it to be quiet. It was, it was a baby. She—she smothered her own baby.

Transcript
Archive Clip
Sound Effect

00:26:10 - 00:26:10

What Josh did is he asked people the question, "Would you murder your own child?" while they were in the brain scanner. And at just the moment when they were trying to decide what they would do, he took pictures of their brains. And what he saw, the contest we described before, was global in the brain. It was like a world war. That gang of accountants, that part of the brain was busy calculating, calculating. "A whole village could die. A whole village could die."

Transcript
Robert

00:26:12 - 00:26:12

But the older and deeper reflex also was lit up, shouting, "Don't kill the baby! No, no! Don't kill the baby!"

Transcript
Robert

00:26:14 - 00:26:14

No!

Transcript
Man
Interviewee

00:26:16 - 00:26:16

Inside, the brain was literally divided: do the calculations, don't kill the baby. Do the calculations, don't kill the baby. Two different tribes in the brain literally trying to shout each other out. And Jad, this was a different kind of contest than the ones we talked about before. Remember before, when people were pushing a man off a bridge, overwhelmingly their brains yelled, "No, no! Don't push the man!" And when people were pulling the lever, overwhelmingly, "Yeah, yeah, pull the lever!"

Transcript
Robert

00:26:17 - 00:26:17

Right.

Transcript
Jad

00:26:24 - 00:26:24

There it was distinct. Here, I don't think really anybody wins.

Transcript
Robert

00:26:44 - 00:26:44

Well, who breaks the tie? I mean, they had to answer something, right?

Transcript
Jad

00:26:50 - 00:26:50

[laughs] Well, that's a good question!

Transcript
Robert

00:26:51 - 00:26:51

And now, is there a—what happens? Is it just two cries that fight each other out or is there a judge?

Transcript
Robert

00:27:11 - 00:27:11

Well, that's an interesting question. And that's one of the things that we're looking at.

Transcript
Josh Greene
Interviewee
Audience

00:27:17 - 00:27:17

When you are in this moment, with parts of your brain contesting, there are two brain regions ...

Transcript
Robert

00:27:26 - 00:27:26

These two areas here, towards the front ...

Transcript
Josh Greene
Interviewee

00:27:36 - 00:27:36

... right behind your eyebrows—left and right—that light up. And this is particular to us. He showed me a slide.

Transcript
Robert

00:27:37 - 00:27:37

It's those sort of areas that are very highly developed in humans as compared to other species.

Transcript
Josh Greene
Interviewee

00:27:38 - 00:27:38

So when we have a problem that we need to deliberate over, the light—the front of the brain, this is above my eyebrow, sort of?

Transcript
Robert

00:28:16 - 00:28:16

Yeah, right about there.

Transcript
Josh Greene
Interviewee

00:28:21 - 00:28:21

And there's two of them, one on the left and one on the right.

Transcript
Robert

00:28:29 - 00:28:29

Bilateral.

Transcript
Josh Greene
Interviewee

00:28:31 - 00:28:31

And they are the things that monkeys don't have as much of that we have?

Transcript
Robert

00:28:35 - 00:28:35

Certainly these parts of the brain are more highly developed in humans.

Transcript
Josh Greene
Interviewee

00:28:37 - 00:28:37

So looking at these two flashes of light at the front of a human brain, you could say we are looking at what makes us special.

Transcript
Robert

00:28:38 - 00:28:38

That's a fair statement.

Transcript
Josh Greene
Interviewee

00:28:42 - 00:28:42

A human being wrestling with a problem, that's what that is.

Transcript
Robert

00:28:47 - 00:28:47

Yeah, where it's both emotional, but there's also a sort of a rational attempt to sort of sort through those emotions. Those are the cases that are showing more activity in that area.

Transcript
Josh Greene
Interviewee

00:28:53 - 00:28:53

So in those cases when these dots above our eyebrows become active, what are they doing?

Transcript
Jad

00:28:56 - 00:28:56

Well, he doesn't know for sure, but what he found is in these close contests, whenever those nodes are very, very active, it appears that the calculating section of the brain gets a bit of a boost, and the visceral "inner chimp" section of the brain is kind of muffled.

Transcript
Robert

00:29:06 - 00:29:06

man: No! No. No ...

Transcript
Archive Clip
Man
Sound Effect

00:29:09 - 00:29:09

The people who chose to kill their children, who made what is essentially a logical decision, over and over, those subjects had brighter glows in these two areas and longer glows in these two areas. So there is a definite association between these two dots above the eyebrow and the power of the logical brain over the "inner chimp" or the visceral brain.

Transcript
Robert

00:29:11 - 00:29:11

Well, you know, that's the hypothesis. But it's gonna take a lot of more research to sort of tease apart what these different parts of the brain are doing, or if some of these are just sort of activating in an incidental kind of way. I mean, we really don't know. This is all—all very new.

Transcript
Josh Greene
Interviewee

00:29:12 - 00:29:12

Okay, so that was the story we put together many, many, many years ago, about a decade ago. And at that point, the whole idea of thinking of morality as kind of purely a brain thing, it was relatively new. And certainly, the idea of philosophers working with MRI machines, it was super new. But now here we are, 10 years later, and some updates. First of all, Josh Greene ...

Transcript
Jad

00:29:34 - 00:29:34

So in the long, long stream of time, I assume now you have three giraffes, two bobcats, and children?

Transcript
Robert

00:29:41 - 00:29:41

That's right. Yeah, so two kids, and we're close to adding a cat.

Transcript
Josh Greene
Interviewee

00:29:42 - 00:29:42

We talked to him again. He has started a family. He's switched labs from Princeton to Harvard. But that whole time, that interim decade, he has still been thinking and working on the trolley problem.

Transcript
Jad
Ethos

00:29:46 - 00:29:46

Did you ever write the story differently?

Transcript
Robert

00:29:48 - 00:29:48

Absolutely, so...

Transcript
Josh Greene
Interviewee

00:29:50 - 00:29:50

For years, he's been trying out different permutations of the scenario on people. Like, "Okay, instead of pushing the guy off the bridge with your hands, what if you did it, but not with your hands?"

Transcript
Jad

00:29:53 - 00:29:53

So in one version, we ask people about hitting a switch that opens a trapdoor on the footbridge and drops the person. In one version of that, the switch is right next to the person. In another version, the switch is far away. And in yet another version, you're right next to the person, and you don't push them off with your hands, but you push them with a pole.

Transcript
Josh Greene
Interviewee

00:29:53 - 00:29:53

Ooh!

Transcript
Robert

00:29:59 - 00:29:59

And to cut to the chase, what Josh has found is that the basic results that we talked about ...

Transcript
Jad

00:29:59 - 00:29:59

That's roughly held up.

Transcript
Josh Greene
Interviewee

00:30:02 - 00:30:02

It's still the case that people would like to save the most number of lives, but not if it means pushing somebody with their own hands—or with a pole, for that matter. Now here's something kind of interesting. He and others have found that there are two groups that are more willing to push the guy off the bridge: they are Buddhist monks and psychopaths.

Transcript
Jad

00:30:05 - 00:30:05

I mean, some people just don't care very much about hurting other people. They don't have that kind of an emotional response.

Transcript
Josh Greene
Interviewee

00:30:20 - 00:30:20

That would be the psychopaths, whereas the Buddhist monks presumably are really good at shushing their "inner chimp," as he called it, and just saying to themselves ...

Transcript
Jad

00:30:26 - 00:30:26

You know, I'm aware that this is—that killing somebody is a terrible thing to do. And I feel that, but I recognize that this is done for a noble reason, and therefore, it's—it's okay.

Transcript
Josh Greene
Interviewee

00:30:28 - 00:30:28

So there's all kinds of interesting things you can say about the trolley problem as a thought experiment, but at the end of the day, it's just that. It's a thought experiment. What got us interested in revisiting it is that it seems like the thought experiment is about to get real.

Transcript
Jad

00:30:32 - 00:30:32

That's coming up right after the break.

Transcript
Jad

00:30:52 - 00:30:52

Jad, Robert. Radiolab. Okay, so where we left it is that the trolley problem is about to get real. Here's how Josh Greene put it.

Transcript
Jad

00:30:59 - 00:30:59

You know, now as we're entering the age of self-driving cars, ah, this is like the trolley problem now finally come to life.

Transcript
Josh Greene
Interviewee

00:31:08 - 00:31:08

Oh, there's cars coming! Oh!

Transcript
Archive Clip
Sound Effect
Pathos

00:31:15 - 00:31:15

The future of the automobile is here.

Transcript
Archive Clip
Sound Effect

00:31:20 - 00:31:20

Oh, there's cars! Ah!

Transcript
Archive Clip
Sound Effect
Pathos

00:31:30 - 00:31:30

Autonomous vehicles. It's here.

Transcript
Archive Clip
Sound Effect

00:31:40 - 00:31:40

The first self-driving Volvo will be offered to customers in 2021.

Transcript
Archive Clip
Sound Effect

00:31:45 - 00:31:45

Ah! Ah! Oh, where's it going?

Transcript
Archive Clip
Sound Effect
Pathos

00:31:46 - 00:31:46

This legislation is the first of its kind, focused on the car of the future that is more of a supercomputer on wheels.

Transcript
Archive Clip
Sound Effect

00:31:48 - 00:31:48

Oh! Oh, there's a car coming!

Transcript
Archive Clip
Sound Effect
Pathos

00:31:49 - 00:31:49

Okay, so self-driving cars, unless you've been living under a muffler, they are coming. It's gonna be a little bit of an adjustment for some of us.

Transcript
Jad

00:32:09 - 00:32:09

Ah!

Transcript
Archive Clip
Sound Effect
Pathos

00:32:12 - 00:32:12

Hit the brakes! Hit the brakes!

Transcript
Archive Clip
Sound Effect
Pathos

00:32:15 - 00:32:15

No.

Transcript
Archive Clip
Sound Effect

00:32:16 - 00:32:16

But what Josh meant when he said it's the trolley problem ...

Transcript
Jad

00:32:23 - 00:32:23

... come to life ...

Transcript
Josh Greene
Interviewee

00:32:30 - 00:32:30

... is basically this. Imagine this scenario ...

Transcript
Jad

00:32:30 - 00:32:30

The self-driving car now is headed towards a bunch of pedestrians in the road. The only way to save them is to swerve out of the way, but that will run the car into a concrete wall and it will kill the passenger in the car. What should the car do? Should the car go straight and run over, say, those five people, or should it swerve and—and kill the one person?

Transcript
Josh Greene
Interviewee

00:32:31 - 00:32:31

That suddenly is a real-world question.

Transcript
Jad

00:32:33 - 00:32:33

If you ask people in the abstract ...

Transcript
Josh Greene
Interviewee
Audience

00:32:37 - 00:32:37

Like, what, theoretically, should a car in this situation do?

Transcript
Jad

00:32:38 - 00:32:38

They're much more likely to say ...

Transcript
Josh Greene
Interviewee

00:32:45 - 00:32:45

I think you should sacrifice one for the good of the many.

Transcript
Woman 1
Interviewee
Logos

00:32:57 - 00:32:57

They should just try to do the most good or avoid the most harm.

Transcript
Josh Greene
Interviewee

00:32:58 - 00:32:58

So if it's between one driver and five pedestrians ...

Transcript
Jad
Logos

00:33:01 - 00:33:01

Logically, it would be the driver.

Transcript
Man 1
Interviewee
Logos

00:33:06 - 00:33:06

Kill the driver.

Transcript
Woman 2
Interviewee
Logos

00:33:10 - 00:33:10

Be selfless.

Transcript
Man 2
Interviewee

00:33:11 - 00:33:11

I think it should kill the driver.

Transcript
Woman 3
Interviewee

00:33:20 - 00:33:20

But when you ask people, forget the theory ...

Transcript
Jad

00:33:22 - 00:33:22

Would you want to drive in a car that would potentially sacrifice you to save the lives of more people in order to minimize the total amount of harm? They say ...

Transcript
Josh Greene
Interviewee
Logos

00:33:43 - 00:33:43

No. I wouldn't buy it.

Transcript
Man 1
Interviewee

00:33:54 - 00:33:54

No. Absolutely not.

Transcript
Man 2
Interviewee

00:33:59 - 00:33:59

That would kill me in it? No.

Transcript
Woman 1
Interviewee

00:34:02 - 00:34:02

So I'm not gonna—I'm not gonna buy a car that's gonna purposely kill me.

Transcript
Man 3
Interviewee

00:34:07 - 00:34:07

Hell no. I wouldn't buy it.

Transcript
Man 4
Interviewee

00:34:17 - 00:34:17

[laughs] For sure, no. [laughs]

Transcript
Woman 2
Interviewee

00:34:26 - 00:34:26

I'd sell it, but I wouldn't buy it.

Transcript
Man 4
Interviewee

00:34:28 - 00:34:28

So there's your problem: people would sell a car—and an idea of moral reasoning—that they themselves wouldn't buy. And last fall, an exec at Mercedes Benz face-planted right into the middle of this contradiction.

Transcript
Jad

00:34:32 - 00:34:32

Welcome to Paris, one of the most beautiful cities in the world. And welcome to the 2016 Paris Motor Show, home to some of the most beautiful cars in the world.

Transcript
Archive Clip
Sound Effect

00:35:03 - 00:35:03

Okay, October 2016, the Paris Motor Show. You had something like a million people coming in over the course of a few days. All the major car-makers were there.

Transcript
Jad

00:35:06 - 00:35:06

Here is Ferrari. You can see the LaFerrari Aperta, and of course the new GTC4Lusso T.

Transcript
Archive Clip
Sound Effect

00:35:10 - 00:35:10

Everybody was debuting their new cars, and one of the big presenters in this whole affair was this guy ...

Transcript
Jad

00:35:33 - 00:35:33

In the future, you'll have cars where you don't even have to have your hands on the steering wheel anymore, but maybe you watch a movie on the head-up display or maybe you want to do your emails. That's really what we are striving for.

Transcript
Archive Clip
Sound Effect

00:35:50 - 00:35:50

This is Christoph von Hugo, a senior safety manager at Mercedes Benz. He was at the show sort of demonstrating a prototype of a car that could sort of self-drive its way through traffic.

Transcript
Jad

00:36:00 - 00:36:00

In this E-Class today, for example, you've a maximum of comfort and support systems.

Transcript
Archive Clip
Sound Effect
Christoph von Hugo

00:36:03 - 00:36:03

You'll actually look forward to being stuck in traffic jams, won't you?

Transcript
Archive Clip
Sound Effect

00:36:06 - 00:36:06

Of course, of course.

Transcript
Archive Clip
Sound Effect
Christoph von Hugo

00:36:19 - 00:36:19

[laughs]

Transcript
Archive Clip
Sound Effect

00:36:22 - 00:36:22

He's doing dozens and dozens of interviews through the show, and in one of those interviews—unfortunately, this one we don't have on tape—he was asked, "What would your driverless car do in a trolley problem-type dilemma, where maybe you have to choose between one or many?" And he answered, quote ...

Transcript
Jad

00:36:28 - 00:36:28

If you know you can save one person, at least save that one.

Transcript
Michael Taylor
Interviewee

00:36:37 - 00:36:37

If you know you can save one person, save that one person.

Transcript
Jad

00:36:42 - 00:36:42

Save the one in the car.

Transcript
Michael Taylor
Interviewee

00:36:56 - 00:36:56

This is Michael Taylor, correspondent for Car and Driver magazine. He was the one that Christoph von Hugo said that to.

Transcript
Jad

00:37:03 - 00:37:03

If you know for sure that one thing, one death can be prevented, then that's your first priority.

Transcript
Michael Taylor
Interviewee

00:37:06 - 00:37:06

Now when he said this to you ...

Transcript
Amanda Aroncyzk
Interviewee
Producer

00:37:10 - 00:37:10

This is producer Amanda Aronczyk.

Transcript
Jad

00:37:22 - 00:37:22

... did it seem controversial at all in the moment?

Transcript
Amanda Aroncyzk
Interviewee
Producer

00:37:26 - 00:37:26

In the moment, it seemed incredibly logical.

Transcript
Michael Taylor
Interviewee

00:37:36 - 00:37:36

I mean, all he's really doing is saying what's on people's minds, which is that ...

Transcript
Jad

00:37:54 - 00:37:54

No.

Transcript
Man 1
Interviewee

00:37:59 - 00:37:59

I wouldn't buy it, personally.

Transcript
Man 2
Interviewee

00:38:00 - 00:38:00

Who's gonna buy a car that chooses somebody else over them? Anyhow, he makes that comment, Michael prints it, and a kerfuffle ensues.

Transcript
Jad
Logos

00:38:03 - 00:38:03

"Save the one in the car." That's Christoph von Hugo from Mercedes ...

Transcript
News Clip
Sound Effect

00:38:05 - 00:38:05

But then when you lay out the questions, you sound like a bit of a heel because you want to save yourself as opposed to the pedestrians.

Transcript
Archive Clip
Sound Effect

00:38:10 - 00:38:10

Doesn't it ring, though, of, like, just privilege, you know?

Transcript
Archive Clip
Sound Effect

00:38:12 - 00:38:12

It does. Yeah, it does.

Transcript
Archive Clip
Sound Effect

00:38:16 - 00:38:16

Wait a second. What would you do? It's you or a pedestrian. And it's just—you know, I don't know anything about this pedestrian. It's just you or a pedestrian, just a regular guy walking down the street.

Transcript
Archive Clip
Sound Effect

00:38:43 - 00:38:43

Ah, screw everyone who's not in a Mercedes!

Transcript
Archive Clip
Sound Effect

00:39:34 - 00:39:34

And there was this kind of uproar about that—how dare you drive these selfish—you know, make these selfish cars? And then he walked it back, and he said, "No, no, what I mean is that just, that we—that we have a better chance of protecting the people in the car, so we're going to protect them because they're easier to protect." But of course, you know, there's always gonna be trade-offs.

Transcript
Josh Greene
Interviewee

00:39:35 - 00:39:35

Yeah.

Transcript
Robert

00:39:44 - 00:39:44

And those trade-offs could get really, really tricky and subtle. Because obviously, these cars have sensors.

Transcript
Jad

00:39:44 - 00:39:44

Sensors like cameras, radars, laser, and ultrasound sensors.

Transcript
Raj Rajkumar
Interviewee

00:39:51 - 00:39:51

This is Raj Rajkumar. He's a professor at Carnegie Mellon.

Transcript
Jad

00:39:52 - 00:39:52

I'm the co-director of the GM-CMU Connected and Autonomous Driving Collaborative Research Lab.

Transcript
Raj Rajkumar
Interviewee
Ethos
Communicator

00:39:54 - 00:39:54

He is one of the guys that is writing the code that will go inside GM's driverless car. He says yeah, the sensors at the moment on these cars ...

Transcript
Jad
Ethos

00:40:06 - 00:40:06

Still evolving.

Transcript
Raj Rajkumar
Interviewee

00:40:29 - 00:40:29

... pretty basic.

Transcript
Jad

00:40:33 - 00:40:33

We are very happy if today it can actually detect a pedestrian, can detect a bicyclist, a motorcyclist. Different makers have different shapes, sizes and colors.

Transcript
Raj Rajkumar
Interviewee

00:40:36 - 00:40:36

But he says, it won't be long before ...

Transcript
Jad

00:40:43 - 00:40:43

You can actually know a lot more about who these people are.

Transcript
Raj Rajkumar
Interviewee

00:40:48 - 00:40:48

Eventually they will be able to detect people of different sizes, shapes, and colors. Like, "Oh, that's a skinny person, that's a small person, tall person, Black person, white person. That's a little boy, that's a little girl."

Transcript
Jad

00:40:50 - 00:40:50

So forget the basic moral math. Like, what does a car do if it has to decide oh, do I save this boy or this girl? What about two girls versus one boy and an adult? How about a cat versus a dog? A 75-year-old guy in a suit versus that person over there who might be homeless? You can see where this is going. And it's conceivable that cars will know our medical records, and back at the car show ...

Transcript
Jad

00:41:07 - 00:41:07

We've also heard that term, "car-to-car communication."

Transcript
Archive Clip
Sound Effect
Interviewer

00:41:09 - 00:41:09

Well, that's also one of the enabling technologies in highly-automated driving.

Transcript
Archive Clip
Sound Effect
Christoph von Hugo

00:41:10 - 00:41:10

Mercedes guy basically said in a couple of years, the cars will be networked. They'll be talking to each other. So just imagine a scenario where, like, cars are about to get into accidents, and right at the decision point, they're, like, conferring. "Well, who do you have in your car?" "Me, I got a 70-year-old Wall Street guy, makes eight figures. How about you?" "Well, I'm a bus full of kids. Kids have more years left. You need to move." "Well, hold up. I see that your kids come from a poor neighborhood and have asthma, so I don't know."

Transcript
Jad

00:41:15 - 00:41:15

So you can basically tie yourself up in knots, wrap yourself around an axle. We do not think that any programmer should be given this major burden of deciding who survives and who gets killed. I think these are very fundamental, deep issues that society has to decide at large. I don't think a programmer eating pizza and sipping Coke should be making the call.

Transcript
Raj Rajkumar
Interviewee

00:41:26 - 00:41:26

[laughs] How does society decide? I mean, help me imagine that.

Transcript
Jad

00:41:28 - 00:41:28

I think it really has to be an evolutionary process, I believe.

Transcript
Raj Rajkumar
Interviewee
Message

00:41:29 - 00:41:29

Raj told us that two things basically need to happen. First, we need to get these robocars on the road, get more experience with how they interact with us human drivers and how we interact with them. And two, there need to be, like, industry-wide summits.

Transcript
Jad

00:41:29 - 00:41:29

Bill Ford Jr.: No one company is going to solve that.

00:41:45 - 00:41:45

This is Bill Ford Jr. of the Ford company giving a speech in October of 2016 at the Economic Club of DC.

Transcript
Jad

00:41:54 - 00:41:54

And we have to have—because could you imagine if we had one algorithm and Toyota had another and General Motors had another? I mean, it would be—I mean, obviously you couldn't do that.

Transcript
Archive Clip
Sound Effect
Bill Ford Jr.

00:41:57 - 00:41:57

Because, like, what if the Tibetan cars make one decision and the American cars make another?

Transcript
Jad

00:42:05 - 00:42:05

So we need to have a national discussion on ethics, I think, because we've never had to think of these things before, but the cars will have the time and the ability to do that.

Transcript
Archive Clip
Sound Effect
Bill Ford Jr.

00:42:22 - 00:42:22

[speaking German]

Transcript
Archive Clip
Sound Effect
German Speaker

00:42:22 - 00:42:22

So far, Germany is the only country that we know of that has tackled this head-on.

Transcript
Jad

00:42:28 - 00:42:28

[speaking German]

Transcript
Archive Clip
Sound Effect
German Speaker

00:42:55 - 00:42:55

One of the most significant points the ethics commission made is that autonomous and connected driving is an ethical imperative.

Transcript
Archive Clip
Sound Effect
Translator

00:42:58 - 00:42:58

They—the government has released a code of ethics that says, among other things, that self-driving cars are forbidden to discriminate between humans in almost any way—not on race, not on gender, not on age, nothing.

Transcript
Jad

00:43:25 - 00:43:25

These shouldn't be programmed into the cars.

Transcript
Archive Clip
Sound Effect
Translator

00:43:34 - 00:43:34

One can imagine a few clauses being added in the Geneva Convention if you will of what these automated vehicles should do. A globally-accepted standard, if you will.

Transcript
Raj Rajkumar
Interviewee

00:43:50 - 00:43:50

How we get there to that globally-accepted standard is anyone's guess. And what it will look like, whether it'll be, like, a coherent set of rules or, like, rife with the kind of contradictions we see in our own brain, that also remains to be seen. But one thing is clear.

Transcript
Jad

00:44:12 - 00:44:12

Oh, there's cars coming! Oh! Oh there's cars! Ah!

Transcript
Archive Clip
Sound Effect
Pathos

00:44:20 - 00:44:20

Oh, there are cars coming ...

Transcript
Jad

00:44:24 - 00:44:24

Feel this!

Transcript
Archive Clip
Sound Effect
Pathos

00:44:27 - 00:44:27

... with their questions.

Transcript
Jad

00:44:30 - 00:44:30

Oh, dear Jesus! I could never! Ah! Ah! Oh, where's it going? Goddamn, Bill. Oh my God.

Transcript
Archive Clip
Sound Effect
Pathos

00:44:46 - 00:44:46

Okay, we do need to caveat all this by saying that the moral dilemma we're talking about in the case of these driverless cars is gonna be super rare. Mostly what'll probably happen is that, like, the planeloads full of people that die every day from car accidents, well that's just gonna hit the floor. And so you have to balance the few cases where a car might make a decision you don't like against the massive number of lives saved.

Transcript
Jad

00:45:10 - 00:45:10

I was thinking actually of a different thing. I was thinking even though you dramatically bring down the number of bad things that happen on roads, you dramatically bring down the collisions, you dramatically bring down the mortality, you dramatically lower the number of people who are drunk coming home from a party and just ram someone sideways and killing three of them and injuring two of them for the rest of their lives. Those kinds of things go way down, but the ones that remain are engineered. Like, they are calculated, almost with foresight.

Transcript
Robert

00:45:44 - 00:45:44

Mm-hmm.

Transcript
Jad

00:45:47 - 00:45:47

So here's the difference—and this is such an interesting difference. Like, "Ah, damn, that's so sad that happened, that that guy got drunk and da da da, and maybe he should go to jail." But, "You mean that the society engineered this in?"

Transcript
Robert
Logos

00:46:10 - 00:46:10

[laughs]

Transcript
Jad

00:46:14 - 00:46:14

That is a big difference. One is operatic and seems like the forces of destiny, and the other seems mechanical and pre-thought through.

Transcript
Robert

00:46:28 - 00:46:28

Premeditated, yeah.

Transcript
Jad

00:46:32 - 00:46:32

And there's something dark about a premeditated expected death. And I don't know what you do about that.

Transcript
Robert

00:46:42 - 00:46:42

Well, yeah, but in ...

Transcript
Jad

00:46:45 - 00:46:45

Everybody's on the hook for that.

Transcript
Robert

00:46:49 - 00:46:49

In the particulars, in the particulars it feels dark. It's a little bit like when, you know, should you kill your own baby to save the village?

Transcript
Jad

00:47:00 - 00:47:00

Right.

Transcript
Robert

00:47:03 - 00:47:03

Like, in the particular instance of that one child it's dark. But against the backdrop of the lives saved, it's just a tiny pinprick of darkness. That's all it is.

Transcript
Jad

00:47:19 - 00:47:19

Yeah, but you know how humans are. If you argue back that yes, a bunch of smarty-pantses concocted a mathematical formula which meant that some people had to die and here they are. There are many fewer than before! A human being, just like Josh would tell you, would have a roar of feeling and of anger and saying, "How dare you engineer this in! No, no, no, no, no!"

Transcript
Robert

00:47:50 - 00:47:50

And that human being needs to meditate like the monks to silence that feeling because the feeling in that case is just getting in the way!

Transcript
Jad

00:48:02 - 00:48:02

Yes and no. And that may be impossible unless you're a monk, for God's sake. [laughs]

Transcript
Robert

00:48:12 - 00:48:12

See, we're right back where we started now. All right, we should go.

Transcript
Jad

00:48:22 - 00:48:22

Jad, you have to thank some people, no?

Transcript
Robert

00:48:27 - 00:48:27

Yes. Oh, this piece was produced by Amanda Aronczyk with help from Bethel Habte. Special thanks to Iyad Rahwan, Edmond Awad and Sydney Levine from The Moral Machine Group, MIT. Also thanks to Sertac Karaman, Xin Xiang and Roborace for all their help. And I guess we should go now.

Transcript
Jad
Communicator
Ethos

00:49:01 - 00:49:01

Yeah. I'll um ...

Transcript
Robert

00:49:04 - 00:49:04

I'm Jad Abumrad.

Transcript
Jad
Communicator

00:49:06 - 00:49:06

I'm not getting into your car.

Transcript
Robert

00:49:10 - 00:49:10

[laughs]

Transcript
Jad

00:49:13 - 00:49:13

If you don't mind. Just take my own.

Transcript
Robert

00:49:18 - 00:49:18

I'm gonna rig up an autonomous vehicle to the bottom of your bed.

Transcript
Jad

00:49:24 - 00:49:24

[laughs]

Transcript
Robert

00:49:27 - 00:49:27

So you're gonna go to bed and suddenly find yourself on the highway driving you wherever I want.

Transcript
Jad

00:49:34 - 00:49:34

[laughs] No you won't.

Transcript
Robert

00:49:38 - 00:49:38

Anyhow, okay, we should go.

Transcript
Jad

00:49:42 - 00:49:42

Yeah.

Transcript
Robert

00:49:45 - 00:49:45

I'm Jad Abumrad.

Transcript
Jad
Communicator

00:49:47 - 00:49:47

I'm Robert Krulwich.

Transcript
Robert
Communicator

00:49:50 - 00:49:50

Thanks for listening.

Transcript
Jad

Project By: saamturner
This site was generated by AVAnnotate IIIF Manifest