Annotating Driverless Dilemma (direct link)
00:03:07
My name is Nick Bilton. I'm a special correspondent for Vanity Fair.
00:03:16
The thing that I've been pretty obsessed with lately is actually not fake news, but it's automation and artificial intelligence and driverless cars. Because it's going to have a larger effect on society than any technology that I think has ever been created in the history of mankind. I know that's kind of a bold statement, but ...
00:03:28
But you've got to imagine that—you know, that there will be in the next 10 years, 20 to 50 million jobs that will just vanish to automation. You've got, you know, a million truckers that will lose their jobs, the—but it's not—we think about, like, automation and driverless cars, and we think about the fact that they are going to—the people that just drive the cars, like the taxi drivers and the truckers, are gonna lose their jobs.
00:03:36
What we don't realize is that there are entire industries that are built around just cars. So for example, if you are not driving the car, why do you need insurance? There's no parking tickets because your driverless car knows where it can and cannot park and goes and finds a spot and moves and so on. If there are truckers that are no longer using rest stops because driverless cars don't have to stop and pee or take a nap, then all of those little rest stops all across America are affected. People aren't stopping to use the restrooms. They're not buying burgers. They're not staying in these hotels, and so on and so forth.
00:03:43
And then if you look at driverless cars to a next level, the whole concept of what a car is is going to change. So for example, right now a car has five seats and a wheel, but if I'm not driving, what's the point of having five seats and a wheel? You could imagine that you take different cars, so maybe when I was on my way here to this interview, I wanted to work out, so I called a driverless gym car. Or I have a meeting out in Santa Monica after this, and it's an hour, so I call a movie car to watch a movie on the way out there. Or office car, and I pick up someone else and we have a meeting on the way.
00:03:46
And all of these things are gonna happen not in a vacuum, but simultaneously. This, you know—pizza delivery drivers are gonna be replaced by robots that will actually cook your pizza on the way to your house in a little box and then deliver it. And so kind of a little bit of a long-winded answer, but I truly do think that—that it's gonna have a massive, massive effect on society.
00:03:57
Am I stressing you guys out? Are you—are you having heart palpitations over there?
00:10:47
Yes.
00:10:52
Yes. [laughs]
00:11:09
Yes.
00:11:32
Yes.
00:11:35
Yeah.
00:11:44
No.
00:11:48
No.
00:12:00
Never.
00:12:01
No.
00:12:06
No.
00:12:34
Pulling the lever to save the five—I don't know, that feels better than pushing the one to save the five. But I don't really know why, so that's a good—there's a good moral quandary for you. [laughs]
00:12:59
Alrighty.
00:13:14
How do people make this judgment? Forget whether or not these judgments are right or wrong, just what's going on in the brain that makes people distinguish so naturally and intuitively between these two cases, which from an actuarial point of view, are very, very, very similar if not identical?
00:13:26
So we're here in the control room. Where you basically just see ...
00:13:38
Yeah, it looks kind of like an airplane engine.
00:13:46
I'll tell you a funny story. You can't have any metal in there because of the magnet, so we have this long list of questions that we ask people to make sure they can go in. "Do you have a pacemaker? Have you ever worked with metal?" Blah, blah, blah, blah, blah ...
00:14:04
Yeah, because you could have little flecks of metal in your eyes that you would never even know are there from having done metalworking. And one of the questions is whether or not you wear a wig or anything like that, because they often have metal wires in with that. And there's this very nice woman who does brain research here who's Italian, and she's asking her subjects over the phone all these screening questions
00:14:05
And so I have this person over to dinner. She says, "Yeah, you know, I ended up doing this study, but it asks you the weirdest questions. This woman's like, 'Do you have a hairpiece?' And—and I'm like, 'What does it have to do if I have herpes or not?'" [laughs] Anyway, and she said—you know, she asked, "Do you have a hairpiece?" But she—so now she asks people if you wear a wig or whatever.
00:14:19
Oh yeah. Yep, several times.
00:14:31
All right, I'll show you some—some stuff. Okay, let me think.
00:14:36
Yep, it's top-down. It's sort of sliced, you know, like—like a deli slicer.
00:14:59
This little guy right here and these two guys right there.
00:15:23
This one we're looking at here, this ...
00:15:51
I think this is part of the "No, no, no" crowd.
00:17:25
A theory—not proven, but I think—this is what I think the evidence suggests.
00:17:27
You've got one part of the brain that says, "Huh, five lives versus one life? Wouldn't it be better to save five versus one?"
00:18:41
It understands it on that level, and says ...
00:19:14
No. Bad! Don't do.
00:19:23
No, I don't think I could push... a person.
00:19:41
No.
00:20:00
Never.
00:20:12
Instead of having sort of one system that just sort of churns out the answer and bing, we have multiple systems that give different answers, and they duke it out. And hopefully out of that competition comes morality.
00:20:38
Our—our primate ancestors, before we were full-blown humans, had intensely social lives. They have social mechanisms that prevent them from doing all the nasty things that they might otherwise be interested in doing. And so deep in our brain, we have what you might call basic primate morality. And basic primate morality doesn't understand things like tax evasion, but it does understand things like pushing your buddy off of a cliff.
00:20:56
Right. Whereas ...
00:21:10
Right. Now that case, I think it's a pretty easy case. Even though it's five versus one, in that case, people just go with what we might call the "inner chimp." But there are other, but there ...
00:21:15
Right. Well, that's what's interesting.
00:21:31
Right. Well, what's interesting is that we think of basic human morality as being handed down from on high, and it's probably better to say that it was handed up from below, that our most basic core moral values are not the things that we humans have invented, but the things that we've actually inherited from other people. The stuff that we humans have invented are the things that seem more peripheral and variable.
00:21:44
Right.
00:22:58
Right. Or at least, you know, that should be your default response. I mean, certainly chimpanzees are extremely violent and they do kill each other, but they don't do it as a matter of course. They, so to speak, have to have some context-sensitive reason for doing so.
00:23:12
Yeah.
00:23:21
The situation is somewhat similar to the last episode of M*A*S*H, for people who are familiar with that. But the way we tell the story, it goes like this: it's wartime ...
00:23:33
You're hiding in the basement with some of your fellow villagers.
00:23:44
And the enemy soldiers are outside. They have orders to kill anyone that they find.
00:24:03
If they hear your baby, they're gonna find you and the baby and everyone else, and they're gonna kill everybody. And the only way you can stop this from happening is cover the baby's mouth. But if you do that, the baby's going to smother and die. If you don't cover the baby's mouth, the soldiers are gonna find everybody and everybody's gonna be killed, including you, including your baby.
00:24:08
And this is a very tough question. People take a long time to think about it, and some people say yes, and some people say no.
00:24:17
Children are a blessing and a gift from God, and we do not do that to children.
00:24:22
Yes, I think I would kill my baby to save everyone else and myself.
00:24:35
No, I would not kill the baby.
00:24:36
I feel because it's my baby, I have the right to terminate the life.
00:24:43
I'd like to say that I would kill the baby, but I don't know if I'd have the inner strength.
00:25:07
No. If it comes down to killing my own child, my own daughter or my own son, then I choose death.
00:25:28
Yeah. If you have to because it was done in World War II. When the Germans were coming around, there was a mother that had a baby that was crying, and rather than be found, she actually suffocated the baby, but the other people lived.
00:25:49
Sounds like an old M*A*S*H thing. No, you do not kill your baby.
00:26:14
No!
00:27:11
Well, that's an interesting question. And that's one of the things that we're looking at.
00:27:26
These two areas here, towards the front ...
00:27:37
It's those sort of areas that are very highly developed in humans as compared to other species.
00:28:16
Yeah, right about there.
00:28:29
Bilateral.
00:28:35
Certainly these parts of the brain are more highly developed in humans.
00:28:38
That's a fair statement.
00:28:47
Yeah, where it's both emotional, but there's also a sort of a rational attempt to sort of sort through those emotions. Those are the cases that are showing more activity in that area.
00:29:11
Well, you know, that's the hypothesis. But it's gonna take a lot of more research to sort of tease apart what these different parts of the brain are doing, or if some of these are just sort of activating in an incidental kind of way. I mean, we really don't know. This is all—all very new.
00:29:41
That's right. Yeah, so two kids, and we're close to adding a cat.
00:29:48
Absolutely, so...
00:29:53
So in one version, we ask people about hitting a switch that opens a trapdoor on the footbridge and drops the person. In one version of that, the switch is right next to the person. In another version, the switch is far away. And in yet another version, you're right next to the person, and you don't push them off with your hands, but you push them with a pole.
00:29:59
That's roughly held up.
00:30:05
I mean, some people just don't care very much about hurting other people. They don't have that kind of an emotional response.
00:30:26
You know, I'm aware that this is—that killing somebody is a terrible thing to do. And I feel that, but I recognize that this is done for a noble reason, and therefore, it's—it's okay.
00:30:59
You know, now as we're entering the age of self-driving cars, ah, this is like the trolley problem now finally come to life.
00:32:23
... come to life ...
00:32:30
The self-driving car now is headed towards a bunch of pedestrians in the road. The only way to save them is to swerve out of the way, but that will run the car into a concrete wall and it will kill the passenger in the car. What should the car do? Should the car go straight and run over, say, those five people, or should it swerve and—and kill the one person?
00:32:33
If you ask people in the abstract ...
00:32:38
They're much more likely to say ...
00:32:45
I think you should sacrifice one for the good of the many.
00:32:57
They should just try to do the most good or avoid the most harm.
00:33:01
Logically, it would be the driver.
00:33:06
Kill the driver.
00:33:10
Be selfless.
00:33:11
I think it should kill the driver.
00:33:22
Would you want to drive in a car that would potentially sacrifice you to save the lives of more people in order to minimize the total amount of harm? They say ...
00:33:43
No. I wouldn't buy it.
00:33:54
No. Absolutely not.
00:33:59
That would kill me in it? No.
00:34:02
So I'm not gonna—I'm not gonna buy a car that's gonna purposely kill me.
00:34:07
Hell no. I wouldn't buy it.
00:34:17
[laughs] For sure, no. [laughs]
00:34:26
I'd sell it, but I wouldn't buy it.
00:36:28
If you know you can save one person, at least save that one.
00:36:42
Save the one in the car.
00:37:03
If you know for sure that one thing, one death can be prevented, then that's your first priority.
00:37:06
Now when he said this to you ...
00:37:22
... did it seem controversial at all in the moment?
00:37:26
In the moment, it seemed incredibly logical.
00:37:54
No.
00:37:59
I wouldn't buy it, personally.
00:39:34
And there was this kind of uproar about that—how dare you drive these selfish—you know, make these selfish cars? And then he walked it back, and he said, "No, no, what I mean is that just, that we—that we have a better chance of protecting the people in the car, so we're going to protect them because they're easier to protect." But of course, you know, there's always gonna be trade-offs.
00:39:44
Sensors like cameras, radars, laser, and ultrasound sensors.
00:39:52
I'm the co-director of the GM-CMU Connected and Autonomous Driving Collaborative Research Lab.
00:40:06
Still evolving.
00:40:33
We are very happy if today it can actually detect a pedestrian, can detect a bicyclist, a motorcyclist. Different makers have different shapes, sizes and colors.
00:40:43
You can actually know a lot more about who these people are.
00:41:15
So you can basically tie yourself up in knots, wrap yourself around an axle. We do not think that any programmer should be given this major burden of deciding who survives and who gets killed. I think these are very fundamental, deep issues that society has to decide at large. I don't think a programmer eating pizza and sipping Coke should be making the call.
00:41:28
I think it really has to be an evolutionary process, I believe.
00:43:34
One can imagine a few clauses being added in the Geneva Convention if you will of what these automated vehicles should do. A globally-accepted standard, if you will.
Annotating Radiolab's "Driverless Dilemma"
00:03:07
My name is Nick Bilton. I'm a special correspondent for Vanity Fair.
00:03:16
The thing that I've been pretty obsessed with lately is actually not fake news, but it's automation and artificial intelligence and driverless cars. Because it's going to have a larger effect on society than any technology that I think has ever been created in the history of mankind. I know that's kind of a bold statement, but ...
00:03:28
But you've got to imagine that—you know, that there will be in the next 10 years, 20 to 50 million jobs that will just vanish to automation. You've got, you know, a million truckers that will lose their jobs, the—but it's not—we think about, like, automation and driverless cars, and we think about the fact that they are going to—the people that just drive the cars, like the taxi drivers and the truckers, are gonna lose their jobs.
00:03:36
What we don't realize is that there are entire industries that are built around just cars. So for example, if you are not driving the car, why do you need insurance? There's no parking tickets because your driverless car knows where it can and cannot park and goes and finds a spot and moves and so on. If there are truckers that are no longer using rest stops because driverless cars don't have to stop and pee or take a nap, then all of those little rest stops all across America are affected. People aren't stopping to use the restrooms. They're not buying burgers. They're not staying in these hotels, and so on and so forth.
00:03:43
And then if you look at driverless cars to a next level, the whole concept of what a car is is going to change. So for example, right now a car has five seats and a wheel, but if I'm not driving, what's the point of having five seats and a wheel? You could imagine that you take different cars, so maybe when I was on my way here to this interview, I wanted to work out, so I called a driverless gym car. Or I have a meeting out in Santa Monica after this, and it's an hour, so I call a movie car to watch a movie on the way out there. Or office car, and I pick up someone else and we have a meeting on the way.
00:03:46
And all of these things are gonna happen not in a vacuum, but simultaneously. This, you know—pizza delivery drivers are gonna be replaced by robots that will actually cook your pizza on the way to your house in a little box and then deliver it. And so kind of a little bit of a long-winded answer, but I truly do think that—that it's gonna have a massive, massive effect on society.
00:03:57
Am I stressing you guys out? Are you—are you having heart palpitations over there?
00:10:47
Yes.
00:10:52
Yes. [laughs]
00:11:09
Yes.
00:11:32
Yes.
00:11:35
Yeah.
00:11:44
No.
00:11:48
No.
00:12:00
Never.
00:12:01
No.
00:12:06
No.
00:12:34
Pulling the lever to save the five—I don't know, that feels better than pushing the one to save the five. But I don't really know why, so that's a good—there's a good moral quandary for you. [laughs]
00:12:59
Alrighty.
00:13:14
How do people make this judgment? Forget whether or not these judgments are right or wrong, just what's going on in the brain that makes people distinguish so naturally and intuitively between these two cases, which from an actuarial point of view, are very, very, very similar if not identical?
00:13:26
So we're here in the control room. Where you basically just see ...
00:13:38
Yeah, it looks kind of like an airplane engine.
00:13:46
I'll tell you a funny story. You can't have any metal in there because of the magnet, so we have this long list of questions that we ask people to make sure they can go in. "Do you have a pacemaker? Have you ever worked with metal?" Blah, blah, blah, blah, blah ...
00:14:04
Yeah, because you could have little flecks of metal in your eyes that you would never even know are there from having done metalworking. And one of the questions is whether or not you wear a wig or anything like that, because they often have metal wires in with that. And there's this very nice woman who does brain research here who's Italian, and she's asking her subjects over the phone all these screening questions
00:14:05
And so I have this person over to dinner. She says, "Yeah, you know, I ended up doing this study, but it asks you the weirdest questions. This woman's like, 'Do you have a hairpiece?' And—and I'm like, 'What does it have to do if I have herpes or not?'" [laughs] Anyway, and she said—you know, she asked, "Do you have a hairpiece?" But she—so now she asks people if you wear a wig or whatever.
00:14:19
Oh yeah. Yep, several times.
00:14:31
All right, I'll show you some—some stuff. Okay, let me think.
00:14:36
Yep, it's top-down. It's sort of sliced, you know, like—like a deli slicer.
00:14:59
This little guy right here and these two guys right there.
00:15:23
This one we're looking at here, this ...
00:15:51
I think this is part of the "No, no, no" crowd.
00:17:25
A theory—not proven, but I think—this is what I think the evidence suggests.
00:17:27
You've got one part of the brain that says, "Huh, five lives versus one life? Wouldn't it be better to save five versus one?"
00:18:41
It understands it on that level, and says ...
00:19:14
No. Bad! Don't do.
00:19:23
No, I don't think I could push... a person.
00:19:41
No.
00:20:00
Never.
00:20:12
Instead of having sort of one system that just sort of churns out the answer and bing, we have multiple systems that give different answers, and they duke it out. And hopefully out of that competition comes morality.
00:20:38
Our—our primate ancestors, before we were full-blown humans, had intensely social lives. They have social mechanisms that prevent them from doing all the nasty things that they might otherwise be interested in doing. And so deep in our brain, we have what you might call basic primate morality. And basic primate morality doesn't understand things like tax evasion, but it does understand things like pushing your buddy off of a cliff.
00:20:56
Right. Whereas ...
00:21:10
Right. Now that case, I think it's a pretty easy case. Even though it's five versus one, in that case, people just go with what we might call the "inner chimp." But there are other, but there ...
00:21:15
Right. Well, that's what's interesting.
00:21:31
Right. Well, what's interesting is that we think of basic human morality as being handed down from on high, and it's probably better to say that it was handed up from below, that our most basic core moral values are not the things that we humans have invented, but the things that we've actually inherited from other people. The stuff that we humans have invented are the things that seem more peripheral and variable.
00:21:44
Right.
00:22:58
Right. Or at least, you know, that should be your default response. I mean, certainly chimpanzees are extremely violent and they do kill each other, but they don't do it as a matter of course. They, so to speak, have to have some context-sensitive reason for doing so.
00:23:12
Yeah.
00:23:21
The situation is somewhat similar to the last episode of M*A*S*H, for people who are familiar with that. But the way we tell the story, it goes like this: it's wartime ...
00:23:33
You're hiding in the basement with some of your fellow villagers.
00:23:44
And the enemy soldiers are outside. They have orders to kill anyone that they find.
00:24:03
If they hear your baby, they're gonna find you and the baby and everyone else, and they're gonna kill everybody. And the only way you can stop this from happening is cover the baby's mouth. But if you do that, the baby's going to smother and die. If you don't cover the baby's mouth, the soldiers are gonna find everybody and everybody's gonna be killed, including you, including your baby.
00:24:08
And this is a very tough question. People take a long time to think about it, and some people say yes, and some people say no.
00:24:17
Children are a blessing and a gift from God, and we do not do that to children.
00:24:22
Yes, I think I would kill my baby to save everyone else and myself.
00:24:35
No, I would not kill the baby.
00:24:36
I feel because it's my baby, I have the right to terminate the life.
00:24:43
I'd like to say that I would kill the baby, but I don't know if I'd have the inner strength.
00:25:07
No. If it comes down to killing my own child, my own daughter or my own son, then I choose death.
00:25:28
Yeah. If you have to because it was done in World War II. When the Germans were coming around, there was a mother that had a baby that was crying, and rather than be found, she actually suffocated the baby, but the other people lived.
00:25:49
Sounds like an old M*A*S*H thing. No, you do not kill your baby.
00:26:14
No!
00:27:11
Well, that's an interesting question. And that's one of the things that we're looking at.
00:27:26
These two areas here, towards the front ...
00:27:37
It's those sort of areas that are very highly developed in humans as compared to other species.
00:28:16
Yeah, right about there.
00:28:29
Bilateral.
00:28:35
Certainly these parts of the brain are more highly developed in humans.
00:28:38
That's a fair statement.
00:28:47
Yeah, where it's both emotional, but there's also a sort of a rational attempt to sort of sort through those emotions. Those are the cases that are showing more activity in that area.
00:29:11
Well, you know, that's the hypothesis. But it's gonna take a lot of more research to sort of tease apart what these different parts of the brain are doing, or if some of these are just sort of activating in an incidental kind of way. I mean, we really don't know. This is all—all very new.
00:29:41
That's right. Yeah, so two kids, and we're close to adding a cat.
00:29:48
Absolutely, so...
00:29:53
So in one version, we ask people about hitting a switch that opens a trapdoor on the footbridge and drops the person. In one version of that, the switch is right next to the person. In another version, the switch is far away. And in yet another version, you're right next to the person, and you don't push them off with your hands, but you push them with a pole.
00:29:59
That's roughly held up.
00:30:05
I mean, some people just don't care very much about hurting other people. They don't have that kind of an emotional response.
00:30:26
You know, I'm aware that this is—that killing somebody is a terrible thing to do. And I feel that, but I recognize that this is done for a noble reason, and therefore, it's—it's okay.
00:30:59
You know, now as we're entering the age of self-driving cars, ah, this is like the trolley problem now finally come to life.
00:32:23
... come to life ...
00:32:30
The self-driving car now is headed towards a bunch of pedestrians in the road. The only way to save them is to swerve out of the way, but that will run the car into a concrete wall and it will kill the passenger in the car. What should the car do? Should the car go straight and run over, say, those five people, or should it swerve and—and kill the one person?
00:32:33
If you ask people in the abstract ...
00:32:38
They're much more likely to say ...
00:32:45
I think you should sacrifice one for the good of the many.
00:32:57
They should just try to do the most good or avoid the most harm.
00:33:01
Logically, it would be the driver.
00:33:06
Kill the driver.
00:33:10
Be selfless.
00:33:11
I think it should kill the driver.
00:33:22
Would you want to drive in a car that would potentially sacrifice you to save the lives of more people in order to minimize the total amount of harm? They say ...
00:33:43
No. I wouldn't buy it.
00:33:54
No. Absolutely not.
00:33:59
That would kill me in it? No.
00:34:02
So I'm not gonna—I'm not gonna buy a car that's gonna purposely kill me.
00:34:07
Hell no. I wouldn't buy it.
00:34:17
[laughs] For sure, no. [laughs]
00:34:26
I'd sell it, but I wouldn't buy it.
00:36:28
If you know you can save one person, at least save that one.
00:36:42
Save the one in the car.
00:37:03
If you know for sure that one thing, one death can be prevented, then that's your first priority.
00:37:06
Now when he said this to you ...
00:37:22
... did it seem controversial at all in the moment?
00:37:26
In the moment, it seemed incredibly logical.
00:37:54
No.
00:37:59
I wouldn't buy it, personally.
00:39:34
And there was this kind of uproar about that—how dare you drive these selfish—you know, make these selfish cars? And then he walked it back, and he said, "No, no, what I mean is that just, that we—that we have a better chance of protecting the people in the car, so we're going to protect them because they're easier to protect." But of course, you know, there's always gonna be trade-offs.
00:39:44
Sensors like cameras, radars, laser, and ultrasound sensors.
00:39:52
I'm the co-director of the GM-CMU Connected and Autonomous Driving Collaborative Research Lab.
00:40:06
Still evolving.
00:40:33
We are very happy if today it can actually detect a pedestrian, can detect a bicyclist, a motorcyclist. Different makers have different shapes, sizes and colors.
00:40:43
You can actually know a lot more about who these people are.
00:41:15
So you can basically tie yourself up in knots, wrap yourself around an axle. We do not think that any programmer should be given this major burden of deciding who survives and who gets killed. I think these are very fundamental, deep issues that society has to decide at large. I don't think a programmer eating pizza and sipping Coke should be making the call.
00:41:28
I think it really has to be an evolutionary process, I believe.
00:43:34
One can imagine a few clauses being added in the Geneva Convention if you will of what these automated vehicles should do. A globally-accepted standard, if you will.
Radiolab Driverless Dilemma
00:03:07 - 00:03:07
My name is Nick Bilton. I'm a special correspondent for Vanity Fair.
00:03:16 - 00:03:16
The thing that I've been pretty obsessed with lately is actually not fake news, but it's automation and artificial intelligence and driverless cars. Because it's going to have a larger effect on society than any technology that I think has ever been created in the history of mankind. I know that's kind of a bold statement, but ...
00:03:28 - 00:03:28
But you've got to imagine that—you know, that there will be in the next 10 years, 20 to 50 million jobs that will just vanish to automation. You've got, you know, a million truckers that will lose their jobs, the—but it's not—we think about, like, automation and driverless cars, and we think about the fact that they are going to—the people that just drive the cars, like the taxi drivers and the truckers, are gonna lose their jobs.
00:03:36 - 00:03:36
What we don't realize is that there are entire industries that are built around just cars. So for example, if you are not driving the car, why do you need insurance? There's no parking tickets because your driverless car knows where it can and cannot park and goes and finds a spot and moves and so on. If there are truckers that are no longer using rest stops because driverless cars don't have to stop and pee or take a nap, then all of those little rest stops all across America are affected. People aren't stopping to use the restrooms. They're not buying burgers. They're not staying in these hotels, and so on and so forth.
00:03:43 - 00:03:43
And then if you look at driverless cars to a next level, the whole concept of what a car is is going to change. So for example, right now a car has five seats and a wheel, but if I'm not driving, what's the point of having five seats and a wheel? You could imagine that you take different cars, so maybe when I was on my way here to this interview, I wanted to work out, so I called a driverless gym car. Or I have a meeting out in Santa Monica after this, and it's an hour, so I call a movie car to watch a movie on the way out there. Or office car, and I pick up someone else and we have a meeting on the way.
00:03:46 - 00:03:46
And all of these things are gonna happen not in a vacuum, but simultaneously. This, you know—pizza delivery drivers are gonna be replaced by robots that will actually cook your pizza on the way to your house in a little box and then deliver it. And so kind of a little bit of a long-winded answer, but I truly do think that—that it's gonna have a massive, massive effect on society.
00:03:57 - 00:03:57
Am I stressing you guys out? Are you—are you having heart palpitations over there?
00:10:47 - 00:10:47
Yes.
00:10:52 - 00:10:52
Yes. [laughs]
00:11:09 - 00:11:09
Yes.
00:11:32 - 00:11:32
Yes.
00:11:35 - 00:11:35
Yeah.
00:11:44 - 00:11:44
No.
00:11:48 - 00:11:48
No.
00:12:00 - 00:12:00
Never.
00:12:01 - 00:12:01
No.
00:12:06 - 00:12:06
No.
00:12:34 - 00:12:34
Pulling the lever to save the five—I don't know, that feels better than pushing the one to save the five. But I don't really know why, so that's a good—there's a good moral quandary for you. [laughs]
00:12:59 - 00:12:59
Alrighty.
00:13:14 - 00:13:14
How do people make this judgment? Forget whether or not these judgments are right or wrong, just what's going on in the brain that makes people distinguish so naturally and intuitively between these two cases, which from an actuarial point of view, are very, very, very similar if not identical?
00:13:26 - 00:13:26
So we're here in the control room. Where you basically just see ...
00:13:38 - 00:13:38
Yeah, it looks kind of like an airplane engine.
00:13:46 - 00:13:46
I'll tell you a funny story. You can't have any metal in there because of the magnet, so we have this long list of questions that we ask people to make sure they can go in. "Do you have a pacemaker? Have you ever worked with metal?" Blah, blah, blah, blah, blah ...
00:14:04 - 00:14:04
Yeah, because you could have little flecks of metal in your eyes that you would never even know are there from having done metalworking. And one of the questions is whether or not you wear a wig or anything like that, because they often have metal wires in with that. And there's this very nice woman who does brain research here who's Italian, and she's asking her subjects over the phone all these screening questions
00:14:05 - 00:14:05
And so I have this person over to dinner. She says, "Yeah, you know, I ended up doing this study, but it asks you the weirdest questions. This woman's like, 'Do you have a hairpiece?' And—and I'm like, 'What does it have to do if I have herpes or not?'" [laughs] Anyway, and she said—you know, she asked, "Do you have a hairpiece?" But she—so now she asks people if you wear a wig or whatever.
00:14:19 - 00:14:19
Oh yeah. Yep, several times.
00:14:31 - 00:14:31
All right, I'll show you some—some stuff. Okay, let me think.
00:14:36 - 00:14:36
Yep, it's top-down. It's sort of sliced, you know, like—like a deli slicer.
00:14:59 - 00:14:59
This little guy right here and these two guys right there.
00:15:23 - 00:15:23
This one we're looking at here, this ...
00:15:51 - 00:15:51
I think this is part of the "No, no, no" crowd.
00:17:25 - 00:17:25
A theory—not proven, but I think—this is what I think the evidence suggests.
00:17:27 - 00:17:27
You've got one part of the brain that says, "Huh, five lives versus one life? Wouldn't it be better to save five versus one?"
00:18:41 - 00:18:41
It understands it on that level, and says ...
00:19:14 - 00:19:14
No. Bad! Don't do.
00:19:23 - 00:19:23
No, I don't think I could push... a person.
00:19:41 - 00:19:41
No.
00:20:00 - 00:20:00
Never.
00:20:12 - 00:20:12
Instead of having sort of one system that just sort of churns out the answer and bing, we have multiple systems that give different answers, and they duke it out. And hopefully out of that competition comes morality.
00:20:38 - 00:20:38
Our—our primate ancestors, before we were full-blown humans, had intensely social lives. They have social mechanisms that prevent them from doing all the nasty things that they might otherwise be interested in doing. And so deep in our brain, we have what you might call basic primate morality. And basic primate morality doesn't understand things like tax evasion, but it does understand things like pushing your buddy off of a cliff.
00:20:56 - 00:20:56
Right. Whereas ...
00:21:10 - 00:21:10
Right. Now that case, I think it's a pretty easy case. Even though it's five versus one, in that case, people just go with what we might call the "inner chimp." But there are other, but there ...
00:21:15 - 00:21:15
Right. Well, that's what's interesting.
00:21:31 - 00:21:31
Right. Well, what's interesting is that we think of basic human morality as being handed down from on high, and it's probably better to say that it was handed up from below, that our most basic core moral values are not the things that we humans have invented, but the things that we've actually inherited from other people. The stuff that we humans have invented are the things that seem more peripheral and variable.
00:21:44 - 00:21:44
Right.
00:22:58 - 00:22:58
Right. Or at least, you know, that should be your default response. I mean, certainly chimpanzees are extremely violent and they do kill each other, but they don't do it as a matter of course. They, so to speak, have to have some context-sensitive reason for doing so.
00:23:12 - 00:23:12
Yeah.
00:23:21 - 00:23:21
The situation is somewhat similar to the last episode of M*A*S*H, for people who are familiar with that. But the way we tell the story, it goes like this: it's wartime ...
00:23:33 - 00:23:33
You're hiding in the basement with some of your fellow villagers.
00:23:44 - 00:23:44
And the enemy soldiers are outside. They have orders to kill anyone that they find.
00:24:03 - 00:24:03
If they hear your baby, they're gonna find you and the baby and everyone else, and they're gonna kill everybody. And the only way you can stop this from happening is cover the baby's mouth. But if you do that, the baby's going to smother and die. If you don't cover the baby's mouth, the soldiers are gonna find everybody and everybody's gonna be killed, including you, including your baby.
00:24:08 - 00:24:08
And this is a very tough question. People take a long time to think about it, and some people say yes, and some people say no.
00:24:17 - 00:24:17
Children are a blessing and a gift from God, and we do not do that to children.
00:24:22 - 00:24:22
Yes, I think I would kill my baby to save everyone else and myself.
00:24:35 - 00:24:35
No, I would not kill the baby.
00:24:36 - 00:24:36
I feel because it's my baby, I have the right to terminate the life.
00:24:43 - 00:24:43
I'd like to say that I would kill the baby, but I don't know if I'd have the inner strength.
00:25:07 - 00:25:07
No. If it comes down to killing my own child, my own daughter or my own son, then I choose death.
00:25:28 - 00:25:28
Yeah. If you have to because it was done in World War II. When the Germans were coming around, there was a mother that had a baby that was crying, and rather than be found, she actually suffocated the baby, but the other people lived.
00:25:49 - 00:25:49
Sounds like an old M*A*S*H thing. No, you do not kill your baby.
00:26:14 - 00:26:14
No!
00:27:11 - 00:27:11
Well, that's an interesting question. And that's one of the things that we're looking at.
00:27:26 - 00:27:26
These two areas here, towards the front ...
00:27:37 - 00:27:37
It's those sort of areas that are very highly developed in humans as compared to other species.
00:28:16 - 00:28:16
Yeah, right about there.
00:28:29 - 00:28:29
Bilateral.
00:28:35 - 00:28:35
Certainly these parts of the brain are more highly developed in humans.
00:28:38 - 00:28:38
That's a fair statement.
00:28:47 - 00:28:47
Yeah, where it's both emotional, but there's also a sort of a rational attempt to sort of sort through those emotions. Those are the cases that are showing more activity in that area.
00:29:11 - 00:29:11
Well, you know, that's the hypothesis. But it's gonna take a lot of more research to sort of tease apart what these different parts of the brain are doing, or if some of these are just sort of activating in an incidental kind of way. I mean, we really don't know. This is all—all very new.
00:29:41 - 00:29:41
That's right. Yeah, so two kids, and we're close to adding a cat.
00:29:48 - 00:29:48
Absolutely, so...
00:29:53 - 00:29:53
So in one version, we ask people about hitting a switch that opens a trapdoor on the footbridge and drops the person. In one version of that, the switch is right next to the person. In another version, the switch is far away. And in yet another version, you're right next to the person, and you don't push them off with your hands, but you push them with a pole.
00:29:59 - 00:29:59
That's roughly held up.
00:30:05 - 00:30:05
I mean, some people just don't care very much about hurting other people. They don't have that kind of an emotional response.
00:30:26 - 00:30:26
You know, I'm aware that this is—that killing somebody is a terrible thing to do. And I feel that, but I recognize that this is done for a noble reason, and therefore, it's—it's okay.
00:30:59 - 00:30:59
You know, now as we're entering the age of self-driving cars, ah, this is like the trolley problem now finally come to life.
00:32:23 - 00:32:23
... come to life ...
00:32:30 - 00:32:30
The self-driving car now is headed towards a bunch of pedestrians in the road. The only way to save them is to swerve out of the way, but that will run the car into a concrete wall and it will kill the passenger in the car. What should the car do? Should the car go straight and run over, say, those five people, or should it swerve and—and kill the one person?
00:32:33 - 00:32:33
If you ask people in the abstract ...
00:32:38 - 00:32:38
They're much more likely to say ...
00:32:45 - 00:32:45
I think you should sacrifice one for the good of the many.
00:32:57 - 00:32:57
They should just try to do the most good or avoid the most harm.
00:33:01 - 00:33:01
Logically, it would be the driver.
00:33:06 - 00:33:06
Kill the driver.
00:33:10 - 00:33:10
Be selfless.
00:33:11 - 00:33:11
I think it should kill the driver.
00:33:22 - 00:33:22
Would you want to drive in a car that would potentially sacrifice you to save the lives of more people in order to minimize the total amount of harm? They say ...
00:33:43 - 00:33:43
No. I wouldn't buy it.
00:33:54 - 00:33:54
No. Absolutely not.
00:33:59 - 00:33:59
That would kill me in it? No.
00:34:02 - 00:34:02
So I'm not gonna—I'm not gonna buy a car that's gonna purposely kill me.
00:34:07 - 00:34:07
Hell no. I wouldn't buy it.
00:34:17 - 00:34:17
[laughs] For sure, no. [laughs]
00:34:26 - 00:34:26
I'd sell it, but I wouldn't buy it.
00:36:28 - 00:36:28
If you know you can save one person, at least save that one.
00:36:42 - 00:36:42
Save the one in the car.
00:37:03 - 00:37:03
If you know for sure that one thing, one death can be prevented, then that's your first priority.
00:37:06 - 00:37:06
Now when he said this to you ...
00:37:22 - 00:37:22
... did it seem controversial at all in the moment?
00:37:26 - 00:37:26
In the moment, it seemed incredibly logical.
00:37:54 - 00:37:54
No.
00:37:59 - 00:37:59
I wouldn't buy it, personally.
00:39:34 - 00:39:34
And there was this kind of uproar about that—how dare you drive these selfish—you know, make these selfish cars? And then he walked it back, and he said, "No, no, what I mean is that just, that we—that we have a better chance of protecting the people in the car, so we're going to protect them because they're easier to protect." But of course, you know, there's always gonna be trade-offs.
00:39:44 - 00:39:44
Sensors like cameras, radars, laser, and ultrasound sensors.
00:39:52 - 00:39:52
I'm the co-director of the GM-CMU Connected and Autonomous Driving Collaborative Research Lab.
00:40:06 - 00:40:06
Still evolving.
00:40:33 - 00:40:33
We are very happy if today it can actually detect a pedestrian, can detect a bicyclist, a motorcyclist. Different makers have different shapes, sizes and colors.
00:40:43 - 00:40:43
You can actually know a lot more about who these people are.
00:41:15 - 00:41:15
So you can basically tie yourself up in knots, wrap yourself around an axle. We do not think that any programmer should be given this major burden of deciding who survives and who gets killed. I think these are very fundamental, deep issues that society has to decide at large. I don't think a programmer eating pizza and sipping Coke should be making the call.
00:41:28 - 00:41:28
I think it really has to be an evolutionary process, I believe.
00:43:34 - 00:43:34
One can imagine a few clauses being added in the Geneva Convention if you will of what these automated vehicles should do. A globally-accepted standard, if you will.
Radiolab Driverless Dilemma
00:03:07 - 00:03:07
My name is Nick Bilton. I'm a special correspondent for Vanity Fair.
00:03:16 - 00:03:16
The thing that I've been pretty obsessed with lately is actually not fake news, but it's automation and artificial intelligence and driverless cars. Because it's going to have a larger effect on society than any technology that I think has ever been created in the history of mankind. I know that's kind of a bold statement, but ...
00:03:28 - 00:03:28
But you've got to imagine that—you know, that there will be in the next 10 years, 20 to 50 million jobs that will just vanish to automation. You've got, you know, a million truckers that will lose their jobs, the—but it's not—we think about, like, automation and driverless cars, and we think about the fact that they are going to—the people that just drive the cars, like the taxi drivers and the truckers, are gonna lose their jobs.
00:03:36 - 00:03:36
What we don't realize is that there are entire industries that are built around just cars. So for example, if you are not driving the car, why do you need insurance? There's no parking tickets because your driverless car knows where it can and cannot park and goes and finds a spot and moves and so on. If there are truckers that are no longer using rest stops because driverless cars don't have to stop and pee or take a nap, then all of those little rest stops all across America are affected. People aren't stopping to use the restrooms. They're not buying burgers. They're not staying in these hotels, and so on and so forth.
00:03:43 - 00:03:43
And then if you look at driverless cars to a next level, the whole concept of what a car is is going to change. So for example, right now a car has five seats and a wheel, but if I'm not driving, what's the point of having five seats and a wheel? You could imagine that you take different cars, so maybe when I was on my way here to this interview, I wanted to work out, so I called a driverless gym car. Or I have a meeting out in Santa Monica after this, and it's an hour, so I call a movie car to watch a movie on the way out there. Or office car, and I pick up someone else and we have a meeting on the way.
00:03:46 - 00:03:46
And all of these things are gonna happen not in a vacuum, but simultaneously. This, you know—pizza delivery drivers are gonna be replaced by robots that will actually cook your pizza on the way to your house in a little box and then deliver it. And so kind of a little bit of a long-winded answer, but I truly do think that—that it's gonna have a massive, massive effect on society.
00:03:57 - 00:03:57
Am I stressing you guys out? Are you—are you having heart palpitations over there?
00:10:47 - 00:10:47
Yes.
00:10:52 - 00:10:52
Yes. [laughs]
00:11:09 - 00:11:09
Yes.
00:11:32 - 00:11:32
Yes.
00:11:35 - 00:11:35
Yeah.
00:11:44 - 00:11:44
No.
00:11:48 - 00:11:48
No.
00:12:00 - 00:12:00
Never.
00:12:01 - 00:12:01
No.
00:12:06 - 00:12:06
No.
00:12:34 - 00:12:34
Pulling the lever to save the five—I don't know, that feels better than pushing the one to save the five. But I don't really know why, so that's a good—there's a good moral quandary for you. [laughs]
00:12:59 - 00:12:59
Alrighty.
00:13:14 - 00:13:14
How do people make this judgment? Forget whether or not these judgments are right or wrong, just what's going on in the brain that makes people distinguish so naturally and intuitively between these two cases, which from an actuarial point of view, are very, very, very similar if not identical?
00:13:26 - 00:13:26
So we're here in the control room. Where you basically just see ...
00:13:38 - 00:13:38
Yeah, it looks kind of like an airplane engine.
00:13:46 - 00:13:46
I'll tell you a funny story. You can't have any metal in there because of the magnet, so we have this long list of questions that we ask people to make sure they can go in. "Do you have a pacemaker? Have you ever worked with metal?" Blah, blah, blah, blah, blah ...
00:14:04 - 00:14:04
Yeah, because you could have little flecks of metal in your eyes that you would never even know are there from having done metalworking. And one of the questions is whether or not you wear a wig or anything like that, because they often have metal wires in with that. And there's this very nice woman who does brain research here who's Italian, and she's asking her subjects over the phone all these screening questions
00:14:05 - 00:14:05
And so I have this person over to dinner. She says, "Yeah, you know, I ended up doing this study, but it asks you the weirdest questions. This woman's like, 'Do you have a hairpiece?' And—and I'm like, 'What does it have to do if I have herpes or not?'" [laughs] Anyway, and she said—you know, she asked, "Do you have a hairpiece?" But she—so now she asks people if you wear a wig or whatever.
00:14:19 - 00:14:19
Oh yeah. Yep, several times.
00:14:31 - 00:14:31
All right, I'll show you some—some stuff. Okay, let me think.
00:14:36 - 00:14:36
Yep, it's top-down. It's sort of sliced, you know, like—like a deli slicer.
00:14:59 - 00:14:59
This little guy right here and these two guys right there.
00:15:23 - 00:15:23
This one we're looking at here, this ...
00:15:51 - 00:15:51
I think this is part of the "No, no, no" crowd.
00:17:25 - 00:17:25
A theory—not proven, but I think—this is what I think the evidence suggests.
00:17:27 - 00:17:27
You've got one part of the brain that says, "Huh, five lives versus one life? Wouldn't it be better to save five versus one?"
00:18:41 - 00:18:41
It understands it on that level, and says ...
00:19:14 - 00:19:14
No. Bad! Don't do.
00:19:23 - 00:19:23
No, I don't think I could push... a person.
00:19:41 - 00:19:41
No.
00:20:00 - 00:20:00
Never.
00:20:12 - 00:20:12
Instead of having sort of one system that just sort of churns out the answer and bing, we have multiple systems that give different answers, and they duke it out. And hopefully out of that competition comes morality.
00:20:38 - 00:20:38
Our—our primate ancestors, before we were full-blown humans, had intensely social lives. They have social mechanisms that prevent them from doing all the nasty things that they might otherwise be interested in doing. And so deep in our brain, we have what you might call basic primate morality. And basic primate morality doesn't understand things like tax evasion, but it does understand things like pushing your buddy off of a cliff.
00:20:56 - 00:20:56
Right. Whereas ...
00:21:10 - 00:21:10
Right. Now that case, I think it's a pretty easy case. Even though it's five versus one, in that case, people just go with what we might call the "inner chimp." But there are other, but there ...
00:21:15 - 00:21:15
Right. Well, that's what's interesting.
00:21:31 - 00:21:31
Right. Well, what's interesting is that we think of basic human morality as being handed down from on high, and it's probably better to say that it was handed up from below, that our most basic core moral values are not the things that we humans have invented, but the things that we've actually inherited from other people. The stuff that we humans have invented are the things that seem more peripheral and variable.
00:21:44 - 00:21:44
Right.
00:22:58 - 00:22:58
Right. Or at least, you know, that should be your default response. I mean, certainly chimpanzees are extremely violent and they do kill each other, but they don't do it as a matter of course. They, so to speak, have to have some context-sensitive reason for doing so.
00:23:12 - 00:23:12
Yeah.
00:23:21 - 00:23:21
The situation is somewhat similar to the last episode of M*A*S*H, for people who are familiar with that. But the way we tell the story, it goes like this: it's wartime ...
00:23:33 - 00:23:33
You're hiding in the basement with some of your fellow villagers.
00:23:44 - 00:23:44
And the enemy soldiers are outside. They have orders to kill anyone that they find.
00:24:03 - 00:24:03
If they hear your baby, they're gonna find you and the baby and everyone else, and they're gonna kill everybody. And the only way you can stop this from happening is cover the baby's mouth. But if you do that, the baby's going to smother and die. If you don't cover the baby's mouth, the soldiers are gonna find everybody and everybody's gonna be killed, including you, including your baby.
00:24:08 - 00:24:08
And this is a very tough question. People take a long time to think about it, and some people say yes, and some people say no.
00:24:17 - 00:24:17
Children are a blessing and a gift from God, and we do not do that to children.
00:24:22 - 00:24:22
Yes, I think I would kill my baby to save everyone else and myself.
00:24:35 - 00:24:35
No, I would not kill the baby.
00:24:36 - 00:24:36
I feel because it's my baby, I have the right to terminate the life.
00:24:43 - 00:24:43
I'd like to say that I would kill the baby, but I don't know if I'd have the inner strength.
00:25:07 - 00:25:07
No. If it comes down to killing my own child, my own daughter or my own son, then I choose death.
00:25:28 - 00:25:28
Yeah. If you have to because it was done in World War II. When the Germans were coming around, there was a mother that had a baby that was crying, and rather than be found, she actually suffocated the baby, but the other people lived.
00:25:49 - 00:25:49
Sounds like an old M*A*S*H thing. No, you do not kill your baby.
00:26:14 - 00:26:14
No!
00:27:11 - 00:27:11
Well, that's an interesting question. And that's one of the things that we're looking at.
00:27:26 - 00:27:26
These two areas here, towards the front ...
00:27:37 - 00:27:37
It's those sort of areas that are very highly developed in humans as compared to other species.
00:28:16 - 00:28:16
Yeah, right about there.
00:28:29 - 00:28:29
Bilateral.
00:28:35 - 00:28:35
Certainly these parts of the brain are more highly developed in humans.
00:28:38 - 00:28:38
That's a fair statement.
00:28:47 - 00:28:47
Yeah, where it's both emotional, but there's also a sort of a rational attempt to sort of sort through those emotions. Those are the cases that are showing more activity in that area.
00:29:11 - 00:29:11
Well, you know, that's the hypothesis. But it's gonna take a lot of more research to sort of tease apart what these different parts of the brain are doing, or if some of these are just sort of activating in an incidental kind of way. I mean, we really don't know. This is all—all very new.
00:29:41 - 00:29:41
That's right. Yeah, so two kids, and we're close to adding a cat.
00:29:48 - 00:29:48
Absolutely, so...
00:29:53 - 00:29:53
So in one version, we ask people about hitting a switch that opens a trapdoor on the footbridge and drops the person. In one version of that, the switch is right next to the person. In another version, the switch is far away. And in yet another version, you're right next to the person, and you don't push them off with your hands, but you push them with a pole.
00:29:59 - 00:29:59
That's roughly held up.
00:30:05 - 00:30:05
I mean, some people just don't care very much about hurting other people. They don't have that kind of an emotional response.
00:30:26 - 00:30:26
You know, I'm aware that this is—that killing somebody is a terrible thing to do. And I feel that, but I recognize that this is done for a noble reason, and therefore, it's—it's okay.
00:30:59 - 00:30:59
You know, now as we're entering the age of self-driving cars, ah, this is like the trolley problem now finally come to life.
00:32:23 - 00:32:23
... come to life ...
00:32:30 - 00:32:30
The self-driving car now is headed towards a bunch of pedestrians in the road. The only way to save them is to swerve out of the way, but that will run the car into a concrete wall and it will kill the passenger in the car. What should the car do? Should the car go straight and run over, say, those five people, or should it swerve and—and kill the one person?
00:32:33 - 00:32:33
If you ask people in the abstract ...
00:32:38 - 00:32:38
They're much more likely to say ...
00:32:45 - 00:32:45
I think you should sacrifice one for the good of the many.
00:32:57 - 00:32:57
They should just try to do the most good or avoid the most harm.
00:33:01 - 00:33:01
Logically, it would be the driver.
00:33:06 - 00:33:06
Kill the driver.
00:33:10 - 00:33:10
Be selfless.
00:33:11 - 00:33:11
I think it should kill the driver.
00:33:22 - 00:33:22
Would you want to drive in a car that would potentially sacrifice you to save the lives of more people in order to minimize the total amount of harm? They say ...
00:33:43 - 00:33:43
No. I wouldn't buy it.
00:33:54 - 00:33:54
No. Absolutely not.
00:33:59 - 00:33:59
That would kill me in it? No.
00:34:02 - 00:34:02
So I'm not gonna—I'm not gonna buy a car that's gonna purposely kill me.
00:34:07 - 00:34:07
Hell no. I wouldn't buy it.
00:34:17 - 00:34:17
[laughs] For sure, no. [laughs]
00:34:26 - 00:34:26
I'd sell it, but I wouldn't buy it.
00:36:28 - 00:36:28
If you know you can save one person, at least save that one.
00:36:42 - 00:36:42
Save the one in the car.
00:37:03 - 00:37:03
If you know for sure that one thing, one death can be prevented, then that's your first priority.
00:37:06 - 00:37:06
Now when he said this to you ...
00:37:22 - 00:37:22
... did it seem controversial at all in the moment?
00:37:26 - 00:37:26
In the moment, it seemed incredibly logical.
00:37:54 - 00:37:54
No.
00:37:59 - 00:37:59
I wouldn't buy it, personally.
00:39:34 - 00:39:34
And there was this kind of uproar about that—how dare you drive these selfish—you know, make these selfish cars? And then he walked it back, and he said, "No, no, what I mean is that just, that we—that we have a better chance of protecting the people in the car, so we're going to protect them because they're easier to protect." But of course, you know, there's always gonna be trade-offs.
00:39:44 - 00:39:44
Sensors like cameras, radars, laser, and ultrasound sensors.
00:39:52 - 00:39:52
I'm the co-director of the GM-CMU Connected and Autonomous Driving Collaborative Research Lab.
00:40:06 - 00:40:06
Still evolving.
00:40:33 - 00:40:33
We are very happy if today it can actually detect a pedestrian, can detect a bicyclist, a motorcyclist. Different makers have different shapes, sizes and colors.
00:40:43 - 00:40:43
You can actually know a lot more about who these people are.
00:41:15 - 00:41:15
So you can basically tie yourself up in knots, wrap yourself around an axle. We do not think that any programmer should be given this major burden of deciding who survives and who gets killed. I think these are very fundamental, deep issues that society has to decide at large. I don't think a programmer eating pizza and sipping Coke should be making the call.
00:41:28 - 00:41:28
I think it really has to be an evolutionary process, I believe.
00:43:34 - 00:43:34
One can imagine a few clauses being added in the Geneva Convention if you will of what these automated vehicles should do. A globally-accepted standard, if you will.