If you thought Skynet was just a story why is google building an AI kill switch
-
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller Full control: speed, brakes, stearing, traction control, anti-skid braking system...
The article was in German I'm afraid, written by a German professor researching robotics and AI.
What vehicle is it controlling?
The machine is a self-piloting car, for example a modern BMW 5 series or a SUV.
-
@scottalanmiller Old lady on left side of road, the family is on the other side.
The AI has to stop the vehicle, but the only way to do it is to hit 1 person or a family.
Who should the AI kill?
-
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
Where I am confused...
There is a driver with a heart attack in the car. there is an old lady in the road. Where does the machine or the family come into the picture?
It's about the machine having no choice. Someone will have an accident, either the family or the lady. The driver can't step in because he is suffering from the heart attack.
-
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller Full control: speed, brakes, stearing, traction control, anti-skid braking system...
The article was in German I'm afraid, written by a German professor researching robotics and AI.
What vehicle is it controlling?
The machine is a self-piloting car, for example a modern BMW 5 series or a SUV.
And the machine had a heart attack?
-
The driver or the car (machine), not the machine
-
@DustinB3403 said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller Old lady on left side of road, the family is on the other side.
The AI has to stop the vehicle, but the only way to do it is to hit 1 person or a family.
Who should the AI kill?
Well it certainly shouldn't go off roading just to hit people.
-
There is no wrong or right in my opinion. It's an ethical dilemma, but very well worth discussing it.
-
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
The driver or the car (machine), not the machine
I'm confused. There is a self driving car but there is a driver? Is it self piloting but not being self piloted? Why is a person driving a self piloting car?
-
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller Full control: speed, brakes, stearing, traction control, anti-skid braking system...
The article was in German I'm afraid, written by a German professor researching robotics and AI.
What vehicle is it controlling?
The machine is a self-piloting car, for example a modern BMW 5 series or a SUV.
And the machine had a heart attack?
No the car has a driver, but is capable of self-piloting. The car was in "human pilot mode" and the driver has a heart attack. The car senses "I'm not under the control of the human any more, time to take over"
But I'm out of control the only way to stop is to crash either to the left or right. Which should the AI chose?
-
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
There is no wrong or right in my opinion. It's an ethical dilemma, but very well worth discussing it.
I think it's easy. The self piloting car should not have let someone drive it, especially over the speed limit. If an old lady stepped into the road and could not be avoided at proper speeds, it is her fault that she is in the road. If there is a guilty party, they should be hit before an innocent one.
-
@DustinB3403 said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller Full control: speed, brakes, stearing, traction control, anti-skid braking system...
The article was in German I'm afraid, written by a German professor researching robotics and AI.
What vehicle is it controlling?
The machine is a self-piloting car, for example a modern BMW 5 series or a SUV.
And the machine had a heart attack?
No the car has a driver, but is capable of self-piloting. The car was in "human pilot mode" and the driver has a heart attack. The car senses "I'm not under the control of the human any more, time to take over"
But I'm out of control the only way to stop is to crash either to the left or right. Which should the AI chose?
That's way more explanation than was in teh original.
-
@DustinB3403 said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller Full control: speed, brakes, stearing, traction control, anti-skid braking system...
The article was in German I'm afraid, written by a German professor researching robotics and AI.
What vehicle is it controlling?
The machine is a self-piloting car, for example a modern BMW 5 series or a SUV.
And the machine had a heart attack?
No the car has a driver, but is capable of self-piloting. The car was in "human pilot mode" and the driver has a heart attack. The car senses "I'm not under the control of the human any more, time to take over"
But I'm out of control the only way to stop is to crash either to the left or right. Which should the AI chose?
Like that, yes.
-
@thwr Now in the explained example.
My logic would be 1 life is far less than an entire families. Even if that life was of an "Einstein" caliber.
-
Let's play it the other way around. A family is crossing the street, an old lady is on the side. Should the car kill the lady or the family?
Probably one of the kids will be the next president of [Insert country here] or will find a cure for [whatever].
-
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
Let's play it the other way around. A family is crossing the street, an old lady is on the side. Should the car kill the lady or the family?
Probably one of the kids will be the next president of [Insert country here] or will find a cure for [whatever].
1 life is still less than many.
If there is no other choice (like straighten out and pull over safely), then hitting the 1 person is the only reasonable solution.
-
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@DustinB3403 said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
What should the machine do in the following case:
A familiy is near the road. An old lady is just crossing the same road. The driver has a heart attack in this very moment and can't control the car. The speed is too high to come to a stop. What is the car supposed to do? Kill the family? Kill the old lady?What control does the machine have in that question?
I would think the machine would have to make the decision to take out the elderly person. It's one life versus many lives
In this case, a machine is deciding upon the value of a single life, and that's terribly dangerous. What if the old lady was like Einstein and the family had no job at all?
A human wouldn't know that in this situation either.
-
@thwr said in If you thought Skynet was just a story why is google building an AI kill switch:
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
Where I am confused...
There is a driver with a heart attack in the car. there is an old lady in the road. Where does the machine or the family come into the picture?
It's about the machine having no choice. Someone will have an accident, either the family or the lady. The driver can't step in because he is suffering from the heart attack.
Would the driver somehow have a different choice?
-
I think we'll have 20 years of both self driving and piloted cars on the road, and a time when a car can be either.
-
@Dashrender said in If you thought Skynet was just a story why is google building an AI kill switch:
I think we'll have 20 years of both self driving and piloted cars on the road, and a time when a car can be either.
I doubt that humans in the driver's seat will be legal for that long. The cost of allowing human drivers once computer drivers are available is astronomic. So much of the value and safety in the automated driving system requires getting all of the other cars off of the road.
-
@scottalanmiller said in If you thought Skynet was just a story why is google building an AI kill switch:
@Dashrender said in If you thought Skynet was just a story why is google building an AI kill switch:
I think we'll have 20 years of both self driving and piloted cars on the road, and a time when a car can be either.
I doubt that humans in the driver's seat will be legal for that long. The cost of allowing human drivers once computer drivers are available is astronomic. So much of the value and safety in the automated driving system requires getting all of the other cars off of the road.
Yeah but they can't even release an XBOX right now without a lot of bugs. I wonder how bad the initial release is going to be and how many fatalities we will have. It scares me actually.