Random Thread - Anything Goes
-
Here is the thing... all modern driving laws already totally cover self driving cars. The situation is already handled and handled well. There's no need or purpose for a change. The needs of driving remain the same. The only new laws needed are the ones that allow self driving cars onto the road, and those are for vendors not passengers. As people who buy or ride in cars, none of this changes for us in any way.
-
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
The car or the human/owner of the car?
You ticket the driver today, no reason to change the law. You don't ticket passengers for the driver's decision. Whether there is only one or ten passengers, don't matter, you don't ticket people who aren't driving.
That question was actually rhetorical as I was getting to my point. But the point is, how to traffic violations become rectified? Or are we going to need traffic laws anymore?
-
@scottalanmiller said in Random Thread - Anything Goes:
at the request of a city or police (parked cars needed to move for an emergency, for example.) Would you ticket the city or the
when there is no driver, the owner is the one to get the ticket, I would guess.
I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.
-
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
at the request of a city or police (parked cars needed to move for an emergency, for example.) Would you ticket the city or the
when there is no driver, the owner is the one to get the ticket, I would guess.
There is never a driver in a parked car situation. Never has been. Has always been the owner getting a ticket.
In all cases, just look at the current laws and they all work perfectly for driverless cars. All things that had to be figured out a century ago for regular cars.
-
@dashrender said in Random Thread - Anything Goes:
I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.
Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.
-
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
The car or the human/owner of the car?
You ticket the driver today, no reason to change the law. You don't ticket passengers for the driver's decision. Whether there is only one or ten passengers, don't matter, you don't ticket people who aren't driving.
That question was actually rhetorical as I was getting to my point. But the point is, how to traffic violations become rectified? Or are we going to need traffic laws anymore?
There aren't traffic laws with driverless cars. There are laws on the algorithms. Which really is how things are today, but you can regulate them differently.
-
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
The car or the human/owner of the car?
You ticket the driver today, no reason to change the law. You don't ticket passengers for the driver's decision. Whether there is only one or ten passengers, don't matter, you don't ticket people who aren't driving.
That question was actually rhetorical as I was getting to my point. But the point is, how to traffic violations become rectified? Or are we going to need traffic laws anymore?
There aren't traffic laws with driverless cars. There are laws on the algorithms. Which really is how things are today, but you can regulate them differently.
Next question is, would it be suggested for manufacturers to use open source or closed source software? I would assume that they use closed source as they do now. However, I would suggest for them to consider open source as to gain the trust of the people. But most people are blind sheep that are probably going to crawl into the car anyways.
-
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
-
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
The car or the human/owner of the car?
You ticket the driver today, no reason to change the law. You don't ticket passengers for the driver's decision. Whether there is only one or ten passengers, don't matter, you don't ticket people who aren't driving.
That question was actually rhetorical as I was getting to my point. But the point is, how to traffic violations become rectified? Or are we going to need traffic laws anymore?
There aren't traffic laws with driverless cars. There are laws on the algorithms. Which really is how things are today, but you can regulate them differently.
Next question is, would it be suggested for manufacturers to use open source or closed source software? I would assume that they use closed source as they do now. However, I would suggest for them to consider open source as to gain the trust of the people. But most people are blind sheep that are probably going to crawl into the car anyways.
Lots use open source now. Some major manufacturer just switched to open source this past week, in fact.
-
@dashrender said in Random Thread - Anything Goes:
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Very hard to tell and you left out a possibility - why was a truck in the way of the road, and why wasn't that truck communicating its location to the Tesla? In a fully self driving world, the cars have a responsibility to talk to each other. So there are lots of failure points in this example.
-
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
-
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.
Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.
so how do you solve the driver problem? Who is the driver in a driverless car? is it the manufacturer?
-
@nerdydad said in Random Thread - Anything Goes:
Next question is, would it be suggested for manufacturers to use open source or closed source software? I would assume that they use closed source as they do now. However, I would suggest for them to consider open source as to gain the trust of the people. But most people are blind sheep that are probably going to crawl into the car anyways.
You're kidding right? normal people don't trust open source more than closed because they don't know the difference. Sure if you sit down and explain it, they might get it, but if you poled 1000 random people on the street, I'd bet $1000 less then half would even know the difference. No this is not a real invitation for someone to take a pole.
-
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Very hard to tell and you left out a possibility - why was a truck in the way of the road, and why wasn't that truck communicating its location to the Tesla? In a fully self driving world, the cars have a responsibility to talk to each other. So there are lots of failure points in this example.
absolutely true... and we're getting there, but still probably 10-20 years off from that.
-
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.
Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.
so how do you solve the driver problem? Who is the driver in a driverless car? is it the manufacturer?
Of course it is.
-
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
-
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Very hard to tell and you left out a possibility - why was a truck in the way of the road, and why wasn't that truck communicating its location to the Tesla? In a fully self driving world, the cars have a responsibility to talk to each other. So there are lots of failure points in this example.
absolutely true... and we're getting there, but still probably 10-20 years off from that.
It'll take one accident where a company is held liable for a death that they could not have prevented and they go after someone without that technology as criminal culpable for putting such a dangerous vehicle on the road and things will change overnight. That's how this stuff works. Once it is clear that NOT having a self driving car is worse than driving drunk... people will start petitioning for all drivers to be arrested.
-
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
-
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Very hard to tell and you left out a possibility - why was a truck in the way of the road, and why wasn't that truck communicating its location to the Tesla? In a fully self driving world, the cars have a responsibility to talk to each other. So there are lots of failure points in this example.
absolutely true... and we're getting there, but still probably 10-20 years off from that.
It'll take one accident where a company is held liable for a death that they could not have prevented and they go after someone without that technology as criminal culpable for putting such a dangerous vehicle on the road and things will change overnight. That's how this stuff works. Once it is clear that NOT having a self driving car is worse than driving drunk... people will start petitioning for all drivers to be arrested.
I hear what you're saying, but the same can and is said about guns. Cars don't kill people, people driving cars kill people.
I do tend to agree that eventually, maybe in my lifetime, we'll see self driving on common road become a thing of the past.
-
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.