mumblethrax
Species traitor
- Joined
- Apr 5, 2004
- Messages
- 5,017
This is not a view supported by the evidence.The autonomous vehicles in use today are already much safer than human drivers, so technically we are past 100%.
This is not a view supported by the evidence.The autonomous vehicles in use today are already much safer than human drivers, so technically we are past 100%.
I had to look it up, but apparently, yeah.That long?
Are they better? How has that been determined?And that's just the tip of the iceberg.
The autonomous vehicles in use today are already much safer than human drivers, so technically we are past 100%. The biggest hurdle now is public acceptance.
Recently a health insurance CEO was murdered, and many argued this was a good thing because by denying coverage the company was killing far more. I look forward to those people advocating the same for anyone spreading FUD about this lifesaving vehicle technology.
That's debatable. Evidence on this is extremely difficult to put together. There is evidence suggesting autonomous vehicles are safer and evidence that suggest they are not yet. Varying conditions and other variables make a definitive conclusion tough. But if it isn't yet, it is damn close.This is not a view supported by the evidence.
Which is why the evidence doesn't support this view. It does support cherry-picking, however.That's debatable. Evidence on this is extremely difficult to put together. There is evidence suggesting autonomous vehicles are safer and evidence that suggest they are not yet. Varying conditions and other variables make a definitive conclusion tough. But if it isn't yet, it is damn close.
I would not characterize the conclusions of either of those studies as supporting the view that autonomous vehicles are much safer than human drivers. They both reflect the conventional wisdom that the performance of autonomous vehicles depends on driving conditions. Autonomous vehicles are good at easy mode driving, bad at dealing with edge cases (which includes things like "rain").The studies suggest they are. But with caveats.

I agree it's impressive, but I'm specifically responding to the claim that they're already much safer than human drivers. That's the view that I don't think is supported by the evidence.I'm not saying they are there yet. But close. I have been in autonomous vehicles on the freeway and in city streets. This obviously wasn't in every possible condition. It's pretty damn impressive.
Which means that to gain widespread acceptance, they need to not only be better than human drivers in basically all conditions, but to not introduce errors that humans wouldn't make.The problem is not so much a problem of safety but the perception of safety.
Every time a self-driving car is in an accident, some people will say, "See. Told you so. They are unsafe".
If/when, things reach the point that there are demonstrably, statistically fewer accidents per whatever metric you measure, it will make no difference to such people.
I think they are pretty damn close. If Musk wasn't a turd, I would have bought a Tesla five years ago. And a huge selling point is the autonomous driving. That ability to let the vehicle to take the wheel on long drives is awesome.I agree it's impressive, but I'm specifically responding to the claim that they're already much safer than human drivers. That's the view that I don't think is supported by the evidence.
We aren't close to autonomous vehicles being much better.I think they are pretty damn close. If Musk wasn't a turd, I would have bought a Tesla five years ago. And a huge selling point is the autonomous driving. That ability to let the vehicle to take the wheel on long drives is awesome.
It's a rational answer. It's not an answer that will persuade people who don't have entirely rational reasons for opposing autonomous vehicles.Sometime about 1960 the Chairman of BOAC was asked whether he would fly on a completely automated passenger plane flight (from takeoff to landing) and he replied he would if such had been demonstrated safer than a piloted flight.
A good answer then and a good answer now.
(I remember reading this at the time in either Flight or Aeroplane but cannot provide a reference other than my memory)
It's not a question of if.The problem is not so much a problem of safety but the perception of safety.
Every time a self-driving car is in an accident, some people will say, "See. Told you so. They are unsafe".
If/when, things reach the point that there are demonstrably, statistically fewer accidents per whatever metric you measure, it will make no difference to such people.
And forever after you are known as the Goat ◊◊◊◊◊◊It's a rational answer. It's not an answer that will persuade people who don't have entirely rational reasons for opposing autonomous vehicles.
I mean, most of us here know the reasons why nuclear power is preferable to coal, but that has not led ineluctably to the success of nuclear power.
You ◊◊◊◊ one goat....
i agree, and i also think they need to be held to a higher degree of liability in an accident that causes some kind of injury or death. you can't ever get every single person to make good driving decisions, it's impossible. an ai program you can, so to a much greater degree accidents with self driving are much, much more preventable. barely better than the dumbest, most reckless people on earth isn't good enough imoWhich means that to gain widespread acceptance, they need to not only be better than human drivers in basically all conditions, but to not introduce errors that humans wouldn't make.
Which is why I think this task is more difficult than most people realize.