Responsible Tesla self driving car kills driver.

https://www.youtube.com/watch?v=TfV-xBAWdY4

Here's an Autopilot in action that really shows a major weakness of the system: steering out of an accident without significantly reducing speed. Really...this system is far from something anyone should want to rely upon, and I question the judgement of those who do rely on this system - even a little bit.

had the Tesla started breaking hard when that truck approached it in addition to serving to the right, then it would have been a much safer move and outcome. Instead, the Tesla swerved (seemed to slow just a bit) but pretty much maintained it's original speed. When the Tesla got back in lane, it was travelling too close behind the tuck - and that's not wise considering it's driver doesn't seem to be too award of the traffic around him.

In short, the Tesla autopilot is all about the mechanics of driving and it seems to have no logic behind it that helps make the best choices.
I don't completely agree, nor do I completely disagree.

I just want to point out that the Tesla (or its driver) did nothing wrong. But the truck driver was reckless.

Here in Germany, had this action been witnessed by traffic cops, the truck driver would have had his license suspended for a few months, and probably lost his job because of that. In addition to a high fee.

ETA: It also appears as if your last sentence is a pretty accurate description of what the system is intended to do. If you expected something different, you fell into the same trap as most of those who post stunt videos using it, or the two people involved into accidents we discuss here.

Additionally, everything Tesla's Autopilot does can be had in high end (and increasingly even the lower end) cars from almost all other manufacturers, sometimes with better technology and better results. The difference being that the driving assistance systems of most other manufacturers are not big sales additions because the manufacturers failed to explain/marketed properly what those assistance systems could do, while everyone wanted to have Tesla's Autopilot. It's one of those things that Tesla and Musk are hated for, and that in no small part is responsible for the bad press surrounding the accidents and Tesla.
 
Last edited:
I don't completely agree, nor do I completely disagree.

I just want to point out that the Tesla (or its driver) did nothing wrong. But the truck driver was reckless.

Here in Germany, had this action been witnessed by traffic cops, the truck driver would have had his license suspended for a few months, and probably lost his job because of that. In addition to a high fee.

ETA: It also appears as if your last sentence is a pretty accurate description of what the system is intended to do. If you expected something different, you fell into the same trap as most of those who post stunt videos using it, or the two people involved into accidents we discuss here.

Additionally, everything Tesla's Autopilot does can be had in high end (and increasingly even the lower end) cars from almost all other manufacturers, sometimes with better technology and better results. The difference being that the driving assistance systems of most other manufacturers are not big sales additions because the manufacturers failed to explain/marketed properly what those assistance systems could do, while everyone wanted to have Tesla's Autopilot. It's one of those things that Tesla and Musk are hated for, and that in no small part is responsible for the bad press surrounding the accidents and Tesla.

"It wasn't my Fault!" - Famous last words often heard in Hospital Emergency Rooms, Morgues and Funerals. Point is this: that Tesla was not being driven defensively. And while there may be no legal blame on the Tesla owner's part, he's still a fool for not taking the precautions he could have taken.
 
This argument is garbage unless you believe that most people would find it acceptable for airplane pilots to engage autopilot and then never look at their instruments or the outside world until they're ready to disengage the system. Certainly no airline would find that in any way acceptable.

Pilots navigating with auto-pilot are tales of lore! .. The tales exist and are lauded by books movies and the media in general

If I have "auto-pilot" I can sleep on the way to my mission!

That is how THAT is presented to the public
 
This argument is garbage unless you believe that most people would find it acceptable for airplane pilots to engage autopilot and then never look at their instruments or the outside world until they're ready to disengage the system. Certainly no airline would find that in any way acceptable.

"Acceptable"? No. But, people are keenly aware that legal and professional considerations are what make it unacceptable, that the equipment itself is perfectly capable of doing the job without being babysat, and warnings that it needs constant monitoring are primarily there for liability purposes.

You can call the argument "garbage" all you like; but surprise, there have been people filmed dead asleep at the wheel of their "autopiloting" Teslas. The very victim of the accident in question filmed himself only a month before, doing absolutely nothing to intervene as he consciously allowed his vehicle to reach an unsafe proximity to another at a highway speed and watched the automatic crash-prevention features take over and do their work. People are regularly allowing this "safety feature" to do all of the driving on its own. And it's because Tesla has sold them on the fact that it is capable of doing so.
 
What Checkmite said.

Meanwhile Tesla tell us that the system warns drivers who take their hands off the wheel and starts to slow the car down. Anyone know how loud/severe the warnings are? Something like a smoke alarm? If not, it needs to be.

And can't the car detect the nature of the road from satnav/digital map data and disable the Autopilot on unsuitable roads? The Tesla S also has trouble if the lane markings are poor, and lane markings are (I imagine) always present on roads suitable for the Autopilot. Doesn't it know the lane markings are poor, by virtue of the fact that it isn't seeing them?
 
Comparing airline pilots with someone with a driving license is a bit absurd.
Airline pilots receive significantly more training, and continuing checks, than your everyday driver. Even then, as Trebuchet points out, they still manage to get overconfident.
 
"It wasn't my Fault!" - Famous last words often heard in Hospital Emergency Rooms, Morgues and Funerals. Point is this: that Tesla was not being driven defensively. And while there may be no legal blame on the Tesla owner's part, he's still a fool for not taking the precautions he could have taken.

As I have said: I do not completely disagree with you.

But, given I have been in similar situations myself, and at least once reacted exactly like the Tesla, I cannot really decide if the reaction is correct or the best of all alternatives, or if the Autopilot should have reacted differently. It depends a lot on the situation, some of which we don't know (for instance, we don't know what was behind the Tesla in the same lane. The Autopilot might have noticed another car close directly behind, and had decided to not do a hard break because of it.)
 
"Acceptable"? No. But, people are keenly aware that legal and professional considerations are what make it unacceptable, that the equipment itself is perfectly capable of doing the job without being babysat, and warnings that it needs constant monitoring are primarily there for liability purposes.

You can call the argument "garbage" all you like; but surprise, there have been people filmed dead asleep at the wheel of their "autopiloting" Teslas. The very victim of the accident in question filmed himself only a month before, doing absolutely nothing to intervene as he consciously allowed his vehicle to reach an unsafe proximity to another at a highway speed and watched the automatic crash-prevention features take over and do their work. People are regularly allowing this "safety feature" to do all of the driving on its own. And it's because Tesla has sold them on the fact that it is capable of doing so.
That is highly debatable. Yes, Tesla had aggressively marketed the Autopilot, but there's at least one person (me) who knew exactly what the system is capable of.

As far as I know, all printed documentation from Tesla given to customers, and verbal instructions during sales and introduction into the car on delivery (as far as I know) explained the system as an assistant system, and did not sell it to be capable on using it like in those videos.
 
"Acceptable"? No. But, people are keenly aware that legal and professional considerations are what make it unacceptable, that the equipment itself is perfectly capable of doing the job without being babysat, and warnings that it needs constant monitoring are primarily there for liability purposes.

You can call the argument "garbage" all you like; but surprise, there have been people filmed dead asleep at the wheel of their "autopiloting" Teslas. The very victim of the accident in question filmed himself only a month before, doing absolutely nothing to intervene as he consciously allowed his vehicle to reach an unsafe proximity to another at a highway speed and watched the automatic crash-prevention features take over and do their work. People are regularly allowing this "safety feature" to do all of the driving on its own. And it's because Tesla has sold them on the fact that it is capable of doing so.
What you are making seems like a reasoned, logical point that includes instances of people sleeping while their robocar drives them someplace as a buttressing example.

What I am reading is .."There is a car that can drive me to work while I sleep!!!"

I want one..NOW.
 
What you are making seems like a reasoned, logical point that includes instances of people sleeping while their robocar drives them someplace as a buttressing example.

What I am reading is .."There is a car that can drive me to work while I sleep!!!"

I want one..NOW.

A Tesla isn't a car that can drive you to work while you sleep. This accident illustrates the exact reason Google went with self drive. There is a huge gap in the car-driver interface of driver assisted cars. The car can warn you of dangers but unless you are paying 100% attention the delay between warning, recognition, and response is huge. Autonomous cars have no such lag, the response is instantaneous. Watch the TEDtalk on Google cars if you want to understand why that is.

Driver assist has been out for years. Working in the collision repair industry it is obvious to me they do not work. We repair many cars that have hit things while parking that have parking assist than those that don't have it. I have never repaired a self parking vehicle.

We have been conditioned to mistrust machines but we are at the point where the machines are far superior to us. Self drive cars see equally well day or night. We don't. Self drive cars can handle huge amounts of information from various sensord. We can. Self drive cars respond almost instantaneously to threats. We done. Self drive cars are always alert. We aren't.

As soon as they become available from manufacturers I will be buying one. I regularly drive large distances at all hours and recognize that there is a much safer way to do that. My grandkids are worth it.
 
We repair many (edited: more?)cars that have hit things while parking that have parking assist than those that don't have it. I have never repaired a self parking vehicle.

I'm not sure that I understand this point, and in this case, I think it might be an interesting point, so I'm going to ask for clarification.



Are there cars that are sold as "self parking"? versus other cars that are sold with "parking assist"?

I think, and I seek correction here if I am wrong, that cars today are sold as "parking assist", and those cars tend to get into frequent accidents while parking.
 
I glanced at Musk's defensive tweets yesterday. They were kind of pathetic. Lots of "The media is lying about my cars" kind of stuff. Almost Trumpian.
Musk should never comment on anything negative regarding his enterprises (which are very unlike Trump's in that they're actually innovative and useful to society). He too often ventures into that territory.
 
I'm not sure that I understand this point, and in this case, I think it might be an interesting point, so I'm going to ask for clarification.



Are there cars that are sold as "self parking"? versus other cars that are sold with "parking assist"?

I think, and I seek correction here if I am wrong, that cars today are sold as "parking assist", and those cars tend to get into frequent accidents while parking.
Yes, there are. Note: It's usually only doing parallel parking.

The Tesla is being one. The Tesla also has a "summoning" feature which allows it to drive itself (with no driver inside!) in and out of a garage or a tight parking spot.

Self parallel parking systems are, by now, sold from most manufacturers. Some of them do only the steering, while the driver just needs to operate gas, brake and gear (this automatically shifts the responsibility for collisions to the driver). Tesla, Mercedes and Audi are three I know off that do everything by itself. On the Tesla, you can activate it for city driving. The car would alert you if it passes a fitting spot, then you stop, tap a button, and the car would move itself into the spot.

It used to be a high-end accessory, but it's now here in Germany advertised for cars down to the low end.

I have no idea how often they get into collisions compared to normal parking. From all I read, there's a learning curve for the driver, in particular for the systems that do only steer, but leave everything else to the driver. If you interfere with the steering, most systems would shut off, which some drivers would not necessarily notice. The car would also use up the free space farther than most human drivers would when using distance sensors, leading to drivers interfering and shutting off the system (and then not being able to get out of the spot they are in, for which there is no "auto"). And you need to learn to listen to the car to tell you when to shift from reverse back to forward (and then hitting something when you get it wrong). Note also that a lot of drivers simply avoid doing parallel parking routinely, and don't remember how the car should move.

My car right now is a VW Golf which only has ultrasonic distance sensors (which are the primary input sensors for all such systems). The sensors do have some trouble with dirt. But, so far, they only do false positives (alert of an object that isn't there). I have not yet noticed them with false negatives (not alerting me of objects that ARE there).
 
Last edited:
A Tesla isn't a car that can drive you to work while you sleep. This accident illustrates the exact reason Google went with self drive. There is a huge gap in the car-driver interface of driver assisted cars. The car can warn you of dangers but unless you are paying 100% attention the delay between warning, recognition, and response is huge. Autonomous cars have no such lag, the response is instantaneous. Watch the TEDtalk on Google cars if you want to understand why that is.

Driver assist has been out for years. Working in the collision repair industry it is obvious to me they do not work. We repair many cars that have hit things while parking that have parking assist than those that don't have it. I have never repaired a self parking vehicle.

We have been conditioned to mistrust machines but we are at the point where the machines are far superior to us. Self drive cars see equally well day or night. We don't. Self drive cars can handle huge amounts of information from various sensord. We can. Self drive cars respond almost instantaneously to threats. We done. Self drive cars are always alert. We aren't.

As soon as they become available from manufacturers I will be buying one. I regularly drive large distances at all hours and recognize that there is a much safer way to do that. My grandkids are worth it.

Yeah...that Auto-Pilot Tesla crashed into that Van pretty slick! I'm not so sure I could have pulled quite the Boner.

https://www.youtube.com/watch?v=qQkx-4pFjus
 
Self drive cars see equally well day or night. We don't.

Except when it's raining, foggy or snowing, or when there are autumn leaves in the gutter? Google itself freely admits that its car's perception systems are severely lacking at certain times. And it can't (yet) tell the difference between a dark patch on the road and a pothole. True self-driving is a long way off.
 
Last edited:
Except when it's raining, foggy or snowing, or when there are autumn leaves in the gutter? Google itself freely admits that its car's perception systems are severely lacking at certain times. And it can't (yet) tell the difference between a dark patch on the road and a pothole. True self-driving is a long way off.

Well we know that in this case the driver was no better than the car.

Tesla's Autopilot system was meant for short distances on private property, essentially for the car to park, and retrieve, itself. An autonomous car could never make the gross error if judgement the human driver made. Its default in poor conditions is to slow down and not overdrive its ability to monitor its environment. Humans do not follow that safety procedure very well which is why you will have hundreds of human driven cars in a pile up in poor conditions. That wouldn't happen with autonomous cars.
 

Back
Top Bottom