Responsible Tesla self driving car kills driver.

Well we know that in this case the driver was no better than the car.

ETA: Actually the human is far worse because the car was not designed to do any of the things the dead driver decided he would allow it to do. It's like putting a blind person behind the wheel and then blaming them for the inevitable crashes they are involved it.

Luckily, for this guy's family, insurance companies do insure for stupid. I hope they enjoy the money.
 
Well we know that in this case the driver was no better than the car.

Tesla's Autopilot system was meant for short distances on private property, essentially for the car to park, and retrieve, itself. An autonomous car could never make the gross error if judgement the human driver made. Its default in poor conditions is to slow down and not overdrive its ability to monitor its environment. Humans do not follow that safety procedure very well which is why you will have hundreds of human driven cars in a pile up in poor conditions. That wouldn't happen with autonomous cars.

No. That's the summoning feature, where the car drives itself without a driver inside. IIRC, it is currently limited to drive 10 m max (or was it 10 ft = 3 m?) with very low speed (kinda like an automatic transmission creep low). Uses the ultrasonic sensors to detect obstacles. There's a video of the first released version tested by a private customer which seems to show that, at this time, the obstacle sensoring did not work overly well with this version (the bumper touched the test objects, like plastic buckets and stuffed animals). Not that it would hurt someone, but not well enough to satisfy dissenters.

The Autopilot is Tesla's name for all the driving assistance systems. Usually meant for where the car keeps in lane, makes lane changes autonomously on driver request, and regulates its speed depending on other traffic. Which requires a driver and does check, intermittently and dynamically, that a driver is present (by monitoring steering wheel movements). Other car manufacturers require for similar systems the driver's hands present on the wheel, monitored by touch sensors.
 
Last edited:
No. That's the summoning feature, where the car drives itself without a driver inside. IIRC, it is currently limited to drive 10 m max (or was it 10 ft = 3 m?) with very low speed (kinda like an automatic transmission creep low). Uses the ultrasonic sensors to detect obstacles. There's a video of the first released version tested by a private customer which seems to show that, at this time, the obstacle sensoring did not work overly well with this version (the bumper touched the test objects, like plastic buckets and stuffed animals). Not that it would hurt someone, but not well enough to satisfy dissenters.

The Autopilot is Tesla's name for all the driving assistance systems. Usually meant for where the car keeps in lane, makes lane changes autonomously on driver request, and regulates its speed depending on other traffic. Which requires a driver and does check, intermittently and dynamically, that a driver is present (by monitoring steering wheel movements). Other car manufacturers require for similar systems the driver's hands present on the wheel, monitored by touch sensors.

Exactly my point. This guy died because of his own stupidity not because there was a problem with the car. Calling the car self driving, or stating this is proof that machines are less reliable than humans, is just ignorance talking. This accident is a perfect illustration of why driver assist cars are a hazard and why autonomous cars should be the goal.
 
Exactly my point. This guy died because of his own stupidity not because there was a problem with the car. Calling the car self driving, or stating this is proof that machines are less reliable than humans, is just ignorance talking. This accident is a perfect illustration of why driver assist cars are a hazard and why autonomous cars should be the goal.

Ah. Ok. I understand now.

As a goal, I certainly agree, but I think even with today's technology some sort of collision avoidance systems could work well. I think.

There should definitely be a clear delineation between what the car can do and what it cannot, so that it is nearly impossible to think the car is going to be looking out for you.
 
He was speeding - or do you say the Tesla was speeding since it was driving.

DailyMail said:
The driver killed when his Tesla sedan crashed while in self-driving mode was traveling at 9 mph above the speed limit just before hitting the side of a tractor-trailer, federal accident investigators said Tuesday.

Data downloaded from the Tesla Model S shows the vehicle was traveling at 74 mph in a 65-mph zone on a divided highway in Williston, Florida, near Gainesville, the National Transportation Safety Board said in a preliminary report...


http://www.dailymail.co.uk/news/article-3709368/Investigators-say-Tesla-car-speeding-time-crash.html
 
Seems like Tesla Autopilot does not do a great job at avoiding pedestrians. Here's a test of the system where a pedestrian nearly gets run over:

http://www.autoblog.com/2016/06/27/tesla-diy-pedestrian-avoidance-test/

Here's a description of the Autopilot's "Desire to Drive". Even though the car was totaled and the driver dead, the Tesla Autopilot kept on driving after it went under and 18-wheeler:

http://www.abcactionnews.com/news/l...ot-crash-say-car-drove-hundreds-of-yards-with

Which...is strange that the Tesla should broadside a 18 Wheeler on Autopilot because 18-Wheelers are so Big. Take a look at the 18-wheeler that the Tesla hit:

http://www.9news.com.au/world/2016/...rash-that-killed-driver-during-autopilot-test
 
Last edited:
Seems like Tesla Autopilot does not do a great job at avoiding pedestrians. Here's a test of the system where a pedestrian nearly gets run over:

http://www.autoblog.com/2016/06/27/tesla-diy-pedestrian-avoidance-test/

Here's a description of the Autopilot's "Desire to Drive". Even though the car was totaled and the driver dead, the Tesla Autopilot kept on driving after it went under and 18-wheeler:

http://www.abcactionnews.com/news/l...ot-crash-say-car-drove-hundreds-of-yards-with

Which...is strange that the Tesla should broadside a 18 Wheeler on Autopilot because 18-Wheelers are so Big. Take a look at the 18-wheeler that the Tesla hit:

http://www.9news.com.au/world/2016/...rash-that-killed-driver-during-autopilot-test

I don't understand why you are having so much trouble understanding this: Teslas are not outonomous cars and thus CANNOT drive themselves. "Autopilot" is merely a software upgrade. Autonomous vehicles have not been cleared for sale to the public. Driver assist cars have. Teslas are driver assist vehicles so it makes sense tgey will crash when idiots expect them to drive themselves.
 
qayak, I think you should say "May not " drive itself. It obviously *can* under many conditions which is why it is dangerous when people use it incorrectly.
 
qayak, I think you should say "May not " drive itself. It obviously *can* under many conditions which is why it is dangerous when people use it incorrectly.

No, sorry. This car could not pass a driving test. It cannot drive itself. It requires human attention and input. An autonomous car requires nothing from a human except a destination.
 
No, sorry. This car could not pass a driving test. It cannot drive itself. It requires human attention and input. An autonomous car requires nothing from a human except a destination.

It should be obliged to get human input, and should stop safely in the absence of it.
In this case the car didn't know about the lack of input, or didn't respond correctly. Even when the driver was dead. It's a failure of the manufacturer, born of a tendency to hype its product in a way that's bound to give owners the wrong idea about the car's abilities.
 
Seems like Tesla Autopilot does not do a great job at avoiding pedestrians. Here's a test of the system where a pedestrian nearly gets run over:

http://www.autoblog.com/2016/06/27/tesla-diy-pedestrian-avoidance-test/

Here's a description of the Autopilot's "Desire to Drive". Even though the car was totaled and the driver dead, the Tesla Autopilot kept on driving after it went under and 18-wheeler:

http://www.abcactionnews.com/news/l...ot-crash-say-car-drove-hundreds-of-yards-with

Which...is strange that the Tesla should broadside a 18 Wheeler on Autopilot because 18-Wheelers are so Big. Take a look at the 18-wheeler that the Tesla hit:

http://www.9news.com.au/world/2016/...rash-that-killed-driver-during-autopilot-test

I won't comment on the two latter links, as they refer (badly written) to the event that triggered this thread, and there's been written enough about that.

Re. the first link: A little analysis reveal that the "test" those two dumbasses perform values about zilch. It does not test the ability of the Tesla emergency brake feature. Here's a little physics: At 18 mph speed, as they use, the distance to stop is about 13 ft. That's very close, and most drivers would react long before that. Given that they use a live person (which is why they're dumbasses, mostly), the driver is likely to react early, and is almost certain to interrupt the emergency brake system before it can properly engage. Things would change at higher speed, or if the driver would just let the system go, but even those dumbasses realize how dangerous that is without using inanimate obstacles. There are other problems, such as that we do not know what the driver does, exactly. Where does he have his hands and feet? He doesn't show us, he doesn't tell us. If he has the hands on the wheel, it adds another possibility to interfere with the system.

Because, if you go just one link further (a link that has been provided on the page you link to), you'll find a quote from the Tesla's manual, which describes the system a bit more. In particular, it points out that the system turns OFF any time the driver does ANYTHING, that is, steer, applies the brake pedal, and APPLIES THE ACCELERATOR PEDAL. The latter might sound insane to people who misunderstand the system and think of it as an autonomous driving system, but it's a perfectly fine design for an assistant driving system that assumes an attentive and capable driver, except for those times when he is not, and which does not (is not capable to) produce a full virtual image of the surrounding area, and is not capable of identifying the full shape and nature of the obstacles.
 
Last edited:
qayak, I think you should say "May not " drive itself. It obviously *can* under many conditions which is why it is dangerous when people use it incorrectly.

No, sorry. This car could not pass a driving test. It cannot drive itself. It requires human attention and input. An autonomous car requires nothing from a human except a destination.

I disagree - as GlennB has pointed out, a drive-safely feature would be as he describes.

It should be obliged to get human input, and should stop safely in the absence of it.
In this case the car didn't know about the lack of input, or didn't respond correctly. Even when the driver was dead. It's a failure of the manufacturer, born of a tendency to hype its product in a way that's bound to give owners the wrong idea about the car's abilities.
The Tesla currently has a system that can be abused as an autopilot, and its inadequate performance isn't immediately apparent as it can perform many manoeuvres that are required for driving.

The fact that the car would be unable to pass a driving test is evidence of its unsuitability, but that in itself isn't evidence that it can't be used as an autopilot - it has been used as an autopilot. My local paper often carries court reports of people who are banned from driving or driving without a license. These people may not drive, but they do so sufficiently often to keep a regular section of the local newspaper full. They are often caught due to their bad driving.

It is forbidden to drive itself, unfortunately it can be used like that.
 
It should be obliged to get human input, and should stop safely in the absence of it.
In this case the car didn't know about the lack of input, or didn't respond correctly. Even when the driver was dead. It's a failure of the manufacturer, born of a tendency to hype its product in a way that's bound to give owners the wrong idea about the car's abilities.

Exactly! And it's the reason I think Tesla is a bit of a stupid company.
 
It should be obliged to get human input, and should stop safely in the absence of it.
In this case the car didn't know about the lack of input, or didn't respond correctly. Even when the driver was dead. It's a failure of the manufacturer, born of a tendency to hype its product in a way that's bound to give owners the wrong idea about the car's abilities.

Why?

Once more, it's an driving assistance system. That means it should not interfere with human actions. Because it does not (and with the current sensor suites, cannot) get full knowledge of its environment the way the human can, but tries to supplant the human sensoring where it's lacking.

There's the notion that assistance systems are baby steps towards a full automated driving car, but I tend to disagree. Driving assistance and autonomous driving are, right now, different paradigms. One intends to help the driver (warn him, making small adjustments if not taken by the driver, and only to do extreme measures like a full stop as a extreme final measure), but REQUIRES the driver to be the last control, while the other tends to make the driver obsolete (essentially, turn him into cargo), with the driver staying away from the controls completely.

Confusing the two is what leads to those accidents. Pretty much all accidents were abuses of the assistance system to make it do things it was not intended to do.
 
Last edited:
I disagree - as GlennB has pointed out, a drive-safely feature would be as he describes.

The Tesla currently has a system that can be abused as an autopilot, and its inadequate performance isn't immediately apparent as it can perform many manoeuvres that are required for driving.

The fact that the car would be unable to pass a driving test is evidence of its unsuitability, but that in itself isn't evidence that it can't be used as an autopilot - it has been used as an autopilot. My local paper often carries court reports of people who are banned from driving or driving without a license. These people may not drive, but they do so sufficiently often to keep a regular section of the local newspaper full. They are often caught due to their bad driving.

It is forbidden to drive itself, unfortunately it can be used like that.

And around we go. The entire issue is that they require a driver to be completely engaged but are the very reason drivers are disengage. Tesla addresses those two facts in their user instructions for their system.

Tesla is also being investigated because they failed to disclose the accident and sold a bunch of stock between the time of the accident and when the announced the accident and had their shares plummet.

Not exactly a socially responsible company.
 
Elon Musk's push for autopilot unnerves some Tesla employees.


CNN said:
Even before Tesla reported the first known death of a driver using its autopilot feature, some employees worried the car company wasn't taking every possible precaution.

Those building autopilot were acutely aware that any shortcoming or unforeseen flaw could lead to injury or death -- whether it be blind spots with the car's sensors or drivers misusing the technology.
But Tesla founder and CEO Elon Musk believes that autopilot has the potential to save lives by reducing human error -- and has pushed hard to get the feature to market.

The team's motto is "not to let the perfect be the enemy of the better," according to a source close to Tesla. For Musk specifically, the source says his driving force is "don't let concerns slow progress."
However, Tesla CEO Elon Musk brushed aside certain concerns as negligible compared to autopilot's overall lifesaving potential.

Some Tesla (TSLA) employees struggled with this balance, according to interviews CNNMoney conducted with five current and former Tesla employees, including several from the autopilot division, most of whom spoke on condition of anonymity...

Musk pushed back against employees who raised concerns about autopilot that he viewed as "overly cautious," according to one former Tesla executive who worked closely with the CEO...

Raj Rajkumar, an autonomous car pioneer at Carnegie Mellon, frequently meets with employees from auto companies at conferences and research events. According to Rajkumar, Tesla employees he has met "say it's an understatement to say [Tesla] is hyperaggressive."

When Rajkumar has raised concerns with those Tesla employees about autopilot's technical limitations, the response is they have to "wash their hands of it" because "it's a business decision."

Multiple employees told CNNMoney that numbers and data matter above all else for winning arguments with Musk and other top execs...


http://money.cnn.com/2016/07/28/technology/elon-musk-tesla-autopilot/index.html
 
And around we go. The entire issue is that they require a driver to be completely engaged but are the very reason drivers are disengage. Tesla addresses those two facts in their user instructions for their system.

Tesla is also being investigated because they failed to disclose the accident and sold a bunch of stock between the time of the accident and when the announced the accident and had their shares plummet. Not exactly a socially responsible company.

Oh, I didn't know that; such things tend to be frowned upon. Even if one was completely amoral, I'd have thought the ratio of risk to reward was too great.
 
One less clown with more money than brains, the car did the world a favor.
Darwin adds another to his list.
I don't see the point in a self-driving car that requires you to sit there, vigilant, prepared to drive at any instant.
It's like the autopilot. The more you require the pilots to use it, the less they fly, and the more their perishable skills atrophy. (See Air France flight 447 for one of many examples).
Smarter machines makes stupider operators,
Correct.
AP & Reuters are now saying that the driver had a portable DVD player with him and was watching Harry Potter.
Reason enough to thank Darwin.
 

Back
Top Bottom