Responsible Tesla self driving car kills driver.

From what I read about Tesla's Autopilot, this is not a situation which it is designed to control. In short, the Autopilot shouldn't have been engaged (on a highway that has intersections), which is not a determination the Autopilot can make.

I disagree with the last part. The technology is good enough, today, to recognize that he was not on the sort of road the autopilot was meant for. It could have shut itself off safely.
 
I disagree with the last part. The technology is good enough, today, to recognize that he was not on the sort of road the autopilot was meant for. It could have shut itself off safely.

Is it so ? Tech which is on the road today, is what was developed tested and marketed years ago. And remember autopilot is a misnommer : it was not supposed to be used without attention and is activated on the behest of the driver. It is not supposed to be fully autonomous and it is not supposed to verify on which road it is AFAIK.

it is more akin to a better cruise control.

People comparing this to autonomous car are out of their mind, or have an axe to grind.
 
elgarak, the Tesla is still going to be dangerous on any highway like the Autobahn if it cannot "see" white trucks or other such vehicles. We really don't know what else the Tesla is blind to seeing. You can certainly have a white truck perpendicular to you on the Autobahn. In America we call it a "jack-knifed semi truck" and it can occur from a variety of ways including an accident pile-up. Would snow or snowfall actually make matters worse and cause the Tesla to be even more blind?
 
People comparing this to autonomous car are out of their mind, or have an axe to grind.

Care to put a name to these quotes? :

the Model S [is] “probably better than humans at this point in highway driving”

the car [is] “almost able to go [between San Francisco and Seattle] without touching the controls at all”.
 
This auto-mobile invention is doomed to failure! Why, at such dangerously high rates of speed, a man would either asphyxiate or combust! The good old reliable horse and buggy will continue to be man's most reliable source of conveyance for all time! Mr. Ford and his ilk will soon find themselves in debtor's prison where they belong!
 
Wrong. ffs study the subject, watch the videos, whatever ... It can park itself, so if it can't handle the pedestrians who are very likely to be around a parking area it has no business being allowed to park itself, right?

It can't see red lights, it can't navigate, it can't turn a corner. Therefore it isn't meant for urban driving. Self parking is not self driving.

Maybe it can slam on the brakes if a pedestrian runs out into the highway, I don't know. But it is not a self driving car. It's a fancy cruise control.
 
Those 130 million autopilot miles are mostly "easy" miles.

Still, it's an impressive record.
We still don't know the statistics for non-fatal accidents using Tesla Autopilot within those 130 million miles. If Brown had ducked his head at the very last instant he might be alive and therefore still no fatalities at 130 million miles in spite of the Tesla not seeing the truck at all.

The autopilot shouldn't have been engaged in this circumstance.
What? Musk would say that this was a perfect highway for the use of the Tesla Autopilot.

Moreover, it should not have been possible to engage it in this circumstance.
How would something like that be incorporated into the Tesla software?
 
elgarak, the Tesla is still going to be dangerous on any highway like the Autobahn if it cannot "see" white trucks or other such vehicles. We really don't know what else the Tesla is blind to seeing. You can certainly have a white truck perpendicular to you on the Autobahn. In America we call it a "jack-knifed semi truck" and it can occur from a variety of ways including an accident pile-up. Would snow or snowfall actually make matters worse and cause the Tesla to be even more blind?

Encountering a semi traveling perpendicular is a common experience when on highway with intersections. It's highly uncommon to encounter one on the interstate.

What you describe is the general problem of autonomous driving. That's why Tesla's Autopilot is still beta, and that's why the Google car cannot be bought or is not yet allowed to drive in most jurisdictions.

My point being: One doesn't ban new technology because there are idiots who don't understand it and use it in situation where one shouldn't. I certainly understood that the situation in which Brown was was not one for the Autopilot capable to handle, and I'ver never even driven a Tesla.

Pending an investigation and pending to see if Tesla themselves oversold the capabilities of its Autopilot. I doubt it, considering what they have said repeatedly. Doesn't matter that there are tons of videos of people showing to take off the hands while on Autopilot -- they all do not follow the advice of their car's manufacturer. Though I could see Tesla having oversold an "Go out and Try it" angle in sales negotiations. If so, they are to blame partially, though the main blame is on the driver. From what I have read so far, it's the equivalent of a reckless driver. His behavior in some of his other videos certainly was.
 
Last edited:
Am I the only one wondering if the guy was trying to make another "Autopilot saved me" video?

And regarding the Harry Potter story, the truck driver, who is ultimately at fault anyhow, is trying to cover his backside.
 
Like Cruise Control, there are more times when you shouldn't use it than one would think.
I've see people use it (CC) during rain, and snow, in suburban traffic (Speed Limit 35 and up), on rural roads, and at the end of a long day.
If you fall asleep, or hit a slick spot, or come to a sharp turn or are on a road that is very winding, it can get you into deep **** in a hurry.
Next step was lane monitoring--an alert when you start to drift. Then Adaptive Cruise Control, which tries to maintain a set distance between you and the car ahead.
Automatic parking is a gimmick, sure, but it is a next step to autonomous vehicles- without intervention, it gauges and controls speed and position to safely put the vehicle where it needs to be.
Tesla is the logical next step, but it still need a living, thinking brain behind it. It's not a full autopilot--think of it as a SAS (aircraft Stability Augmentation System). It's there to help, not take over
Google and the like are jumping over this step, and actually doing it very well, but they are not the next step. People will only accept a little at a time.
We are getting there.
Lane monitoring-->lane correction
Speed control-->Adaptive speed control--> automatic braking
We have those now. The problem is combining them
These things get implemented in stages, not all at once
 
Am I the only one wondering if the guy was trying to make another "Autopilot saved me" video?
Yep. His onboard video camera was not recording.

And regarding the Harry Potter story, the truck driver, who is ultimately at fault anyhow, is trying to cover his backside.
The DVD player and the Harry Potter movie disc were recovered by the police. The movie was still playing when the truck driver went to look at the Tesla right after the accident. That is why the trucker was able to say what movie Brown was watching. The source for information about that is Associated Press and Reuters.
 
And remember autopilot is a misnommer : it was not supposed to be used without attention and is activated on the behest of the driver.
Which attributes it shares with the autopilot found on airplanes - it isn't supposed to be used without attention of a pilot either.
 
Watch and learn. There's going to be a lot of Tesla Cheerleaders out there on the interwebs who are going to figure out how to blame this guy for his own death long before the facts are established. In a way, this reminds me of the Glock Handgun Owners who unfailingly find ways to blame other Glock Owners when they shoot themselves in the leg with their own Glocks (a condition known as "Glock Leg") :)
 
Watch and learn. There's going to be a lot of Tesla Cheerleaders out there on the interwebs who are going to figure out how to blame this guy for his own death long before the facts are established. In a way, this reminds me of the Glock Handgun Owners who unfailingly find ways to blame other Glock Owners when they shoot themselves in the leg with their own Glocks (a condition known as "Glock Leg") :)

Tesla is leading the way with their weasel-worded press releases.
 
What? Musk would say that this was a perfect highway for the use of the Tesla Autopilot.


How would something like that be incorporated into the Tesla software?

Maybe I've misunderstood what sort of road Brown was on.

What I know is that the car is equipped with vision sensors, and that those sensors are capable of seeing things such as stoplights, buildings, and traffic signs. Algorithms exist today that could be used to determine the nature of the street on which the car is driving. Is it an interstate highway? A thoroughfare with lots of stoplights, but plenty of traffic? A residential street? A two lane country road?

It could be programmed to make that determination, and not allow the system to be used in cases that are not recommended by the manufacturer.
 
Yep. His onboard video camera was not recording.


The DVD player and the Harry Potter movie disc were recovered by the police. The movie was still playing when the truck driver went to look at the Tesla right after the accident. That is why the trucker was able to say what movie Brown was watching. The source for information about that is Associated Press and Reuters.

So he never got to see the end of the movie? bummer.
 
Maybe I've misunderstood what sort of road Brown was on.

What I know is that the car is equipped with vision sensors, and that those sensors are capable of seeing things such as stoplights, buildings, and traffic signs. Algorithms exist today that could be used to determine the nature of the street on which the car is driving. Is it an interstate highway? A thoroughfare with lots of stoplights, but plenty of traffic? A residential street? A two lane country road?

It could be programmed to make that determination, and not allow the system to be used in cases that are not recommended by the manufacturer.

Are roads classified in this way through the satnav system? If so, that would be any easy way to allow/disallow this system kicking in.

Even then, there have been plenty of reports pointing out that the Tesla Autopilot gets confused if the lane markings are poor, so it would need to notify the driver that it was having trouble in that respect and get itself itself switched off.
 

Back
Top Bottom