• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Responsible Tesla self driving car kills driver.

BStrong

Penultimate Amazing
Joined
Jun 14, 2011
Messages
13,087
Location
San Francisco
Obviously, this must be banned:

http://www.wsj.com/articles/tesla-d...pilot-feature-is-linked-to-a-death-1467319355

Joshua Brown, a 40-year-old Ohio owner of a Tesla Model S, died when his electric car drove under the trailer of an 18-wheel truck on a highway in Williston, Fla., according to local media and public records. Mr. Brown had earlier in the year recorded a video of his car’s autopilot avoiding a crash and posted it on YouTube.
 
Obviously, this must be banned:

http://www.wsj.com/articles/tesla-d...pilot-feature-is-linked-to-a-death-1467319355

Joshua Brown, a 40-year-old Ohio owner of a Tesla Model S, died when his electric car drove under the trailer of an 18-wheel truck on a highway in Williston, Fla., according to local media and public records. Mr. Brown had earlier in the year recorded a video of his car’s autopilot avoiding a crash and posted it on YouTube.

I call that "Death Mode". But I got a strange sense of humor....
 
Said it before, I'll say it again: I want to see the car companies develop their software and install it on a wheelchair to interact with pedestrians before they put it into live traffic.

Because this is what happens when laboratory scale development is prematurely released to the wild.
 
Said it before, I'll say it again: I want to see the car companies develop their software and install it on a wheelchair to interact with pedestrians before they put it into live traffic.

Because this is what happens when laboratory scale development is prematurely released to the wild.

You...using your sound reason and good engineering practice to make decisions....you ruin the fun for every one.

For shame....I want to live in a Dream World!
 
Obviously, this must be banned:

http://www.wsj.com/articles/tesla-d...pilot-feature-is-linked-to-a-death-1467319355

Joshua Brown, a 40-year-old Ohio owner of a Tesla Model S, died when his electric car drove under the trailer of an 18-wheel truck on a highway in Williston, Fla., according to local media and public records. Mr. Brown had earlier in the year recorded a video of his car’s autopilot avoiding a crash and posted it on YouTube.
Why must it be banned? Did it kill 50 people in an Orlando nightclub again?
 
Said it before, I'll say it again: I want to see the car companies develop their software and install it on a wheelchair to interact with pedestrians before they put it into live traffic.

Because this is what happens when laboratory scale development is prematurely released to the wild.

All kidding aside, it's one thing to be an unpaid beta tester of some consumer item that has little to no possibility of injuring you and an unpaid beta tester of something that in a worst case scenario could kill yourself, your family and any unfortunate passerby.

I'd no more purchase and drive a self driving car at this point than I'd ride a motorcycle w/o helmet and safety gear.
 
From reading the reports, I get the impression that the lighting conditions were an issue, to the extent that the tester didn't even try to override the system--so he didn't even see the truck, either.
Need. More. Data.
 
Tesla says that you are never supposed to think that the Autopilot will prevent an accident. The driver must remain vigilant and sweve or hit the brakes just as if it were like any other car. It has sensors that require you to keep both hands on the steering wheel or else it sounds warnings and begins to slow itself down.

This driver never touched the brakes even though the truck was right in front of him.
 
It was an automatic and it killed someone, why do you ask?
"It" killed the driver? Deliberately? Like in "Duel"?

Or was it the case that in an accident situation someone got killed. So what conditions caused that accident to happen? Could it have been prevented? If so, why not? What changes might be suggested? What have we learned?
 
Obviously, this must be banned:

http://www.wsj.com/articles/tesla-d...pilot-feature-is-linked-to-a-death-1467319355

Joshua Brown, a 40-year-old Ohio owner of a Tesla Model S, died when his electric car drove under the trailer of an 18-wheel truck on a highway in Williston, Fla., according to local media and public records. Mr. Brown had earlier in the year recorded a video of his car’s autopilot avoiding a crash and posted it on YouTube.

Tesla expressly note and tell driver that this autopilot feature require them to swerve and brake and they should be in control at all time. The reason for that is because Tesla's system is not very advanced and rely on few sensor which can be fooled (as they were in this case - there does not seem to be a radar or laser sensor to measure distances to obstacles, only a visual one).

As such the system should not be banned. Again in this case the element which failed ultimately is the human which obviously decided that Tesla advice was to be ignored and used the system for a usecase for which it was not developed for.

Ban human driver on the road. Have only automated system which are truly developed to work without human. And then we can talk.
 
Tesla expressly note and tell driver that this autopilot feature require them to swerve and brake and they should be in control at all time. The reason for that is because Tesla's system is not very advanced and rely on few sensor which can be fooled (as they were in this case - there does not seem to be a radar or laser sensor to measure distances to obstacles, only a visual one).

As such the system should not be banned. Again in this case the element which failed ultimately is the human which obviously decided that Tesla advice was to be ignored and used the system for a usecase for which it was not developed for. Ban human driver on the road. Have only automated system which are truly developed to work without human. And then we can talk.


Even though the guy had his head chopped off?
 
It seems to me that the driver must not have seen the truck at all because he never touched the brake pedal. How could a person ever suppress their own subconscious reaction and NOT hit the brakes in such a situation? How could you think that the Autopilot is going to save you right up until the moment of impact? The answer is that you wouldn't and you would slam the brakes.

This is going to be a thing with any Autopilot car. People are not going to be able to suppress their own inputs (steering and braking) in spite of knowing that the car is supposed to do it all for you. I can imagine people getting Autopilot cars and trying them a bit and deciding they don't like it or it scares them. They will turn it off and never use it again.
 
It seems to me that the driver must not have seen the truck at all because he never touched the brake pedal. How could a person ever suppress their own subconscious reaction and NOT hit the brakes in such a situation? How could you think that the Autopilot is going to save you right up until the moment of impact? The answer is that you wouldn't and you would slam the brakes.

He wasn't paying attention, despite the warnings that come with the car? Other Tesla drivers have posted video of themselves fooling around 'hands off' and even pretending to be asleep - clearly not in control of the car.

If you sell 'autonomous' cars you should fully expect some drivers to get overconfident. What bothers me is that the car seems not to have known that the light conditions constituted a hazard. Is Lidar good enough?

eta: an expert is right now chatting on BBC radio and pointing out that there are non-light sensors that should have picked up the truck no matter what the light conditions. He's speculating that the height of the trailer above the road fooled the sensors.
 
Last edited:
He wasn't paying attention, despite the warnings that come with the car? Other Tesla drivers have posted video of themselves fooling around 'hands off' and even pretending to be asleep - clearly not in control of the car.

If you sell 'autonomous' cars you should fully expect some drivers to get overconfident. What bothers me is that the car seems not to have known that the light conditions constituted a hazard. Is Lidar good enough?

eta: an expert is right now chatting on BBC radio and pointing out that there are non-light sensors that should have picked up the truck no matter what the light conditions. He's speculating that the height of the trailer above the road fooled the sensors.

Isn't the evidence so far showing that the human and car's sensors were fooled? I really don't see this is an issue of a autonomous (because this isn't such a vehicle) car failing, it is sadly yet again human error.
 

Back
Top Bottom