Responsible Tesla self driving car kills driver.

Like Cruise Control, there are more times when you shouldn't use it than one would think.
I've see people use it (CC) during rain, and snow, in suburban traffic (Speed Limit 35 and up), on rural roads, and at the end of a long day.
If you fall asleep, or hit a slick spot, or come to a sharp turn or are on a road that is very winding, it can get you into deep **** in a hurry.
Next step was lane monitoring--an alert when you start to drift. Then Adaptive Cruise Control, which tries to maintain a set distance between you and the car ahead.
Automatic parking is a gimmick, sure, but it is a next step to autonomous vehicles- without intervention, it gauges and controls speed and position to safely put the vehicle where it needs to be.
Tesla is the logical next step, but it still need a living, thinking brain behind it. It's not a full autopilot--think of it as a SAS (aircraft Stability Augmentation System). It's there to help, not take over
Google and the like are jumping over this step, and actually doing it very well, but they are not the next step. People will only accept a little at a time.
We are getting there.
Lane monitoring-->lane correction
Speed control-->Adaptive speed control--> automatic braking
We have those now. The problem is combining them
These things get implemented in stages, not all at once

There are a whole bunch of steps to make fully autonomous driving possible but there reaches a point where it cannot be ambiguous who is in control and that has to be totally binary without incremental stages towards that goal.
From the website http://www.driverless-future.com/?page_id=774
Misconception 1: Driver assistance systems will evolve gradually into fully autonomous cars
...
But this evolution contains one obvious discontinuity: All of the driver assistance systems which are in use today operate only for short times and in extremely limited settings. Auto-parking operates for a few seconds with the driver watching. Emergency braking kicks in at the last moment before an inevitable crash. Lane warning comes on briefly when a car veers out of its lane.

This changes drastically once the car drives itself continuously for minutes or hours. Here, gradual evolution is impossible: from the moment that a car drives continuously, there is no margin for error; no room for gradual improvement, learning by doing or evolution. It needs to be able to cope with all short-term eventualities and crisis situations that may arise on the spot.
...
I think Tesla's system is really pushing the envelope of this ambiguity where there should be and semi-autonomous is a dangerous territory to be in.
 
Well,I think we have a Darwin Award candidate, here, folks....

IMHO,a "Self Driving Car" that is actually safe is a long way in the future. And I am skeptical about that anyway.
 
Well this convinced me, a safety feature failed so we need to ban all safety features on automobiles. Lets start with banning seatbelts and airbags.
 
There are a whole bunch of steps to make fully autonomous driving possible but there reaches a point where it cannot be ambiguous who is in control and that has to be totally binary without incremental stages towards that goal.
From the website http://www.driverless-future.com/?page_id=774

I think Tesla's system is really pushing the envelope of this ambiguity where there should be and semi-autonomous is a dangerous territory to be in.
We had the same issues with Fly by Wire. The initial airplane ( an F-8 Crusader) had to have both force stick and conventional controls because none of the test pilots ( or engineers) trusted the computers and control systems (1971-1973). The conventional system was a backup to save the airplane. At some point, the conventional controls had to be removed, after faith in the system was justified ( Although the pilot did have an opt-out in the handle we tween his knees), which eventually (in 1974) led to a new airplane where they were never installed at all ( e.g., the F-16). It was a huge leap of faith at the time.
 
Well this convinced me, a safety feature failed so we need to ban all safety features on automobiles. Lets start with banning seatbelts and airbags.

It's not so much the safety feature failed, as it was misused by the driver,who expected it to do something it was not designed to do: Drive the car without any Human Supervision.
Watching a movie while driving a car is just plain stupid,period. As I stated, Darwin award stupid.
 
Last edited:
And I love how some of the lefties here are putting most of the blame on Tesla,and letting the idiot driver off the hook. Of course if you are coming from what is basically an "All Businesses Are Basically Evil" viewpoint..........

I am not saying Tesla is blameless,if they were overselling what the auto pilot system can be used for. But if the driver...as seems to be the case, was ignoring the operating instruction,which clearly states you have to be paying attention to the road , I think that pretty much lets Tesla off the hook for legal liability.
Some people just hate Private Business,and think it is basically evil.
 
Last edited:
It's not so much the safety feature failed, as it was misused by the driver,who expected it to do something it was not designed to do: Drive the car without any Human Supervision.
Watching a movie while driving a car is just plain stupid,period. As I stated, Darwin award stupid.

It was a feature that was inevitably going to be misused.

I am not sure that anything other than emergency braking or steering would work with an autopilot, until a full self-driving car, for problem stated in the old saw about the most dangerous component being the nut behind the steering wheel.
 
It was a feature that was inevitably going to be misused.

I am not sure that anything other than emergency braking or steering would work with an autopilot, until a full self-driving car, for problem stated in the old saw about the most dangerous component being the nut behind the steering wheel.

I am skeptical as hell about an autopilot being a good idea for a car...at least in the current stage of technology....but I think hanging Tesla out to dry like a couple of people are trying to do, and letting the driver off the hook for his stupidity is also wrong.
 
If you've got all those sensors available, there are a lot better things you can do with them than construct an "AutoPilot": which is one of the stupidest things I have ever seen put on a car. Instead, those sensors could be used in conjunction with GPS to warn the driver if he's exceeding the speed limit, or coming up on a blind intersection as he's about to crest a hill, or tell him that the next curve is a decreasing-radius curve....so be extra careful - and it can give a lot of other good and timely warnings.
 
My prediction.

Real self driving vehicles will start downtown in cities within 5 years ... they will have no steering wheel and will operate at low speeds (30Kmph) during nice weather only.

The will be more like platforms than cars, with a cab on top for 4 to 6 people to sit in.

I predict they will have flashing white or yellow lights and beep horns at every little thing in their way a half block ahead.

The traffic laws will give them right of way at every opportunity ... and have their own lanes on major streets.

They may even be able to control traffic lights.

They will cost the same as a taxi fare.

You will call them with your cell phone like Uber

You will pay by cell phone before you board in tandem with calling them, no cash or credit card.

They will be VERY close to the ground making it almost impossible to run over pedestrians,

Every trip will be recorded on video for liability reasons ... there will be laws making it illegal to get in their way.

It will be good and every one will love and accept it! :)
 
Message deleted.

My automatic posting program crashed
 
Last edited:
My prediction.

Real self driving vehicles will start downtown in cities within 5 years ... they will have no steering wheel and will operate at low speeds (30Kmph) during nice weather only.

The will be more like platforms than cars, with a cab on top for 4 to 6 people to sit in.

I predict they will have flashing white or yellow lights and beep horns at every little thing in their way a half block ahead.

The traffic laws will give them right of way at every opportunity ... and have their own lanes on major streets.

They may even be able to control traffic lights.

They will cost the same as a taxi fare.

You will call them with your cell phone like Uber

You will pay by cell phone before you board in tandem with calling them, no cash or credit card.

They will be VERY close to the ground making it almost impossible to run over pedestrians,

Every trip will be recorded on video for liability reasons ... there will be laws making it illegal to get in their way.

It will be good and every one will love and accept it! :)

I doubt big cities are going to rush to inconvience 80% of their inhabinants for the sake of a traffic service.....
 
The details on this new accident aren't fully clear. It sounds like the car didn't lose traction on the winding two-lane road and instead made some incorrect steering move and it didn't avoid the wooden stakes. These were probably those thick posts that are used like guardrails. The Autopilot is supposed to keep you in your lane but it does always suppose that you are watching and have your hands on the wheel. This guy wasn't doing it right. Could it have hit pedestrians on the roadside?
 
https://www.youtube.com/watch?v=TfV-xBAWdY4

Here's an Autopilot in action that really shows a major weakness of the system: steering out of an accident without significantly reducing speed. Really...this system is far from something anyone should want to rely upon, and I question the judgement of those who do rely on this system - even a little bit.

had the Tesla started breaking hard when that truck approached it in addition to serving to the right, then it would have been a much safer move and outcome. Instead, the Tesla swerved (seemed to slow just a bit) but pretty much maintained it's original speed. When the Tesla got back in lane, it was travelling too close behind the tuck - and that's not wise considering it's driver doesn't seem to be too award of the traffic around him.

In short, the Tesla autopilot is all about the mechanics of driving and it seems to have no logic behind it that helps make the best choices.
 
Last edited:
And I love how some of the lefties here are putting most of the blame on Tesla,and letting the idiot driver off the hook.

"Lefties"? You believe people's opinions on an autopilot-involved wreck to be a liberal-versus-conservative issue?

We can sit here in this forum and insist that we superiorly-intelligent people naturally understand that the system is an "enhanced safety feature" that is meant merely to complement a driver who is actively operating the vehicle and fully engaged with the controls at all times. But Tesla did not name the system something like "Augmented Drive" or "Advanced Cruise Control", it named the system "Autopilot", after a system that the general public understands to be capable of autonomously controlling the attitude of an airplane at literally every phase of flight except touchdown (and that exception only because the law requires manual control in the last few feet before landing; otherwise they would be capable of that function as well). Tesla does bear at least some culpability for choosing a name for its system that is evocative of that autonomy, which many perfectly reasonable people are bound to make dangerous inferences from.
 
But Tesla did not name the system something like "Augmented Drive" or "Advanced Cruise Control", it named the system "Autopilot", after a system that the general public understands to be capable of autonomously controlling the attitude of an airplane at literally every phase of flight except touchdown (and that exception only because the law requires manual control in the last few feet before landing; otherwise they would be capable of that function as well). Tesla does bear at least some culpability for choosing a name for its system that is evocative of that autonomy, which many perfectly reasonable people are bound to make dangerous inferences from.
This argument is garbage unless you believe that most people would find it acceptable for airplane pilots to engage autopilot and then never look at their instruments or the outside world until they're ready to disengage the system. Certainly no airline would find that in any way acceptable.
 
Not that it's acceptable, but pilots have been known to fall asleep while on autopilot. I agree it's a poor choice of words on Tesla's part.

Sent from my XT1254 using Tapatalk
 

Back
Top Bottom