Responsible Tesla self driving car kills driver.

The truck driver said that the Tesla continued for hundreds of feet after going under his truck and even knocked down a pole and ended up in a lawn. It's possible that the car didn't even apply the brakes or properly steer after the roof was sheared off.

Perhaps the interior should be fitted with sensors to count the number of heads per passenger, and if the number becomes unequal at any point it would trigger the car to realize something was wrong.
 
Hmmm....what's the beeping?

Oh...the car can't drive it safely....why not?

Oh...****. (I'm counting on the autocensor here.)


There's no way that any car that can drive "safely" most of the time, but occasionally needs an operator to avoid death should be allowed on the road. By the time the driver becomes alert, and realizes what the problem is, someone will be dead.
 
Perhaps the interior should be fitted with sensors to count the number of heads per passenger, and if the number becomes unequal at any point it would trigger the car to realize something was wrong.

*speechless* :D
 
How does the Tesla work with pedestrians? Will it knock them down like bowling pins? How does it deal with children playing and running around near the road? Does it focus on their potential movements just like a human brain does?
 
How does the Tesla work with pedestrians? Will it knock them down like bowling pins? How does it deal with children playing and running around near the road? Does it focus on their potential movements just like a human brain does?

How does the cruise control on your car deal with those things now? This is only intended for use on highways. I don't think it can handle red lights or pedestrians.
 
How does the cruise control on your car deal with those things now? This is only intended for use on highways. I don't think it can handle red lights or pedestrians.

Wrong. ffs study the subject, watch the videos, whatever ... It can park itself, so if it can't handle the pedestrians who are very likely to be around a parking area it has no business being allowed to park itself, right?
 
AP & Reuters are now saying that the driver had a portable DVD player with him and was watching Harry Potter. After passing under the truck it travelled another quarter mile before hitting a telephone pole. The driver had 8 speeding tickets. Friends describe him as fearless and a speed demon.
 
"Your roof is ajar."

"General car catastrophe. You are advised to stop at your convenience"

Seriously, this looks like a massive dose of hubris from Tesla that the licencing authorities should have stamped on in a jiffy. "Beta" software on a car ffs?? On public roads?

Tesla have set the business back years.
 
Maybe the Autopilot doesn't see telephone poles either. Or maybe it was a white pole against a pale sky. Thanks, Musk.
 
Press reports are saying that the Tesla system couldn't distinguish between the bright white side of the truck it hit and the brightly lit sky around it. Question: What is the Tesla mechanism that tells it when to brake? I imagined that it was some kind of radar. But if it depends on the unit comparing images of the world against a database, I don't see how it could be reliable.
 
Clark Howard on the radio was talking about his Tesla and from memory he said he took his hands off the wheel, leaned back, and let it drive him down the freeway.

When these self-driving cars arrive, are we going to be sitting in the driver's seat in a state of near panic, ready to take over at the slightest sign of malfunction? I'd just as soon drive.

It is interesting to see that similar fears existed about Cruise Control when that came out. Snopes has the classic myth covered:

http://www.snopes.com/autos/techno/cruise.asp
 
Said it before, I'll say it again: I want to see the car companies develop their software and install it on a wheelchair to interact with pedestrians before they put it into live traffic.

Because this is what happens when laboratory scale development is prematurely released to the wild.

Elsewhere I linked to articles about Toyota's throttle software - It really doesn't seem like a good idea to have them making a fully driverless car without a complete overhaul of the way they approach software:
<cut for irrelevance to this subject>

Are you sure about the highlighted part?

http://www.edn.com/design/automotiv...ler-firmware--Bad-design-and-its-consequences

Barr's ultimate conclusions were that:
Toyota’s electronic throttle control system (ETCS) source code is of unreasonable quality.
Toyota’s source code is defective and contains bugs, including bugs that can cause unintended acceleration (UA).
Code-quality metrics predict presence of additional bugs.
Toyota’s fail safes are defective and inadequate (referring to them as a “house of cards” safety architecture).
Misbehaviors of Toyota’s ETCS are a cause of UA.

Hardware

Although the investigation focused almost entirely on software, there is at least one HW factor: Toyota claimed the 2005 Camry's main CPU had error detecting and correcting (EDAC) RAM. It didn't. EDAC, or at least parity RAM, is relatively easy and low-cost insurance for safety-critical systems.

Other cases of throttle malfunction have been linked to tin whiskers in the accelerator pedal sensor. This does not seem to have been the case here.

Thousands and thousands

The Camry ETCS code was found to have 11,000 global variables. Barr described the code as “spaghetti.” Using the Cyclomatic Complexity metric, 67 functions were rated untestable (meaning they scored more than 50). The throttle angle function scored more than 100 (unmaintainable).

Toyota loosely followed the widely adopted MISRA-C coding rules but Barr’s group found 80,000 rule violations. Toyota's own internal standards make use of only 11 MISRA-C rules, and five of those were violated in the actual code. MISRA-C:1998, in effect when the code was originally written, has 93 required and 34 advisory rules. Toyota nailed six of them.

Further reading here: http://embeddedgurus.com/barr-code/2013/10/an-update-on-toyota-and-unintended-acceleration/

In March 2014, the U.S. Department of Justice announced a $1.2 billion settlement in a criminal case against Toyota. As part of that settlement, Toyota admitted to past lying to NHTSA, Congress, and the public about unintended acceleration and also to putting its brand before public safety. Yet Toyota still has made no safety recalls for the defective engine software.

On April 1, 2014, I gave a keynote speech at the EE Live conference, which touched on the Toyota litigation in the context of lethal embedded software failures of the past and the coming era of self-driving vehicles. The slides from that presentation are available for download at http://www.barrgroup.com/killer-apps/.

Elsewhere in reading about this, it was stated that if a critical bit was flipped the only way of reseting the throttle to stop it accelerating was to set it to full then relax it.

and some more background.

http://www.eetimes.com/document.asp?doc_id=1319903&page_number=3

What's next for NHTSA
After the Oklahoma trial, what steps should the NHTSA be taking? Barr made some suggestions:

NHTSA needs to get Toyota to make its existing cars safe and also needs to step up on software regulation and oversight. For example, FAA and FDA both have guidelines for safety-critical software design (e.g., DO-178) within the systems they oversee. NHTSA has nothing.
Also, NHTSA recently mandated the presence and certain features of black boxes in all US cars, but that rule does not go far enough. We observed that Toyota's black box can malfunction during unintended acceleration specifically, and this will cause the black box to falsely report no braking. NHTSA's rules need to address this, e.g., by being more specific about where and how the black box gets its data, so that it does not have a common failure point with the engine computer.

There did also seem to be a mechanical problem with the mats causing the accelerator pedal to stick *as well*.
 
Clark Howard on the radio was talking about his Tesla and from memory he said he took his hands off the wheel, leaned back, and let it drive him down the freeway.

When these self-driving cars arrive, are we going to be sitting in the driver's seat in a state of near panic, ready to take over at the slightest sign of malfunction? I'd just as soon drive.


Humans are very bad at supervision, machines are far better.

It would be better if it were the other way round. A machine to take over in an emergency - with a manual override switch in case of a situation outside the programming parameters.
 
How does the cruise control on your car deal with those things now? This is only intended for use on highways. I don't think it can handle red lights or pedestrians.

Cruise control cannot be engaged (in all cars I drove that had it) below a certain speed. If you car drops below a certain threshold, it disengages it. The threshold is typically something like 35mph/50kph, so it won't work when you drive through a city with a speed limit.

In the Brown crash, the truck crossed into Brown's lane at an intersection, traveling perpendicular to Brown. He crashed into/beneath the wide side of the trailer.

From what I read about Tesla's Autopilot, this is not a situation which it is designed to control. In short, the Autopilot shouldn't have been engaged (on a highway that has intersections), which is not a determination the Autopilot can make. What it is designed for is interstate/German Autobahn style driving where there are no intersections and all cars travel more or less parallel in the same direction.

I don't use cruise control for pretty much the same reason: I seldom drive in situations where it can make my driving easier. (It works pretty well on US interstates, because the speed variations there are not very broad, thanks to a rather low general speed limit, mostly. On German Autobahns, where there is no general speed limit, the variations in speeds of different cars is so large, and there are enough drivers who disregard speed limits when posted, that it doesn't pay off to engage cruise control, because you are always on alert to disengage it since you're about to crash. This is exactly the driving situation Tesla's Autopilot is intended to solve, I gather. It's essentially keep-in-lane cruise control with variable speed based on observance of other cars.)
 
Considering this is the first fatality in 130 million autopilot miles driven when the national average is one fatality for every 93 million miles, that is a good sign the current assist technology is working well (though of course there is always room for improvement, like getting other humans to stop driving their cars into oncoming traffic).

Apples and oranges. The autopilot is only (supposed to be) engaged for interstate highway driving, which covers large expanses of very safe miles. Those 130 million autopilot miles are mostly "easy" miles.

Still, it's an impressive record. Self driving cars will get there, and I believe will do so in my lifetime*, but it isn't ready yet. The autopilot shouldn't have been engaged in this circumstance. Moreover, it should not have been possible to engage it in this circumstance.


*I'm 53. I'm confident that by the time I retire, I will be able to let my car drive for me.
 

Back
Top Bottom