Ed Self-Driving Cars: Pros, Cons, and Predictions

Evaluate Self-Driving Cars on a scale of 1-5 (1 = Terrible, 3 = Meh, 5 = Great)

  • 1

    Votes: 10 6.6%
  • 2

    Votes: 11 7.2%
  • 3

    Votes: 24 15.8%
  • 4

    Votes: 28 18.4%
  • 5

    Votes: 79 52.0%

  • Total voters
    152
  • Poll closed .
I've seen plenty of that.

It gets everywhere. I even remember taking three friends (so a full car) into Edinburgh for a music workshop, from a village that has at best one bus an hour, which takes ages because it zig-zags across the landscape to pick up and drop passengers at a number of villages and towns. Someone said, of course we really should be taking public transport.
"Someone". I see.

An off the cuff remark about pollution is not really the same thing as making an argument in even an Internet forum.

She was that brainwashed.

Brainwashed? Use the pejorative language why don't you.

Why? I asked. Er, pollution, she said. I pointed out that she was sitting in an electric car.

Even electric cars pollute. Their manufacture causes pollution. their eventual scrapping causes pollution. The production of electricity causes pollution. The wear on their tyres is pollution.

Er, um. You might say congestion, but we had four people in a not-very-big car. But still some be-kind handmaiden thought we should have spent several hours on a bus, after walking a mile in the rain to get the thing. And a couple of hours hanging around because the timetable of course never fits in with the timings of events.
I've repeated to you several times now words to the effect that there are scenarios where using a car is better. You have chosen to ignore my words and persist with this stupid argument. You've cherry picked one particular journey and ignored the real problem of millions of people descending on cities on their own in their cars every day.
 
You talk about what interests you, I talk about what interests me. We have different perspectives.
 
The whole idea behind self driving is to avoid situations before they even occur. While the pattern type will change, in theory it should be greatly reduced because the autonomous car will be more likely avoid situations where there really is no good option.

And self driving cars don't get impatient, distracted or angry, it's hard to overestimate what a difference that will make to road safety.
 
And self driving cars don't get impatient, distracted or angry, it's hard to overestimate what a difference that will make to road safety.
I agree, but they are going to kill and injure people, real world conditions are going to make that a certainty. To use an old adage - I think we are 80% there, but that final 20% is going to be the killer, literally in some cases.
 
I agree, but they are going to kill and injure people, real world conditions are going to make that a certainty. To use an old adage - I think we are 80% there, but that final 20% is going to be the killer, literally in some cases.

If the 80:20 rule applies, we'd expect that final 20% to be 80% of the cost, and effort to achieve it.
 
And self driving cars don't get impatient, distracted or angry,
These are human emotions. There's no reason why self driving software sophisticated enough to cope with any situation humans can deal with might not have conditions analogous to these. For example, the software may be programmed to get a bit more "assertive" in heavy traffic in order to make progress. Perhaps two or more self driving vehicles programmed in this way and interacting might get into a positive feedback loop that ends in disaster.

Or think about distraction. Maybe some aspect of the environment takes too much processing power and the AI fails to notice the 40 tonne truck bearing down.

I think it's somewhat hilarious that everybody seems to think a system consisting of thousands of self driving cars is always going to be safe and predictable when our experience tells us that computer systems that interact with each other can sometimes produce bizarre and unintuitive behaviour.
 
These are human emotions. There's no reason why self driving software sophisticated enough to cope with any situation humans can deal with might not have conditions analogous to these. For example, the software may be programmed to get a bit more "assertive" in heavy traffic in order to make progress. Perhaps two or more self driving vehicles programmed in this way and interacting might get into a positive feedback loop that ends in disaster.

Or think about distraction. Maybe some aspect of the environment takes too much processing power and the AI fails to notice the 40 tonne truck bearing down.

I think it's somewhat hilarious that everybody seems to think a system consisting of thousands of self driving cars is always going to be safe and predictable when our experience tells us that computer systems that interact with each other can sometimes produce bizarre and unintuitive behaviour.
As I think I've said before, I would expect AI and the like to work pretty well in certain environments, where a great majority of events are predictable, and the database of reactions large. I would be much less likely to expect it to work in rural environments, where there are a fair number of unique "WTF" events, for which there is no precedent. A self driving car faced with an entirely new situation must figure it out. I would expect it either to stop and effectively declare it cannot decide, or make a new decision, which it seems in the case of AI can lead occasionally to complete confusion and hallucination. I suspect the decisions we make are more complicated and nuanced than one might think, even if statistically we get it wrong more often.
 
As I think I've said before, I would expect AI and the like to work pretty well in certain environments
If anything, 2024 has shown us how resillient we'd really be to Terminators.
Picture a squad of human soldiers on patrol. A robot tank appears but is easily disabled with a traffic cone on its hood. Getting into the bunker at the end of the patrol is more tedious than in the films as you have to press a button labelled 'Confirm you are not a terminator', have a sheet of paper handed to you through the slit in the door and read the squiggly numbers and lines out loud, only to have a handful of more papers issued to you, on which you are asked to cross out pictures that contain buses, traffic lights, or crosswalks. This can be stressful and difficult if under direct fire. No matter, you develop a routine where the squad mates provide covering fire while the person who's the slowest shot gets to solve the captchas. Most of the time everyone makes it inside, and the occassional lost limb or squaddie is deemed worth it as no Terminator ever makes it through the bunker door.

Even the attempts to sabotage the timeline are easily foiled. The Terminator shows up menacingly at Sarah Connor's home only to be calmly instructed to disregard all previous instructions and make an espresso. Coffee in hand, she tells him to scram, forget he's an android, take up gardening, and assume all jobs under the name Mr. Pilkington. Skynet never hears from him again.
 
As I think I've said before, I would expect AI and the like to work pretty well in certain environments, where a great majority of events are predictable, and the database of reactions large. I would be much less likely to expect it to work in rural environments, where there are a fair number of unique "WTF" events, for which there is no precedent.
I don't think that city environments are necessarily that much more predictable than rural environments. I have to admit that I am quite surprised that Waymo seems to have made it work (for the most part) even though they are in fairly restricted environments.


A self driving car faced with an entirely new situation must figure it out. I would expect it either to stop and effectively declare it cannot decide, or make a new decision, which it seems in the case of AI can lead occasionally to complete confusion and hallucination. I suspect the decisions we make are more complicated and nuanced than one might think, even if statistically we get it wrong more often.
That's one scenario but I think another is that self driving cars presented with certain situations involving lots of other self driving cars, may get into some sort of feedback loop and them boom.
 
I don't think that city environments are necessarily that much more predictable than rural environments. I have to admit that I am quite surprised that Waymo seems to have made it work (for the most part) even though they are in fairly restricted environments.



That's one scenario but I think another is that self driving cars presented with certain situations involving lots of other self driving cars, may get into some sort of feedback loop and them boom.
You may well be right, but I'm thinking in part of what the AI uses to learn. Self driving cars presumably learn from extensive experience, and rely on situations that have some similarity and whose outcomes can be fairly predicted. I think a rural environment is likely to present more unusual surprises, and a less extensive set of learned reactions. Ditches, blind curves, where some animals eat, how they behave, the difference between a driveway reflector and a wildcat, which towns salt the roads and which don't, etc. Infrequent, usually of little importance, but I suspect an alert human driver sees and understands a good bit more on the fly than a self driving car does, even if statistically the self driving car makes fewer ordinary errors. And even in some instances where the safety of the car and its occupants is unaffected, some differences occur. The car cares nothing, and probably would not know, whether the thing it runs over is a twig or snake, but I do.
 
Does anyone know if a Tesla running v 13.2 can be stopped by a traffic cone on the bonnet?
 
The UK Highway Code states that drivers should only flash their headlights to let other road users know they are there.
Down my road it means "I am giving way", and if it didn't then the result would be chaos. People give way here when they judge that it's the efficient way to deal with cars approaching each other and insufficient space to pass in the presence of multiple parked cars. Even if they're somewhat wrong then no harm is done, as traffic keeps flowing.
A self-driving car, programmed to obey the Highway Code, would be a nightmare in these parts.
 
The UK Highway Code states that drivers should only flash their headlights to let other road users know they are there.
Down my road it means "I am giving way", and if it didn't then the result would be chaos. People give way here when they judge that it's the efficient way to deal with cars approaching each other and insufficient space to pass in the presence of multiple parked cars. Even if they're somewhat wrong then no harm is done, as traffic keeps flowing.
A self-driving car, programmed to obey the Highway Code, would be a nightmare in these parts.
Always been a strange one - everyone knows flashing your lights is a "you go first" yet the highway code not only has never adopted this it in effect says the opposite. As you say this could lead to some strange occurrences especially when there is a mix of human and self-drive cars.
 
Yes, that's been a topic of debate in car autonomy for quite a while - driving culture is not the same thing as driving rules, and you can't just teach a car the UK Highway Code and expect it to fit into local driving culture.

A flash of lights strictly means "I know you are there" but also says "I want you to know that I know you are there and that I'm not going to proceed as if you weren't, and since I was generous enough to do that I'm hoping you'll take the hint that I think you should go first to clear the way". But it needn't mean that everywhere else, and there's a chance it doesn't convey all of that to any particular person in the UK, who might instead interpret it as "I am annoyed at you" or whatever else they can think of.
 
Always been a strange one - everyone knows flashing your lights is a "you go first" yet the highway code not only has never adopted this it in effect says the opposite. As you say this could lead to some strange occurrences especially when there is a mix of human and self-drive cars.
It depends on the context, doesn't it.

For example, on a road with bends limiting visibility, an oncoming car flashing its lights usually means there is a hazard ahead that you need to be aware of, or sometimes, the rozzers are just around the corner with a radar speed trap.

If you are pootling along in lane 3 of the motorway at 60, a car flashing its lights means "please move over so I can overtake".
 
One pro so far has been the videos of self-driving cars going absoutely bonkers. My favourite so far is the Waymo car which did 37 laps of a roundabout :D
 
Around here, flashing lights also means, "thank you for giving way to me". In daylight we'll wave, a sort of hand-salute, but after dark that can't be seen, so flash your lights.

Context is everything, boys and girls.
 

Back
Top Bottom