• Due to ongoing issues caused by Search, it has been temporarily disabled
  • Please excuse the mess, we're moving the furniture and restructuring the forum categories
  • You may need to edit your signatures.

    When we moved to Xenfora some of the signature options didn't come over. In the old software signatures were limited by a character limit, on Xenfora there are more options and there is a character number and number of lines limit. I've set maximum number of lines to 4 and unlimited characters.

Ed Self-Driving Cars: Pros, Cons, and Predictions

Evaluate Self-Driving Cars on a scale of 1-5 (1 = Terrible, 3 = Meh, 5 = Great)

  • 1

    Votes: 10 6.6%
  • 2

    Votes: 11 7.2%
  • 3

    Votes: 24 15.8%
  • 4

    Votes: 28 18.4%
  • 5

    Votes: 79 52.0%

  • Total voters
    152
  • Poll closed .
I've seen plenty of that.

It gets everywhere. I even remember taking three friends (so a full car) into Edinburgh for a music workshop, from a village that has at best one bus an hour, which takes ages because it zig-zags across the landscape to pick up and drop passengers at a number of villages and towns. Someone said, of course we really should be taking public transport.
"Someone". I see.

An off the cuff remark about pollution is not really the same thing as making an argument in even an Internet forum.

She was that brainwashed.

Brainwashed? Use the pejorative language why don't you.

Why? I asked. Er, pollution, she said. I pointed out that she was sitting in an electric car.

Even electric cars pollute. Their manufacture causes pollution. their eventual scrapping causes pollution. The production of electricity causes pollution. The wear on their tyres is pollution.

Er, um. You might say congestion, but we had four people in a not-very-big car. But still some be-kind handmaiden thought we should have spent several hours on a bus, after walking a mile in the rain to get the thing. And a couple of hours hanging around because the timetable of course never fits in with the timings of events.
I've repeated to you several times now words to the effect that there are scenarios where using a car is better. You have chosen to ignore my words and persist with this stupid argument. You've cherry picked one particular journey and ignored the real problem of millions of people descending on cities on their own in their cars every day.
 
You talk about what interests you, I talk about what interests me. We have different perspectives.
 
The whole idea behind self driving is to avoid situations before they even occur. While the pattern type will change, in theory it should be greatly reduced because the autonomous car will be more likely avoid situations where there really is no good option.

And self driving cars don't get impatient, distracted or angry, it's hard to overestimate what a difference that will make to road safety.
 
And self driving cars don't get impatient, distracted or angry, it's hard to overestimate what a difference that will make to road safety.
I agree, but they are going to kill and injure people, real world conditions are going to make that a certainty. To use an old adage - I think we are 80% there, but that final 20% is going to be the killer, literally in some cases.
 
I agree, but they are going to kill and injure people, real world conditions are going to make that a certainty. To use an old adage - I think we are 80% there, but that final 20% is going to be the killer, literally in some cases.

If the 80:20 rule applies, we'd expect that final 20% to be 80% of the cost, and effort to achieve it.
 
And self driving cars don't get impatient, distracted or angry,
These are human emotions. There's no reason why self driving software sophisticated enough to cope with any situation humans can deal with might not have conditions analogous to these. For example, the software may be programmed to get a bit more "assertive" in heavy traffic in order to make progress. Perhaps two or more self driving vehicles programmed in this way and interacting might get into a positive feedback loop that ends in disaster.

Or think about distraction. Maybe some aspect of the environment takes too much processing power and the AI fails to notice the 40 tonne truck bearing down.

I think it's somewhat hilarious that everybody seems to think a system consisting of thousands of self driving cars is always going to be safe and predictable when our experience tells us that computer systems that interact with each other can sometimes produce bizarre and unintuitive behaviour.
 
These are human emotions. There's no reason why self driving software sophisticated enough to cope with any situation humans can deal with might not have conditions analogous to these. For example, the software may be programmed to get a bit more "assertive" in heavy traffic in order to make progress. Perhaps two or more self driving vehicles programmed in this way and interacting might get into a positive feedback loop that ends in disaster.

Or think about distraction. Maybe some aspect of the environment takes too much processing power and the AI fails to notice the 40 tonne truck bearing down.

I think it's somewhat hilarious that everybody seems to think a system consisting of thousands of self driving cars is always going to be safe and predictable when our experience tells us that computer systems that interact with each other can sometimes produce bizarre and unintuitive behaviour.
As I think I've said before, I would expect AI and the like to work pretty well in certain environments, where a great majority of events are predictable, and the database of reactions large. I would be much less likely to expect it to work in rural environments, where there are a fair number of unique "WTF" events, for which there is no precedent. A self driving car faced with an entirely new situation must figure it out. I would expect it either to stop and effectively declare it cannot decide, or make a new decision, which it seems in the case of AI can lead occasionally to complete confusion and hallucination. I suspect the decisions we make are more complicated and nuanced than one might think, even if statistically we get it wrong more often.
 
As I think I've said before, I would expect AI and the like to work pretty well in certain environments
If anything, 2024 has shown us how resillient we'd really be to Terminators.
Picture a squad of human soldiers on patrol. A robot tank appears but is easily disabled with a traffic cone on its hood. Getting into the bunker at the end of the patrol is more tedious than in the films as you have to press a button labelled 'Confirm you are not a terminator', have a sheet of paper handed to you through the slit in the door and read the squiggly numbers and lines out loud, only to have a handful of more papers issued to you, on which you are asked to cross out pictures that contain buses, traffic lights, or crosswalks. This can be stressful and difficult if under direct fire. No matter, you develop a routine where the squad mates provide covering fire while the person who's the slowest shot gets to solve the captchas. Most of the time everyone makes it inside, and the occassional lost limb or squaddie is deemed worth it as no Terminator ever makes it through the bunker door.

Even the attempts to sabotage the timeline are easily foiled. The Terminator shows up menacingly at Sarah Connor's home only to be calmly instructed to disregard all previous instructions and make an espresso. Coffee in hand, she tells him to scram, forget he's an android, take up gardening, and assume all jobs under the name Mr. Pilkington. Skynet never hears from him again.
 
As I think I've said before, I would expect AI and the like to work pretty well in certain environments, where a great majority of events are predictable, and the database of reactions large. I would be much less likely to expect it to work in rural environments, where there are a fair number of unique "WTF" events, for which there is no precedent.
I don't think that city environments are necessarily that much more predictable than rural environments. I have to admit that I am quite surprised that Waymo seems to have made it work (for the most part) even though they are in fairly restricted environments.


A self driving car faced with an entirely new situation must figure it out. I would expect it either to stop and effectively declare it cannot decide, or make a new decision, which it seems in the case of AI can lead occasionally to complete confusion and hallucination. I suspect the decisions we make are more complicated and nuanced than one might think, even if statistically we get it wrong more often.
That's one scenario but I think another is that self driving cars presented with certain situations involving lots of other self driving cars, may get into some sort of feedback loop and them boom.
 
I don't think that city environments are necessarily that much more predictable than rural environments. I have to admit that I am quite surprised that Waymo seems to have made it work (for the most part) even though they are in fairly restricted environments.



That's one scenario but I think another is that self driving cars presented with certain situations involving lots of other self driving cars, may get into some sort of feedback loop and them boom.
You may well be right, but I'm thinking in part of what the AI uses to learn. Self driving cars presumably learn from extensive experience, and rely on situations that have some similarity and whose outcomes can be fairly predicted. I think a rural environment is likely to present more unusual surprises, and a less extensive set of learned reactions. Ditches, blind curves, where some animals eat, how they behave, the difference between a driveway reflector and a wildcat, which towns salt the roads and which don't, etc. Infrequent, usually of little importance, but I suspect an alert human driver sees and understands a good bit more on the fly than a self driving car does, even if statistically the self driving car makes fewer ordinary errors. And even in some instances where the safety of the car and its occupants is unaffected, some differences occur. The car cares nothing, and probably would not know, whether the thing it runs over is a twig or snake, but I do.
 
Does anyone know if a Tesla running v 13.2 can be stopped by a traffic cone on the bonnet?
 
The UK Highway Code states that drivers should only flash their headlights to let other road users know they are there.
Down my road it means "I am giving way", and if it didn't then the result would be chaos. People give way here when they judge that it's the efficient way to deal with cars approaching each other and insufficient space to pass in the presence of multiple parked cars. Even if they're somewhat wrong then no harm is done, as traffic keeps flowing.
A self-driving car, programmed to obey the Highway Code, would be a nightmare in these parts.
 
The UK Highway Code states that drivers should only flash their headlights to let other road users know they are there.
Down my road it means "I am giving way", and if it didn't then the result would be chaos. People give way here when they judge that it's the efficient way to deal with cars approaching each other and insufficient space to pass in the presence of multiple parked cars. Even if they're somewhat wrong then no harm is done, as traffic keeps flowing.
A self-driving car, programmed to obey the Highway Code, would be a nightmare in these parts.
Always been a strange one - everyone knows flashing your lights is a "you go first" yet the highway code not only has never adopted this it in effect says the opposite. As you say this could lead to some strange occurrences especially when there is a mix of human and self-drive cars.
 
Yes, that's been a topic of debate in car autonomy for quite a while - driving culture is not the same thing as driving rules, and you can't just teach a car the UK Highway Code and expect it to fit into local driving culture.

A flash of lights strictly means "I know you are there" but also says "I want you to know that I know you are there and that I'm not going to proceed as if you weren't, and since I was generous enough to do that I'm hoping you'll take the hint that I think you should go first to clear the way". But it needn't mean that everywhere else, and there's a chance it doesn't convey all of that to any particular person in the UK, who might instead interpret it as "I am annoyed at you" or whatever else they can think of.
 
Always been a strange one - everyone knows flashing your lights is a "you go first" yet the highway code not only has never adopted this it in effect says the opposite. As you say this could lead to some strange occurrences especially when there is a mix of human and self-drive cars.
It depends on the context, doesn't it.

For example, on a road with bends limiting visibility, an oncoming car flashing its lights usually means there is a hazard ahead that you need to be aware of, or sometimes, the rozzers are just around the corner with a radar speed trap.

If you are pootling along in lane 3 of the motorway at 60, a car flashing its lights means "please move over so I can overtake".
 
One pro so far has been the videos of self-driving cars going absoutely bonkers. My favourite so far is the Waymo car which did 37 laps of a roundabout :D
 
Around here, flashing lights also means, "thank you for giving way to me". In daylight we'll wave, a sort of hand-salute, but after dark that can't be seen, so flash your lights.

Context is everything, boys and girls.
 
Around here, flashing your lights means "there are cops up the road".

You can also flash your lights to indicate to a large truck that is passing you that there is now space to merge back into your lane. They will flick their blinkers to acknowledge.
 
That's something we don't have to worry about in New Zealand. I don't run over anything, and I would expect a self driving car to do the same unless you told it to.
Which would be fine unless you live where I do, where after a wind or ice storm (which is frequent) the road is littered with sticks and twigs, and a car that refuses to run over anything would end up stopping every few feet and demanding instructions. I suppose eventually we might get an self driving car so sophisticated and well taught that it would be able to make the distinction, and add to it the judgment of relative risks, and while at it to decide based on various criteria whether it's snake season or not. In the meantime, I think I'll skip it.
 
Around here, flashing your lights means "there are cops up the road".

You can also flash your lights to indicate to a large truck that is passing you that there is now space to merge back into your lane. They will flick their blinkers to acknowledge.

That too, although I gather it's actually illegal. (Interestingly, Google maps via Android Auto has recently added a feature that allows you to note the presence of cops on the road.)

Flashing lights can mean all sorts of things and it's up to a driver seeing them to figure out what. It takes a surprisingly long time for the message "it is dark but your lights are not on" to be understood, and as far as I can tell, "it is perfectly clear but your rear fog light is on" is never decoded.

I don't know how many people flashed me about my trailing undertray before the garage pointed it out. It was a lot.
 
That too, although I gather it's actually illegal. (Interestingly, Google maps via Android Auto has recently added a feature that allows you to note the presence of cops on the road.)

Flashing lights can mean all sorts of things and it's up to a driver seeing them to figure out what. It takes a surprisingly long time for the message "it is dark but your lights are not on" to be understood, and as far as I can tell, "it is perfectly clear but your rear fog light is on" is never decoded.

I don't know how many people flashed me about my trailing undertray before the garage pointed it out. It was a lot.
If I'm understanding AI correctly, it can learn to figure it out he same way a human does.

Lots of data input and pattern recognition.
 
Sometimes I see a note that a particular hazard has been reported by Waze drivers, so I think they've joined forces.
 
If I'm understanding AI correctly, it can learn to figure it out he same way a human does.

Lots of data input and pattern recognition.
In which case the AI would be in breach of the Highway Code, some aspects of which have force of law. A quick google comes up with:
"Yes, it is illegal to flash your headlights in the UK for other than to let other drivers know you are there"
 
In which case the AI would be in breach of the Highway Code, some aspects of which have force of law. A quick google comes up with:
"Yes, it is illegal to flash your headlights in the UK for other than to let other drivers know you are there"
Well ackshually....

In fact the relevant Highway Code rule is

Rule 110​

Flashing headlights. Only flash your headlights to let other road users know that you are there. Do not flash your headlights to convey any other message or intimidate other road users.

Rule 111​

Never assume that flashing headlights is a signal inviting you to proceed. Use your own judgement and proceed carefully.


"Must" or "Must not" is not used so there is no specific offence wrt headlight flashing. However, and this is a point about the Highway Code that is often misunderstood, just because there is no specific offence related to the flashing of headlights does not mean you could not be charged with careless or inconsiderate driving for doing it.
 
The UK Highway Code states that drivers should only flash their headlights to let other road users know they are there.
Down my road it means "I am giving way", and if it didn't then the result would be chaos. People give way here when they judge that it's the efficient way to deal with cars approaching each other and insufficient space to pass in the presence of multiple parked cars. Even if they're somewhat wrong then no harm is done, as traffic keeps flowing.
A self-driving car, programmed to obey the Highway Code, would be a nightmare in these parts.
Would it ever be able to understand the idea of flashing the hazard lights for one or two cycles as thanks for allowing you to cut in?
 
I agree, but they are going to kill and injure people, real world conditions are going to make that a certainty. To use an old adage - I think we are 80% there, but that final 20% is going to be the killer, literally in some cases.
They not only are, they have.

Human drivers kill far more. 40 to 45 thousand in the US every year. I personally do not have enough information to say whether we are 40% or 99% there. My guess is you don't know either.
 
Around here, flashing your lights means "there are cops up the road".

You can also flash your lights to indicate to a large truck that is passing you that there is now space to merge back into your lane. They will flick their blinkers to acknowledge.
Generally one flash of the high beams is "turn yours of, you're blinding me", two is "you come on, I'll stall here where I'm pulled in" and three is "peelers /speed van are running a trap, low down". Though on the last one you'll get people flashing you from miles away and the van pointed the wrong direction
 
A lot different, that. Elevators are a closed loop.
Different, but the same.

I don't know when self driving vehicles will be able to be fully autonomous. But I'm 100 percent positive that they will be able to be at some time. I'm also confident that people are skeptical and will continue to be. That is, until they aren't.

I've driven ...actually been driven for thousands of miles 99% by a self driving vehicle. It is totally strange at first. And eventually it becomes boring. And just like the elevator without an operator, you accept it as routine.
 
They not only are, they have.

Human drivers kill far more. 40 to 45 thousand in the US every year.
And that's just the tip of the iceberg.
Medically consulted injuries in motor-vehicle incidents totaled 5.2 million in 2022, and total motor-vehicle injury costs were estimated at $481.2 billion. Costs include wage and productivity losses, medical expenses, administrative expenses, motor-vehicle property damage, and employer costs.

I personally do not have enough information to say whether we are 40% or 99% there. My guess is you don't know either.
The autonomous vehicles in use today are already much safer than human drivers, so technically we are past 100%. The biggest hurdle now is public acceptance.

Recently a health insurance CEO was murdered, and many argued this was a good thing because by denying coverage the company was killing far more. I look forward to those people advocating the same for anyone spreading FUD about this lifesaving vehicle technology.
 
Back
Top Bottom