• Due to ongoing issues caused by Search, it has been temporarily disabled
  • Please excuse the mess, we're moving the furniture and restructuring the forum categories
  • You may need to edit your signatures.

    When we moved to Xenfora some of the signature options didn't come over. In the old software signatures were limited by a character limit, on Xenfora there are more options and there is a character number and number of lines limit. I've set maximum number of lines to 4 and unlimited characters.

Ed Self-Driving Cars: Pros, Cons, and Predictions

Evaluate Self-Driving Cars on a scale of 1-5 (1 = Terrible, 3 = Meh, 5 = Great)

  • 1

    Votes: 10 6.6%
  • 2

    Votes: 11 7.2%
  • 3

    Votes: 24 15.8%
  • 4

    Votes: 28 18.4%
  • 5

    Votes: 79 52.0%

  • Total voters
    152
  • Poll closed .
The actual seat belt usage achieved was ~60% in 1995, rising to 70% in 2000 and leveling off at ~90% by 2015. It's only taken 48 years to get there. Meanwhile cars today have multiple airbags, partly because some people still refuse to wear their seat belts.
Wouldn't it just be easier to say "I was wrong about this?" Mandatory seatbelt usage laws didn't come into effect in force until the mid-to-late 80s, when usage was around 10%. Getting to 60% usage by the 90s and 90% usage today constitutes a success, not a failure.
 
Last edited:
I would crash my car before hitting a pedestrian. Even though I have trained myself not to swerve into the incoming lane if a car pulls out, I probably still would if it was a person falling down on the road. Cars are designed to crumple when hit to protect the occupants, people aren't.

The video says:-

I'm surprised that the Audi driver was injured. You can see that the airbag deployed, so either it didn't catch sideways movement or the bag itself injured him. Either way this was a far better outcome than running over the pedestrian.

Buit hey, never let a chance to bash Tesla go by, right?
Maybe you could highlight the Tesla bashing in my post?

I shared the example as it had recently appeared on my FB feed, although it seems it happened earlier last year. It also seems unlikely that the car was self-driving at the time, based on circumstantial evidence, but that was still the speculation made in the post even though enough time has passed that the facts might now be known.

Pointing out that avoiding the pedestrian was what the car did (whether controlled by the driver or the car itself) is hardly Tesla bashing, since it means driver safety wasn't the prime concern.
 
Maybe you could highlight the Tesla bashing in my post?

I shared the example as it had recently appeared on my FB feed, although it seems it happened earlier last year. It also seems unlikely that the car was self-driving at the time, based on circumstantial evidence, but that was still the speculation made in the post even though enough time has passed that the facts might now be known.

Pointing out that avoiding the pedestrian was what the car did (whether controlled by the driver or the car itself) is hardly Tesla bashing,
since it means driver safety wasn't the prime concern.
I'd have said it was an example of a self driving car (if that was what it was) behaving as we'd expect a lot of humans would. Hardly a bashing of anything.
 
I'd have said it was an example of a self driving car (if that was what it was) behaving as we'd expect a lot of humans would. Hardly a bashing of anything.
Yes, that's what I was trying to get across. If the car was self-driving (and it seems unlikely to have been), then it would have been choosing between hitting a soft pedestrian and a hard oncoming vehicle, and it chose the latter. That seems, from the human point of view, to have been the safest result, though not necessarily one that a car would be able to reason out. (Not saying that the human involved had time to apply reason, either, but if the car's going to be in charge, I want it to be doing a better job than a human can.)
 
@Roger Ramjets
Except I never said it was impossible, a la Lord Kelvin, nor that it had to be 100% risk free. Just that it has to get quite a bit better than it is today.

The thing is, we already have thing A that works, and yes has a certain level of risk. There is hardly any reason to replace it with thing B that doesn't work as well in exactly the cases when it matters the most. (Well, except if you're in a habit of driving drunk, without a permit, while texting or all of the above.)

To use your nuclear power example, if we already could get the same power output for actually cheaper from something less risky (e.g., fusion), we wouldn't be building fission power plants.

WHEN thing B gets better than thing A, sure, go ahead. Until then, I'll not be using thing B. Is all I'm saying.
 
Except….
Except when it's controlling the points, and a trolley is rolling down and will splat one group or the other depending on the robot's decision. Did Asimov consider that dilemma?
 
Except when it's controlling the points, and a trolley is rolling down and will splat one group or the other depending on the robot's decision. Did Asimov consider that dilemma?
Yes - if I recall it's in one of shorts in I, Robot anthology. You can view Asimov's earlier robot stories as exploring the logical consequences of the 3 laws.
 
In other news...

Waymo to test its autonomous driving technology in over 10 new cities
After testing the Waymo Driver in multiple cities, the company says the technology is adapting successfully to new environments, leading to the expansion. In addition to ongoing trips to Truckee, Michigan's Upper Peninsula, Upstate New York and Tokyo, the expansion includes testing in San Diego and Las Vegas, with more cities yet to be announced...

"During these trips, we'll send a limited fleet of vehicles to each city, where trained human autonomous specialists will be behind the wheel at all times," a spokeswoman for Waymo said. The testing will begin with manual driving through the densest and most complex parts of each city, including city centers and freeways.

Waymo plans to send less than 10 vehicles to each city, where they will be manually driven around for a couple of months, according to The Verge, which first reported the news.
 
Except I never said it was impossible, a la Lord Kelvin, nor that it had to be 100% risk free. Just that it has to get quite a bit better than it is today.
You said "Will that change in the future? Quite possibly. WHEN that happens, sure, we can go around and fully automate them. Or cars for that matter. But as long as that didn't happen, well, we don't."

That's equivalent to what Lord Kelvin said too, that heavier than air flight was 'impossible' with the current technology of the day - when it had already been demonstrated. Similarly, autonomous driving has now been demonstrated many times. This makes your "laughed at Bozo the Clown" quote laughable itself.

I Used To Question Tesla FSD's Safety, But After This Latest Report The Numbers Don't Lie
This morning, Tesla announced on its X account: "We hit a new Q4 record for miles driven between accidents in 2024. Teslas using Autopilot technology drove 5.94 million miles vs US average of .70 million miles." Omead Afshar... elaborated, stating, "In the 4th quarter, we recorded one crash for every 5.94 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.08 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2023) shows that in the United States, there was an automobile crash approximately every 702,000 miles."
(Stupid forum software won't embed the graph. Follow the link above to see how much safer fsd is becoming than manual driving!)
 
That got me wondering how FSD would cope with single-track roads with passing places. Particularly when an oncoming driver fails to utilise a space he should have utilised, leading to two cars facing each other on a stretch of road with no way to pass.

There are a couple of YouTube videos, filmed last February by the account "Just get a Tesla" of him taking a fully manual Tesla Model Y over the Bealach na Bà and back. I doubt if the current FSD system is trained on such roads. It would be interesting to see how they would tackle this.
 
Back
Top Bottom