this post was submitted on 16 Dec 2023
569 points (96.1% liked)

Technology

70711 readers
3366 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Neato@kbin.social 51 points 1 year ago (4 children)

Holy shit. My car doing that once and I'd be a nervous wreck just thinking about using it again.

[–] Wrench@lemmy.world 25 points 1 year ago

I give teslas more room because I have been brake checked by them on empty roads before. These ghost brake problems are prevalent.

[–] snooggums@kbin.social 16 points 1 year ago (1 children)

I have had the adaptive cruise control brake on multiple Hondas and Subarus in similar situations. Not like slamming on the brakes, but firm enough to confuse the hell out of me.

Every time it was confusing and now I just don't use it if the road is anything but open and clear.

[–] buran@lemmy.world 18 points 1 year ago* (last edited 1 year ago)

Honda’s sensing system will read shadows from bridges as obstructions in the road that it needs to brake for. It’s easy enough to accelerate out of the slowdown, but I was surprised to find that there is apparently no radar check to see if the obstruction is real.

My current vehicle doesn’t have that issue, so either the programming has been improved or the vendor for the sensing systems is a different one (different vehicle make, so it’s entirely possible).

[–] KpntAutismus@lemmy.world 8 points 1 year ago* (last edited 1 year ago) (2 children)

i barely trust the lane-keeping assistant in my friend's car. imagine going 70+km/h and suddenly the car decides to jerk the steering to the left/right because you weren't exactly in the middle of your lane.

fuck modern assistants IMO. i can use the steering wheel just fine, and people have been able to for a hundred years.

[–] pennomi@lemmy.world 21 points 1 year ago (2 children)

Considering that driving is (statistically) the most dangerous thing the average person does, I wouldn't really say that people use the steering wheel just fine.

It’s just that computers are currently worse at it than humans.

[–] KpntAutismus@lemmy.world 11 points 1 year ago (1 children)

agreed. if "autopilot" becomes a better driver than the average person, then it has a right to exist.

[–] PlutoParty@programming.dev 3 points 1 year ago* (last edited 1 year ago) (1 children)

Despite autopilot's flaws, this is already true, if we are speaking statistically.

[–] pennomi@lemmy.world 10 points 1 year ago

In not entirely sure if I trust the statistics that are available for a couple reasons (and feel free to correct me if I’m wrong):

  1. They are self reported by the manufacturer
  2. Systems like autopilot will revert to manual control when it detects a situation it can’t handle, which means it has the luxury of “not being at fault during the crash” when it may have caused the situation 5 seconds before
  3. It’s comparing to all vehicles instead of just vehicles that have similar non-self-driving but effective safety features
[–] FaceDeer@kbin.social 2 points 1 year ago

I wouldn't even say that without seeing statistics to back it up. The news doesn't cover routine traffic accidents, but one Tesla screws up one thing and that story is front page. Don't rely on anecdotes and emotions.

[–] merc@sh.itjust.works 2 points 1 year ago

i can use the steering wheel just fine, and people have been able to for a hundred years.

People have been bad at it for a hundred years. I'm not saying that people should necessarily be using auto-steering that keeps them in the middle of their lanes, but they should at least be using systems that beep at them when they stray out of their lane.

The bar for self-driving technology isn't some amazing perfect computer that never makes a mistake. It's the average driver. The average driver is bad.

[–] burliman@lemm.ee 0 points 1 year ago (2 children)

That’s the bar that automatic driving has. It messes up once and you never trust it again and the news spins the failure far and wide.

Your uncle doing the same thing just triggers you to yell at him, the guy behind him flips you off, he apologizes, you’re nervous for a while, and you continue your road trip. Even if he killed someone we would blame the one uncle, or some may blame his entire class at worst. But we would not say that no human should drive again until it is fixed like we do with automated cars.

I do get the difference between those, and I do think that they should try to make automated drivers better, but we can at least agree about that premise: automated cars have a seriously unreasonable bar to maintain. Maybe that’s fair, and we will never accept anything but perfect, but then we may never have automated cars. And as someone who drives with humans every day, that makes me very sad.

[–] Neato@kbin.social 4 points 1 year ago (1 children)

The difference is that Tesla said it was autopilot when it's really not. It's also clearly not ready for primetime. And auto regulators have pretty strict requirements about reliability and safety.

While that's true that autonomous cars kill FAR less people than human drivers, ever human is different. If an autonomous driver is subpar and that AI is rolled out to millions of cars, we've vastly lowered safety of cars. We need autonomous cars to be better than the best driver because, frankly, humans are shit drivers.

I'm 100% for autonomous cars taking over entirely. But Tesla isn't really trying to get there. They are trying to sell cars and lying about their capabilities. And because of that, Tesla should be liable for the deaths. We already have them partially liable: this case caused a recall of this feature.

[–] Staiden@lemmy.dbzer0.com 4 points 1 year ago* (last edited 1 year ago)

But the vaporware salesman said fully automatic driving was 1 year away! In 2018, 2019, 2020, 2021... he should be held responsible. The guy once said to further technology some people will die and that's just the price we pay. It was in a comment about going to Mars, but we should take that in to accout for everything he does. If I owned a business and one of my workers died or killed someone because of gross negligence I'd be held responsible why does he get away with it.

Except Tesla's uncle had brain damage and doesn't really learn from the situation so will go it again, and had clones of him driving thousands of other cars.