Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
These autonomous vehicle trolley problems go just as deep as you want them to - and there are no real right or wrong answers for any of it.
What if the car is faced with a situation where it can either hit a pedestrian to save the passenger or drive over a cliff and save the pedestrian by sacrificing the passenger?
What if a collision is unavoidable but it has the option to choose between hitting a child or hitting a granny?
It's only a matter of time until we have self-driving vehicles that are far safer drivers than humans - but they still won't be flawless, and accidents will keep happening. Can we live with there being no one to blame for it? Or do we just go back to human drivers with higher accident rates - at least then we have someone to point our fingers at?
OP is asking about an accident and responsibility, not a trolley problem.
You pretend AVs can’t make mistakes just face difficult choices. That’s false.
You also pretend AVs are safer than human drivers. Nobody knows if that’s true so let’s not pretend it is.
And even if they're safer than drivers, someone (other than the victim, ideally) still needs to be responsible when they inevitably hit someone.
All this is pretty much true... but a company still did QA on the code / hardware combo. The cost of death to the individuals that they caused should be part of the cost they need to pay. Its part of their negative externalities even if they are lower than the human driver (that ideally is charged with theirs)
I'm not claiming the legal system says this, or that it's likely to happen. Just that the logic doesn't seem complicated or ambiguous
We punish people for speeding, driving under the influence, or texting while driving because those behaviors are reckless and we want to deter them. That makes sense.
But what about a freak accident where the driver did nothing wrong? Should they still be punished just because it happened to be them in the wrong place at the wrong time? In my opinion, no. If they didn't do anything reckless or negligent, there's no reason to think punishment would teach them anything useful. At that point it just feels like we're satisfying our need for vengeance rather than serving any logical purpose.
With a self-driving car, every accident would basically fall into that "freak accident" category. The car wasn't distracted, drunk, or driving recklessly. Maybe you could argue the company should pay compensation to the victim's family or at least cover medical costs if the person survived as a gesture of good will - but I don't see how the company would be morally responsible in a way that justifies fining or punishing them.
Just thinking aloud here. I don't know what the actual answer is.
Accidents with driverless cars simply doesn't fall into the freak category like you claim. Thousands of hours were put into making decisions that led to this point. They were all made by the manufacturer (or the software + hardware combo with final QA, for now the same company but financial punishments are not difficult to split)
The legal system here is in place for someone to pay for the fact a person is no longer alive that ideally would be. Its not complicated when reasoning about what caused them to no longer be alive
Again, legal system may not come close to agreeing and society may never either. Kind of like I find it hard to imagine someone being fined for stealing candy from a baby even though it seems obvious there was harm and who caused the fault
Actually... simple point.
You take a corner, there is a defect in your tires so you cant turn well, you hit someone, the investigation shows the defect in the tires. Who pays / is to blame?
I'm not saying the trolley problem style arguments aren't true for driverless cars; society will need to adapt. I just think having the companies pay still gets us to safer roads but with accountability and without society hiding the costs these companies impose
We already have self driving cars that are way safer than humans (Waymo).
But that bar is so low the devil would have to start digging to go under it.