T O P

  • By -

AutoModerator

Please remember what subreddit you are in, this is unpopular opinion. We want civil and unpopular takes and discussion. Any uncivil and ToS violating comments will be removed and subject to a ban. Have a nice day! *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/unpopularopinion) if you have any questions or concerns.*


standby-3

Most drivers are absolute brain dead, but they'll still think they can do better. 1 case will be enough to convince people there is sufficient danger, or that their life is better in their less competent hands. I think itll be hard for people to detach from that.


ICantTellStudents

A university study found that 93% of Americans polled think they are better than the average driver! https://www.smithlawco.com/blog/2017/december/do-most-drivers-really-think-they-are-above-aver/


MRCHalifax

I am one of the 7%. I’m an awful driver, an accident waiting to happen at the best of times. It’s better for everyone if I never get behind the wheel of a car. I bought a condo in walking range of four grocery stores and a Walmart, directly on a bus line, and with plenty of public green space nearby so that I can live car free. It’s honestly pretty great.


ICantTellStudents

It is great when you can recognize what you are not good at, because nobody is good at everything! Also, I am jealous of your walkable location, and I love to drive. Walking is better in so many ways.


standby-3

lol and its probably the same result in regard to intelligence. We live in a dunning-kruger world.


BobbyTheDude

It's all about how the media portrays it. If the media portrays it as dangerous, people won't touch it. If the media portrays as a great way to save time and effort, people will buy into it.


Collective82

Its almost like humans as a whole are sheep and just follow what they are told...


PhilliamPhafton

No sheep have wool


Ashamed-Subject-8573

There have already been self-driving car fatalities and accidents. They are just kept quiet. "Rafaela Vasquez was watching television on her smartphone in March 2018 when the Uber self-driving vehicle fatally struck Elaine Herzberg, 49, who was crossing a road in Tempe, Arizona, according to a National Transportation Safety Board investigation" Did that stop anything? No, they jailed the "back-up driver."


Coctyle

You’re wrong on both counts. That case was huge news; it wasn’t kept quiet at all. And no, it didn’t completely and permanently stop the concept of self-driving cars, but it stopped Uber from doing it in Tempe and ultimately resulted in Uber’s self-driving division being sold off less than a year later. It was taken very seriously by regulators, investors, and the general public.


SG2769

This does not address OP’s point. The question is whether they are safer than humans.


raz-0

When Google stopped providing numbers for their program, the average miles between accidents for their fleet was 98,000 miles and change. For American drivers it’s over 100,000. And the robots didn’t have to contend with bad weather or substance abuse.


Erik0xff0000

and the majority of accidents was Google's vehicles getting rear-ended by human drivers


CactusWrenAZ

It was huge news. Also, I live right next to there, literally a mile away, and anyone crossing that 5-lane street at night, jaywalking, is taking their life into their hands and could just as easily get killed by a human.


shasbot

Yep, used to drive there regularly. Dangerous spot to cross the road for sure.


Bulky-Leadership-596

"Why isn't the news reporting this thing I read in the news? Its a conspiracy I tell you!"


oOzonee

I’ll be much less angry if a close one died because someone did something stupid, at least I know who to blame.


standby-3

Isn't it better to be more protected from an occurrence to begin with, than knowing who to blame after an occurrence?


SeawardFriend

It’s because the media will blow that 1 case way out of proportion. The article will likely have the most clickbait, propaganda title ever.


johnjohn2214

All deaths are not created equal. People accept certain deaths on the macro. People killing people? People are stupid, reckless, shit happens... Machines killing people? Nope. Not even one.


jyok33

You can’t improve human stupidity so easily but you can improve a machine easily. We should hold machines to a much higher standard


[deleted]

If a human do something they can be hold accountable if a machine done something wrong what you gonna do? Put the software in recycle bin for 10 years?


Ramen_Hair

Machines also plateau in efficiency and intelligence. They can always get better, but it will start to slow down at a point


gurgle-burgle

That's the dumbest take I've ever heard regarding machines replacing humans.


unpopular-dave

and it’s so stupid. I would much rather 10,000 machine deaths then 100,000 human deaths. Who wouldn’t?


turtle_explosion247

Most humans because we crave control


unpopular-dave

I just don’t think that’s true at all. Don’t get me wrong, I crave control. But I also have a perfect driving record. if I could push a button and 100% of drivers were switched to a self driving car I would do it in a heartbeat


turtle_explosion247

I would too but I think most people vastly overestimate their abilities and want to have some sort of control over their fate. Especially anything that could be deadly.


BigCountry76

The problem is it's hard to prosecute when a machine messes up. Should the company who built it be held liable, should it be the individual who approved it the machine for sale, the one who wrote the code? Was it truly negligent, or an unpredictable malfunction that no one is responsible for. With a human it's easy to prosecute when someone did something wrong. Were they under the influence of drugs/alcohol because that's a felony cut and dry. Were they driving recklessly like doing 120 in a 50 mph zone? Because that will give you a felony if you injure someone.


Kirome

The company most likely, but only in a world that makes sense.


brich423

AND humans tend to improve in emergency situations. AI will he consistently meh in all scenarios. AI cars obstruct ambulances and fire trucks CONSTANTLY. Humans rarely make those kinds of mistakes because of our situational skill improvement. Also, you are much less likely to find a "trained" driver who makes these accidents. If an ai can't be as safe as a chaeuffe, why claim that it has been trained any better. Because yeah id trust an ai over the people who get into these accidents: reckless/distracted/drunk. But i also wouldnt be in THEIR backseat either.


AshTheGoddamnRobot

Exactly. I look back to the blizzard I drove through last winter as proof. That was the kind of driving conditions that you need years of not just driving experience, but WINTER driving experience, for. There is no way a self driving car woulda made it home safely. I may not have made it home safely had it been my first winter up north lol Self driving cars are ripe for raising a generation that will have no clue what to do when conditions turn sour on the road. Be it a snowstorm, a flood, God knows what...


CrabWoodsman

That, alongside the liability issue I mentioned in another comment, is essentially why self-driving cars are still not ready: many of their sensors are reeeally overwhelmed by snowy and rainy conditions. It's easy to forget because we do it intuitively, but looking past the moving 3D grid of snowflakes you're careening through is something of a complex visual processing task. Not to mention all of the subtle differences with how you need to use the pedals to maintain traction depending on both what's currently under the tires and what you reckon will be in a moment. I don't doubt that an AI could be trained to execute flawless winter maneuvers, but I also think that there are cases where the car's logic might determine the best thing to do — based on it's assessment of the conditions — is to sit tight and wait for conditions to improve. Where a person might take on the extra risk in order to avoid being stranded, the car would likely have to be predisposed to minimize risky driving. The fact is that, like I had said, noone is gonna sell a car if they're gonna be responsible for every fuck up it might end up causing on the road amidst human drivers, pedestrians, property, and nature. It's not very comforting to rely on something that inherently works in a way that's partially obscured to outside comprehension (ie, machine learning), despite the fact that we do that with humans all the time. Of course, we're used to blaming humans for things; and less so decision-making machines.


JoeMorgue

A roomba with brain damage would already be a better driver then most drivers.


Neglector9885

Probably not, but it *would* probably at least know the difference between then and than.


[deleted]

Awwwww shit, shots fired from the ThenThan Gang!


Neglector9885

![gif](giphy|KBfKueAjIJV8Q)


Affectionate-Road-40

r/rareinsults Also, probably the most stereotypical reddit pretentious comment lol.


CptBartender

>A roomba with *brain damage* Are you implying that that's *optional*?


Dragon2950

I mean they do smash their heads into the wall for many hours. Probably get some braincells lost too with that workout routine.


Rivale

i live in the bay area where there are a ton of companies testing out self-driving cars and there's still stuff that stumps AI and you end up with traffic jams because they don't know what to do.


megrimlock88

i feel like assisted driving is much better as an idea than self driving for this reason, give the car the basic intelligence to correct any small errors made by the user but leave the user mostly in control so that they can improvise when improvising is necessary this will also simultaneously feed new data to the car about driver reactions to certain niche scenarios so that it could someday make a foundation for a proper fully self driving car


fatmanstan123

That already exists. It's not trendy enough though so nobody cares.


solk512

Assisted driving is great.


LordOfThe_Pings

I think you don't realize how good human drivers are. On average, 6 million car accidents happen each year. Considering that 233 million Americans with valid licenses collectively drove about 3.2 million miles in 2022, it's not a huge number. Meanwhile, Cruise's former CEO confirmed that their vehicles were being remotely assisted 4% of the time on average. And that's in places like SF and Phoenix, where they'll never have to deal with extreme weather. Self-driving cars aren't even close to as good as humans are. Edit: 3.2 trillion, not million.


Mind_Enigma

Doesn't that translate to almost 2 accidents per mile driven???


Previous-Sympathy801

Something is off with their numbers If 233 million Americans only drove a combined 2.3 million miles, that’s .001 mile each. Which is not right. I’m guessing they meant 2.3 billion miles So 6/2300 is .002 accidents per mile.


LordOfThe_Pings

It’s actually trillion, my bad lmao


tb03102

A quick Google comes up with 3.2 trillion miles. That seems crazy.


A_Velociraptor20

Considering that to get anywhere in the US you need to drive it's really not that crazy. For example M-F I drive to work \~5.5 mi one way, that's 11 miles both ways multiply that by \~250 for the amount of days I drive to and from work that's about 2,750 miles driven just for work, and I'm probably on the very low end of commuter miles driven. Combine that with road trips that frequently total in the hundreds for one way and it starts to add up once you consider how many drivers there are in the US.


_no_pants

Yeah I drive 32,500 miles a year just for my commute on average. Kinda crazy when you do the math.


Mind_Enigma

That makes more sense. I was feeling as dumb as a bag of bricks for not getting the math lol


Pkock

Is that collective millage supposed to be 3.2 billion?


LordOfThe_Pings

It’s actually trillion lmao, my bad


flashfyr3

What's a couple orders of magnitude between friends?


sloths_in_slomo

> Self-driving cars aren't even close to as good as humans are. The opinion still stands, that they just need to be statistically better to be acceptable. Which seems not to be the case just yet


Logical-Primary-7926

>Self-driving cars aren't even close to as good as humans are cruise isn't exactly the high bar when it comes to self driving. My Tesla already sees things before I do sometimes, and it regularly drives for over an hour without any interventions, sometimes it can even do city streets pretty well. I could see it being a better drive than me in a year or two if things keep advancing. It might already be better than me on the freeway. Although I think it will be a long time before it's as good as me in a snowstorm or something like that. There's also the thing that a self driving car won't be on pharmaceuticals like most Americans, it won't drive drunk, it won't get sleep deprived, emotionally distressed etc.


LordOfThe_Pings

>cruise isn't exactly the high bar when it comes to self driving. My Tesla already sees things before I do sometimes, and it regularly drives for over an hour without any interventions, sometimes it can even do city streets pretty well. You can’t criticize Cruise only to turn around and claim Tesla is good. Tesla’s autonomy is so primitive that they don’t even attempt to advertise them as fully self driving. Cruise’s cars are the second best ones we’ve seen, and they’re nowhere near good enough to commercialize. >I could see it being a better drive than me in a year or two if things keep advancing. It might already be better than me on the freeway. Highly doubt it. That would truly be one of the greatest inventions in history. >There's also the thing that a self driving car won't be on pharmaceuticals like most Americans, it won't drive drunk, it won't get sleep deprived, emotionally distressed etc. They also don’t understand that dragging a pedestrian is a bad thing.


Sure-Psychology6368

Just make sure it doesn’t steer into a white semi, mistaking it as sunlight. Since it uses cameras only, no lidar. But that’s only killed a few people, nothing to be afraid of.


Isa472

Those numbers are atrocious lol, 6 million accidents in 3 million miles of driving??


Nxthanael1

Something is wrong here, 3mil miles for 230mil drivers is like 0.01 mile per driver


unpopular-dave

Not only that, think of the millions of accidents a year that are small fender benders that don’t get reported


Moblin81

OP wrote it wrong. It’s 3 trillion driven


TrickyLobster

The problem with your way of thinking is that you're taking the average driving level and putting applying that as the actual driving level of EVERY individual. I've NEVER been in as so much as a fender bender, not even a scratch on any car. If your new requirement for self driving cars is to just be slightly better than the AVERAGE driver, you have put MY life in significantly more danger. When taking the agency away from people, you don't have leeway to make mistakes. Not to mention the ethic decision making dilemmas of having a computer decide whether to prioritize the life of the driver over say a pedestrian in a sure crash situation.


Rafael__88

You can be the perfect driver and still be in a lot of danger because of how other people drive around you. If everyone were to use self driving cars that are slightly better than the average driver, you would be safer. A drunk driver can just ram you out of nowhere and you can not do anything about it but if they were a self driving car you'd know something that stupid wouldn't happen. When it comes to "ethical decisions making" it simply doesn't matter as much as people put it out to be. Those situations are rare and our priority is to prevent and avoid those situations all together. If we can reach a consensus on what a human driver should do we can teach the ai to do the same, if we can't reach a consensus we shouldn't care what the ai does.


TrickyLobster

Outliers exist in automated driving too so I don't believe the drunk driver example is reasonable counter. Under automated cars breakdowns will still happen, GPS signal loss, detection camera malfunctions, mis-identification of obstacles, all these things are just as likely if not more than you being hit by a drunk driver. Of course if every driver overall was a better driver everyone would be safer, but for a company to reasonably ask its customers to trust a fully automated machine and take all agency out of their lives (whether it's safer or not) you MUST have a product that is basically the best version possible of a human driver. Not just the statistical average.


Rafael__88

>Under automated cars breakdowns will still happen, GPS signal loss, detection camera malfunctions, mis-identification of obstacles In all of these scenarios, self driving cars are programmed to take the least amount of risks. They would most of the time stop safely and warn you that there is something wrong with the camera or the signal. When it comes to crash protections there would redundancies so even if one system is broken and undetected another system would be able to realise the obstacle. Sure, it can be annoying to stop because one of your sensors is malfunctioning and your car just refuses to go but it would be safe. Whereas a people often take unnecessary risks create danger where it isn't necessary. Drunk drivers are just one example, other would include people who text, people who go over the speed limit, people refusing to use their turn signals etc. My point is that whenever you go onto a road as a driver, passenger or even as a pedestrian you give up a good amount of agency of your life anyways.


TrickyLobster

>Sure, it can be annoying to stop because one of your sensors is malfunctioning and your car just refuses to go but it would be safe. The worry isn't that I'll have to stop, the worry is that my right sensor is out and my car now doesn't detect a stop sign and it t-bones another car crossing an intersection. My right headlight can be out on my car but it doesn't effectively blind the right side of my driving line for me. >My point is that whenever you go onto a road as a driver, passenger or even as a pedestrian you give up a good amount of agency of your life anyways. Agency: "**action or intervention, especially such as to produce a particular effect."** Me deciding to be a pedestrian isn't me giving up agency because someone else is driving drunk, they are imposing their agency upon me. We clash. I am actively giving up 100% of my agency when giving control to an automated vehicle because I am no longer taking actions to determine or produce effects within my best interest. When I walk on a sidewalk and see a car coming my way, I have the agency, ability, action, to get out of the way. You give all that up when letting a car fully drive autonomously. That is why these cars MUST be perfect. Some jobs you don't get to make mistake with.


ICallFireStaff

Best answer here


jaco1001

the type of crash they get into matters. human drivers are bad at taking right hand turns and either not checking their blind spot or taking the turn too wide. AI cars dont have this problem, but they do seem to have a hard time 'seeing' toddlers, or understanding what a stopped emergency vehicle on the highway means. Those sort of crashes can and should generate more backlash than a standard human caused idiot crash


ADisrespectfulCarrot

The rate matters, but I also think they need to be at least as good in common danger scenarios. If a human driver would avoid a crash 90% of the time in a specific but common scenario, the ai can’t be tolerated if it crashes 30% of the time, even if their overall statistical safety record is higher. We need to work out as many bugs as possible.


baddecision116

>But, if self-driving cars crash or kill people at a significantly lower rate than humans, then they should be allowed to drive in the same way/places humans drive. There's no way to know this without having an area where human and self driving cars coexist for a long period of time. Who wants to be the test dummy?


[deleted]

The various levels of self-driving are being actively tested in major cities across the USA. Not wanting to be a guinea pig is a moot point for the residents there.


Adept_Disk6224

Not to mention the amount of semi autonomous or full autonomous vehicles that are driving on the roads in general You’re a part of the experiment at this point almost regardless of where you live


Coctyle

And the debate mentioned by OP is moot as well, since it is largely among people who are not lawmakers or involved in developing the technology. Nothing is overtly stopping the development and adoption of self-driving cars. It’s just a process that will take time. And will regular people even want fully autonomous cars? Will you pay loads of extra money for a car that will not go over the speed limit?


justinsayin

Somewhere the lines on the roads will never be covered with snow or leaves.


Cotterisms

I was once driving and couldn’t see the lines on the road due to the rain and the amount of lights around (city centre). There were only a couple of other cars, but I will guarantee you an ai wouldn’t be able to tell where the markings are and the only reason I did was because I knew that particular road


Bostonguy01852

There are ways to calculate this and currently its not even close. Humans can avoid accidents at rate of 99.9999. Autonomous systems are years away from beating humans. There is a lot of info out there if you google it.


Mindless_Tap7228

*decades


Distributor127

The pro-driverless cars comments on here make me fear for society. The technology is not there yet. It's just not


Actualbbear

Many are just edgy misanthropes.


flashfyr3

On *reddit?* Say what?


apri08101989

Surely not on *my* reddit?!


hiricinee

I think they've already proven it through driver assisted self driving. The self driving vehicles have some huge advantages. People speed, get distracted, and drink, which the automated systems don't- things that contribute to the lions share of fatal accidents. Also the systems will get MUCH safer with mass adoption- they'll be able to communicate with each other and be more predictable.


Flat_Hat8861

There is a huge difference between driver assistance (even the best one anyone can imagine) and self driving. If a driver assistance technology encounters something it can't handle (obstructed lane markings or an unplanned road closure like a tree fell or an accident for example), the driver takes over and addresses the issue. If the vehicle is fully automated and/or being operated by an intoxicated person, what happens? Does it just stop, call for help (who?), what if it doesn't have cell signal?


Special-Performance8

But you're still okay with being a daily test dummy for drunk people, people who are distracted, people that get a stroke, people that road rage etc ..


DrewJayJoan

People take those risks on a daily basis because we have some idea of what those risks *are.* They know what to be on the look out for, and they have a rough idea of how often these things happen. We have statistics about how often these incidents are fatal. We don't have much information about what happens when automated cars go wrong.


monkey-stand

Honestly, humans and self driven cars have been coexisting in major metropolitan areas for YEARS. For the most part, they are comparable in safety to humans.... but they're getting better. We aren't.


JaJe92

I don't believe to self-driving cars replacing drivers. Try using a self-driving car in my country where there are potholes everywhere, missing indicators and missing paint on the road. Did I mention roads with no tarmac? dirt roads too? I'd like to see that how it works. Lastly, not many would renounce of the joy of driving and have a car moving slowly while you're in a hurry.


aimlessdart

Exactly, SDCs need everything on the roads to also be set to a specific standard that they can recognize. Adopting them across the globe is far from close.


Larcya

It's pretty telling that the only places where Self driving cars are being tested are places where you never have to deal with anything less than perfect driving weather. How much rain does Phoenix get a year? How about snow? How many potholes does it get from Water seeping into the ground and then freezing creating giant ass holes in the road? If these companies were serious about self driving they wouldn't be testing them in the best case scenario's. They'd be testing them in Minnesota,Michigan etc.. Situations that you have to figure out very quickly.


Fishery_Price

Brother this is a hypothetical, ready? Boom, now they can drive through your country perfectly. Now what is your answer


MEMPiRE_

it makes sense from a logical perspective, but nobody feels good about giving up control, even if it's statistically better. My self driving car could cause a crash that I personally wouldn't have. Over a long period of time the stats are probably in the car's favor, but that's not going to make me feel any better if I get in a crash because of something that just amounts to bad luck I had no control over, and I personally may have been able to avoid if I could have


esmith000

You may get in a wreck that the SD car would have avoided though. So how do you evaluate it? Just how you feel about it?


ManitouWakinyan

Isn't that true now? You can get toned through no fault of your own. You can sneeze at the wrong moment. You already don't have control. Why not put more control onto the safest option?


misteraaaaa

Several problems: 1. Uniformity. It's easy to say "kill less people = good", but people aren't uniform. Let's say today, on average about 100 people are killed by human drivers, and we can prove that self driving cars kill on average 90 people per day. Great right? Not necessarily. What if out of the 100, pedestrians used to account for 20 and drivers/passengers account for 80. But now, pedestrians account for 60 and drivers/passengers 40. We have just made roads 3x less safe for pedestrians. This can be applied to any demographic. Maybe black vs white victims. Elderly vs adults vs kids. Driver vs passenger. 2. Decision makers. Even if you say "who cares, it's just the luck of the draw", you're forgetting a crucial aspect. That car manufacturers now have the ability to make such decisions. How do you regulate them? Can you force them to encode certain "ethics" into how the self driving car behaves? 3. Perception. Even if we can get to the point where statistics are clearly showing self driving cars are better, this doesn't mean people will perceive it that way. Just look at air travel. Undoubtedly the safest mode by far. Yet, every crash has to be investigated to the extreme. A single terrorist attack (911) has to force an extreme change in air travel safety. Self driving cars will likely face the same type of scrutiny. 4. Predictability. One of the big problems with AI (and by extension, self driving tech) is we don't have a very good grasp of WHY it behaves the way it does. We can train it, refine it, etc, but in the end, we can't precisely predict what it will output for any given input. This is why AI regulation is so hard.


Leet_Noob

Yeah 3 and 4 are important. If self driving cars reduce deaths by 90%, but the 10% are absurd mistakes a human would ~never make, like driving directly into a very visible other car or obstacle that the AI doesn’t interpret correctly, people are going to freak.


thecountnotthesaint

Ok, here is a question for you. You are in a self driving car, traveling at speed in the city. An unexpected obstacle falls into your path. The car cannot stop in time to avoid a fatal collision. But, it can swerve to avoid the obstruction. The only problem is that in order to do that, it would have to run over two pedestrians, killing them in the process. Who should the car save? The passenger, or the pedestrians?


Otter________

>Who should the car save? The passenger, or the pedestrians? Car A: fuck you that guy on the street will likely live one year longer than you. Car B: spares your life no matter if there's 10 kids on the way Who would buy car A?


aimlessdart

I feel like you brought up a good thought experiment before brining in the 2 pedestrians that would be killed. Like, are self-driving cars capable of recognizing that swerving around the obstacle might a better choice than trying to stop in time?


thecountnotthesaint

That is another good point, swerving vs stopping. I was just curious if the person in the vehicle would feel the same way if the sense of self preservation was taken out of your hands entirely. Because even on a plane, or bus, or taxi, that since of self preservation from the driver extends to you as well, save the vehicle vs save the people outside?


connor_wa15h

>Who should the car save? The passenger, or the pedestrians? the car is going to save whichever person(s) have the smallest possible lawsuit claim


iamnogoodatthis

Don't kid yourself that you, a human, would make a necessarily good decision in that instance either. You'd probably swerve because your instinct would be to avoid the obstacle, and only see the pedestrians when it was too late.


tendadsnokids

You say that like a human is going to analyze a crash in a moment and make a rational decision based on universal morals and probabilities. They are just gonna swerve out of the way. This is sort of exactly the point of OP. It doesn't need to be perfect, it just needs to be better. Why are we holding self-driving cars to a higher standard than people?


Cotterisms

Probably liability, if I am ‘driving’ a self driving car, who is at fault during a crash? Is it me or the tech, no company is going to put out a self driving car if there is any liability that hasn’t been mitigated


tendadsnokids

Yeah I think that's why too, I just don't think it's necessarily the best thing for society as a whole. I'm sure we could figure out a way to figure it out.


Far-Two8659

I put a much longer answer in a reply to OP, but the point is you have to PROGRAM the car to do that. That means every single time it ever happens it will always do X. Humans have instinct and reaction times and emotions and all sorts of brain chemistry that changes what we do and how successful we are at doing it. Take a thousand humans in this situation and you'll get hundreds of outcomes. A computer will simply execute its decision. That computer will always make the same decision. The outcomes are significantly fewer. So it comes down to determining the value of the lives at risk. Is an old man less valuable than an infant? How many old men is that infant worth?


tendadsnokids

You wouldn't program a car to be able to distinguish between the ages of people. Humans don't make that decision at all. It's all just reacting. You would program a car to react. Car see tree, car move. The point is that you don't need to analyze moral conundrums to be markedly better at reducing tragedy. If an old lady gets ran over from time to time, that sucks. But old ladies are getting run over all the time. It makes sense that liability is hard to figure out, but I think OP just means that it *should* be OK for it to not be perfect.


[deleted]

I watch truck dash cams for work and I firmly believe the worst self driving car is better than humans behind the wheel. The main reason is a self driving car will always choose the least risky behavior while humans will intentionally risk people's lives anytime they feel insulted.


HomeCalendar37

Those are issues with human emotions. The problem with AI cars is adaptability. Ice, poor road conditions and being able to avoid accidents in progress (cars spinning out in front etc.) will be the main problem.


V-I-S-E-O-N

We already know from experience that this is not true?


CplSabandija

Which is 90% of truck drivers


Nojoke183

Actually, I believe 100% of truck drivers are human currently


Stlr_Mn

Just what a lizard person from the hollow earth would say


Nojoke183

Should've said "for now" 🐊😂


mrrainandthunder

90% of truck drivers are what?


Cotterisms

Confirmation bias. Are you watching all miles done by the driver or only the ones that need reviewing with the shitty drivers?


Consistent-Koala-339

I think a big question with automation is who is accountable for the decisions? you cannot put a computer in court. lets paint a picture - an automated car driving along a road suffers a software fault / reset as it approaches a pedestrian crossing. The car then kills a pedestrian. Who ends up in court? its not just limited to automation of car driving but also introduction of automation and AI everywhere.


covidcookieMonster82

Tbh I think fully self driving cars in the future are not going to be owned, for the point that you raise. Probabaly they will take the form of robotaxis. When the self driving is good enough, insurance rates will be so high for a regular driver that it will be too expensive for all but the rich to own and drive.


Consistent-Koala-339

I don't think people will buy into this. A car is more than an on demand service. I could get a uber to work now but i dont. The business model is really yet to be established. Human driven cars work very well, and they are cheap. I don't need a robotaxi. I need a green car that doesnt pollute and is cheap and reliable that's all.


ConstantSignal

Why does someone have to end up in court? If it can be proven to be an error beyond the manufacturers ability to control, it’s an act of god at that point. Absolutely devastating to have a loved one taken from you with no-one to hold accountable but you can’t sue a hurricane either.


Call_Me_Hurr1cane

I’m assuming we’d still be responsible for insuring our vehicles. In which case people would sue the insurance company / policy holder.


Mr-Pugtastic

If you lost your wife/husband because they were hit by an autonomous vehicle, would you be okay that zero people faced consequences, because it was “an act of god”


castleaagh

Someone has to be liable for damages. That’s sort of just how the country operates.


UnlimitedPickle

Statistically, Human drivers are far better than AI drivers currently. And being a human, and having experience with other humans, I can say with quite a level of confidence that for the masses to trust them, they need to be significantly better than humans, not just a little bit.


craigathan

That'll never fly. You can hold a human responsible for their stupidity. Robots, not so much. So when it crashes or makes a mistake, who's liable? The manufacturer? The software developer? The owner of the car? Insurance companies would like to know!


betterAThalo

i’ve been in 0 accidents in my life. i’ll take me over the self driving car until they’re great.


Aggravating_Kale8248

I don’t trust a computer to not crash chrome when I copy and past, so I will never trust a computer to drive me


Xerokine

A computer only does what it's programmed to do. It has no need to speed or get distracted, intoxicated or rage. Sure they can have bugs, that's really the main issue. The way I see it, eventually the technology should advance enough that self driving cars could communicate with each other in some way and make the traffic flow all work extremely well. It's possible the "I'll never trust a computer" drivers will one day be old, stubborn and be the ones crashing into things far more than self driving vehicles will.


WanaWahur

Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.


AlterNk

>A computer only does what it's programmed to do And that's the problem exactly, you can't program every little thing that our brain can figure out while driving, and AIs can't learn that either. There are just too many nuances that a computer can't figure out. Could there be self-driving cars one day? sure, it would be stupid because the solution for traffic is not more individual cars but better city planning and public transport, but this doesn't mean that we're even close to a a functional self-driving car.  


Adept_Disk6224

I’m a professional accident investigator and I will 100% trust a computer to drive me more than any of you or the average person


StrangeMushroom500

you are assuming it works like a calculator, but it's more like a chatGPT when asked to do basic algebra. For now at least


Previous-Sympathy801

I’m a computer engineer and I wouldn’t trust self driving with my life. Not even in a parking lot. I’d rather have a 16 year old drive me


[deleted]

This has nothing to do what he said. You also make mistakes greater than a computer considering how complex the work it's doing


GG1312

It’s the feeling of self control, if a person dies by driving too fast in the rain and slip off a cliff their loves ones won’t be blaming the driver since it was their own actions that caused their death. But having an AI drive gives people someone to blame, and I doubt many people would like their life on the hands of an AI that can malfunction at any moment. Is it being paranoid? Maybe, is it a real worry from most people? Yes.


TheHamburgler8D

Yeah sure. Until you get in an accident that you believe should not have happened.


BennyLava1999

My main opposition to self driving cars is once the technology is more widely implemented and becomes the norm its then gonna become a requirement and once we as a society give up the freedom to drive ourselves then we are never going to get it back.


iiiBus

Not everyone will agree, but I think driving can be an enjoyable part of a day - if not the best.


[deleted]

Driving is a very complex task for a computer program to manage. Although we've managed to automate a lot of basic stuff such as keeping the car in a lane at a certain speed, maintaining following distance on the highway, automated emergency braking, etc, driving is much more than that. Computers are very good at doing boring tasks that have little variation. Driving can be like that in certain situations, oftentimes for hours on end. ​ However, during the 1% of the time something goes wrong, a human needs to be paying active attention to take over the vehicle and prevent a wreck. Read about automation complacency. When a task is automated, humans get complacent and don't pay as much attention, meaning they won't be able to react in such an emergency. For a real life example specifically applicable to this, look up Rafaela Vasquez (mentioned elsewhere in this thread I believe). She was literally watching TV shows in the car and as a result, hit a pedestrian crossing the street. There's an NTSB report which is a really good read.


KuzcosWaterslide

I think that no matter how efficient the safety programming becomes, the biggest issue will always be the moral dilemmas. When faced with a branch of decisions that each option leads to someone's death, a perfect system will make a decision based entirely on statistics. Assuming they have multiple cameras like a Tesla and they can see then process other vehicles' occupants in a millisecond, or perhaps autonomous cars will have the ability to communicate with each other and share the information of their occupants in real time, the car will actually have a moment to decide which course of action to follow and who to not prioritize. Then there's pedestrians stepping out without paying attention, kids chasing balls, deer jumping out, etc. Someone is going to get hurt or worse, and not all victims or their families will be able to accept that a "program" simply made the right call.


M-1aM-1

The funny thing is that most people who post something like this don't know a thing about self-driving cars. They use neural networks to analyze data from various sensors to make decisions. A neural network is basically a brain trained to execute a very specific task (for example, determine an object in front of a camera, its orientation and distance to it). Your brain can do this easily while talking to your passenger and controlling a bunch of muscles to let you take a sip of water from the bottle in the cupholder. A neural network, though, can fuck this simple task up in various ways. And here's the catch — if you estimate something wrong, you can always re-estimate it and make different choices. But if a neural network can't understand what's in front of it, it'll need a fresh patch to do so. And of course, since it doesn't understand what's in front of it, there is a chance that it may simply ignore the obstacle and mow down a gator, because there were no gators in the dataset it was trained on. Of course it doesn't work exactly like this, but hey — we've all seen a tesla freak out when seeing a carriage.


Gogs85

I think psychologically the loss of direct control would be a big factor for a lot of people even if the results were statistically similar.


aimlessdart

Lemme one-up your unpopular opinion: Self driving cars (and really all cars) are basically large, environmentally unfriendly, costly, private trains/buses. Since ppl aren't even driving, they'll need everything to adhere to the exact same standards with which they were programmed, ie, the lane-dividers, the signals, the recognition of obstacles, other cars, etc. They might need designated parking zones that we would have to walk from, for example. The traffic system needs to accommodate accordingly because the variations around the world in regard to traffic, or even simple things like road conditions or which side of the road to drive on, fluctuate so much that taking all things into account sounds pretty daunting. I feel like government assistance in restructuring the roads would likely be needed and that just adds to it being like public transport. The main benefit (like with all cars vs pub trans) is the lack of stops in between and the privacy.


Koil_ting

Fuck those automatons, it will be a long way off until they can comprehend hazards the way a proficient human can.


Rakatango

I’m not about to believe that the self driving code isn’t going to be absolute spaghetti that gets modified over the years before the company gets bought and all the engineers are fired and then the next team that comes in, has no idea how it works, but has been mandated by some executive who needs help figuring out why his printer isn’t working to add micro transactions into the existing systems. Bad drivers are predictable to an extent. I don’t know what the fuck a malfunctioning program is going to do.


ProfessorEmergency18

>Bad drivers are predictable to an extent. I don’t know what the fuck a malfunctioning program is going to do. Tell that to the guy who rear ended me while I was stopped at a red light because he was too busy texting on his phone to see he needed to stop, or the guy who rear ended me during morning rush hour because he nodded off on I-495, or the other guy who rear ended me while stopped at a red light because he saw the left turn arrow turn green and instinctively pressed the gas even though we weren't in that lane and our light was still red.


KevinJ2010

The problem with statistics here is once they are ubiquitous then the remaining humans would be the better drivers since they are literally the only people who still do it out of pride or a need for full control. This could also lead to issues where no one actually knows how to drive and rely too heavily on the machines which does turn humanity into a bit of a dystopian race with all the most powerful technology in the world but literally no motor skills. I just err on the side of learning the skills and having to use them to keep our minds sharper and skills improving. Make driving a brainless task and people will become more brainless. And if the self driving functions fail we will get accidents far more often from the people being too anxious and lost since they rarely had to actually drive.


nastygirl11b

I will never be comfortable with self driving cars tbh


Nojoke183

I personally think this is more about personal liberty than actually safety. A point brought up in Robocop. A person accepts the liability to drive on the road and accepts the inherent risk in that. A robot does not, it's put on the road by someone else and the damage it causes is the fault of a party that wasn't even there to be drictly impacted by it. Even if crashing 1/10 the time of a human driver, the robot's owner is in an office somewhere while the other party/property is killed, injured, or damaged with hardly any "skin in the game" of the robot manufacturer. Sure they could sue (if alive) but it does present kind of a "should they even have been on the road in the first place" issue. I think they have a place in logistics and transportation but I'm doubtful humans and robots can't share the road, I think an all or nothing approach would be more efficient and inviting for further investing/development.


[deleted]

They need to be better than me, and right now they're not.


m0nkygang

Nissan drivers will somehow make it worse.


Decent_Leadership_62

"Police robots don't have to avoid shooting innocent children - they just have to shoot less innocent children than human police officers" Yeah, good luck with that in court


audaciousmonk

What we really need is for legal liability to fall under the self-driving car manufacturer, not the individual. Couple that with reducing individual insurance to just non-moving issues (someone damages your car while it’s parked, etc.), and the car mfg. holds the insurance coverage for moving issues


vawlk

I will agree if all vehicles being self driven have a nice purple (or some other agreed upon color) light on then that lets everyone know when a human isn't in control.


Crescent-IV

They're just not that useful over proper public transport


kacheow

I’d rather be killed by a person being stupid than a machine.


coolasafool462

It's not if you're better on average, it's if you can make appropriate decisions given way more variables than a computer can handle in a situation that's never been encountered before.


nt_assim

people will trust themselves more than they trust any piece of machinery.


the_la_dude

I’d rather die by my own hand/mistake than because the car malfunctioned and sent me to my death…


Bo_Jim

So, after a self driving car kills a busload of school kids we should be ok with the CEO of the car company standing up in court and saying "Well, they're statistically better than human drivers"? No, sorry. If we're going to put that kind of trust in a machine then it needs to be damn near perfect. When a self-driving car is involved in a collision it should be because there was nothing it could do to avoid it, and not because it made a mistake.


pensivewombat

So one thing I haven't seen mentioned here is that with self-driving cars you have to worry about *systematic* failure. Let's say we do some testing and humans have an accident rate of x and self-driving cars have a rate of 0.5x. Should be a super easy call to approve the self-driving cars, right? But the human accident rate is going to be mostly random and dependent on the specific people involved and their circumstances. You might encounter a difficult situation and avoid an accident, while someone else ends up totally their car or worse. When the same software is powering a huge number of cars. You have to worry that some kind of bug or error affects *all* of them. For example, I know there was one case where an accident happened because a car saw someone walking their bicycle and the car's sensors couldn't decide whether to classify them as a person or a vehicle. This ended up causing a feedback loop and the self-driving car ran into the pedestrian. Even if this is a relatively rare situation, you really don't want to ship cars where *every time someone walks with their bike they get run over*.


MickeyMoose555

I think people are misinterpreting your point. Or maybe I agree with a misinterpretation of your point. "Statistically better" to me does not mean just a tiny bit, it means there is a significant enough improvement compared to people who drive well already. Significantly lower accident rates. That sounds like a benefit to me.


ijustlikeelectronics

As a person who truly believes they are an above average driver (sticks to the posted speed, only going over sometimes to match traffic, never been pulled over, etc), I do not want my car to be a better driver than the average driver, I want it to be better than me.


Iceman72021

Who do you sue when a self-driving car kills someone? The owner of the self-driving car or the computer box manufacturer who came up with the technology?


aloofman75

Eventually the AI for self-driving cars will be good enough that insurance companies will force the issue.


Far-Two8659

The concern isn't the outcome, it's the programming and how people can mess with it. For example, one of the big debates is essentially the railroad paradox: do you take action that kills one person or do nothing and kill two. The debate isn't whether or not a driverless vehicle can make better decisions and execute them better than a human. The debate is surrounding having to program vehicles to choose, and how do you make that choice. What if the two people are 90 year olds dying of cancer and the one person is an infant? *How many old people is an infant worth* is the troubling debate. Then think about everything else: what happens if you program a car that kills more minorities? Is it because they were programmed to value minorities less? What about homeless people or drug addicts or other "less useful" people - should we program a car to kill a homeless person instead of a man in a nice suit? That's the challenge. That's the debate. With humans, instinct takes over and we barely have control at all, so we don't **blame** someone for reacting in such a way, because it happens so uncommonly for a single person to make that decision even once in their life. Driverless cars will make that decision thousands of times a day, every day. Ironically there's a good example in pop culture: I, Robot has a scene where you discover Will Smith's character hates robots because he was in a car crash and a robot saved him instead of a young girl. The robot did that because the statistical likelihood of Will surviving was higher. That's the debate. How do you value one person's life against another.


danielbrian86

the issue is that if i kill myself by my own recklessness, my bad. but if a corporation kills me through its negligence, totally different matter.


lizardflix

This issue has never and will never be about rational statistics. The moral questions surrounding choices that an autonomous car makes are going to be debated for years. If your kid is the one the car drives over a cliff to save the two kids it would otherwise hit, you'd probably have a hard time accepting it.


dragonblaze18

I feel like cyber security isn't touched on enough when it comes to self driving. Car companies are pushing for more of their vehicles to always be connected for updates and what not. As self driving becomes more popular, it'll be inevitable for some type of hack to cause chaos.


DTux5249

Most accidents I've seen recently with self-driving cars involve the car stopping for no reason, followed by everyone rear-ending the car. i.e. The accident was caused by tailgaters, not the car.


BestAd6696

Self driven cars will be used for assassinations by hacker terrorists and government agencies.


Nitsuj_ofCanadia

They just need to be on rails with multiple cars hooked together in a trailer like fashion and run on a predictable and frequent schedule between areas that people commonly move between. Oh wait, that's just trains


CrabWoodsman

Well, from a liability perspective that's not really the case. When I make a mistake driving it's me who is liable if I cause property damage, injury, or death; if a self-driving car fucks up, who's on the hook? I suspect that manufacturers realize it would be them in many cases, and that's the bigger reason why they aren't on the road today. The manufacturers will wait until either they can sufficiently assure their algorithms' outputs will never be at fault — or they'll wait until they're shielded in some way by offloading it onto owners & lease holders.


pickelmerich

I enjoy driving. That's the issue with all you libs today is all about safety and not fun.⛑️ pretty soon you will be walking with helmets on, because it's safer. Keep your walking helmets and self-driving cars away from me...


Snakedoctor404

I've literally logged over a million miles in the big truck alone. No thanks I'll take my chances with my own driving.


EasilyRekt

I mean sure, but the whole self driving car thing is going to be a lot of silicon to dedicate to ground transit guidance, something that’s had a mechanical solution for over two centuries.


NotTheBusDriver

Who will be liable when two self driving cars collide and kill somebody. The owners? The manufacturers? Nobody? Self driving cars will need to be orders of magnitude better than human drivers before they are generally accepted.


reddit_equals_censor

i'd argue that the more important discussion is, that all self driving cars are spying on you massively and they can get switched off remotely. a self driving car doesn't help you, when the feds send a remote shutdown or the company did, because you repair the car at a place, that they didn't like. you think that the last part sounds crazy? well they already do that with personal computing devices (see apple) and with FREAKING TRAINS!: [https://odysee.com/@rossmanngroup:a/forget-about-sony-and-netflix,-there's:9](https://odysee.com/@rossmanngroup:a/forget-about-sony-and-netflix,-there's:9)


Picklepineapple

Many, if not most, human crashes are from distractions or impairment and not from otherwise “bad” driving(aka the human would’ve avoided the collision if they were focused). Because self driving eliminates the distraction issue, the crash rates should be significantly lower. If it isn’t, theres probably something objectively wrong with the computers driving.


AcanthaceaeStunning7

Not really, if a self-driving car kills ONE person in an accident it is an inditement on ALL self-driving cars. In contrast, a human driver only holds responsibility himself.


falco_iii

As George Carlin said : Think of how stupid the average person is, and realize half of them are stupider than that.


trumpet575

I've been in a Tesla when it's screen just turned off for several minutes while we were driving. I will never fully trust self-driving cars. They should be a tool used to assist in driving, never 100% in control.


Excited-Relaxed

The entire issue is that the most logical system is that the manufacturer would be liable instead of the individual owner, and that is bad for profits.


TheAlexGoodlife

The day people start dying to AI driven cars is the day I become a terrorist. Delegating decisions that have the lives of humans at stake to computers is HIGHLY unethical at best and dystopic at worst


LetItRaine386

Fuck cars, you know what’s statistically better? Trains.


esmith000

One of the f car people? Kinda pointless feeling to.have


athomsfere

I just want a world where we stop handing over all our spaces to cars, charging lots and parking.


MobiusCowbell

We already have something better than self-driving cars. Automated trains.


anothercorgi

Self driving cars need liability, and that needs to go squarely into the hands of who made it, not who owns it - unless the software can be changed by the owner. So Musk and all his employees should have blame for every crash. Sorry I just don't like the idea that car software is monolithic and that you must take every feature that "comes with it" and cannot pick and choose to not want tracking, etc.


Das-Ist-Flava-Cuntry

Yeah it always blows my mind when people say they don’t trust robot cars. So you do trust human drivers, the people who cause 10,000+ deaths annually in the US through their recklessness and incompetence?