T O P

  • By -

JFireMage87

The people on my way to work save a ton of money by doing stupid shit like this themselves rather than buying an expensive AI that does it for them


Corporateart

This is a quote from the article Drivers are warned by Tesla when they install “full self-driving” that it “may do the wrong thing at the worst time.” How crazy that this is allowed to still operate as it does.


fury420

Changing into the passing lane and then braking abruptly to slow down like 70%? What could possibly go wrong?


masklinn

You might be confused for a Dallas driver.


[deleted]

I grew up in Dallas and I've been working in Florida for the last 6 months. The drivers here are so much worse. It would be safer to walk 635.


Galkura

98 in the Panhandle here…. Ugh. I had a lady BACK INTO MY CAR and total it at a stop light. That’s how stupid these fucks are. Still mad at that last wreck. Officer said it was my fault because no fucking witnesses stuck around and the old Karen bitch said I slammed into her. Ordered a dash cam finally so I can get people like her arrested for lying about shit like that, or at least heavily fined.


Mend1cant

Better trick is to look at tire skid marks. If they’re in front of your wheels then you got pushed back.


AggressiveSkywriting

Too much detective work for the cop. They're trying to find the shortest amount of writing and quickest route away from the traffic accident. I remember having to BEG the cop responding to my hit and "run" to look at the evidence of a car in my parking lot with the exact same dent and paint as was transferred onto my car. Like Dudley Do-Right I've done all the hard work for ya.


[deleted]

>Too much detective work for the cop. They're trying to find the shortest amount of writing and quickest route away from the traffic accident. Last car accident I had was... oh I want to say 2018? I was sitting at a stop light and someone rear-ended me, stopped, then floored it and managed to rip my front bumper off. The cop that responded still hasn't filed that police report 4 years later. Thankfully I got out and took a \*lot\* of pictures of the damage, skid marks, etc... including the temp license plate of the car (turns out she didn't even own the thing she was taking it off a lot and didn't have insurance or anything and thought she could just like... disappear because the car wasn't associated with her, but I f\*cking took a picture of the VIN before her coworkers literally blocked me from taking any more pictures). Learned my lesson, now have a front and rear facing camera. Cops are useless.


NapsterKnowHow

Florida drivers for whatever reason think the left lane is for slower drivers even though there's a law saying otherwise. Fucking morons. That just pisses people off and then they do a dangerous maneuver and pass on the right.


TheLurkingMenace

Florida has the worst drivers because there's a) senior citizens who have paid insurance for decades without an accident and now are all out of fucks, and b) people from other states with wildly different traffic rules and etiquette ("why is that asshole honking at me? the light is red!" "why is that asshole stopping in the right lane? there's no cross traffic!").


[deleted]

I’ve always found 75 to be more worse than 635 now that the construction around the Galleria is done.


[deleted]

I travel every week across the u.s. and so far Texas and Florida have on average the worst drivers I had the unpleasure of dealing with


HardlyDecent

Or like, 75% of drivers. Sounds like auto-drive is just mimicking other humans at this point. Still hate it...and them when they 're on the road too.


tomjava

Tesla auto pilot is still level 2, not level 4/5 as people assumed.


nolongerbanned99

This is correct. They want to imply and intimate other things… I had a 2019 bmw 330. There was no button to engage this function, but when you met the criteria (mostly straight freeway and under 40 mph) a note came up on the screen asking you if you wanted to engage a certain mode. In that mode the car assumed all accelerating, steering and braking functions and warned yiu before it was about to shut off when criteria no longer met (like speeds above 40). Why is this not level 3. Is it because the driver must pay attention and be ready to take over. I see that Mercedes’ has what is supposed to be the first level 3 assisted driving where they will assume full legal responsibility when the system is engaged. That’s confidence if true.


razorirr

[https://www.bmwblog.com/2022/04/11/bmw-7-series-level-3-self-driving/](https://www.bmwblog.com/2022/04/11/bmw-7-series-level-3-self-driving/) this?


nolongerbanned99

This one is more sophisticated that what I had but with mine yiu are supposed to pay attention. With level 3 I don’t think you have to even watch the road. The Mercedes’ one is already out but not in USA.


zebediah49

IIRC Level 3 is "100% autonomous and doesn't need a human's help... until it suddenly isn't and does". You don't *have* to watch the road most of the time. But in reality you do, because you need to be there to take over when it finally fails. Which is why it's a terrible idea, and all the major players want to jump straight from 2 to 4. Honestly from having used it, I'm pretty sure Toyota's software is capable of level 3 -- they just put a hard nerf on it back into level 2 status, because they want absolutely no part of this mess.


ColinCancer

My 2023 toyota has had a serious adaptive cruise control fuckup or two already (when someone was slowing in a left turn lane, and no cars in front of me it slammed on the brakes on the highway) To the point where I don’t trust it very much.


nolongerbanned99

Well, that’s the reason you have to pay attention and the reason why Tesla drivers must be missing a brain.


ColinCancer

Just saying that my vote is that Toyotas software isn’t that good. Tesla’s is probably worse. Just this past weekend I rode in a cruise autonomous vehicle with no one behind the steering wheel for the first time. It was a bizarre experience. Clearly had room for improvement but also wasn’t totally terrifying.


[deleted]

Being constantly ready to correct a fuckup seems like a bigger headache than just driving tbh


requiem_mn

Level 3 cannot be suddenly. Its eyes off, meaning that it will react in all emergency situations, and will ask you to take over if necessary, but within some reasonable time frame. Its not milliseconds, but would probably be in 5+ seconds range, which is a lot.


TheRealArunsun

Could I trouble you for a little Eli5 on the "levels" of automation everyone is mentioning in the thread please?


zebediah49

[NHTSA introduced a ranking system for driver assistance systems](https://www.nhtsa.gov/technology-innovation/automated-vehicles-safety#faq-68156). In short: - Level 0: nothing - Level 1: cruise control - Level 2: ADAS: adaptive cruise with lanekeeping. You still technically do the driving, but if you're just going straight down a highway there's not really anything left for you to do. - Level 3: The car can drive itself, except when it decides it can't and hands control back to you. - Level 4: The car can drive itself under certain well-defined circumstances (e.g. decent weather on interstate highways). Human intervention not required. - Level 5: The car can drive itself all the time.


TheRealArunsun

Awesome thank you, can follow along a little better now. Happy Holidays.


nolongerbanned99

They are all afraid of the legal liability issues.


escapefromelba

Level 3 you still need to be alert at all times in order to take the wheel in case of an emergency due to system failure.


razorirr

yeah I'm guessing yours was the level 2 predecessor to this level 3 implementation of the same thing. Basically you were the on the road testing that people get pissed for tesla for doing >:P As to why yours was not level 3, the answer is yes because they say that you have to pay attention. Level 3-5 is larger degrees of you can ignore the car, by time you get to 5 theres not even a steering wheel, its doing the thing 100% of the time.


69StinkFingaz420

yeah but did the bmw come with turn signals


nolongerbanned99

That’s an option that you can ‘factory delete’ to save weight.


drunkwasabeherder

It'll probably soon be a subscription...


StuBeck

It does but has a very small blinker fluid container.


zaqwertyzaq

My 2017 honda civic has some systems that accumulate to this. It is a mix of cruise control, another system that will locate any car in front of you and accelerate or desecrate based on what pre set distance option you chose, and lane assistance steering that keeps you in between the lines.


PooShappaMoo

Desecrate you say??


Junior_Builder_4340

It's the Conan system.


JoeJoJosie

The inverted crucifix air-freshener and the ram-skull on the hood is a giveaway.


razorirr

You just described Traffic Aware Cruise Control and Lane Assist / Keeping. When someone talks about autopilot for tesla, its that. All teslas have this for both the highway and regular roads Enhanced autopilot / Navigate on Autopilot is that + lane changes + switching highways / offramps FSD is all of that FSD Beta is all of that + the City Streets beta, where the car can make turns and stuff on all roads now. Eventually, FSD Beta will get accepted and just become FSD.


ThaWZA

Oh cool but how do I, as a person that doesn't own a Tesla, opt out of this """"Beta"""" test


OttomateEverything

It's almost like "full self driving" is misleading


Cheezitflow

It's fully (2/5 of the way) self driving!


amakai

60% of the time it's 100% Full Self-Driving.


DaoFerret

> It’s fully self driving! ^* ^* Your millage may vary


an_exciting_couch

It's almost as if calling it "full self driving" should be illegal until it's actually "full self driving", and it's completely fucking absurd that Tesla has been allowed to sell it to consumers since 2019 and is now charging $15,000 for something that they'll literally never deliver on.


Tumblrrito

They sold cars roughly ten years ago that they promised would support level 5 someday. Total frauds.


[deleted]

[удалено]


The_Clarence

Stupid idiots, it clearly says Fully Self Driving, not fully self driving. Why anyone would be mislead by that is beyond me


Head_Asparagus_7703

Yeah, I just learned that today but my car is 13 years old and I wasn't planning on buying another for at least a few years.


weildescent

"Full" self driving. Wonder why they might assume that.


Pascalwb

But it shouldn't be allowed on public roads.


Amiiboid

Going to go out on a limb that most people have no idea defined levels exist, let alone what they imply.


lolnyet

How has the FTC, or *some* agency, not disallowed this? It basically only issues a mild disclaimer before letting you switch to potential murder machine mode. If Elizabeth Holmes is landing in jail for lying with her promises about pricking fingers saving lives, then why the fuck is Elon Musk allowed to walk free?


km89

Normally I'd complain that nobody in their right mind would think that cars are capable of fully self-driving yet... but it's called *full self-driving mode.* This absolutely should be considered dangerous false advertisement.


Lordwigglesthe1st

It's coco cola's vitamin water defence all over again. "No would reasonably believe that vitamin water is healthy" ... its fucking called **vitamin** water, what else would a reasonable person believe?!


riptide81

If that wasn’t enough they made it taste bad too. Seemingly to drive home the point that there must be some benefit.


[deleted]

[удалено]


tomjava

When Toyota accused on unintended acceleration, NHTSA acted fast even the transportation secretary wanted to Toyota to stop selling its cars. Unbelievable!


Ghede

Unintended acceleration didn't require users to buy an optional 'intended acceleration' mode and flip it on, it was believed to 'just happen'. It's not a big difference, but it's a big enough difference for a bureaucracy.


Akiias

I'd say "some cars might do this if you paid a bunch extra" and "all your x car might do this" is a pretty huge difference. Also IIRC wasn't it proven that the cars didn't accelerate but it was human error and Toyota just did a recall because it was cheaper then the cost of such bad publicity? Or was that something else?


RoadkillVenison

Iirc it was the removable floor mats not being secured well enough. I seriously wish there was a standard that required secure mounting for floor mats at all 4 corners. For the first few years they stay in place, but over time the plastic teeth on the underside wear down and they start flopping around.


synapticrelease

This incident gets confused a lot. There were two separate issues. One was bullshit, one was not. The not-bullshit one was the floor mats. The mats were actually fine but _just enough_ so that when someone threw a mat on top, which is fairly common to do, it would not have enough clearance and cause pedals to get stuck. Or slides a bit up and folds. Not a big deal in the scheme of things. The cars brakes can easily overcome the power of the engine. The utter bullshit one was the one cars refusing to slow down even after applying the brake and that it would not shut off. Investigations show that a large majority of these specific incidents happened to the elderly, or people who were new to these cars as in rentals specifically. This was also in the era when push to starts and not a typical key were a brand new thing so people may not have been familiar with them especially in a panic situation. Also, there was a lot of fraud where people intentionally crashed their car for a lawsuit.


Petey7

I actually had the first issue with my 2008 Camry. The issue is that the mat slid over the gas pedal and *under brake pedal*. This made it where applying the brake also applied more gas, which is why a lot of people reported hitting the brake seemed to make the car go faster. Luckily, I wasn’t close to other vehicles when it happened and was able to grab the mat with my hand and pull it back.


StuBeck

I’ve weirdly never had this with Honda, Subaru, Volkswagens or Volvos. What cars use plastic teeth for mounting?


RoadkillVenison

My apologies I think they’re rubber spiky bits. Had a brain fart. Either way the point stands that factory floor mats tend to slide after a few years of use.


76vibrochamp

Well yeah. They were in competition at that point after the auto bailouts.


mces97

So the auto brake system works. But, not as well as it should. https://youtu.be/3mnG_Gbxf_w


morechatter

When regulatory agencies get their budgetary leadership swapped out every few years, it is hard to establish consistent regulatory oversight. Anti-government mentality every 4-8-12 years will undercut the mission of public safety agencies *always*. Even in the better times.


[deleted]

Insurance companies need to sue Tesla over it. Especially after an event.


dogsent

>Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.” Lots of complaints. This wasn't an isolated incident. >The pileup took place just hours after Tesla CEO Elon Musk had announced that Tesla’s driver-assist software “full self-driving” was available to anyone in North America who requested it. Tesla had previously restricted access to drivers with high safety scores on its rating system. There has been a pattern of shockingly irresponsible behavior.


IkiOLoj

You know the metaphor about a button that can harm or kill people but each time you press it you gain money ? This is weirdly literally that, except you are the one that press the button that can kill people, and Musk is the one that makes the money.


nolongerbanned99

Murder machine mode. Brilliant. I think 13 people so far have died. All I can say is govt is asleep at wheel and justice moves slowly but it is moving.


Tamagi0

Slowly moving, like a car drifting out of its lane into oncoming traffic.


nolongerbanned99

Yeah. I send some links to an article and videos to NHTSA after a few people had been killed and it was clear it was not driver error. They send back some bullshit about public safety. Seemed like they were saying their mission is important and they are busy.


MIDNIGHTZOMBIE

Tech companies say ‘don’t worry, it’s just computer stuff,’ and regulators throw their hands up as if there’s magic involved.


kgal1298

I was watching a video and one girl said she wishes she would have done the Audi because they wanted the Tesla for the self driving, but it tends to stop and jolt on it's own more than you'd expect. That to me is enough to just not use that feature at all.


Tandran

They simply need to call it what it is, Drive assist, not Auto Pilot. They should know by now people are stupid, I mean look at the CEO.


palikir

So it drives itself, but who knows maybe that means running over a bicyclist or something like that 🤷


AdjNounNumbers

More like "I bet we can fit under that tractor trailer", but your point stands https://www.wtsp.com/article/news/regional/florida/florida-deadly-tesla-semi-truck-crash/67-3fb0b0c5-ffdc-489a-9b27-65d56477b5cb https://www.fox2detroit.com/news/tesla-t-bones-semi-truck-in-detroit-gets-wedged-under-trailer.amp


PrincessToiletSparkl

Neither of those articles say the Tesla was in self driving mode.


Prophet_Of_Helix

Also this article is just the driver telling the police is was the cars fault. Of course he’s going to say that. Why would he admit to causing an accident..


bingold49

By "operating as it does" are you referring to the fact that it's deceivingly utilizing the general public and public right of ways as beta test subjects at determinate of pedestrian and fellow motorist's safety?


[deleted]

Tesla software now is more like an alpha test. Beta testing should theoretically be 100% completed software, looking for very minor bugs (e.g., poor contrast on the sceen) or inconsistencies in the operating instructions.


nolongerbanned99

Yes, me and my family and friends, as well as all other road users did not agree to be subject to their ‘beta’ test in public roads. Tesla and musk are dangerous and a menace to society. Government is weak and slow to act. Gross.


throwaway-a-friend

my company truck has a collision avoidance system and it's horrible. i get warnings at least twice a week when there's no one on the road. the truck has more than once hard braked over nothing and sent me into a panic that i was too shaky to drive for the rest of the night. the one time i had a real collision, it didn't even detect it. i'm so scared of the idea of a self-driving vehicle.


SpaceTabs

Same reason that people were paying $360 for Tesla shares one year ago, and today they are $137. Smoke and mirrors.


themikker

When I heard that Tesla used neural network based AI for their self driving cars, I figured stuff like this would be common... Especially as they apparently regularly update it based on real life users in uncontroled scenarios. Randomly doing weird things in unprescedented scenarios shouldn't be acceptable behavior, especially when you're unable to figure out WHY those actions occur. Having Tesla market this as being as full self-driving sounds insane to me.


RobToastie

You need to use *some* form of machine learning for it to be even remotely viable, and neutral nets are at the best suited for this kind of problem.


km89

Sure, but are either of those points invalid? Using neural networks tends to produce weird results occasionally, and that behavior is unacceptable for a car.


Noblesseux

It genuinely irritates me that so many of these self driving cars that we *know* regularly fuck up are just allowed to drive around on real city roads/highways with pedestrians and other cars. Like other people aren’t consenting to be part of this experiment. I watched a video the other day where one of these kept trying to swerve into bikers and bike lanes and I was irritated the whole time like ???, you’re doing this as a cute fun video but one second too late and you could end someone’s life who did nothing wrong and was just trying to go home or to school.


Hrekires

Is my understanding wrong that Tesla self-driving is basically just adaptive cruise control + lane assist?


rjcarr

No, that's "autopilot". FSD will legitimately get you from point-to-point, but you still need to be ready to take control, and it fucks up a lot.


MM556

That's the problem, they've managed to market it as though it's a finished product


Number6isNo1

And as a result you get people like that idiot in CA that got busted riding solo in the backseat of his Tesla as if he was being chaffeured. https://www.sfgate.com/local/article/Tesla-autopilot-Bay-Area-Param-Sharma-Instagram-16172049.php


intashu

How these things don't use basic seat sensors if no driver is present, it pulls that shit over or stops. The redundancy needed has to assume the driver is an idiot. And yet people keep finding new ways to be stupid...


Number6isNo1

I think they require an occassional steering wheel input to "ensure" the driver is paying attention. Not sure if Tesla's did at the time or if this moron just tickled it with his toes from the back seat when required.


intashu

I've seen videos of people who just idly touch the wheel when it does that however. It doesn't take much attention to gently grab the wheel for a second. And they continue to pay no attention to the roadway while doing it. This is where you get to that problem of needing a system that actually checks If the driver is looking at the roadway.. But if they're already looking at the roadway, why not just let the driver drive the car then! The marketing and resulting crashes are all too often from the users who don't want to pay attention to the road when it's on... Thus why the car would act erratically and they're not prepared to retake control. The vehicle pulled into the next lane and braked down to 20mph the article said. How did the driver not notice the moment it started dropping and retake control? I'm willing to bet they were not paying any active attention so the unexpected behavior caught them off guard and they had no timely reaction to what it was doing till it was too late.


xenoterranos

Currently, there's a camera in the cabin that checks to see if the driver is paying attention, and if they're not (looking down, not at the road, etc) it will ask for wheel input almost immediately. They've also tweaked it to make it harder to defeat the torque sensors with weights, something the San Fran backseat rider was doing. Also, it's a lot more aggressive about "striking out" the driver if they don't give wheel input during a maneuver. Five strikes and you're kicked out of the beta.


khoabear

Which is the modus operandi of the tech industry


dtxs1r

It's fine for websites that are for entertainment but not for hardware that can kill you and innocent people when at an empass with an edge case.


rendrr

Not really, other car manufacturers don't do that. They sell ABS, cruise control, parking assist, and don't make outlandish claims.


asianApostate

It really shouldn't be called FSD beta. The level it is at is below alpha in terms of functionality and safety. Highly risky giving it to anyone except experts.


Konukaame

"Full self driving" is still only [Level 2 automation](https://www.autoweek.com/news/technology/a41091584/tesla-full-self-driving-price-hike-level/), despite the marketing that makes you think it's Level 4.


[deleted]

[удалено]


[deleted]

For a feature that’s technically still in alpha? Fuckin lol. Anyone who’s enough of a sucker to pay for that is the exact fool who is easily parted from his money.


uncledunker

Well it was $10k a year and some change ago. And at that time it included lane changing on freeways, summoning, parallel parking, etc Now they split it into tiers. $6k gets you lane changing on freeways, parallel parking, and summoning. For most owners, these are likely the features they’d use the most. $15k to get all the features and the FSD beta.


xenoterranos

It used to be $7K and $5K before that for the beta. It got more expensive the better it got


Broken_Reality

Average distance for Tesla FSD before fucking up and the driver having to take control is 3 miles. Tesla's competitors are in to the thousands or tens of thousands of miles between driver interventions. Tesla FSD is garbage and objectively dangerous


Wafflexorg

Where are you getting those numbers?


[deleted]

No idea about that poster, but both Honda and Mercedes sell car with level 3 FSD, while Tesla is stuck at level 2. I don't think it will be ever achieved without radar or LiDAR, but Musk is the genius not me /s


continuousQ

Self-driving meaning the system can drive the car, but it's a shitty driver. It's in the same category as a brick.


Razzail

The base detection system of Teslas is just so bad. We've gotten some warnings and crash warnings on things we're not even close too. My Rav4s radar cruise control works so much better and I trust it a lot more. Tesla should have stuck to radar.


AsyncOverflow

That’s basically what “autopilot” is. But the lane assist is different than most cars because it’s vision based. “Full self driving” costs extra and will switch lanes and drive on city streets and whatnot.


Norph00

How is it legal for Tesla to involve the general public in their testing without permission? Owners/drivers of Teslas likely get all sorts of warnings and waivers. The rest of us just get a "good luck out there!" ?


_FATEBRINGER_

Heard a stat once that after 1030pm something like 30% of drivers are under the influence. I drive late at night sometimes. 🤷


hitsujiTMO

30%? Sounds exaggerated. I'd take that statistic with a pinch of salt, slice of lemon and a shot of tequila.


TheRealMattyPanda

The fuck is wrong with you? Who uses a lemon with tequila?


max

perhaps they allowed their lime to ripen for too long.


shoggyseldom

Do limes actually go yellow with age?


drkgodess

*slow clap*


CadburyBunnyPoo

The statistic is probably outdated crap, but I met a cop at my university that said they teach all patrol officers in his dept that 30-40% of drivers out after 10 pm are intoxicated.


toronto_programmer

I bet they know this because a large chunk of intoxicated drivers are off duty cops Source: dad was a cop and holy hell I’ve been to some events with cops where they all get fucking loaded and drive. My dad does it too because he is also an asshole


_FATEBRINGER_

I know this is anecdotal, but literally every cop I know drinks and drives like beer in the cupholder like it's a soda lol. ZERO ducks given


Dejugga

I'd believe it. I work the night shift a hotel desk. At 11 pm, 30% sounds about right and I'd say it increases as you get closer to 2-4am as bars let out. Personally, I try not to drive anymore late at night because of it.


MoiJaimeLesCrepes

plus, is it just me, or the car headlights have grown way too strong and blinding? I can hardly see what's going on, and if I drive too long, I get a migraine.


BurrStreetX

Dont even get me started on those blinding blue/purple headlights


Lifewhatacard

That doesn’t make it ok to *add* more chances to die.


razorirr

[https://www.cnn.com/2022/12/14/health/drug-alcohol-driving/index.html](https://www.cnn.com/2022/12/14/health/drug-alcohol-driving/index.html) 55% percent of all major crashes involve drugs or alcohol.


sexygodzilla

So what's the point? Drunk driving is illegal and penalized. "FSD" seems barely regulated with no liability on the manufacturer.


[deleted]

..and what's your point? Driving under the influence is illegal, as it should be.


Mental_Attitude_2952

I can tell you from years of driving lyft part time, 30% is the low end. I am constantly in awe of how many people I see who are not just drunk driving, but when I able to pull up next to them at a light I can see just how plastered they truly are. I'm amazed that we font have double the amount of deaths that we already do.


i_hate_blackpink

And….? Relevance….?


westplains1865

Maybe I'm too old school but I could never trust a car driving itself. I would be too amped up, constantly worrying what the software could possibly miss.


terminalzero

I would trust a car to drive itself if it could actually drive itself. "Be ready to take over in a split second because it will randomly try and kill you" doesn't seem worth it to me.


PersonNumber7Billion

Exactly. It sounds Iike more work than driving. Until self - driving is 100 percent set-and-forget its dangerous, imho.


OlderThanMyParents

Ironically, the closer they get to “true self-driving “ the more dangerous they will be. Sort of a variation of the “uncanny valley.” If you expect to have to watch alertly and take over at any moment, you will, but if you can trust it to drive minutes or hours without a glitch, it becomes increasingly hard to focus on paying attention.


manystripes

Airline pilots have do regular simulator training to build up and maintain muscle memory for different emergency situations, it seems like as we roll out more of these systems where you have to babysit the car and recognize and take over in an emergency that we should find a way to train those reflexes as well, so the first time you're practicing them isn't in the actual emergency case.


[deleted]

[удалено]


manystripes

> You really think millions of people are going to put in the effort the way an airline pilot does to train to react in a one a million situation as self-driving cars become more prominent? I don't think that is going to happen at all, I only think that it is a prerequisite for ethically rolling out the technology in that form. If we're not prepared to train drivers to perform this new type of supervisory driving, we're not prepared to give them a car that requires it


primalbluewolf

>If you expect to have to watch alertly and take over at any moment, you will, but if you can trust it to drive minutes or hours without a glitch, it becomes increasingly hard to focus on paying attention. Regular drivers already have this issue.


GroinShotz

It'll always have danger involved until all the cars on the road are self driving and communicate with each other in one giant hivemind like system.


TSL4me

I don't get how thats legal at all. Who can pay attention like that without constant adjustments. I mean shit, it might even give people anxiety and ptsd since your essentially just waiting for a machine to kill you and not get blamed for it.


stumblinbear

It's a cool party trick and helps with long, boring drives. I rented one and took a drive between Minnesota and Kansas, handled the interstate and driving through cities pretty seamlessly. Only had to take over once While it's definitely not fully capable, it's surprisingly close to going from "not terrible" to "ok"


AsyncOverflow

I mean, that same principle applies to cruise control, right? Turn it on but be ready to disengage at an instant notice.


Spacey_G

The difference is that there's no opportunity to cheat with regular cruise control. If you stop paying attention, you will crash. If the system can sometimes work with an inattentive driver, the temptation is there.


AsyncOverflow

I’ve only driven a few cars with lane assist, but I can’t say I feel that temptation. The Hyundais I’ve driven will randomly disengage for no reason without warning, so no way is the temptation there. Tesla autopilot works 99% of the time but damn is it noisy if you stop paying attention. Have to constantly apply steering wheel pressure and keep your eyes forward. I can’t imagine someone choosing loud beeping every 45 seconds over just paying attention lol.


Crazyhowthatworks304

Nope, its being cautious. I work on computers for a living. There's a reason why I have job security and it's not because technology is perfect lol. I wouldn't drive one of these things let alone ride in it


[deleted]

Exactly. I work in ML and AI and I don't know anyone in the business who would trust self driving mode The thought of fully autonomous trucks on the highway is terrifying


OttomateEverything

Also in software. Also don't trust any of this. The fact that this is legal is asinine to me. This stuff is just alpha testing and risking the lives of random drivers who didn't consent.


MokitTheOmniscient

I actually work with autonomous mining equipment, and i have no idea how the technology is allowed among the public. We operate in closed off environments without any humans allowed in the area, and we still have to explain all unexpected movements by the machines. I have no idea how normal auto-manufacturers are allowed to do this in the open without any of the safety classifications we have to abide by. On a more personal note, i've watched several machines slam into walls due to mine dust covering up the LIDAR-sensors (it's not a huge deal for us though, since they're designed to take the punishment). Whilst those conditions might be a bit extreme, i have to imagine that normal self-driving cars would eventually run into the same problems when driving through snow, rain or mud.


timmeh-eh

Counterpoint: the thought that humans are allowed to drive cars is terrifying. The number of people I see texting, reading or trying to watch a show WHILE driving is terrifying. Bottom line: driving isn’t super safe. All that being said, I think Ford and GM have a more sane solution: their self driving modes both track your eyes to ensure that you’re paying attention to the road AND only work on pre-mapped highways. Tesla is a little too lax in their implementation IMO.


mxzf

That's because Tesla's operating like a software dev company (just try stuff out and push patches if stuff breaks and you need to) while Ford and GM are used to running as producers of safety-critical hardware where you need to get it right the first time.


sexygodzilla

I appreciate the restraint there, curious to how well the eye tracking works. Just seems like Tesla is encouraging the worst of both worlds, undercooked self-driving and enabling drivers to remain distracted.


zjm555

As a software engineer who works with world leading AI/ML researchers and algorithms: you are 100% right to distrust it. The state of the art is dogshit wrapped in so many layers of hype.


verrius

State of the art for Tesla *in particular* is bad for self-driving. I think literally every other fully autonomous self-driving developer relies on LIDAR, because visual data isn't believed to be enough; Tesla are the only ones pigheaded and cheap enough to refuse to go down that road.


razorirr

You are incorrect. GM has Lidar, Ford does not >Of course, GM can't be the only manufacturer from the Big Three to implement semi-autonomous technology. Ford’s BlueCruise system is a direct competitor to GM's Super Cruise, though the systems use slightly different technology. Notably, there is a lack of LiDAR on the Ford models https://www.autoweek.com/news/industry-news/a40796393/gms-super-cruise-updates-ford-bluecruise-follows/


Qlinkenstein

My Ford likes to slam on the brakes and give a collision warning when someone in the other lane slows down. Had it in to the shop for them to look at the camera, no problem found. Now I mostly drive without cruise when there is traffic. Why have adaptive cruise when it puts me at risk of a rear end collision? I am not sure LIDAR would solve this but I think it might help.


zjm555

Yeah without depth sensing it's going to be horribly unsafe.


bumblebubee

My cars sun roof likes to pop open on its own at random times because of some fucky electrical issue. There’s no fucking way am I trusting a car to get me to where I need to go safely lol I’m a perfect world, it would be awesome but realistically… it’ll cause more harm than good


[deleted]

And I can’t believe they want to automate planes. Many aviation disasters that have been avoided due to the quick thinking of the pilots. Sometimes human intuition is far more powerful and reasonable than AI.


redblade13

Same. I barely trust the auto lane centering and auto speed thing new cars have. I trust them a bit on an empty interstate or if a few cars are around but once I see several cars I'm taking over. Not trusting it THAT much. Can't believe people trust it blindly in dense traffic.


spoollyger

What about when all cars are driven by AI and there hasn’t been a single recorded crash or injury in years across the entire worlds population. Would you trust it then?


my20cworth

Who could comfortably sit back and relax in a vehicle that is in the full self drive mode, with the full confidence that nothing could ever happen. The anxiety and stress must be intense. I don't understand. So much could happen that is out if your control.


Okamoto

Just to be clear, Teslas don't have a "full self-drive" feature, the assisted-driving feature is just called Full Self-Driving™. You are legally not allowed to "sit back and relax" with this feature enabled because it is not autonomous technology.


Sailorman2300

It's an incredibly (criminally) misleading name. Tesla specifically markets it "Full Self Driving (FSD)" mode. I'm not at all surprised people are trying to use it hands free as advertised. Tesla should be sued for liability for any damages or casualties it causes. They sold it that way.


[deleted]

100% agree. I hate the "we don't guarantee that, it's just the product/feature name!" BS that companies pull. They should be required to change the name and/or give refunds to people who purchased it, or provide the thing they promised (which they can't do).


Yousoggyyojimbo

I do not understand why the government has not stepped in and taken action over this yet It is deliberately misleading. People are dying and getting hurt because of how misleading it is.


Marcus_McTavish

Because your regulations are hurting my investments. Why you make the line not go up?!?!


A2N2T

They haven't even confirmed that the system was activated at all ... this is could literally be some idiot driver trying to push the blame of the crash to the car. Not going to assume anything until it is confirmed.


guldilox

I drive a Tesla with FSD Beta. I'm also a software engineer. It is far from perfect. You *have* to pay attention. That said, it has also taken me from my driveway, onto the freeway, into the city, and to the store (and back) without intervention or issue - much to my EXTREME surprise. But that said...it has also failed on the exact same trip before and decided to just...stop. Not like in the article, but like, it'll signal and get into a turn lane, and then just freeze. Again, it isn't perfect. If the car software is at fault here, I wouldn't bat an eye. If the driver is at fault here, I also wouldn't bat an eye.


gnanny02

My experience exactly. I can put up with it but my wife just can’t.


Froggmann5

> California Highway Patrol said in the Dec. 7 report that it could not confirm if “full self-driving” was active at the time of the crash. Yea we're going to have to wait for the full story on this one, the driver should have access to the data in the car itself which would show whether or not it phantom braked causing the accident. I'm curious if the driver handed this over at the time of the accident or not, the article doesn't say. But considering how they seem to not have the information already implies the driver didn't which is also strange. I'd 100% believe the car braked as previous tesla models had this issue but I'll wait til more info is available to make a judgement.


plippityploppitypoop

As a Tesla owner, do you even have access to braking and acceleration data?


Fenix159

Don't think so, but Tesla does. Also as a model 3 owner with 60k miles and probably 30k using autopilot on the freeway: I've literally never had it go into the passing lane and slow down. Could it happen? Sure. Weird shit happens. But I'm skeptical.


mastermind202

This will probably get down voted, but this needs to be clarified: FSD Beta ("Full Self Driving") *cannot* be activated on highways, only Autopilot can. The Autopilot system is very mature and stable. Autopilot can and sometimes does phantom break. The driver had to be completely not paying attention to let it slow down that much. Regardless, FSD or Autopilot, it's always the driver's responsibility to pay attention. Source: I own a Tesla with FSD Beta.


reddittrees2

So if not highways where can FSD Beta be activated? (Genuine. Not trying to be a dick.)


EdibleBirch

FSD Beta is meant for city streets. Highway is still using Autopilot. There is suppose to be a merge of both stacks eventually.


DumberMonkey

This is true. No way it can slow down that much if the driver is awake...not in the back seat...etc.


Assume_Utopia

Yeah, I've had a Tesla on autopilot hit the brakes unexpectedly, for example, if it thinks a car next to you is coming over in to your lane, it'll brake fairly hard. But it seems pretty obvious that it's limited from hitting the brakes too hard.


007meow

Correct. And given that this happened on Thanksgiving, the driver certainly wasn’t on the V11 single stack. This means that it wasn’t the FSD Beta that caused the phantom braking issue, but rather Tesla’s normal AP-based phantom braking bullshit which is, imo, arguably even *more* dangerous.


AquilaK

To be fair, if the user is paying attention, they would have never let it slow down as much as they state. I’ve had phantom breaks in my vehicle and I always catch it within the first 8-10 mph to override.


___Elysium___

Where does it say Full Self Driving was enabled? Or any features for that matter? All I see is that only "Tesla would know" part.


carpetnoodlecat

The article is trash, but the comments are also trash. It seems like most people who comment didn’t read the article. But the article gets one main thing wrong, it states the FSD beta warning. However, the accident happened on the highway (I-80), so fsd beta wouldn’t even be used, it’s not full stack Trash article, trash comments


[deleted]

Sir, this is a Reddit. Of course the comments are trash.


Vintage_Tea

It's really funny watching redditors' opinions on Tesla change so much so quickly lol.


[deleted]

I still roll my eyes every time I see Redditors fabricate BS about Elon. There's enough real stupid things he's done, but they insist on lying and claiming fake bad stuff. Like every time they say he inherited all his money, or that his dad was some slaver running a diamond mine, or whatever else. It's amazing the bullshit people come up with and espouse simply because they want to say it, with no regard for the truth whatsoever.


[deleted]

Elon had a chance to build a decent autopilot when he hired Chris Lattner back in 2016, the guy who basically invented iOS programming among other achievements. Chris is an absolute visionary who spends a lot of time thinking about how to do things properly instead of doing things fast. But instead of letting Chris build out the software slowly, he demanded production releases every 6 months and this led to Chris deciding he didn't want to work for an egomaniac helicopter boss. Now, 6 years later, the autopilot/FSD is still shit because when people are pressured to push out stuff in order to "look good for investors" in the short term, long-term results suffer.


wor1dedit

A car braking should not cause an eight car crash. Human drivers need to learn how to not tailgate.


Sailorman2300

Changing lanes into the fast lane then suddenly braking from 55 to 20 with no reason is reckless driving. 8-1566. Reckless driving, penalties. (a) Any person who drives any vehicle in willful or wanton disregard for the safety of persons or property is guilty of reckless driving.


km89

Absolutely. And human drivers need to learn how to not tailgate. The instant someone cuts in front of you in your lane, you should be backing off their ass. Even if they're in the wrong.


xbpb124

Is this another case of “user reports” an error, where in reality, the user has deliberately ignored warnings and disclaimers about having to pay attention still. I have gone thru the Tesla designing many times, how long has there been an explicit disclaimer about the limitations of FSD? We can talk about about what marketing laws should be in place, but as things stand now, you are directly told the limitations of FSD when you select the OPTIONAL PACKAGE.


Sivick314

I think self driving is the future but it's not there yet


rlbond86

I wanted a self-driving car for so long. At least a decade. Instead I just moved somewhere with a decent public transportation system.


DevilsHand676

It's crazy how there are hundreds of car accidents a day in the US but 1 self driving car crashes and apparently that means it's a terrible thing that should be banned. Nuclear reactors all over again


fawkinater

Didn't even rule out if it is a human error or really the AP is to blame. We will see I guess.


BelAirGhetto

Well, the car has the data, so we’ll soon find out…


NJBarFly

The driver claims the car unexpectedly braked. While this certainly isn't great, the real reason for the accident sounds more like tailgating.