Elektrek: "Tesla FSD Beta tried to kill me last night"
skyspydude1 @ skyspydude1 @lemmy.world Posts 1Comments 250Joined 2 yr. ago
Is this how they make Ultraporn?
Oh really? Is that why for years now, on the front page for Autopilot on Tesla's site, was the infamous "Paint it Black" demo, where in the first 10 seconds it says "The driver only there for legal reasons, the car is driving itself"? What do you think is going to stick in the mind of a potential buyer: that video of the car "driving itself" right on the Tesla website, or the generic 5 line page that you'll see in basically every single car with a satnav these days saying, "Please operate the car safely"?
Regardless of how much people like you love to get into the technicalities and differences between Autopilot and Full Self Driving and chime in with "ACKSHUALLY" and insert any number of the same tired responses about how autopilot works on aircraft or what it says in the documentation, it changes nothing about how they've shaped the public perception of their system and how people are going to attempt and use it.
Stop defending their shitty practices. Literally everyone else has figured out how to prevent people from abusing these systems, Tesla won't even bother, because people like you will step in and defend it every time for some fucking reason, and as a bonus it saves them money.
Imagine content creation that was done purely for the fun of creating content and sharing info, albeit with literally zero hope of receiving any money. Better in some ways, worse in others.
Oh, you're looking for a part number for something relatively common? No can do. However, I'm sure you'd be interested in pages of Chinese phone numbers that carry 3 digits in a similar order to your search.
The main issue is that they market it like a fully autonomous system, and made it just good enough that it lulls people into a false sense of security that they don't need to pay attention, while also having no way to verify they are, unlike other systems from BMW, GM, or Ford.
Other systems have their capabilities intentionally hampered to insure that you're not going to feel it's okay to hop in the passenger seat and let your dog drive.
They are hands-on driver assists, and so they are generally calibrated in a way that they'll guide you in the lane, but will drift/sway just a bit if you completely take your hands off the wheel, which is intended to keep you, y'know, actually driving.
Tesla didn't want to do that. They wanted to be the "best" system, with zero safety considerations at any step other than what was basically forced upon them by the supplier so they wouldn't completely back out. The company is so insanely reckless that I feel shame for ever wanting to work for them at one point, until I saw and heard many stories about just how bad they were.
I got to experience it firsthand too working at a supplier, where production numbers were prioritized over key safety equipment, and while everyone else was willing to suck it up for a couple of bad quarters, they pushed it and I'm sure it's indirectly resulted in further injuries and potentially deaths because of it.
Sure, and we do hundreds of thousands of miles of simulation on each SW build before it'll be okayed for even driving on site, then it has to pass additional tests before it's allowed on public roads with a test driver.
As someone who works on AV SW for a living, it's really not a big deal, assuming you've got certain limits already in place.
However, unlike Tesla, we're not just handing this out to random people who clicked "I agree" on the screen. We've got tons of dedicated training and have to demonstrate we can react to stuff and take over under worst-case conditions, and take incidents like this really seriously.
It's funny that he says "Oh, this is why it's not released to the public", as I did some driving with a Model 3 on the latest version of FSD within the last few weeks, and in a 1 hour drive had plenty of "Oh shit" moments like this. So yeah, they'll totally release garbage like this to the public, no doubt about it
They're basically the only one. Even MobilEye, who is objectively the best in the ADAS/AV space for computer vision, uses other sensors in their fleet. They have demonstrated camera only autonomy, but realize it's not worth the $1000 in sensors to risk killing people.
They removed the radars, they've never used LiDAR as Elon considered it "a fool's errand", which translates to "too expensive to put in my penny pinched economy cars". Also worth noting that they took the radars out purely to keep production and the stock price up, despite them knowing well in advance performance was going to take a massive hit without it. They just don't give a shit, and a few pedestrian deaths are 100% worth it to Elon with all the money he made from the insane value spike of the stock during COVID. They were the one automaker who maintained production because they just randomly swapped in whatever random parts they could find, instead of anything properly tested or validated, rather than suck it up for a bad quarter or two like everyone else.
And yet FSD is still worse than the one time I got in the car with an exchange student who had never driven a car before coming to the US and thought her learners permit was the same as a driver's license.