this post was submitted on 23 May 2025
2 points (100.0% liked)

Technology

81078 readers
3986 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don't see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

you are viewing a single comment's thread
view the rest of the comments
[–] Buffalox@lemmy.world 1 points 8 months ago (3 children)

The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

What I don't get is how this false advertising for years hasn't caused Tesla bankruptcy already?

[–] echodot@feddit.uk 1 points 8 months ago (1 children)

Because the US is an insane country where you can straight up just break the law and as long as you're rich enough you don't even get a slap on the wrist. If some small startup had done the same thing they'd have been shut down.

What I don't get is why teslas aren't banned all over the world for being so fundamentally unsafe.

[–] Buffalox@lemmy.world 1 points 8 months ago

What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

I've argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.

[–] NikkiDimes@lemmy.world 0 points 8 months ago (1 children)

Well, because 99% of the time, it's fairly decent. That 1%'ll getchya tho.

[–] ayyy@sh.itjust.works 1 points 8 months ago (2 children)

To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

[–] KayLeadfoot@fedia.io 1 points 8 months ago (1 children)

Someone who doesn't understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

[–] bluewing@lemm.ee 0 points 8 months ago* (last edited 8 months ago) (1 children)

You are trying to judge the self driving feature in a vacuum. And you can't do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn't need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that "bit better" than you statistically.

FSD isn't perfect. No such system will ever be perfect. But, the goal isn't perfect, it just needs to be better than you.

[–] echodot@feddit.uk 1 points 8 months ago (2 children)

FSD isn't perfect. No such system will ever be perfect. But, the goal isn't perfect, it just needs to be better than you.

Yeah people keep bringing that up as a counter arguement but I'm pretty certain humans don't swerve off a perfectly straight road into a tree all that often.

So unless you have numbers to suggest that humans are less safe than FSD then you're being equally obtuse.

[–] bluewing@lemm.ee 1 points 8 months ago

A simple google search, (which YOU could have done yourself), shows it's abut 1 in 1.5 million miles driven per accident with FSD vs 1 in 700,000 miles driven for mechanical cars. I'm no Teslastan, (I think they are over priced and deliberately for rich people only), but that's an improvement, a noticeable improvement.

And as a an old retired medic who has done his share of car accidents over nearly 20 years-- Yes, yes humans swerve off of perfectly straight roads and hit trees and anything else in the way also. And do so at a higher rate.

[–] jamesjams@lemmy.world 1 points 8 months ago (1 children)

Humans do swerve off perfectly straight roads into trees, I know because I've done it!

[–] echodot@feddit.uk 2 points 8 months ago (2 children)

Can you confirm that to the best of your knowledge you are not a robot?

[–] jamesjams@lemmy.world 1 points 8 months ago

Bleep bleep bloop indeed human, affirmative, am human, ...thinking... Well to the best of my knowledge anyway

[–] KayLeadfoot@fedia.io 1 points 8 months ago

This little subthread looks like this.

[–] NikkiDimes@lemmy.world 1 points 8 months ago

..It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn't even time for human intervention, but I frequently had to take over when I used to use it (post v13)

[–] FreedomAdvocate@lemmy.net.au -1 points 8 months ago (1 children)

What false advertising? It’s called “Full Self Driving (Supervised)”.

[–] Buffalox@lemmy.world 0 points 8 months ago* (last edited 8 months ago) (1 children)

For many years the "supervised" was not included, AFAIK Tesla was forced to do that.
And in this case "supervised" isn't even enough, because the car made an abrupt unexpected maneuver, instead of asking the driver to take over in time to react.

[–] FreedomAdvocate@lemmy.net.au -1 points 8 months ago

The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.