this post was submitted on 17 Feb 2026
906 points (99.5% liked)

Technology

81451 readers
4010 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.

Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.

The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.

you are viewing a single comment's thread
view the rest of the comments
[–] dogslayeggs@lemmy.world 18 points 19 hours ago (3 children)

It's important to draw the line between what Tesla is trying to do and what Waymo is actually doing. Tesla has a 4x higher rate, but Waymo has a lower rate.

[–] merc@sh.itjust.works 13 points 14 hours ago (2 children)

Not just lower, a tiny fraction of the human rate of accidents:

https://waymo.com/safety/impact/

Also, AFAIK this includes cases when the Waymo car isn't even slightly at fault. Like, there have been 2 deaths involving a Waymo car. In one case a motorcyclist hit the car from behind, flipped over it, then was hit by another car and killed. In the other case, ironically, the real car at fault was a Tesla being driven by a human who claims he experienced "sudden unintended acceleration". It was driving at 98 miles per hour in downtown SF and hit a bunch of stopped cars at a red light, then spun into oncoming traffic and killed a man and his dog who were in another car.

Whether or not self-driving cars are a good thing is up for debate. But, it must suck to work at Waymo and to be making safety a major focus, only to have Tesla ruin the market by making people associate self-driving cars with major safety issues.

[–] TooManyGames@sopuli.xyz 5 points 14 hours ago (1 children)

I immediately formed a conspiracy theory that Teslas automatically accelerate when they see Waymo cars

[–] merc@sh.itjust.works 4 points 13 hours ago (1 children)

And it's not out of aggression. It's just that their image recognition algorithms are so terrible that they match the Waymo car with all its sensors to a time-traveling DeLorean and try to hit 88 mph.... or something.

[–] TooManyGames@sopuli.xyz 2 points 6 hours ago

They crash for the memes. Sounds about right considering who's in charge.

[–] ThirdConsul@lemmy.zip 3 points 13 hours ago (2 children)

Not just lower, a tiny fraction of the human rate of accidents:

https://www.iihs.org/research-areas/fatality-statistics/detail/state-by-state

Well, no. Lets talk fatality rate. According to linked data, human drivers

1.26 deaths per 100 million miles traveled

Vs Waymo 2 deaths per 127 million miles :)

[–] merc@sh.itjust.works 5 points 12 hours ago (1 children)

Well, Waymo's really at 0 deaths per 127 million miles.

The 2 deaths are deaths that happened were near Waymo cars in a collision involving the Waymo car. Not only did the Waymo not cause the accidents, they weren't even involved in the fatal part of either event. In one case a motorcyclist was hit by another car, and in the other one a Tesla crashed into a second car after it had hit the Waymo (and a bunch of other cars).

The IIHS number takes the total number of deaths in a year, and divides it by the total distance driven in that year. It includes all vehicles, and all deaths. If you wanted the denominator to be "total distance driven by brand X in the year", you wouldn't keep the numerator as "all deaths" because that wouldn't make sense, and "all deaths that happened in a collision where brand X was involved as part of the collision" would be of limited usefulness. If you're after the safety of the passenger compartment you'd want "all deaths for occupants / drivers of a brand X vehicle" and if you were after the safety of the car to all road users you'd want something like "all deaths where the driver of a brand X vehicle was determined to be at fault".

The IIHS does have statistics for driver death rates by make and model, but they use "per million registered vehicle years", so you can't directly compare with Waymo:

https://www.iihs.org/ratings/driver-death-rates-by-make-and-model

Also, in Waymo it would never be the driver who died, it would be other vehicle occupants, so I don't know if that data is tracked for other vehicle models.

[–] hector@lemmy.today 0 points 3 hours ago

I seem to recall a homeless woman that got killed like right away when they released these monstrosities on the road, because why pay people to do jobs when machines can do them for you? I'm sure that will work out for everyone, with investment income.

[–] 73ms@sopuli.xyz 3 points 13 hours ago (2 children)

When there's two deaths total it's pretty obvious that there just isn't enough data yet to consider the fatal accident rate. Also FWIW like was said neither of those was in any way the Waymo's fault.

[–] hector@lemmy.today 1 points 3 hours ago (1 children)

That's the problem, you can't trust these companies not to use corrupt influence to blame others for their mistakes. It's you verses a billions of dollars companies with everything at stake, that owns (senior tiered leasing rights,) your politicians, both locally, in state, and federally, and by extension the regulators up and down the line.

Do you not know how things work in this country? Given their outsized power we don't want them involved in determining blame for accidents, dash cam footage or no, we've seen irrefutable evidence is no guarentee of justice, even if it's provided to you.

[–] 73ms@sopuli.xyz 1 points 2 hours ago (1 children)

Well Waymo isn't assigning blame, it's a third party assessment based on the information released about those accidents. The strongest point remains that fatal accidents are rare enough that there simply isn't enough data to claim any statistical significance for these events. The overall accident rate for which data is sufficient remains significantly lower than the US average.

[–] hector@lemmy.today 1 points 2 hours ago

They have influence with the police and regulators, and insurance companies, to avoid blame.

They are on limited routes, at lower speeds, so they won't have a higher fatality rate. If you compared human drivers for that same stretch of road it would also be zero. You can't compare human drivers on expressways during rush hour with waymo's trip between the airport and the hotels on a mapped out route that doesn't go on the expressway.

[–] ThirdConsul@lemmy.zip 1 points 6 hours ago

The "fault" means nothing to "deaths per miles" statistic though?

[–] ThirdConsul@lemmy.zip 7 points 14 hours ago* (last edited 13 hours ago) (2 children)

Isn't Waymo rate better because they are very particular where they operate? When they are asked to operate in sligthly less than perfect conditions it immediately goes downhill https://www.researchgate.net/publication/385936888_Identifying_Research_Gaps_through_Self-Driving_Car_Data_Analysis (page 7, Uncertainty)

Edit: googled it a bit, and apparently Waymo mostly drives in

Waymo vehicles primarily drive on urban streets with a speed limit of 35 miles per hour or less

Teslas do not.

[–] 73ms@sopuli.xyz 3 points 13 hours ago* (last edited 12 hours ago)

We are talking about Tesla robotaxis. They certainly do drive in very limited geofenced areas also. While Waymo now goes on freeways only in the Bay Area with the option being offered to only some passengers Tesla Robotaxis do not go on any freeways ever currently. In fact they only have a handful of cars doing any unsupervised driving at all and those are geofenced in Austin to a small area around a single stretch of road.

Tesla Robotaxis currently also cease operations in Austin when it rains so Waymo definitely is the more flexible one when it comes to less than perfect conditions.

[–] dogslayeggs@lemmy.world 1 points 13 hours ago

That is certainly true, but they are also better than humans in those specific areas. Tesla is (shockingly) stupid about where they choose to operate. Waymo understands their limitations and choose to only operate where they can be better than humans. They are increasing their range, though, including driving on the 405 freeway in Los Angeles... which is usually less than 35mph!!

[–] jabjoe@feddit.uk 2 points 16 hours ago (2 children)

Because Waymo uses more humans?

[–] Ilovethebomb@sh.itjust.works 8 points 15 hours ago (1 children)

Because Waymo doesn't try and do FSD with only cameras.

[–] jabjoe@feddit.uk 1 points 13 hours ago (1 children)

Are they doing FSD if there are human overseas? Surely that is not "fully".

So human overseas and not only cameras.

[–] 73ms@sopuli.xyz 2 points 12 hours ago (1 children)

All these services have the ability for a human to solve issues if the FSD disengages. Doesn't mean they're not driving on their own most of the time including full journeys. The remote assistant team is just ready to jump in if there's something unusual that causes the Waymo driver to disengage and even then they don't usually directly control the car, they just give the driver instructions on how to resolve the situation.

[–] jabjoe@feddit.uk 1 points 3 hours ago

I think Waymo are right to do what they do. I just wouldn't call it "fully". If Telsa are doing the same and still doing badly, or should be doing the same and aren't, it still makes them worse than Waymo either way.

[–] dogslayeggs@lemmy.world 1 points 13 hours ago (1 children)
[–] jabjoe@feddit.uk 1 points 3 hours ago

Searching for "Waymo human overseas" brings up results about it. Doing similar for Telsa isn't finding anything. Also I've not heard about like I have with Waymo. I don't think Waymo are wrong to do this at all. It not making a decision when unsure is safer.