this post was submitted on 21 Aug 2023
577 points (94.9% liked)

Technology

59168 readers
2125 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla knew Autopilot caused death, but didn't fix it::Software's alleged inability to handle cross traffic central to court battle after two road deaths

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (2 children)

Well, if it's just the lane assistance autopilot that is causing this kind of crash. I'd agree it's likely user error. The reason I say if, is because I don't trust journalists to know or report on the difference.

I am still concerned the FSD beta is "out there" though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

[–] [email protected] 2 points 1 year ago

If it were about the FSD implementation, things would be very different. I'm pretty sure that the FSD is designed to handle cross traffic, though.

I do not trust normal users to understand what beta means

Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That's like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

Yeah, I don't trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who's really good at driving though? Humans. Perfect track record, those humans.