this post was submitted on 14 Aug 2023
503 points (96.7% liked)

Technology

59339 readers
5187 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -1 points 1 year ago (2 children)

The video is very thorough and goes into the hazy video caused by the flashing lights being one of the issues.

[–] [email protected] 5 points 1 year ago (2 children)

That's not the main problem. It is more like an excuse. The main problem has been explained in the video right before that:

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

The emergency vehicles just happen to be your most frequent kind of obstacles.

The fallback to the camera is a bad excuse anyway, because radar is needed first to detect any obstacles. The cam will usually be later (=at closer distance) than the radar.

The even better solution (Trigger warning: nerdy stuff incoming) is to always mix all results of all kinds of sensors at an early stage in the processing software. That's what european car makers do right from the beginning, but Tesla is way behind with their engineering. Their sensors still work indepently, and each does their own processing. So every shortcoming of one sensor creates a faulty detection result that has to be covered later (read: seconds later, not milliseconds) by other kinds of sensors.

[–] [email protected] 4 points 1 year ago

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

Teslas don't use radar, just cameras. That's why Teslas crash at way higher rates than real self driving cars like Waymo.

[–] [email protected] 2 points 1 year ago

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

I feel like this is bad tech understanding in journalism (which is hardly new). There's no reason radar couldn't see stationary vehicles. In fact, very specifically, they're NOT stationary relative to the radar transceiver. Radar would see them no problem.

My actual suspicion here is that Tesla actively ignores stationary vehicles (it can know they're stationary by adding its known speed to the relative speed) not in front of the vehicle. Now, in normal streets this makes sense (or at least those on the non-driver's side). Do you pay attention to every car parked by the side of the road when driving? You're maybe looking for signs of movement, or lights on, etc. But you're not tracking them all, and neither will the autopilot. However, on a highway if you have more than 1 vehicle on the shoulder every now and then it should be making you wonder what else is ahead (and I'd argue a single car on the shoulder is a risk to keep watch on). A long line of them should definitely make you slow down.

I think Human drivers would do this, and I think an autopilot should be considering what kind of road it is on, and whether it should treat scenarios different.

I also have another suspicion, but it's just a thought. If this Tesla was really using radar as well as cameras, haze or not, it should have seen that stationary vehicle further ahead than it did. Since newer Tesla cars don't have radar, and coming from a software development background, I can actually see a logical (in terms of corporate thinking) reason to remove the code for radar. They would do this simply because they will not want to maintain it if they have no plans to return to radar. Think of it like this. After a few versions of augmenting the camera detection logic, it is unlikely to work with the existing radar logic. Do they spend the time to make them work together for the older vehicles, or only allow camera based AI on newer software versions? I would suspect the latter would be the business decision.

[–] [email protected] 2 points 1 year ago

The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.

Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.