E. Lon Musk. Supah. Geenius.
Not The Onion
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
Wank E. Cuckyote
MEEP MEEP
I hope some of you actually skimmed the article and got to the "disengaging" part.
As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.
It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.
That's so wrong holy shit
Don't get me wrong, autopilot turning itself off right before a crash is sus and I wouldn't put it past Tesla to do something like that (I mean come on, why don't they use lidar) but maybe it's so the car doesn't try to power the wheels or something after impact which could potentially worsen the event.
On the other hand, they're POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.
if it can actually sense a crash is imminent, why wouldn't it be programmed to slam the brakes instead of just turning off?
Do they have a problem with false positives?
I've been wondering this for years now. Do we need intelligence in crashes, or do we just need vehicles to stop? I think you're right, it must have been slamming the brakes on at unexpected times, which is unnerving when driving I'm sure.
So they had an issue with the car slamming on the brakes at unexpected times, caused by misidentifying cracks in the road or glare or weird lighting or w/e. The solution was to make the cameras ignore anything they can't recognize at high speeds. This resulted in Teslas plowing into the back of firetrucks.
As the article mentioned, other self-driving cars solved that with lidar, which elon himself is against because he says AI will just get so good and 2d cameras are cheaper.
This is from 6 years ago. I haven't heard of the issue more recently
https://www.washingtonpost.com/technology/interactive/2023/tesla-autopilot-crash-analysis/
The tesla did not consistently detect that the thing infront of it was a truck, so it didn't brake. Also, this describes a lot of similar cases.
I remember a youtuber doing similar tests, where they'd try to run over a fake pedestrian crossing or standing in the road at low speed, and then high speed. It would often stop at low speed, but very rarely stopped or swerved at high speed.
Wouldn't it make more sense for autopilot to brake and try to stop the car instead of just turning off and letting the car roll? If it's certain enough that there will be an accident, just applying the brakes until there's user override would make much more sense..
Normal cars do whatever is in their power to cease movement while facing upright. In a wreck, the safest state for a car is to cease moving.
Rober seems to think so, since he says in the video that it's likely disengaging because the parking sensors detect that it's parked because of the object in front, and it shuts off the cruise control.
I see your point, and it makes sense, but I would be very surprised if Tesla did this. I think the best option would be to turn off the features once an impact is detected. It shutting off before hand feels like a cheap ploy to avoid guilt
..... It shutting off before hand feels like a cheap ploy to avoid guilt
that's exactly what it is.
It always is that way; fuck the consumer, its all about making a buck
It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.
So, who's the YouTuber that's gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.
It basically already happened in the Mark Rober video, it turns off by itself less than a second before hitting
My 500$ robot vacuum has LiDAR, meanwhile these 50k pieces of shit don't 😂
Holy shit, I knew I'd heard this word before. My Chinese robot vacuum cleaner has more technology than a tesla hahahahaha
To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well. That being said, tesla shouldn't rely on cameras
I'd take that bet. I imagine at least some drivers would notice something sus' (due to depth perception, which should be striking as you get close, or lack of ANY movement or some kind of reflection) and either
- slow down
- use a trick, e.g. flicking lights or driving a bit to the sides and back, to try to see what's off
or probably both, but anyway as other already said, it's being compared to other autopilot systems, not human drivers.
To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well.
this isn't being fair. It's being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.
Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.
The video does bring up human ability too with the fog test ("Optically, with my own eyes, I can no longer see there's a kid through this fog. The lidar has no issue.") But, as they show, this wall is extremely obvious to the driver.
The tesla would lose its shit if it sees this
They already have trouble enough with trucks carrying traffic lights, or with speed limit drivers on them.
and fire trucks. and little kids. and, uh, lots of things really.