The headline makes it sound like Tesla is trialing a new 'fatality' feature for it's autopilot.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Well, someone has to invent the suicide booths featured in Futurama. Might as well be him.
The reality is that they didn't trial it at all, they just sent straight to production. In this case, it successfully achieved a fatality.
With how Elon has been acting this is a distinct possibility.
It would probably scream "Xterminate!" before running you over.
Autopilot is not safe.
https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
Isn't it a glorified cruise control/lane guidance system, rather than an actual automated driving system? So it would be about as safe as those are, rather than being something that you can just leave along to handle its own business, like a robotic vacuum cleaner.
The main issue is that they market it like a fully autonomous system, and made it just good enough that it lulls people into a false sense of security that they don't need to pay attention, while also having no way to verify they are, unlike other systems from BMW, GM, or Ford.
Other systems have their capabilities intentionally hampered to insure that you're not going to feel it's okay to hop in the passenger seat and let your dog drive.
They are hands-on driver assists, and so they are generally calibrated in a way that they'll guide you in the lane, but will drift/sway just a bit if you completely take your hands off the wheel, which is intended to keep you, y'know, actually driving.
Tesla didn't want to do that. They wanted to be the "best" system, with zero safety considerations at any step other than what was basically forced upon them by the supplier so they wouldn't completely back out. The company is so insanely reckless that I feel shame for ever wanting to work for them at one point, until I saw and heard many stories about just how bad they were.
I got to experience it firsthand too working at a supplier, where production numbers were prioritized over key safety equipment, and while everyone else was willing to suck it up for a couple of bad quarters, they pushed it and I'm sure it's indirectly resulted in further injuries and potentially deaths because of it.
The second trial, set for early October in a Florida state court, arose out of a 2019 crash north of Miami where owner Stephen Banner’s Model 3 drove under the trailer of an 18-wheeler big rig truck that had pulled into the road, shearing off the Tesla's roof and killing Banner. Autopilot failed to brake, steer or do anything to avoid the collision, according to the lawsuit filed by Banner's wife.
Is this the guy who was literally paying no attention to the road at all and was watching a movie whilst the car was in motion?
I legit can't find information on it now as every result I can find online is word for word identical to that small snippet. Such is modern journalism.
I know people like to get a hard on with the word "autopilot", but even real pilots with real autopilot still need to "keep an eye on things" when the system is engaged. This is why we have two humans in the cockpit on those big commercial jets.
The way musk marketed it was as a "self driving" feature, not a driving assist. Yes with all current smart assists you need to be carefully watching what it's doing, but that's not what it was made out to be. Because of that I'd still say tesla is responsible.
There are also two pilots. Because they know people are people. And don't brand it a self driving and full self driving then.
It seems like an obvious flaw that's pretty simple to explain. Car is learnt to operate the infromation about collisions on a set height. The opening between the wheels of a truck's trailer thus could be treated by it as a free space. It's a rare situation, but if it's confirmed and reproduceable, that, at least, raises concerns, how many other glitches would drivers learn by surprise.
This is the best summary I could come up with:
SAN FRANCISCO, Aug 28 (Reuters) - Tesla Inc (TSLA.O) is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.
Self-driving capability is central to Tesla’s financial future, according to Musk, whose own reputation as an engineering leader is being challenged with allegations by plaintiffs in one of two lawsuits that he personally leads the group behind technology that failed.
The first, scheduled for mid-September in a California state court, is a civil lawsuit containing allegations that the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour, strike a palm tree and burst into flames, all in the span of seconds.
Banner’s attorneys, for instance, argue in a pretrial court filing that internal emails show Musk is the Autopilot team's "de facto leader".
Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the "Autopilot" and "Full Self-Driving" names.
In one deposition, former executive Christopher Moore testified there are limitations to Autopilot, saying it "is not designed to detect every possible hazard or every possible obstacle or vehicle that could be on the road," according to a transcript reviewed by Reuters.
The original article contains 986 words, the summary contains 241 words. Saved 76%. I'm a bot and I'm open source!
I can't understand how anyone is even able to let the car do something on its own. I drive old Dacia Logan and Renault Scénic, but at work we have Škoda Karoq and I can't even fully trust its beeping backing sensors or automatic handbrake. I can't imagine if the car steered, accelerated or braked without me telling it to.