this post was submitted on 02 May 2025
1064 points (98.6% liked)

memes

14498 readers
4031 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 8 points 8 hours ago

The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.

/s.

[–] [email protected] 40 points 20 hours ago (2 children)

I'd imagine you are always responsible for what you do when you're driving, even if a system like autopilot is helping you drive.

[–] [email protected] 5 points 11 hours ago

If you are in the drivers seat, you are responsible for anything the car does unless there was a provable mechanical failure.

[–] [email protected] 34 points 20 hours ago (1 children)

Especially cause autopilot disengages right before the accident so it's technically always your fault.

[–] [email protected] 4 points 16 hours ago

Yup gotta read the fine print

[–] [email protected] 54 points 22 hours ago (1 children)

Except the autopilot will modify its data that it was turned off right at the moment it hits people...

[–] [email protected] 42 points 21 hours ago (2 children)

Nah, it just disengages a fraction of a second before impact so they can claim "it wasn't engaged at the moment of impact, so not our responsibility."

There were rumours about this for ages, but I honestly didn't fully buy it until I saw it in Mark Rober's vison vs lidar video and various other follow-ups to it.

[–] [email protected] 7 points 12 hours ago

It not about responsibility, it's about marketing. At no point do they assume responsibility, like any level 2 system. It would look bad if it was engaged, but you are 100% legally liable for what the car does when on autopilot (or the so called "full self driving"). It's just a lane keeping assistant.

If you trust your life (or the life of others) to a a lane keeping assistant you deserve to go to jail, be it Tesla, VW, or BYD.

[–] [email protected] 7 points 21 hours ago (2 children)
[–] [email protected] 12 points 20 hours ago* (last edited 20 hours ago) (2 children)

It turns off, but it's likely so the AEB system can kick in.

AP and AEB are separate things.

Also all L2 crashes that involve an air bag deployment or fatality get reported if it was on within something like 30s before hand, assuming the OEM has the data to report, which Tesla does.

Rules are changing to lessen when it needs to be reported, so things like fender benders aren't necessarily going to be reported for L2 systems in the near future, but something like this would still be and alway has.

[–] [email protected] 1 points 8 hours ago (1 children)

What's AEB? Automatic Energetic Braking?

[–] [email protected] 2 points 7 hours ago

I'm guessing automatic emergency braking

[–] [email protected] 1 points 11 hours ago (1 children)

Ok but if Tesla's using that report to get out from liability, we still've a damn problem

[–] [email protected] 1 points 2 hours ago* (last edited 2 hours ago)

If it's a L2 system the driver is always liable. The report just makes sure we know it's happening and can force changes if patterns are found. The NHSTA made Tesla improve their driver monitoring based off the data since that was the main problem. The majority of accidents (almost all) were drunk or distracted drivers.

If it's a L4 system Tesla is always liable, we'll see that in June in Austin in theory for the first time on public roads.

The report never changes liability, it just let's us know what the state of the vehicle was for the incident. Tesla can't say the system was off because it was off 1 second before because we'll know it was on prior to that. But that doesn't change liability.

load more comments (1 replies)
[–] [email protected] 280 points 1 day ago (3 children)

Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.

[–] [email protected] 3 points 12 hours ago

The driver is always at blame, even if it was on. They turn it off for marketing claims.

PS: fuck elon

[–] [email protected] 96 points 1 day ago (2 children)

I didn't know this, but I'm not shocked, or even a little bit surprised.

[–] [email protected] 43 points 1 day ago (1 children)

Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.

[–] [email protected] 38 points 1 day ago (1 children)

This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was "off"

[–] [email protected] 3 points 18 hours ago

It turns it off with the parking sensor 2ft before the accident.

load more comments (1 replies)
[–] [email protected] 27 points 1 day ago* (last edited 1 day ago) (1 children)

Holy shit I did indeed look it up, and it's true. Dunno if it'll hold up but it's still shady as shit

[–] [email protected] 13 points 1 day ago (3 children)

Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you're not controlling it at that moment, you are expected to maintain safe operation.

That's why the Uber self driving car that killed someone was considered the test driver's fault and left Uber mostly off the hook.

Not sure how it works for the robo taxis, though.

[–] [email protected] 6 points 1 day ago

Yeah that's gonna be tricky with those. I live in Vegas where they're already operating. No steering wheel at all.

load more comments (2 replies)
[–] [email protected] 22 points 21 hours ago (1 children)

In my country it's always your fault. And I'm very glad.

[–] [email protected] 1 points 16 hours ago (1 children)

Here in the US, even if the driver is found responsible it is often only a small fine ($500) and maybe a 30 day suspension for killing someone.

[–] [email protected] 3 points 8 hours ago (1 children)
[–] [email protected] 132 points 1 day ago

Autopilot will turn off a few milliseconds before impact either way

[–] [email protected] 38 points 1 day ago* (last edited 1 day ago) (21 children)

Tldr: Take the train and be safe.

Rant: In the EU, you are 35x more likely to die from a car crash, compared to a train crash. The union has created the so-called Vision Zero program, which is designed to reach zero driving deaths by some arbitrarily chosen date in the future. And of course it talks about autonomously driving cars. You know, crazy idea, but what if instead of we bet it all on some hypothetical magic Jesus technology that may or may not exist by the arbitrarily chosen date and instead focus on the real world solution that we already have? But well, the car industry investors would make less money, so I can answer that myself. :(

Edit: Also, Musk is a Nazi cunt who should die of cancer.

[–] [email protected] 2 points 12 hours ago

Well, there is no train station at my house. Or Aldi. Or my kids Kindergarten. And I live Germany, where public transport is excellent on a global level (memes about Deutsche Bahn aside).

Cars will be necessary for the foreseeable future. Let's make them as safe as possible while investing in public transport, they are not mutually exclusive.

PS: fuck Elon.

[–] [email protected] 11 points 1 day ago (1 children)

Speaking as a German: There are fewer train-related deaths because the trains don’t drive.

load more comments (1 replies)
[–] [email protected] 6 points 23 hours ago

We have Vision Zero in the US, too. They lowered speed limits in a couple neighborhoods from 25mph to 20, and all the LED road signs show annual aggregated deaths from car crashes until the number is greater than zero, then someone wrings their hands and says "Welp, we did what we could, guess people just like dying" and then goes on vacation. (Source: me, I made up the spokesperson who gets scapegoated, but all the other stuff is observationally evident where I live)

load more comments (18 replies)
[–] [email protected] 91 points 1 day ago* (last edited 1 day ago) (7 children)

Unironically this is a perfect example of why AI is being used to choose targets to murder in the Palestinian Genocide or in cases like DOGE attacking the functioning of the U.S. government, also US healthcare company claims of denial or collusion of landlord software to raise rent.

The economic function of AI is to abdicate responsibility for your actions so you can make a bit more money while hurting people, and until the public becomes crystal clear on that we are under a wild amount of danger.

Just substitute in for Elon the vague idea of a company that will become a legal and ethical escape goat for brutal choices by individual humans.

[–] [email protected] 30 points 1 day ago (1 children)

Which is why we need laws about human responsibility for decisions made by AI (or software in general).

[–] [email protected] 3 points 20 hours ago

I did an internship at a bank way back, and my role involved a lot of processing of spreadsheets from different departments. I automated a heckton of that with Visual Basic, which my boss was okay with, but I was dismayed to learn that I wasn't saving anyone's time except my own, because after the internship was finished, all of the automation stuff would have to be deleted. The reason was because of a rule (I think a company policy rather than a law) that required that any code has to be the custody of someone, for accountability purposes — "accountability" in this case meaning "if we take unmaintained code for granted, then we may find an entire department's workflow crippled at some point in the future, with no-one knowing how it's meant to work".

It's quite a different thing than what you're talking about, but in terms of the implementation, it doesn't seem too far off.

[–] [email protected] 3 points 21 hours ago

It reminds me of how apparently firing squad executions used to have only some of the guns loaded with live guns, and the rest with blanks. This way, the executioners could do some moral gymnastics to convince themselves that they hadn't just killed a person

load more comments (5 replies)
[–] [email protected] 39 points 1 day ago (3 children)

Made by someone who's able to THINK like a Tesla owner.

Brake pedal? Unthinkable

[–] [email protected] 10 points 1 day ago (1 children)
load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 40 points 1 day ago (1 children)

At best Tesla pays a fine, not Elon.

[–] [email protected] 24 points 1 day ago

Don't worry, DOGE will just fire the investigators before that happens.

[–] [email protected] 8 points 23 hours ago (1 children)

Oh. A fine. How will Musk survive that financially?

[–] [email protected] 4 points 22 hours ago

He won't be fine...

[–] [email protected] 24 points 1 day ago (1 children)

Wow. That’s a staggeringly apt update

[–] [email protected] 39 points 1 day ago (3 children)
load more comments (3 replies)
load more comments
view more: next ›