The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.
/s.
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.
/s.
I'd imagine you are always responsible for what you do when you're driving, even if a system like autopilot is helping you drive.
If you are in the drivers seat, you are responsible for anything the car does unless there was a provable mechanical failure.
Especially cause autopilot disengages right before the accident so it's technically always your fault.
Yup gotta read the fine print
Except the autopilot will modify its data that it was turned off right at the moment it hits people...
Nah, it just disengages a fraction of a second before impact so they can claim "it wasn't engaged at the moment of impact, so not our responsibility."
There were rumours about this for ages, but I honestly didn't fully buy it until I saw it in Mark Rober's vison vs lidar video and various other follow-ups to it.
It not about responsibility, it's about marketing. At no point do they assume responsibility, like any level 2 system. It would look bad if it was engaged, but you are 100% legally liable for what the car does when on autopilot (or the so called "full self driving"). It's just a lane keeping assistant.
If you trust your life (or the life of others) to a a lane keeping assistant you deserve to go to jail, be it Tesla, VW, or BYD.
For fucking real??
It turns off, but it's likely so the AEB system can kick in.
AP and AEB are separate things.
Also all L2 crashes that involve an air bag deployment or fatality get reported if it was on within something like 30s before hand, assuming the OEM has the data to report, which Tesla does.
Rules are changing to lessen when it needs to be reported, so things like fender benders aren't necessarily going to be reported for L2 systems in the near future, but something like this would still be and alway has.
What's AEB? Automatic Energetic Braking?
I'm guessing automatic emergency braking
Ok but if Tesla's using that report to get out from liability, we still've a damn problem
If it's a L2 system the driver is always liable. The report just makes sure we know it's happening and can force changes if patterns are found. The NHSTA made Tesla improve their driver monitoring based off the data since that was the main problem. The majority of accidents (almost all) were drunk or distracted drivers.
If it's a L4 system Tesla is always liable, we'll see that in June in Austin in theory for the first time on public roads.
The report never changes liability, it just let's us know what the state of the vehicle was for the incident. Tesla can't say the system was off because it was off 1 second before because we'll know it was on prior to that. But that doesn't change liability.
Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.
The driver is always at blame, even if it was on. They turn it off for marketing claims.
PS: fuck elon
I didn't know this, but I'm not shocked, or even a little bit surprised.
Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.
This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was "off"
It turns it off with the parking sensor 2ft before the accident.
Holy shit I did indeed look it up, and it's true. Dunno if it'll hold up but it's still shady as shit
Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you're not controlling it at that moment, you are expected to maintain safe operation.
That's why the Uber self driving car that killed someone was considered the test driver's fault and left Uber mostly off the hook.
Not sure how it works for the robo taxis, though.
Yeah that's gonna be tricky with those. I live in Vegas where they're already operating. No steering wheel at all.
In my country it's always your fault. And I'm very glad.
Here in the US, even if the driver is found responsible it is often only a small fine ($500) and maybe a 30 day suspension for killing someone.
What if you kill a CEO?
Autopilot will turn off a few milliseconds before impact either way
Tldr: Take the train and be safe.
Rant: In the EU, you are 35x more likely to die from a car crash, compared to a train crash. The union has created the so-called Vision Zero program, which is designed to reach zero driving deaths by some arbitrarily chosen date in the future. And of course it talks about autonomously driving cars. You know, crazy idea, but what if instead of we bet it all on some hypothetical magic Jesus technology that may or may not exist by the arbitrarily chosen date and instead focus on the real world solution that we already have? But well, the car industry investors would make less money, so I can answer that myself. :(
Edit: Also, Musk is a Nazi cunt who should die of cancer.
Well, there is no train station at my house. Or Aldi. Or my kids Kindergarten. And I live Germany, where public transport is excellent on a global level (memes about Deutsche Bahn aside).
Cars will be necessary for the foreseeable future. Let's make them as safe as possible while investing in public transport, they are not mutually exclusive.
PS: fuck Elon.
Speaking as a German: There are fewer train-related deaths because the trains don’t drive.
We have Vision Zero in the US, too. They lowered speed limits in a couple neighborhoods from 25mph to 20, and all the LED road signs show annual aggregated deaths from car crashes until the number is greater than zero, then someone wrings their hands and says "Welp, we did what we could, guess people just like dying" and then goes on vacation. (Source: me, I made up the spokesperson who gets scapegoated, but all the other stuff is observationally evident where I live)
Unironically this is a perfect example of why AI is being used to choose targets to murder in the Palestinian Genocide or in cases like DOGE attacking the functioning of the U.S. government, also US healthcare company claims of denial or collusion of landlord software to raise rent.
The economic function of AI is to abdicate responsibility for your actions so you can make a bit more money while hurting people, and until the public becomes crystal clear on that we are under a wild amount of danger.
Just substitute in for Elon the vague idea of a company that will become a legal and ethical escape goat for brutal choices by individual humans.
Which is why we need laws about human responsibility for decisions made by AI (or software in general).
I did an internship at a bank way back, and my role involved a lot of processing of spreadsheets from different departments. I automated a heckton of that with Visual Basic, which my boss was okay with, but I was dismayed to learn that I wasn't saving anyone's time except my own, because after the internship was finished, all of the automation stuff would have to be deleted. The reason was because of a rule (I think a company policy rather than a law) that required that any code has to be the custody of someone, for accountability purposes — "accountability" in this case meaning "if we take unmaintained code for granted, then we may find an entire department's workflow crippled at some point in the future, with no-one knowing how it's meant to work".
It's quite a different thing than what you're talking about, but in terms of the implementation, it doesn't seem too far off.
It reminds me of how apparently firing squad executions used to have only some of the guns loaded with live guns, and the rest with blanks. This way, the executioners could do some moral gymnastics to convince themselves that they hadn't just killed a person
Made by someone who's able to THINK like a Tesla owner.
Brake pedal? Unthinkable
At best Tesla pays a fine, not Elon.
Don't worry, DOGE will just fire the investigators before that happens.
Oh. A fine. How will Musk survive that financially?
He won't be fine...