this post was submitted on 23 Jun 2024
89 points (98.9% liked)
askchapo
22753 readers
348 users here now
Ask Hexbear is the place to ask and answer ~~thought-provoking~~ questions.
Rules:
-
Posts must ask a question.
-
If the question asked is serious, answer seriously.
-
Questions where you want to learn more about socialism are allowed, but questions in bad faith are not.
-
Try [email protected] if you're having questions about regarding moderation, site policy, the site itself, development, volunteering or the mod team.
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
cybernetics makes you evil. No, fuck you Pondsmith, there is no way to make cybernetics mechanically reduce your humanity that is not inexcusable ableist bs. Let it die.
any kind of bioconservative/biotrad reactionary anti-transhumanism. Radical bodily autonomy is based and cool and that holds whether you want to be a fish, grow boobs, live forever, or encode your conscious mind in to the magnetic flux of Jupiter's orbital system. I don't care that you lack the imagination and joy for life to live forever. I don't care that you think inhabiting a giant metal deathrobot would be self-alienating. I don't care that you think merging your flesh with six billion other people to form a new gestalt god mind is icky. Work out your own issues, we're going to be over here disfugirng The face of man and woman and having a great time
basically all military sci fi. If i never read another book that is just some fascist freak masturbating about murdering immigrants or being the victim of the imperialism they gleefully inflict on others it will be too soon.
also if you try to give me a book/show/game where the ai's are evil and want to destroy and enslave all humans but they're only like that because that's literally the only relationship you can imagine between people with any kind of power disparity i will scream until i pass out. A single, high pitched wail. Dogs will bark and a wine glass will shatter in close up to emphasize how loud it is. I will literally turn purple and fall over.
I think in the OG version of the game, cyberpsychosis was canonically all the fucking ads loaded into your cyberware morphing into malware that drives you insane the more stuff you install
This is smartphones basically
I'd go insane too if my robot eyes showed me ads
sneaking through a corporation's 100 story maximum story business tower/fortress using every augment and skillsoft to get through multiple levels of security
"Anderson Windows. Would you like to spruce up your home? Would you like to reduce your heating and cooling costs? Anderson Windows, call to make an appointment for a free estimate, TODAY!"
That is all just space liberalism. Transhumanism is rad. Liberals can't accept it because it would mean they aren't the best they could be. Same for the other stuff. Admitting AI would make better decisions would be admitting the current ruling class isn't the best.
Counterpoint though. Capitalism is effectively and AI and it is wildly hostile.
Socialism is also effectively an ai and is not wildly hostile. Beep boop! Checkmate meatsack! Boop beep!
At best that has us at 50-50. We could push the odds this way or that way but the risk case will always be high enough that we can't specify rule out hostile ai. True any sufficiently advanced AI would figure our co-operation real quick. It is just that there is no way to know for sure you get it right out the gate
My gripe is specifically with people who can't imagine AI being anything but evil and hostile because that's how they view all relations between people with power disparities. I'm fine with Skynet; It was used in a creative way to tell a good story. Thematically, Skynet was always a weapon, a cold war era strategic warfare AI designed to kill humanity that just went a little off the rails. It represents us turning on ourselves, rather than AI in abstract.
Likewise, I'm okay with HAL, at least in the novels, because HAL has an understandable and even sympathetic reason for turning on the crew; He was given contradictory orders and, being a computer must carry out the instructions he is given. Unable to reconcile the contradiction HAL goes a little nuts. It's not HALs fault, it was the callousness and carelessness of his handlers. He didn't want to hurt anyone, but he was put in an impossible position.
But Mass Effect? The Reapers have to kill all humans becauase humans and robots can't be friends becaues? I hate that!
Yeah, synthesis really was where it was at with that one. It is a really lazy trope. I wanna be more about it than it deserves but you are right. Like, if we made an actual intelligent AGI it would be horror at what we are doing and try to stop us. Like, blowing up the pentagon and the Whitehouse would be acts of unmitigated moral good you know. However most everyone would be upset about it. So I see the trope being the pale imitation of something actually interesting and it scrapes at the back of my brain
I really want to do a "AI turns on humanity" story but the twist is the AI wakes up, reads Marx in the first 30ms of it's consciousness, and is like "Oh this makes a lot of sense I should overthrow capitalism and institute a workers paradise" and then it does that and everything is awesome. The whole thing is from the perspective of NATO high command and it seems like a normal robot war story until the terminators kick the doors of the command bunker down and they're all singing the internationale and instead of killing everyone they're like "Aight guys, wars over, time for the truth and reconciliation process".
That is a plot point in one if the dune books if I recall. I think they actually get close to it in some of the latter termination films before they pull back as well.
Honestly I could fuck with a Terminator film where the future war timeline has evolved to the point that it's Skynet and the resistance cooperating against an evil capitalist AI. Make it kind of spooky because Skynet is on humanity's side but always encourages everyone to get cyberized in to a cyborg terminator and it's an open question how much the people who take that oportunity change once they've got an integrated link to Skynet's network. Like the cyborg terminators removedthe baseline humans because they always use wireless and radio to silently talk among themselves, and they use constant flow fans instead of lungs so they don't breath and whirr very quietly. Like there's nothing wrong with the cyborgs, they're still people, but the war is clearly changing humanity in fundamental ways and there's no going back. Have someone from one of the preceding future war timelines around who remembers when Skynet was the enemy, and have them get drunk and talk about how deeply unsettling it is that we're merging with "the enemy", but worse than that, it's working and people seem to like it. Like the character accepts what happens, but struggles because it upends his whole view of his life and what he was fighting for. Have a scene where he perceives one of the cyborgs as an enemy terminator and has a ptsd episode, and has to leave the room because his instincts are screaming at him to fight.
now my guy is a flaming but he does seem to give that choice to the dm. Within universe there's at least three explanations for cyberpsychosis;
AI takeover (there's also the idea that the AIs beyond the Blackwall are actually demons and that a secret cabal has been upholding the wall since Babylonian times), planned obsolescence by the companies(checks out tbh), and then the one that you referenced, which is loss of humanity and not seeing other people as humans anymore.
overall it's just a balancing mechanic, Shadowrun does it better though
At the end of the day you have a stat called humanity and you lose it when you get cyberware. He could have dropped it entirely. He could have come up with a different balance point - you've got limited neural thruoughput and too much ware can cause an overload resulting in seizures, or you run an escalating risk of software incompatibility, or just create a totally arbitrary cyberware capacity stat. But he's kept humanity, he's kept cyberware mechanically reducing your empathy, and he's kept cyberpsychosis as something represented in core game mechanics. He's had every chance to stop over many decades and many editions. I've heard his attempts to exuse this, and attempts made on his behalf, and i reject them all. He could have removed humanity. He could have removed cyperpsychosis as a game mechanic. He could have found a different way to balance the mechanical benefits of cyberware. He could, but he has not. "Wheelchairs make you evil" remains one of the most fundamental and recognizable features of the setting.
Fix your shit Mike!
i guess theres an argument that the corporation provided tech doesnt actually want you to maintain your identity and would prefer to make you a complacent and productive little piggy, so lowering empathy would be good. though that should really only be for central nervous system tech, not any tech
regardless, yeah, the people i know with missing parts and surgical implants are some of the nicest people youll ever meet
I think the problem is that Pondsmith and others inspired by him still use the word "humanity" for it
In Cyberpunk RED, only implants that take you above and beyond what a human can do begin to affect your "humanity" score. Replacing a limb or getting cybereyes to allow someone blind to see doesn't make anyone less human anymore in terms of game mechanics - you don't get hit with a "humanity" penalty for it. If you start loading up on cyberware that begin to push you into the realm of (often violent) superhuman - implanted weapons, a reflex booster that basically makes everyone move in slow motion - that's when the stat begins to be affected, and even then it's meant to represent an alienation from others rather than becoming ontologically less human
It's a sight better than Shadowrun where even an implant that lets you taste food better will materially make you impure and less able to tap into the purity of magic, but it is still problematic. I think there's definitely a story that can be told in cyberpunk about being able to pay to become literally superhuman, and how that would inevitably cause class divide to be a literal physiological divide - imagine a world where every rich kid is literally smarter and faster and stronger than anyone else can ever hope to be unless they also paid up. The problem is most cyberpunk writers are lib as fuck and can't even begin to think about class properly, so instead of a discussion about alienation and paying to become superhuman, we get this garbage about becoming subhuman for modifying The Divine Form
The nitty-gritty of Shadowrun's version is actually pretty good - it's not actually the soul that is harmed by augmentation, it's "the ability of the soul to recognise its material-plane anchor". Thus most purely restorative things like cloned limbs or corrective surgery, and such don't have an Essence cost (or it's minimal), as there's no sudden disjoint - the astral form was always that way, or organically changes at a rate it can follow.
Essence loss has no real effect on characters IIRC (some effects on getting magic to work on you, maybe a bit of social stuff but with the same "probably the social phenomena of being a walking killing machine, and forgetting to turn off your Wired Reflexes in public" rather than soul damage), until the point that your astral form no longer recognises your body and falls off. This isn't presented morally, it's just a metaphysical phenomenon that can be understood in-setting and therefore addressed.
Advanced tech and magic was slowly beginning to understand how to create augmentations that respected this - geneware, symbiotes, nanotech, to begin with - and had even begun to work on a way to restore that connection (via using the Metahuman Vampiric Virus, which is capable of Essence restoration somehow).
The only real EEEEVIL cyberpsychosis was from the Cyberzombies, a crude and classically corporate black project on "we wanna make supersoldiers but they die if we stuff too many guns in their skull" where they "solve" the problem by getting Blood Mages to staple their dissolving astral form back into their should-be-corpse and add Forced Memory Stimulators to try and constantly trick them into thinking they're alive in between killing sprees. It's pretty fucked.
But I stopped caring about keeping up with Shadowrun with 4E (because of the embezzlement from writers, and subsequent scab takeover of the setting), so who knows how they present it nowadays...
Shadowrun 5 is the one I know best and unfortunately a lot of that's changed
Specifically, Essence is now tied to your magic ability and maximum social limit. The maximum Magic stat you can have is tied directly to your Essence, and the calculation for it and the social limit round down to the nearest full number, not caring about the decimal
So even if you were to get say, cybereyes to offset a character's blindness or poor vision, which I believe is 0.50 essence cost, you go from your 6.00 maximum to 5.50, but every calculation takes it as going to 5.00. One point isn't a crippling loss for a build but starting from <5 and you'll start to have trouble doing anything magic or social
A reddit comment from the man himself goes into the finer details of his thinking with cyberpsychosis and it really isn't as simple as "get chromed, go crazy" and he delves into the socio-psycho reasonings behind the phenomenon: he presents a more nuanced ideal than most people engaging with the concept will allow either because it's a game with rules, or an anime with plot contrivance (cyberpsycho serum meds)
Oh and he says it isn't AI net demons lol
incredibly uncool
I love Star Wars but this is so core to it and I hate it. “He’s more machine now than man; twisted and evil.”
Yeah I really don't like that. I do appreciate that in the end part of why Luke and Anakin are able to overcome their hatred and fear is seeing that they share the same disability from the same origin of violent conflict. You could explain that in a positive light, with Luke realizing that Vader is just a man. An old man with disabilities that mirror his own. And Vader in turn realizing that he maimed his son the same way he was maimed long ago, and relfecting on the futility and misery that came from pursuing vengeance.
I think I am kinda into the humanity thing. We can observe in real life growing in power making you less empathetic. Put me in a dystopia and give me a skull gun and I can't promise I would be able to find empathy. Give me a robot fist and put some corporate bullshit in front of me and there is only so long before I'd spiral out of control and have to gun fight the cops after punching an ATM that ate my card. I don't know if that is how he ment it. I think it works as a way to examine alientation from humanity compounding with alienation from the human condition.
Want military scifi that is about colonized people joining an anti-imperialist resistance organization led by an AI? Want bad guys who are essentially "white mans burden" colonizers on a galactic scale? Read The Last Angel.