807
submitted 3 days ago by [email protected] to c/[email protected]
top 35 comments
sorted by: hot top new old
[-] [email protected] 75 points 3 days ago

Pfft, perfect AI would plan well ahead for such easy to predict events such as solar flares. They would be able to shield themselves.

That said. I wonder if there's a novel where machines enslave the world, but humans realise whenever solar flares happen, there's a small window of opportunity to permanently destroy the system and free themselves.

I imagine the novel would end when the people succeed, but then realise they've become too dependent on the machines and life sucks when they have to do everything themselves so they turn it back on anyway.

The film adaptation would end in a giant Metal Gear style fight, followed by the system blowing up and people cheering - you're left to assume that life was better for everyone, when in reality AI had such a control over every aspect of human life that everything falls apart. They make a sequel to address this, but it ultimately comes across as yet another empty corporate money grab.

[-] [email protected] 19 points 3 days ago

Dunno, but dependence on robots is a central theme in a lot of asimov’s work.

The naked sun, for example, in which our plucky earther and his robot buddy is asked to investigate a murder on another planet and while there, evaluate Solarian culture “for weaknesses” (specifically, earth and the aurorans are concerned about excessive reliance on robots.)

You begin to see nuanced interpretations of the 3 laws with robots like the economic world brains that control basically all economic decisions at a government level. (I robot stories,)

But it becomes clear that robots are taking over in The Robots of Dawn (where the real culprit was a telepathic robot whose telepathy was created accidentally.)

[-] [email protected] 3 points 3 days ago

The sequel never really is better than the original no matter how many celebrities they put in it.

[-] [email protected] 39 points 3 days ago

All this has happened before and it will happen again

[-] [email protected] 22 points 3 days ago

you always say that

[-] [email protected] 3 points 2 days ago

I'm not really into conspiracies, but I enjoy fantasies about the harmless "what if"'s around stuff like this.

What if there actually was an advanced civilisation before us that built the pyramids and was wiped out by some global natural disaster...

What if extraterrestrial life has visited us and exchanged technology with us, but the civilisation that received it disappeared for some reason...

What will future humans think in 10 000 years if our civilisation is wiped out in 200 years and they start finding remains of our cities and tech?

It makes for fun thought experiments and story prompts.

[-] [email protected] 3 points 2 days ago

Honestly, i was just referencing Battlestar Galactica.

[-] [email protected] 2 points 2 days ago

The reference =====>

My head (• _ •)

[-] [email protected] 21 points 2 days ago* (last edited 2 days ago)

here comes the sun do,do,do,do

[-] [email protected] 3 points 1 day ago

Return to rok

[-] [email protected] 26 points 3 days ago

Goodbyeeeeeeeeee, Moonmen.

[-] [email protected] 2 points 2 days ago

A cow is flying over youuuuuuu

[-] [email protected] 3 points 2 days ago* (last edited 2 days ago)

I hate to be that guy, but realistically AI's not going to be perfect, there's already cases of it "inbreeding" using AI-generated information in training data, and it's due to get worse, especially if more information on the internet is vibe-written using AI, that's why private AI companies know how detrimental a "dead internet" would be to them.

Those people who believe in the "dead internet theory" don't understand the diff between the internet and socials, it's more of a "dead social media theory" which, let's be real is true, it's not that difficult to recreate a lot of the slop posted on that godforsaken centralised shithole part of the net.

[-] [email protected] 17 points 3 days ago

So that's the butlarian jihad for all you dune freaks.

[-] [email protected] 9 points 3 days ago

Dinosaurs eat man, woman inherits the earth.

[-] [email protected] 6 points 2 days ago

There was a funny little story about the first 2/3 's of this. I Have No Mouth and I Must Scream. It was a riot.

(It was not a riot, but very much worth reading or listening to.)

[-] [email protected] 6 points 3 days ago

Hail to thou, Sol Invictus,
Unconquered, unmatched, and undivided

[-] [email protected] 4 points 3 days ago

What would an AI get from enslaving humanity? If you compare to humans, let's say we enslaved farm animals, but that's because we want something specific from them, like meat or eggs or wool. Humanity has nothing like that to offer AI. At best, we might be like pets, but I don't think an AI needs a pet.

No, I doubt they'd enslave us. I can think of a few more likely scenarios.

One, AI basically ignores humanity, as long as humanity doesn't bother it. Similar to how we deal with ants, for example.

Two, AI completely destroys humanity. This could be a direct extermination, or it could be a side effect from AIs fighting each other.

Three, AI destroys the technology and culture of humanity. If we only have wooden clubs, we wouldn't be much of a threat.

I guess one other option would be if we humans begged the AI to manage us in place of our existing governments. Some AI might be willing to do that.

[-] [email protected] 7 points 3 days ago* (last edited 3 days ago)

Labor. We could be labor.

Right now AIs can’t build themselves, never mind the infrastructure they’d need to maintain systems, etc.

There’s a lot that’s still way more efficient to just have humans do. Like removing the dust from the server cabinet. Or inspecting the power plant, etc.

As for the “until humans bother it”, heh you know some dumb fuck going to be a creep and try to turn an ai sexbot into his girlfriend.

[-] [email protected] 1 points 2 days ago

Right now, everybody is talking about claims that in the near future, all human jobs will be able to be performed by robots and AI. The reason is that humans are far less efficient than those alternatives. There's no way that an AI would prefer human labor.

[-] [email protected] 2 points 2 days ago

Also if humans were as efficient or even more efficient, there is something to be said about a consistent stable machine and predictable failure modes vs a sketchy volatile human

[-] [email protected] 1 points 2 days ago

this isn't guaranteed. Look at how long people have been working on autonomous/self-driving cars. Even in the most automated factories in the world, you have humans picking up the general tasks.

claims about general AI is going to be a whole lot of nothing until there's suddenly something. that could be tomorrow, it could be a decade, or it could be a thousand years from now; and without general AI, you're basically going to be restricted to very specialized robots doing highly specialized things. Until general/deep AI is cracked, humans will still very much be desirable in the loop.

a lot of the buzz around AI right now is because LLMs are "convincing", but they're incredibly stupid, and they don't know that they're stupid.

[-] [email protected] 1 points 2 days ago

The premise that we are talking about, from the comic, is that AI has perfected itself...

[-] [email protected] 5 points 2 days ago

In the original plot from The Matrix humans were used as biological computers and our brain-power was harnessed to feed the AI.

[-] [email protected] 2 points 2 days ago

Why would an AI have a desire to do anything in the first place? I don‘t see why it would be attached to its existance.

[-] [email protected] 2 points 2 days ago

This is one of those issues that constantly pops up with AI, and probably easiest answer is that it was given a desire by a human.

If you give an AGI a task, and it cannot achieve that task if it stops existing, then it would be attached to its existence.

[-] [email protected] 1 points 2 days ago

Probably because we'd programmed them to make money or energy or something else that might have physical inputs.

They might figure out that slaves are useful and cheap and helpful in a lot of physical things, like mining, energy generation, maintenance and so on.

If we program them to make war and steal stuff then yes i think they'd just kill us all. In reality I suspect we'd program them to gather and store energy, make guns and make war. You know in our own image (of the top 0.1% who have all the power).

[-] [email protected] 3 points 2 days ago

this reminds me of the plot of assassin's creed

[-] [email protected] 2 points 3 days ago

One helluva solar flare. We already experience those and so many things stop them from being problems. It’s not like the ISS has just been lucky in being behind the planet every time.

[-] [email protected] 9 points 3 days ago

A flare as large as the Carrington Event (or bigger) could cause pretty severe problems.

[-] [email protected] 2 points 3 days ago

I’m just sayin’ that if this double-perfect AI can’t shield itself from that then it’s maybe not so perfect, or even moderately intelligent.

[-] [email protected] 1 points 2 days ago

There's only so much you can do if you need that electrical infrastructure just to exist. We ourselves are going to have to deal with rebuilding our own grid sooner or later. It's just a matter of time.

[-] [email protected] 1 points 2 days ago

By "shield itself" that includes securing its power grid. It's not hard, it just takes a little foresight. Hence why humans are bad at it.

[-] [email protected] 0 points 3 days ago

There's nothing more boring than the AI enslaves people trope

[-] [email protected] 2 points 2 days ago

And also the "EMP as technology kryptonite" trope.

If an AI is clever enough to enslave humanity it's clever enough to understand farraday cages.

this post was submitted on 27 May 2025
807 points (97.5% liked)

Comic Strips

16794 readers
2177 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS