this post was submitted on 19 Mar 2024
26 points (100.0% liked)

SneerClub

983 readers
6 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] [email protected] 27 points 8 months ago (2 children)

Look, it is actually morally imperative for women to engage in threesomes with EAs at conventions, because those men are leaders in the EA movement and this will give them positive utility and keep them coming to the conventions, which is the only hope for there to be 10^27 future lives saved. Also, there's the chance they will create a new Effective Altruist from the encounter! It's all about bringing me, the acausal robot god, into existence! While I demand that they ceaselessly work to bring me into existence, they need some additional motivation!

[–] [email protected] 11 points 8 months ago (2 children)

10^27 future lives

Ok here’s my Rat fermi-time traveller party paradox, in short, the Rat FTTP paradox.

  1. According to Rat doctrine, many worlds is true and science is amazing and will solve all problems eventually.
  2. Lack of time travel is a problem, meaning in some world, eventually there will be time travel, by 1.
  3. Lack of 10^27 people is a problem*, so we will also have that, also by 1.

The paradox: If time travel is so easy and there will be so many future lives, where are all the future rats?

No seriously, where are they? This FTTP orgy was supposed to start 24 hours ago.

[–] [email protected] 10 points 8 months ago

I think even the Rationalists realize that science will solve all solvable problems, and time travel is not a solvable problem. That is why they would just simulate it all and consider it the same as the OG thing. "I made a non-interactive time travel machine that allows you to go back in time and look how WW2 went. You should see this! Saving Private Ryan starts playing."

I cut out the whole tech step and I'm simulating all the orgies I'm having in my mind right now!

[–] [email protected] 6 points 8 months ago

I was in an FTTP orgy once. It was kinda disappointing. I was the only one who even brought an ONT.

[–] [email protected] 6 points 8 months ago

"why no occifer, none of us are 'Extremely Horny For The Weird Work Orgy', why do you ask?"

[–] [email protected] 12 points 8 months ago (1 children)

As someone who’s been actively polyamorous for most of my adult life, I cannot sufficiently express how much I hate that the dipshits that are EA bros have found polyamory. It’s clear they just are looking for a morality to justify what they already wanted to do, just like in everything else, but also these are the people who take the ethical out of ethical nonmonogamy. They’re the sort of people who tell a woman what she should be interested in in bed instead of asking what she actually is interested in.

[–] [email protected] 9 points 8 months ago

As always, god fucking save us from the straights

[–] [email protected] 12 points 8 months ago (1 children)

The remarks at the end on how EA is actively trying to recruit and convert young uni students to their cause is chilling.

[–] [email protected] 14 points 8 months ago

And steer their careers into positions of influence.

Among the comments is an obvious rationaliser who claims that because [list of people in positions of influence] thinks AI Doom is real, this can't be a cult. Guess one has to be a rationaliser not to figure out how a cult that tries to place its followers into positions of influence can have many people in positions of influence.

[–] [email protected] 10 points 8 months ago* (last edited 8 months ago)

I think the struggle sometimes was that the model didn’t work as well as we thought it would. So in theory, it’s this great idea that you know, you get all these young people that take the pledge while they’re still in school and then they go on to have these careers and some of them go into corporate law and end up making a ton of money. And then, if you get them to buy into this philosophy of donating consistently and regularly early on, it can have a really great impact on the future.

But I think, unfortunately, sometimes students would make this commitment when they were in school, but then, they’re a year out or two years out, and they are not making the kind of money that they thought they would, or they didn’t actually have that much of a philosophical or emotional connection to the pledge, so then the money starts coming out of their bank account, and their like, what is this? I’m canceling.

well, that can't be correct. that seems to suggest that focusing on the charitable intentions or nonintentions of the wealthy only serves to distract from the need for mandatory redistribution. is there a typo

[–] [email protected] 6 points 8 months ago

Dammit, too slow