[-] [email protected] 2 points 3 months ago

As a fellow Usenet junkie from way back, now I'm curious which newsgroups Yarvin hung out in.

[-] [email protected] 5 points 3 months ago* (last edited 3 months ago)

Yeah, it was a brain fart.

[-] [email protected] 5 points 3 months ago

I always tune into Casey Newton and Kevin Roose's podcast to get my latest fix of AI hype, now that they've moved on from crypto hype and multiverse hype. Can't wait to see what the next hype cycle will bring!

[-] [email protected] 4 points 5 months ago

I should probably mention that this person went on to write other comments in the same thread, revealing that they're still heavily influenced by Bay ?Area rationalism (or what one other commenter brilliantly called "ritual multiplication").

[-] [email protected] 5 points 5 months ago

The story has now hopped to the orange site. I was expecting a shit-show, but there have been a few insightful comments from critics of the rationalists. This one from "rachofsunshine" for instance:

[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]

The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.

As relevant here:

  1. While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...

  2. Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.

  3. Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.

  4. The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:

  5. It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)

TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.

[-] [email protected] 3 points 6 months ago

One thing to keep in mind about Ptacek is that he will die on the stupidest of hills. Back when Y Combinator president Garry Tan tweeted that members of the San Francisco board of supervisors should be killed, Ptacek defended him to the extent that the mouth-breathers on HN even turned on him.

[-] [email protected] 4 points 1 year ago

Sorry for the off-topic rant, but WTF is Emile Torres doing on twitter? Anytime I see someone creating content for that Nazi hellsite, I start looking at them differently.

[-] [email protected] 5 points 1 year ago

I'd really like to know the back story on this interview too. I realize weirdness isn't exactly distinctive when it comes to rationalists, but Zack is in a league of his own.

[-] [email protected] 4 points 2 years ago

The first comment by the first commenter is "Can we suspend Godwin's Law for a moment?" followed by an explanation of the ways in which The Protocols of the Elders of Zion is an accurate description of reality.

Libertarianism is never far from Nazism. The Venn diagram is a circle. The only question is which circle contains the other.

[-] [email protected] 3 points 2 years ago* (last edited 2 years ago)

Roko's authoritative-toned "aktshually..." response to Annie's claims have me fuming. I don't know why. I mean I've known for years that this guy is a total boil on the ass of humanity. And yet he still manages to shock with the worst possible take on a topic -- even when the topic is sexual abuse of a child. If, like Roko, I were to play armchair psychiatrist, I'd diagnose him as a sociopath with psychopathic tendencies. But I'm not. So I won't.

[-] [email protected] 2 points 2 years ago* (last edited 2 years ago)

This is good:

Take the sequence {1,2,3,4,x}. What should x be? Only someone who is clueless about induction would answer 5 as if it were the only answer (see Goodman’s problem in a philosophy textbook or ask your closest Fat Tony) [Note: We can also apply here Wittgenstein’s rule-following problem, which states that any of an infinite number of functions is compatible with any finite sequence. Source: Paul Bogossian]. Not only clueless, but obedient enough to want to think in a certain way.

Also this:

If, as psychologists show, MDs and academics tend to have a higher “IQ” that is slightly informative (higher, but on a noisy average), it is largely because to get into schools you need to score on a test similar to “IQ”. The mere presence of such a filter increases the visible mean and lower the visible variance. Probability and statistics confuse fools.

And:

If someone came up w/a numerical“Well Being Quotient” WBQ or “Sleep Quotient”, SQ, trying to mimic temperature or a physical quantity, you’d find it absurd. But put enough academics w/physics envy and race hatred on it and it will become an official measure.

view more: ‹ prev next ›

TinyTimmyTokyo

0 post score
0 comment score
joined 2 years ago