this post was submitted on 12 Mar 2024
31 points (100.0% liked)

SneerClub

983 readers
16 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

The New Yorker has a piece on the Bay Area AI doomer and e/acc scenes.

Excerpts:

[Katja] Grace used to work for Eliezer Yudkowsky, a bearded guy with a fedora, a petulant demeanor, and a p(doom) of ninety-nine per cent. Raised in Chicago as an Orthodox Jew, he dropped out of school after eighth grade, taught himself calculus and atheism, started blogging, and, in the early two-thousands, made his way to the Bay Area. His best-known works include “Harry Potter and the Methods of Rationality,” a piece of fan fiction running to more than six hundred thousand words, and “The Sequences,” a gargantuan series of essays about how to sharpen one’s thinking.

[...]

A guest brought up Scott Alexander, one of the scene’s microcelebrities, who is often invoked mononymically. “I assume you read Scott’s post yesterday?” the guest asked [Katja] Grace, referring to an essay about “major AI safety advances,” among other things. “He was truly in top form.”

Grace looked sheepish. “Scott and I are dating,” she said—intermittently, nonexclusively—“but that doesn’t mean I always remember to read his stuff.”

[...]

“The same people cycle between selling AGI utopia and doom,” Timnit Gebru, a former Google computer scientist and now a critic of the industry, told me. “They are all endowed and funded by the tech billionaires who build all the systems we’re supposed to be worried about making us extinct.”

top 36 comments
sorted by: hot top controversial new old
[–] [email protected] 33 points 8 months ago (3 children)

This was such a chore to read, it's basically quirk-washing TREACLES. This is like a major publication deciding to take an uncritical look at scientology focusing on the positive vibes and the camaraderie, while stark in the middle of operation snow white, which in fact I bet happened a lot at the time.

The doomer scene may or may not be a delusional bubble—we’ll find out in a few years

Fuck off.

The doomers are aware that some of their beliefs sound weird, but mere weirdness, to a rationalist, is neither here nor there. MacAskill, the Oxford philosopher, encourages his followers to be “moral weirdos,” people who may be spurned by their contemporaries but vindicated by future historians. Many of the A.I. doomers I met described themselves, neutrally or positively, as “weirdos,” “nerds,” or “weird nerds.” Some of them, true to form, have tried to reduce their own weirdness to an equation. “You have a set amount of ‘weirdness points,’ ” a canonical post advises. “Spend them wisely.”

The weirdness is eugenics and the repugnant conclusion, and abusing bayes rule to sidestep context and take epistimological shortcuts to cuckoo conclusions while fortifying a bubble of accepted truths that are strangely amenable to allowing rich people to do whatever the hell they want.

Writing a 7-8000 word insider expose on TREACLES without mentioning eugenics even once throughout should be all but impossible, yet here we are.

[–] [email protected] 15 points 8 months ago* (last edited 8 months ago)

Inside the Strange World of the Uwu Smol Beans: An Exposé of a Quirky Community with No Racists Whatsoever

[–] [email protected] 13 points 8 months ago* (last edited 8 months ago) (2 children)

quirk-washing TREACLES

I can’t wait to be quirk-washed, I’m ready to hang up my pick-me hat and let the new yorker do the work for me

[–] [email protected] 8 points 8 months ago* (last edited 8 months ago) (3 children)

speaking of, saw this this morning: https://www.vox.com/future-perfect/2024/2/13/24070864/samotsvety-forecasting-superforecasters-tetlock

2 within a handful of days trying to reputation-wash after they got their filthy little selves exposed last year as the shitgremlins they are. hopefully it's just a coincidence in timing, but guess we'll have to see

[–] [email protected] 10 points 8 months ago (1 children)

I'm probably not saying anything you didn't already know, but Vox's "Future Perfect" section, of which this article is a part, was explicitly founded as a booster for effective altruism. They've also memory-holed the fact that it was funded in large part by FTX. Anything by one of its regular writers (particularly Dylan Matthews or Kelsey Piper) should be mentally filed into the rationalist propaganda folder. I mean, this article throws in an off-hand remark by Scott Alexander as if it's just taken for granted that he's some kind of visionary genius.

[–] [email protected] 3 points 8 months ago

yep aware. didn't care too much about the article itself, was more observing the coincidence in timing. but you have a point there with the names, I really should make that a standing mental ban

[–] [email protected] 5 points 8 months ago (1 children)

Had to stop reading that. My eyes were rolling too much.

[–] [email protected] 7 points 8 months ago

uwu smol-bean number starers, lovable little group of misfits from checks notes fucking RAND

[–] [email protected] 3 points 8 months ago (1 children)

What happened to Samotsvety last year? I missed that .

[–] [email protected] 5 points 8 months ago

I meant more the general state of the things in the TREACLES umbrella catching unfavourable public attention over the last while

[–] [email protected] 7 points 8 months ago

you gotta be white cis and loathsome or they won't do it

[–] [email protected] 12 points 8 months ago

God I always forget about the repugnant conclusion. It's baffling that it's being taken as anything but a fatal indictment of utilitarianism.

[–] [email protected] 23 points 8 months ago* (last edited 8 months ago) (1 children)

Yet another news story that omits how the science in HPMOR, the Sequences and the flagship e/acc blog is just wrong. Like, failing junior-high biology wrong.

The A.I., trying to access a Web site, was blocked by a Captcha, a visual test to keep out bots. So it used a work-around: it hired a human on Taskrabbit to solve the Captcha on its behalf.

Wait, didn't that turn out to be bullshit?

[–] [email protected] 14 points 8 months ago

Yeah, a lot of these TESCREAL exposés seem to lean on the perceived quirkiness while completely failing to convey how deeply unserious their purported scientific and philosophical footing is, like virgin tzatziki with impossible gyros unserious.

[–] [email protected] 12 points 8 months ago (2 children)

Wat

[Grace's] grandfather, a British scientist at GlaxoSmithKline, found that poppy seeds yielded less opium when they grew in the English rain, so he set up an industrial poppy farm in sunny Australia and brought his family there.

To grow opium???

(OK I guess for medicinal purposes but maybe point that out)

[–] [email protected] 11 points 8 months ago

We should have known the English rain was trouble when it started giving people tans

[–] [email protected] 7 points 8 months ago (1 children)

I wonder how much of that family fortune has found its way into EA coffers by now.

[–] [email protected] 7 points 8 months ago (1 children)

In another part of the article, it states that Grace grew up "semi-feral", so perhaps the fortune was smoked away in the Tasmanian opium dens (those exist, right?)

[–] [email protected] 12 points 8 months ago* (last edited 8 months ago)

In yet another part of the article:

She had found herself in both an intellectual community and a demimonde, with a running list of inside jokes and in-group norms. Some people gave away their savings, assuming that, within a few years, money would be useless or everyone on Earth would be dead.

More totally normal things in our definitely not a cult community.

[–] [email protected] 11 points 8 months ago (1 children)

I have a bad feeling these people are going to waltz into even more power.

[–] [email protected] 9 points 8 months ago (2 children)

I dunno. At least in the US, these people are decidedly outside the mainstream, at least in the US. Their views on religion and sexual mores preclude any popular appeal, and they are handicapped in a similar way were they to try to infiltrate existing power structures.

Basically their only hope is that an AI under their control takes over the world.

[–] [email protected] 11 points 8 months ago

Basically their only hope is that an AI under their control takes over the world.

They are pretty dominant in the LLM space and are already having their people fast tracked into positions of influence, while sinking tons of cash into normalizing their views and enforcing their terminology.

Even though they aren't trying to pander to religious americans explicitly, their millenialism with the serial numbers filed off worldview will probably feel familiar and cozy to them.

[–] [email protected] 8 points 8 months ago* (last edited 8 months ago) (1 children)

Come on, you’re talking about America, when did mainstream popular appeal ever limit anyone with money?

[–] [email protected] 9 points 8 months ago (2 children)

You still need to lever that money by "buying" the people in power.

Right now there's really no mainstream politicians 100% on board with the weirdness of TESCREALs:

  • mainstream Democrats - too wary of corporations, too eager to regulate
  • pre-Trump GOP - maybe, but they're losing influence fast
  • current Trump GOP - literally crazy, way too easy for TESCREALs to be painted as a satanic cult
[–] [email protected] 5 points 8 months ago

Maybe. The current EA strategy is to takeover all the technocratic positions in government/business one level down from the ostensible policy-makers. The idea being that if they are the only ones qualified to actually write the reports on "alignment" for DoD/NIST/etc. then ultimately they get to squeeze in some final control over the language, regardless of what Joe Senator wants. Similarly, by monopolizing and brainwashing all the think tank positions, even the Joe Senators out there end up leaning on them to write the bills and executive orders.

[–] [email protected] 4 points 8 months ago

I’ve finally got around to replying to this but it’s been burning a hole in my subconscious

I think that’s a naive interpretation of the interests in play here.

Altman aptly demonstrated that a yes/no on regulations isn’t the money’s goal here, the goal is to control how things get regulated. But at the same time Democrats are hardly “eager to regulate” simpliciter, and the TESCREALs/Silicon Valley can hardly be said to have felt the hammer come down in the past. It may be part of some players’ rhetoric (e.g. Peter Thiel) that the Republicans (both pre- and post-Trump) are their real friends insofar as the Republicans are eager to just throw out corporate regulations entirely, but that’s a different issue: it’s no longer one of whether you can buy influence, it’s a matter of who you choose to buy influence with in the government, or better yet which government you try to put in power.

It should be noted at this point that mentioning Thiel is hardly out of court, even if he’s not in the LessWrong stream: he shares goals and spaces with big elements of the general TESCREAL stream. He’s put money into Moldbug’s neo-reaction, which is ultimately what puts Nick Land sufficiently on the radar to find his way into Marc Andreesen’s ludicrous manifesto.

And why should the TESCREALs fear being painted as a satanic cult in the first place? Has that been a problem for anybody but queer people and schoolteachers up to this point? It seems unlikely to me that anyone involved in Open AI or Anthropic is going to just stop spending their absolute oceans of capital for fear that LibsOfTikTok is going to throw the spotlight on them. And why would Raichik do that in the first place? The witch hunters aren’t looking for actual witches, they’re looking for political targets, and I don’t see what’s in it for them in going after some of the wealthiest people on the West Coast except in the most abstract “West Coast elites” fashion, which as we all know is just another way of targeting liberals and queers.

[–] [email protected] 11 points 8 months ago (1 children)

Oh, good, ex-incel Scott is in a polycule now, the wonders of the cult lifestyle.

[–] [email protected] 9 points 8 months ago (1 children)

Wasn't he supposed to be a romantic asexual at some point?

[–] [email protected] 8 points 8 months ago (2 children)

After all I’ve heard I believe that was a bald faced lie.

[–] [email protected] 7 points 8 months ago

Maybe he's the guy who goes to the orgy just to hold hands.

[–] [email protected] 6 points 8 months ago (1 children)
[–] [email protected] 5 points 8 months ago

That he was hooking up with dudes at rationalist meet ups.

[–] [email protected] 10 points 8 months ago

he dropped out of school after eighth grade, taught himself calculus

Lmaou, gonna need a citation on this one chief. This the same guy who said we need people monitoring for 'sudden drops' in the loss function? I'm supposed to believe this poser understands what a derivative is now?

[–] [email protected] 10 points 8 months ago

Non-paywall link.

[–] [email protected] 6 points 8 months ago

"Socialists think we’re sociopathic Randroid money-obsessed Silicon Valley hypercapitalists."

No, Scott, we just think you're a coward and a racist

[–] [email protected] 5 points 8 months ago

Life is weird when you're living in a racist polycule