[-] Soyweiser@awful.systems 5 points 10 hours ago

Before they could ask grok how to stop a process it was already too late.

Not that it mattered as Groks advice to become the reichschancellor actually didnt fix this problem.

[-] Soyweiser@awful.systems 2 points 10 hours ago

Thanks for the deep dive ln stuff that is wrong with it (also the others).

[-] Soyweiser@awful.systems 6 points 1 day ago

Yeah was quite disappointed by that, also the anthropomorphization of AI by the end.

[-] Soyweiser@awful.systems 4 points 1 day ago* (last edited 9 hours ago)

Article on the Ick generated by AI shit from the perspective of a woman "They Built Stepford AI and Called It “Agentic”", talking about how women adopt it less, and gives a reason why this might be so.

On a personal note (I'm a man for the record), while I normally get the uncanny valley effect a lot less than normal people, I do notice it a lot with AI generated people, really odd experience that.

(Author does seem to be a pro AI person however).

E: thanks everybody being so critical about it, should have read the whole article (and not ignored the substack red flag) before posting it here so uncritically.

[-] Soyweiser@awful.systems 3 points 1 day ago

Well, just dont use your real name online.

[-] Soyweiser@awful.systems 3 points 2 days ago* (last edited 2 days ago)

The start with the weird bit against people with mh issues had me on edge already, and when he let all the 'these things are for women/my ex' stuff slide, I was not thinking good things of the author.

Note how nobody he talks to seems to be a woman, despite all the techbros talking about women quite often.

(The authors apparent metoo history comes as no shock (I didnt look into that so dont quote me on that)).

[-] Soyweiser@awful.systems 5 points 2 days ago

Also a little bit of context for the people who know nothing about all this, the O9A is one of those very scary groups, liked to various murders and stuff like that.

Iirc some anti-extremism people used to not mention them a lot as they didnt want them to get more attention by platforming them a little bit and they were scared of drawing their attention personally.

[-] Soyweiser@awful.systems 9 points 2 days ago

with the right bootstrap instructions and a bit of tools.

Is doing a lot of work

[-] Soyweiser@awful.systems 2 points 2 days ago* (last edited 2 days ago)

I assume a lot of people are using this moment to do the 'I never liked him' hate.

I disagree with Tante on the second article btw. Dont think people drop others on a dime, Inthink it is a slower process where someone you look up to does more and more small things you dislike (or you reread and start to realize you perhaps had a few too rose colored glasses on) and then your opinion turns. (With some exceptions of course, lot of people have a few things they consider red lines, like a lot of leftwingers not being fan of sex crimes, or people on the right not being a fan of treating poc like equals).

E: i do have a hit skeet on bsky saying 'Guess even Doctorow must eventually enshittify' hope this didn't trigger this blog post. (I meant it both as he got worse, but also im using enshittify intentionally wrong cause Cory said a very weird thing about how anti AI was neoliberal purity culture, which I also think is misusing terms).

[-] Soyweiser@awful.systems 9 points 3 days ago* (last edited 3 days ago)

Yeah not even halfway in and it is just madness. Also not unlikely the Roy guy just made things up.

Guess the author didn't think of asking about the inconsistencies in the mans story cause they both bonded over disliking unhoused people. (The horrible unhoused people who mumble incoherently vs the chad founder who shouts 'will you be a cofounder with me?' at people).

But nope just post the blackpillers words uncritically. Do not mention that this bold truthteller who doesnt like to be told what to do or he gets enraged spend a year at home to save his parents business (and admits to that damaging their business).

Alexander is one of the leading proponents of rationalism

Is he? Or is he just calling himself that. Claiming to be a Rationalist is easier than actually doing it of course.

For rationalists, the divide between truth and falsehood is very important;

Only for the outgroup. (Saying this in relation to Scott 'Secret NRx'/'I didnt read the book I reviewed' is something).

"Racing cum is definitely interesting.” I found Eric very hard not to like.

Might want to reflect on that a bit. And why this is more a pr piece than journalism. (Did he even check all these people got kicked out of their highschools?)

Re donald boat.

Why didn't people just block him? Why doesnt the author talk about this?

I told Donald the theory I’d been nursing

This explains, the author wants to be them.

[-] Soyweiser@awful.systems 16 points 4 days ago

Tante.cc writes about Cory using an 'Drunk Uncle' style argument to defend his LLM usage (and go after the left using strawmans).

(To counter one of Cory's arguments, If disliking LLMs was just about the people who run it, people against it would have have stayed in sneerclub).

[-] Soyweiser@awful.systems 9 points 5 days ago* (last edited 5 days ago)

the Gen-Z men shut out by elite institutions often join their grandfathers and turn toward MAGA, or worse, into Groypers.

Iirc that is not as often true as people claim it is. But yeah, not gonna click unherd to see if they have a source. Because blergh unherd.

15

Via reddits sneerclub. Thanks u/aiworldism.

I have called LW a cult incubator for a while now, and while the term has not catched on, nice to see more reporting on the problem that lw makes you more likely to join a cult.

https://www.aipanic.news/p/the-rationality-trap the original link for the people who dont like archive.is used the archive because I dont like substack and want to discourage its use.

22
submitted 6 months ago* (last edited 6 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

As found by @gerikson here, more from the anti anti TESCREAL crowd. How the antis are actually R9PRESENTATIONALism. Ottokar expanded on their idea in a blog post.

Original link.

I have not read the bigger blog post yet btw, just assumed it would be sneerable and posted it here for everyone's amusement. Learn about your own true motives today. (This could be a troll of course, boy does he drop a lot of names and thinks that is enough to link things).

E: alternative title: Ideological Turing Test, a critical failure

15
submitted 6 months ago* (last edited 6 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Original title 'What we talk about when we talk about risk'. article explains medical risk and why the polygenic embryo selection people think about it the wrong way. Includes a mention of one of our Scotts (you know the one). Non archived link: https://theinfinitesimal.substack.com/p/what-we-talk-about-when-we-talk-about

11
submitted 9 months ago* (last edited 9 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

12
submitted 2 years ago* (last edited 2 years ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

3

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›

Soyweiser

0 post score
0 comment score
joined 2 years ago