[-] Soyweiser@awful.systems 3 points 14 hours ago* (last edited 13 hours ago)

freshwater

This reminded me of a few old comic stories were eventually the robot/computer was partially running on blood.

(One of them was a judge dredd one where they had vampire robots who iirc used the blood to keep a president in suspended animation alive. Snap, Crackle and Pop, it had a suprisingly wholesome ending for a dredd comic).

[-] Soyweiser@awful.systems 1 points 15 hours ago

Think it is a geek/nd failure mode tbh.

I have also done something like that in some messy situations and it was eye-opening when people just told me 'dude, I dont want to be involved, and thus dont care', obv a bit different than this case (as the people here do seem to care), but it made me realize I was stupidly trying to explain myself to people and hoping that helped with the issue/emotional issues coming from that issue.

Hell part of this 'people will see my side if I explain it' is one of the reasons some instances got defederated.

[-] Soyweiser@awful.systems 6 points 15 hours ago* (last edited 15 hours ago)

Lol, yeah, didnt check awful for a day and this escalated. Well done on all the mods for keeping us informed what is going on btw.

What a great example of how not to react when caught in a controversy by the db0 crew.

[-] Soyweiser@awful.systems 3 points 2 days ago

"We made GoldTrumpCoin just for you."

[-] Soyweiser@awful.systems 5 points 2 days ago

Ow god vox day. That dweeb.

[-] Soyweiser@awful.systems 7 points 2 days ago

the phrase “magic dirt” sounds real familiar.

Same for me but also cant recall the reference.

[-] Soyweiser@awful.systems 6 points 2 days ago

Yeah I saw Galloway be described as, lets keep it civil, a big opportunist.

[-] Soyweiser@awful.systems 8 points 2 days ago

A quick glance at segfaults reactions seem to me like he operates on 'if I just explain it enough to people they will agree with my side, and if they dont they have not properly heard all the facts', and he (the dox people dropped seems to imply male pronouns) seems to really begrudge friends/people he knows irl for disagreeing with him. Which doesnt seem to be the most healthy place to be in in a conflict like this.

What a shitshow, always sad to see somebody have an online episode like that. (As an outsider I obv have no idea what is going on, and im not going to dig into all that).

[-] Soyweiser@awful.systems 9 points 2 days ago

How is this possible? We included 'do not treat sarcasm as serious' in the prompt.

Damn prompt goblins again!

[-] Soyweiser@awful.systems 17 points 3 days ago

Turns out sneerclub is the superpredictor. 10/10 on going 'this is a bad idea'.

[-] Soyweiser@awful.systems 6 points 4 days ago

I enjoyed a different type of takedown than our more usual method of pointing out the weird nested asumptions.

[-] Soyweiser@awful.systems 8 points 4 days ago* (last edited 4 days ago)

it has become painfully clear that there is some sort of intrinsic value to the truth of the art, of the experience that creates it, to the backstory that connects it to reality.

This has been a thing for a while now already, even before gen AI in gaming that there often was, for a lot of people a preference to playing with others, as in various games any npc AI, or randomly generated events (like in skyrims dynamic quests) were seen as less interesting than real human interactions/created things.

Think this is a typo btw:

Hidu pantheon

The argument goes that we're seeing an exponential increase in the rate of technological development.

Also interesting to note that the evidence so far is at max N=1.

131
submitted 1 month ago* (last edited 1 month ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

As we don't have a top level post about this already (nor on reddit) I thought why not make one. Archive.is

Extremely likely the guy was a lesswronger, or at least radicalized by that sort of thinking.

But not much else seems to be known as far as I can tell. Corbin also posted about the HN reactions in the stubsack.

And remember, no fed posting.

Edit: looks like his house also got shot. Archive (after the speculation in this thread, makes you wonder if this was a follow up false flag, as the bottle didn't break last time).

15

Via reddits sneerclub. Thanks u/aiworldism.

I have called LW a cult incubator for a while now, and while the term has not catched on, nice to see more reporting on the problem that lw makes you more likely to join a cult.

https://www.aipanic.news/p/the-rationality-trap the original link for the people who dont like archive.is used the archive because I dont like substack and want to discourage its use.

22
submitted 9 months ago* (last edited 9 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

As found by @gerikson here, more from the anti anti TESCREAL crowd. How the antis are actually R9PRESENTATIONALism. Ottokar expanded on their idea in a blog post.

Original link.

I have not read the bigger blog post yet btw, just assumed it would be sneerable and posted it here for everyone's amusement. Learn about your own true motives today. (This could be a troll of course, boy does he drop a lot of names and thinks that is enough to link things).

E: alternative title: Ideological Turing Test, a critical failure

15
submitted 9 months ago* (last edited 9 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Original title 'What we talk about when we talk about risk'. article explains medical risk and why the polygenic embryo selection people think about it the wrong way. Includes a mention of one of our Scotts (you know the one). Non archived link: https://theinfinitesimal.substack.com/p/what-we-talk-about-when-we-talk-about

11
submitted 1 year ago* (last edited 1 year ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

12
submitted 2 years ago* (last edited 2 years ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

3

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›

Soyweiser

0 post score
0 comment score
joined 2 years ago