14
submitted 1 month ago by [email protected] to c/[email protected]

Just for the record, I'm not suicidal.

29
submitted 6 months ago by [email protected] to c/[email protected]
34
submitted 6 months ago* (last edited 6 months ago) by [email protected] to c/[email protected]
101
submitted 1 year ago by [email protected] to c/[email protected]

The article doesn't mention SSC directly, but I think it's pretty obvious where this guy is getting his ideas

[-] [email protected] 45 points 1 year ago

When they made an alt-right equivalent of Patreon they called it "Hatreon". This stuff is like a game to them.

43
submitted 1 year ago by [email protected] to c/[email protected]

An old post from Caroline Ellison's tumblr, since deleted.

[-] [email protected] 15 points 1 year ago* (last edited 1 year ago)

Here is the document that mentions EA as risk factor, some quotes below

Fourth, the defendant may feel compelled to do this fraud again, or a version of it, based on his use of idiosyncratic, and ultimately for him pernicious, beliefs around altruism, utilitarianism, and expected value to place himself outside of the bounds of the law that apply to others, and to justify unlawful, selfish, and harmful conduct. Time and time again the defendant has expressed that his preferred path is the one that maximizes his version of societal value, even if imposes substantial short term harm or carries substantial risks to others... In this case, the defendant’s professed philosophy has served to rationalize a dangerous brand of megalomania—one where the defendant is convinced that he is above the law and the rules of the road that apply to everyone else, who he necessarily deems inferior in brainpower, skill, and analytical reasoning.

[-] [email protected] 11 points 1 year ago

No, they're able to grasp the near term risks, they just don't want that to get in the way of making money because they know they're unlikely to be affected.

[-] [email protected] 11 points 1 year ago

Yudkowsky is pretty open about being a sexual sadist

54
submitted 2 years ago by [email protected] to c/[email protected]

I somehow missed this one until now. Apparently it was once mentioned in the comments on the old sneerclub but I don't think it got a proper post, and I think it deserves one.

12
submitted 2 years ago by [email protected] to c/[email protected]
20
submitted 2 years ago by [email protected] to c/[email protected]

From Sam Altman's blog, pre-OpenAI

[-] [email protected] 11 points 2 years ago

Looking forward to LW articles with titles like "Ashkenazification via engineered viruses as a solution for African poverty: here's why it might work"

[-] [email protected] 18 points 2 years ago* (last edited 2 years ago)

the cell’s ribosomes will transcribe mRNA into a protein. It’s a little bit like an executable file for biology.

Also, because mRNA basically has root level access to your cells, your body doesn’t just shuttle it around and deliver it like the postal service. That would be a major security hazard.

I am not saying plieotropy doesn't exist. I'm saying it's not as big of a deal as most people in the field assume it is.

Genes determine a brain's architectural prior just as a small amount of python code determines an ANN's architectural prior, but the capabilities come only from scaling with compute and data (quantity and quality).

When you're entirely shameless about your Engineer's Disease

[-] [email protected] 11 points 2 years ago* (last edited 2 years ago)

Old news obviously, but I think it's worth documenting the organizations and dollar amounts

even Steven Pinker is coming out against EA now: https://twitter.com/sapinker/status/1732114240666743102

14
submitted 2 years ago by [email protected] to c/[email protected]
17
submitted 2 years ago by [email protected] to c/[email protected]

Image taken from this tweet: https://twitter.com/softminus/status/1732597516594462840

post title was this response: https://twitter.com/QuintusActual/status/1732615870613258694

Sadly the article is behind a paywall and I am loath to give Scott my money

15
submitted 2 years ago by [email protected] to c/[email protected]

I was wondering if someone here has a better idea of how EA developed in its early days than I do.

Judging by the link I posted, it seems like Yudkowsky used the term "effective altruist" years before Will MacAskill or Peter Singer adopted it. The link doesn't mention this explicitly, but Will MacAskill was also a lesswrong user, so it seems at least plausible that Yudkowsky is the true father of the movement.

I want to sort this out because I've noticed that a recently lot of EAs have been downplaying the AI and longtermist elements within the movement and talking more about Peter Singer as the movement's founder. By contrast the impression I get about EA's founding based on what I know is that EA started with Yudkowsky and then MacAskill, with Peter Singer only getting involved later. Is my impression mistaken?

[-] [email protected] 14 points 2 years ago

I still find it amusing that Siskind complained about being "doxxed" when he used his real first and middle name.

[-] [email protected] 12 points 2 years ago* (last edited 2 years ago)

I highly suspect the voice analysis thing was just to confirm what they already knew, otherwise it would have been like looking for a needle in a haystack.

People on twitter have been speculating that someone who knew him simply ratted him out.

[-] [email protected] 25 points 2 years ago* (last edited 2 years ago)

“Our goal is really to increase the scope and scale of civilization as measured in terms of its energy production and consumption,” h

old and busted: paperclip maximizer

new hotness: entropy maximizer

30
submitted 2 years ago by [email protected] to c/[email protected]

At various points, on Twitter, Jezos has defined effective accelerationism as “a memetic optimism virus,” “a meta-religion,” “a hypercognitive biohack,” “a form of spirituality,” and “not a cult.” ...

When he’s not tweeting about e/acc, Verdon runs Extropic, which he started in 2022. Some of his startup capital came from a side NFT business, which he started while still working at Google’s moonshot lab X. The project began as an April Fools joke, but when it started making real money, he kept going: “It's like it was meta-ironic and then became post-ironic.” ...

On Twitter, Jezos described the company as an “AI Manhattan Project” and once quipped, “If you knew what I was building, you’d try to ban it.”

[-] [email protected] 10 points 2 years ago* (last edited 2 years ago)

E/acc comes across to me as run-of-the-mill libertarianism dressed in sci-fi clothes, and the idea that it's in any way related to thermodynamics is bait.

[-] [email protected] 17 points 2 years ago

If anyone finds pictures of the wooden unaligned AI effigy they should post them.

view more: next ›

GorillasAreForEating

0 post score
0 comment score
joined 2 years ago