[-] [email protected] 1 points 2 months ago

We should be trying to stop this from coming to pass with the urgency we would try to stop a killer asteroid from striking Earth. Why aren’t we?

Wait, what are we trying to stop from coming to pass? Superintelligent AIs? Either I'm missing his point, or he really agrees with the doomers that LLMs are on their way to becoming "superintelligent".

[-] [email protected] -1 points 1 year ago

I wouldn't know anything about the thread, as it's impossible for me to read without a twitter account. Yet another reason why the site is trash.

But by all means, go on generating content for a bigoted fascist who will use everything you write to increase engagement with his platform and give it undeserved credibility and $$$.

(And no, I won't block you.)

[-] [email protected] 0 points 2 years ago

One of the easiest ways to get downvoted on the orange site is to say anything even mildly critical of Scott Alexander Siskind. It's really amusing how much respect there is for him there.

[-] [email protected] 1 points 2 years ago

I see him more as a dupe than a Cassandra. I heard him on a podcast a couple months ago talking about how he's been having conversations with Bay Area AI researchers who are "really scared" about what they're creating. He also spent quite a bit of time talking up Geoffrey Hinton's AI doomer tour. So while I don't think Ezra's one of the Yuddite rationalists, he's clearly been influenced by them. Given his historical ties to effective altruism, this isn't surprising to me.

[-] [email protected] 1 points 2 years ago

My attention span is not what it used to be, and I couldn't force myself to get to the end of this. A summary or TLDR (on the part of the original author) would have been helpful.

What is it with rationalists and their inability to write with concision? Is there a gene for bloviation that also predisposes them to the cult? Or are they all just mimicking Yud's irritating style?

[-] [email protected] 0 points 2 years ago

What's it like to be so good at PR?

[-] [email protected] 0 points 2 years ago

That reminds me. If the world is about to FOOM into a kill-all-humans doomscape, why is he wasting time worrying about seed oils?

[-] [email protected] 0 points 2 years ago

Random blue check spouts disinformation about "seed oils" on the internet. Same random blue check runs a company selling "safe" alternatives to seed oils. Yud spreads this huckster's disinformation further. In the process he reveals his autodidactically-obtained expertise in biology:

Are you eating animals, especially non-cows? Pigs and chickens inherit linoleic acid from their feed. (Cows reprocess it more.)

Yes, Yud, because that's how it works. People directly "inherit" organic molecules totally unmetabolized from the animals they eat.

I don't know why Yud is fat, but armchair sciencing probably isn't going to fix it.

[-] [email protected] 1 points 2 years ago* (last edited 2 years ago)

Now that his alter ego has been exposed, Hanania is falling back on the "stupid things I said my youth" chestnut. Here's a good response to that.

[-] [email protected] 1 points 2 years ago* (last edited 2 years ago)

In theory, a prediction market can work. The idea is that even though there are a lot of uninformed people making bets, their bad predictions tend to cancel each other out, while the subgroup of experts within that crowd will converge on a good prediction. The problem is that prediction markets only work when they're ideal. As soon as the bettor pool becomes skewed by a biased subpopulation, they stop working. And that's exactly what happens with the rationalist crowd. The main benefit rationalists obtain from prediction markets and wagers is an unfounded confidence that their ideaas have merit. Prediction markets also have a long history in libertarian circles, which probably also helps explain why rationalists are so keen on them.

view more: ‹ prev next ›

TinyTimmyTokyo

0 post score
0 comment score
joined 2 years ago