Solving the is-ought problem is super easy when you change what "ought" means.
sinedpick
even technical communities use discord now. It's SO fucking over.
The first footnote makes me want to give myself a lobotomy with a no. 2 pencil:
I wish to note here that Richard took this “as evidence that John would fail an intellectual turing test for people who have different views than he does about how valuable incremental empiricism is”. Of course I couldn’t just ignore an outright challenge to my honor like that, so I wrote a brief reply which Richard himself called “a pretty good ITT”.
If this guy doesn't masturbate his successful polemicizing every 1.5 paragraphs, he'll go into septic shock.
More like "People want things and hurt if they don't get them. Also, look at me saying things like utility function! Function is math! Math is smart. I am smart! Isn't that so cool?"
Phase 1: (January 2024) Sell to biohackers in Prospera for $20,000.
Huh? oh. This prospera. https://www.astralcodexten.com/p/prospectus-on-prospera
Does anyone have any updates on wtf is even going on there? Is it actually the libertarian utopia that I was promised? Or is it just a place for people with nice comfy remote jobs to exploit cheap Honduran labor to live large.
It would do you a lot of good to actually read about communism and political theory in general instead of acting as a conduit of brain rot.
Absent from this analysis: reduction in QALYs resulting from perfectly good men being denied a child bride by some evil meddling NGO. Will someone please think of the poor men?
I'm actually somewhat surprised that the rationalists didn't bring this up.
I have nothing interesting to say about the article, but I got a kick out of what orange site thinks:
https://news.ycombinator.com/item?id=38221178
Why does someone who writes great sci-fi suddenly have social capital to weigh in on industry and politics, two things firmly outside of his wheelhouse?
How absolutely dare someone comment about the perceived impact of their work?
but at what point did our hatred of capitalists (note: I don't hate capitalists) decide to overshadow our, you know, lifelong lust for the stars?
* pauses sentence to perform a quick act of fellatio *
Every time I read a technologist's screed against Musk or Bezos or Zuckerberg (three people whose combined lifetime works do not even scratch a fraction of the economic value incinerated by the US military in 40 weeks) all I can see is sour grapes and ad hominem.
Maybe take off your Musk-sperm-tinted glasses then?
These people did not create nor perpetuate the attributes of the dystopia you claim to reside in (that was the CIA). (It's also not actually a dystopia, or anything resembling one; ask any of the two billion people lifted out of dirt poverty (largely due to technology!) in the last three decades.)
No no no, it wasn't a system of misaligned incentives and lack of accounting of negative externalities that has created the dystopic world we live in today, it was the CIA! Wait, it's not actually a dystopia!
The old planet will go to hell in its own way from its own inhabitants. I'd rather live in space where it's safer. (Also, how cool would it be to escape before Earth is finally fully conquered? This would mean that humans as a species successfully avoid a total hierarchy.)
[The forces that are destroying the planet]
[The people trying to get to space]
They're the same picture.
This is hard for me to digest because I have a lot of respect for Karpathy. His famous article on the potential of LSTMs as language models absolutely blew my mind. The innovations that led to LLMs stand on his shoulders, so how can he have such a poor take here?
I had a long sneer typed out but my shitty Lemmy app deleted my draft. Fuck the orange site and it's sophists. The whole prison subthread is probably one of the most disgusting things I've read from that shit hole in recent memory.
ow fuck! my toe!
What happened with SBF will happen with an AI given a similar target, in terms of having misalignments that start out tolerable but steadily grow worse as capabilities increase and you face situations outside of the distribution, and things start to spiral to places very far than anything you ever would have intended.
Ah yes, one day someone will accidentally install the "I'm sorry, I can't let you do that Hal" plugin. Oops, I let the nuke launch AI override all of our control mechanisms, silly me!
I fucking hate x-risk people so much.
More details on the rust thing? I can't find it by searching keywords you mentioned but I must know.