swlabr

joined 2 years ago
[–] [email protected] 7 points 1 year ago

I absolutely agree.

[–] [email protected] 10 points 1 year ago

RE: your original comment. Reminded me of this LW post from half a year ago (discussed here)

RE: the followup edit. God, that is sad, and par for the course. Removed from context, I resonate with the youthful hopefulness of thinking you'll change the world, followed by the slightly less youthful hopelessness that changing the world in any meaningful way is much harder than what was quoted. Staying in the orbit of LW, NRx and other right/far right corners of the blagosphere is definitely not setting oneself up for success.

Also yes in their attempts to moderate and elevate their level of discourse, they've hamstrung themselves in many ways, least of all in being able to tell this dude to stop and get some help. It's like 10% of why they seem so humorless, self-serious, and unable to change (the last 90% is because they are humorless, self-serious, and unable to change)

[–] [email protected] 17 points 1 year ago (1 children)

Reasoning about future AIs is hard

“so let’s just theorycraft eugenics instead” is like 50% of rationalism.

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago) (3 children)

I hate this phrase but this is “saying the quiet part out loud” in action.

[–] [email protected] 20 points 1 year ago* (last edited 1 year ago)

Nothing is sacred to anyone who is willing to consume or make this kinda thing. 100% of showcases of AI capability are just AIs copying something humans do. Sometimes it’s chess. Other times it’s copying Monet, of Van Gogh, or in this case, Carlin.

This is exactly the kind of thing that the WGA was striking against and what big media corporations want to have happen. As shown by some of the comments in this thread there are people that are absolutely fine with facsimile as art. It’s all bad and I hate it. I especially hate how nostalgia for the classics is gonna drive this.

[–] [email protected] 10 points 1 year ago* (last edited 1 year ago)

Best case, in their inability to recognise that they already in a cult, they create a schism that eats up the rat community once and for all. Unfortunately rats seem to be schism resistant tho.

I mean what’s a sex cult without a little sects?

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (3 children)

Ugh.

In order to make this kind of thing, here’s what you’ve gotta do:

  1. Write a bunch of jokes in the style of george carlin
  2. Record someone performing the jokes with similar delivery to Carlin
  3. Train an AI on carlin’s voice
  4. Use the AI to deepfake the voice into carlin.

3 and 4 are just a matter of time with any celebrity.

Given that carlin is one of the most influential standups OAT, a lot of comedians are already doing 1 and 2, in some sense. I’d love to romanticise comedy and say that most of the people doing this are doing it to hone their craft, rather than do this kind of cheap shit, however that’s not the case. I love comedy and listen to a lot of it to know that if comedians aren’t being hacky or cringe, they are probably doing the most soul crushing, cynical, humiliating things in the name of humor instead. So this is kind of par for the course.

[–] [email protected] 7 points 1 year ago

This is a great article, not only as a primer on the ways in which we think about the future of technology are flawed, but also as a nuanced approach to speculation that is pessimistic but not doomful (oops just invented a word). Thanks for finding and posting.

[–] [email protected] 3 points 1 year ago

24b, sort of. The problem came down to “hey do you remember how to do linear algebra?” and the answer was: dawg, I barely know normal algebra. I had solved the vibes of the problem but none of the details, though.

[–] [email protected] 3 points 1 year ago (2 children)

Sorry for the necropost: I have completed all the problems! One of them completely stumped me and I had to cheat. Not going to do a writeup unless requested :)

[–] [email protected] 11 points 1 year ago

Sociological Claim: the extent to which a prominence-weighted sample of the rationalist community has refused to credit the Empirical or Philosophical Claims even when presented with strong arguments and evidence is a reason to distrust the community’s collective sanity.

Zack my guy you are so fucking close. Also just fucking leave.

[–] [email protected] 13 points 1 year ago (1 children)

How I sorta think about it, which might be a bit circular. I think the long content is a gullibility filter of two kinds. First, it selects for people who are willing to slog through all of it and eat it up, and defend their choice in doing so. Second, it’s gonna select people who like the broad strokes ideas, who don’t want to read all the content, but are able to pretend as if they had.

The first set of people are like scientologists sinking into deeper and deeper levels of lore. The second group are the actors in the periphery of scientology groups trying to network.

view more: ‹ prev next ›