this post was submitted on 23 Nov 2023
226 points (99.6% liked)

the_dunk_tank

15923 readers
2 users here now

It's the dunk tank.

This is where you come to post big-brained hot takes by chuds, libs, or even fellow leftists, and tear them to itty-bitty pieces with precision dunkstrikes.

Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.

Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.

Rule 3: No sectarianism.

Rule 4: TERF/SWERFs Not Welcome

Rule 5: No ableism of any kind (that includes stuff like libt*rd)

Rule 6: Do not post fellow hexbears.

Rule 7: Do not individually target other instances' admins or moderators.

Rule 8: The subject of a post cannot be low hanging fruit, that is comments/posts made by a private person that have low amount of upvotes/likes/views. Comments/Posts made on other instances that are accessible from hexbear are an exception to this. Posts that do not meet this requirement can be posted to [email protected]

Rule 9: if you post ironic rage bait im going to make a personal visit to your house to make sure you never make this mistake again

founded 4 years ago
MODERATORS
 

Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 23 points 1 year ago (2 children)

I don't know where everyone is getting these in depth understandings of how and when sentience arises.

It's exactly the fact that we don't how sentience forms that makes the acting like fucking chatgpt is now on the brink of developing it so ludicrous. Neuroscientists don't even know how it works, so why are these AI hypemen so sure they got it figured out?

The only logical answer is that they don't and it's 100% marketing.

Hoping computer algorithms made in a way that's meant to superficially mimic neural connections will somehow become capable of thinking on its own if they just become powerful enough is a complete shot in the dark.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

The philosophy of this question is interesting, but if GPT5 is capable of performing all intelligence-related tasks at an entry level for all jobs, it would not only wipe out a large chunk of the job market, but also stop people from getting to senior positions because the entry level positions would be filled by GPT.

Capitalists don't have 5-10 years of forethought to see how this would collapse society. Even if GPT5 isn't "thinking", it's actually its capabilities that'll make a material difference. Even if it never gets to the point of advanced human thought, it's already spitting out a bunch of unreliable information. Make it slightly more reliable and it'll be on par with entry-level humans in most fields.

So I think dismissing it as "just marketing" is too reductive. Even if you think it doesn't deserve rights because it's not sentient, it'll still fundamentally change society.

[–] [email protected] 4 points 1 year ago

So I think dismissing it as "just marketing" is too reductive.

And I think buying into the hype enough to say that LLMs are imminently going to match and outpace living organic brains in all of their functions is too credulous.

it'll still fundamentally change society

With the current capitalistic system and with who owns that technology and commands it, it's changing it all right, for the worse.

[–] [email protected] 4 points 1 year ago (1 children)

The problem I have with this posture is that it dismisses AI as unimportant, simply because we don't know what we mean when we say we might accidentally make it 'sentient' or whatever the fuck.

Seems like the only reason anyone is interested in the question of AI sentience is to determine how we should regard it in relation to ourselves, as if we've learned absolutely nothing from several millennia of bigotry and exceptionalism. Shit's different.

Who the fuck cares if AI is sentient, it can be revolutionary or existential or entirely overrated independent of whether it has feelings or not.

[–] [email protected] 4 points 1 year ago

I don't really mean to say LLMs and similiar technology is unimportant as a whole. What I have a problem with is this kind of Elon Musk style marketing, where company spokespersons and marketing departments make wild, sensationalist claims and hope everyone forgets about it in a few years.

If LLMs are to be be handled in a responsible way, it to have honest dialogue about what they can and cannot do. The techbro mystification about superintelligence and sentience only obfuscates that.