this post was submitted on 28 Sep 2024
126 points (100.0% liked)

TechTakes

1493 readers
105 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

daniellamyoung_3h

Unpopular opinion: you only hate chat gpt because it makes it harder to stack rank and discriminate against people.

So what everyone can write well now? great it's a tool! Just like moving faster because you drive a car.

The good news is you'll be easily able to hire for that writing job you need. The bad news is you won't be able to discriminate against candidates who are not as good with the written word.

Also, an obsession with the written word is a tenant of white supremacy [salute emoji]

Ian Rennie
‪@theangelremiel.bsky.social‬

Man, this probably hits really hard if you're fuckin stupid.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 14 points 3 months ago (1 children)

Okay, show me a system that was only trained on data given with explicit permission and hopefully compensation and I'll happily be fine with it.

But that isn't what these capitalists, tech obsessives etc they have done. They take take take and give nothing back.

They do not understand nor care about consent, that's the crux of the issue.

I couldn't care less if all the training data was consensual.

[–] [email protected] 15 points 3 months ago (1 children)

But even if it there was an LLM that used only ethical sources it would still need massive amounts of energy for training and using so until we're 100% renewable and the whole world gets as much of that energy as they need ...

[–] [email protected] 7 points 3 months ago

This is fair. I was more responding to the person in the picture's point that we care more that other people who don't have the skills or perhaps ability to write can now when no, that's not really the problem.

But you do raise a good point.