this post was submitted on 14 May 2025
942 points (96.4% liked)

Fuck AI

2841 readers
315 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

Source (Bluesky)

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 1 week ago (2 children)

I think your manager has a skill issue if his output is being badly formatted like that. I'd tell him to include a formatting guideline in his prompt. It won't solve his issues but I'll gain some favor. Just gotta make it clear I'm no damn prompt engineer. lol

[–] [email protected] 6 points 1 week ago (1 children)

I didn't think we should be using it at all, from a security standpoint. Let's run potentially business critical information through the plagiarism machine that Microsoft has unrestricted access to. So I'm not going to attempt to help make it's use better at all. Hopefully if it's trash enough, it'll blow over once no one reasonable uses it. Besides, the man's derided by production operators and non-kool aid drinking salaried folk He can keep it up. Lol

[–] [email protected] 0 points 1 week ago (1 children)

Okay, then self host an open model. Solves all of the problems you highlighted.

[–] [email protected] 4 points 1 week ago

Or, you know, don't use LLMs. That also solves all those problems too, costs less, and won't hallucinate your way into lawsuits or whatever.

[–] [email protected] 3 points 1 week ago (1 children)

Nobody is a "prompt engineer". There is no such job, for all practical purposes, and can't be one given that the degenerative AI pushers change their models more often than healthy people change their underwear.

[–] [email protected] 3 points 1 week ago

Right, I just don't want him to think that, or he'd have me tailor the prompts for him and give him an opportunity to micromanage me.