115
submitted 5 days ago by [email protected] to c/[email protected]
top 32 comments
sorted by: hot top new old
[-] [email protected] 91 points 5 days ago

One of the worst things about coding is having to pick apart someone else broken code.

So why the fuck would I want to accelerate my work to THAT point?

You know why designers and PMs like AI code? Because they don't know what the fuck they are doing, they dont have to try and stitch that junk into 15 years of legacy code and they dont have to debug that shit.

"Actually Darkard, I ran this request into GPT and it came back with this? It's only short and most of it has already been done here, so I think your story point estimate is wrong?"

Fuuuuuck oooooooffffffff

[-] [email protected] 31 points 5 days ago

It's okay, bro. Let it out.

[-] [email protected] 27 points 5 days ago

I just need Gareth to stop telling me that he knows how to code when he thinks a git push is what you do when you want grandpa's inheritance early.

[-] [email protected] 9 points 5 days ago

I’d much rather pick apart my own broken code.

[-] [email protected] 8 points 5 days ago* (last edited 5 days ago)

is having to pick apart someone else broken code.

I agree, but also I do find that AI's broken code is generally waaay less annoying to pick apart than my colleagues' code. I'm not sure exactly why. Probably partly because it's better at commenting code and naming variables so it's easier to follow?

I think also partly it's because reviewing other people's code is usually done during code review, where you can't just directly edit the code to fix it - you have to start a conversation convincing them to do it differently. That's quite annoying and doesn't happen with AI generated code.

[-] [email protected] 11 points 5 days ago

I still don't understand why we're using humans to review AI code. Shouldn't AIs be reviewing the code?

We're letting AIs do the fun part (coding) and forcing humans to review (the worst part) reams more janky code.

[-] [email protected] 7 points 4 days ago

AI's get the fun part of everything right now. AI gets writing, humans get editing. AIs get drawing, humans get fixing hands and details, etc etc

[-] [email protected] 4 points 5 days ago

AI's aren't smart enough yet. But plenty of people are also using them to review code.

[-] [email protected] 1 points 5 days ago

What are the results if you take code written by one brand of ai and then have another brand of ai review it? Like use chatgpt to write code, and then ask copilot if the generated code has any errors and will work as intended?

[-] [email protected] 4 points 4 days ago

I don't know, if I put my hand in a fan and then put that mutilated hand into another brand of fan, do you think that might fix it?

[-] [email protected] 2 points 4 days ago

Interesting idea, I've never tried that. I feel like it wouldn't be a silver bullet but you might get slightly better results I guess.

[-] [email protected] 3 points 4 days ago

Huh, I feel the complete opposite. Human-written code follows some sort of causality, for example when you see complex code or a strange detail change, they had a reason to write that and they likely tried other solutions first. With AI-generated code it feels a lot more like I have to rate each changed line in isolation, which is exhausting.

But yeah, I don't know, we don't typically do code reviews. I've only been in that situation so far, when I had significantly more knowledge of the project and language, so there were rarely discussions beyond trying to teach them.

[-] [email protected] 1 points 5 days ago

I'm not a programmer so i don't know if this makes sense, but I wonder if it's easier to retool ai code because ai code is janky in a similar-ish way most of the time, while human code is janky in different ways all the time? Whadda ya think?

[-] [email protected] 5 points 5 days ago

I disagree with the premise.

AI is good at making things that LOOK right. Pictures. Words. Whatever. Actually makes errors harder to find IMO.

[-] [email protected] 2 points 5 days ago

Yeah definitely could be. I also think when AI gets things wrong it gets it so obviously wrong you have to delete it and do it yourself (and not worry about offending someone). It rarely seems to make the same kinds of trivial mistakes humans do (like copy/paste errors for example). It either does a pretty decent job that's easy to fix up, or it totally fails and you do it yourself.

[-] [email protected] 4 points 4 days ago* (last edited 4 days ago)

You know why designers and PMs like AI code? Because they don't know what the fuck they are doing,

I just want to highlight this for any designers or PMs reading along.

In the same breath, I want to invite my designer colleagues to try out this amazing designing script I wrote. It'll save them a ton of time, I bet. (This is sarcasm.)

I actually respect the difficultly of designers jobs.

Even while many of them don't respect the difficultly of mine.

Oh well. I'll get paid either way, in the end, because this shit all breaks when it's done wrong.

[-] [email protected] 37 points 4 days ago

My bosses keep trying to get me to do this. They don't understand that the work they give me is complicated and niche. The chatbots do not help at all, ever. It's shit.

I don't understand why these numbskulls pay me if they think my job is so easy that a fucking chatbot can do it.

[-] [email protected] 6 points 3 days ago

they want you to model your work processes so the AI can duplicate you.

[-] [email protected] 15 points 4 days ago

They need to see if it can be done before they fire you. But if you're not trying they can't start seeing. So you need to set least try for them to see.

[-] [email protected] 6 points 4 days ago* (last edited 4 days ago)

I don't understand why these numbskulls pay me if they think my job is so easy that a fucking chatbot can do it.

Call me paranoid, but it feels like having professionals consistently brute force the output from these systems into something usable would provide a rich resource to improve them to the point that they can.

[-] [email protected] 2 points 4 days ago

Yeah, every so often I do want to give it a try, to see what's possible, but I just immediately bounce off from trying to explain my task in a prompt. There's so much context one needs to be aware of, I'd have to write a whole essay into there.

And that's when I'm lucky, when I myself know what needs to be done. More often than not, I start implementing and then find out what the steps are. Finding that out upfront, so I could try to write it into a prompt, takes an insane amount of effort. And especially so, when I can't rely on the instructee thinking logically to catch when I made a mistake.

[-] [email protected] 12 points 4 days ago

Because some people think it's all about the prompts. If you get all of your highly talented, highly paid, irreplaceable people to write their jobs out in prompts so that the AI can do it. You can fire them and keep the prompts, then just have an entry level person use those same prompts to do the same work...

Of course, that's the dream, reality is one hallucination away

[-] [email protected] 13 points 5 days ago

Psycho manager.

[-] [email protected] 11 points 4 days ago

I tried to get chatgpt to help me write fvwmscript a week or so ago and literally all of the output it gave was hallucination, none of it was valid or even made any sense in context. It vaguely looked like fvwmscript but that was the only similarity it had to real code

[-] [email protected] 9 points 5 days ago

I used AI today for a short period. Then I gave it up because it kept giving me stupid ideas and I just did it myself. I have yet to be impressed by literally anything other than the sinister abilities of its uses as a surveillance tool, A tool in the US Nuclear Program and the dragnet. Even Palantir, Peter Thiel's little monster he created is inaccurate and costly. His little tool scrapes Twitter and it uses osint accounts and it ended up killing a bunch of meaningless civilians. The person who owned the OSINT account donated money to charity cause they felt guilty. So coming from just a business perspective, from fiscal realities, it is clear that most of this AI bullshit is just a bunch of buzzwords and nonsense. A way for big business to just lower the bar, lower standards crush small business and hollow out America. AI has yet to actually show me something that I find valuable. The bubble it gonna pop.

[-] [email protected] 7 points 4 days ago

I hope one day he will order sandwich from machine generated app and machine will put allergic ingredients in his sandwich and he will die so we fucking understand what are consequences of this fuckery with technology.

[-] [email protected] 1 points 3 days ago

AI: but sir you said no peanuts and tree nuts. This is just the peanuts. Have a good day!

[-] [email protected] 6 points 5 days ago

What is that "Warp"? I want to know to be sure not to use their software...

[-] [email protected] 5 points 5 days ago
[-] [email protected] 4 points 5 days ago

Hm, actually not the worst idea, but no way I would enter some AI-gibberish in the console.

[-] [email protected] 10 points 4 days ago

I don't understand why it's closed source, who would ever use a closed source terminal when there's a nearly infinite number of open ones

[-] [email protected] 0 points 4 days ago

Neat idea. Idiot CEO.

this post was submitted on 05 Jun 2025
115 points (99.1% liked)

Programming Circlejerk

203 readers
1 users here now

Community to talk about enlightened programming takes

Rules:

founded 4 months ago
MODERATORS