this post was submitted on 09 Nov 2023
9 points (100.0% liked)
TechTakes
1430 readers
125 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
relatedly, I recently had to work on a project in a language I've barely touched before. also happened to notice that apparently I have free copilot "because of my open source contributions" (??? whatever)
so I tried it out
it was one of the usecases people keep telling me it's ideal for. unfamiliar territory! a new language! something I have little direct experience in but could possibly navigate by using some of my other knowledge and using this to accelerate!
well, it managed to perform just about exactly to my expectations! I didn't really try hard to take down numbers but I'd say easily 30%+ simple (and possibly subtle?) errors, and suggestions being completely wrong 60%+. the mechanic for the latter is the same bullshit as they push with prompts, "choose the one you like best" > "cycle through the completion suggestions"
observed things like logic inversions and incorrect property references, both of which are things I don't know whether a learning-to-program person someone using it in the "this is magical! I can just make it type code for me!" sense would be able to catch without some amount of environment tooling or zen debugging (and the latter only if/when they get into the code reading mindset). at multiple points, even when I provided it with extremely detailed prompts about generation, it would just fail to synthesise working biz logic for it
and that was all for just simple syntax and language stuff. I didn't even try to do things with libraries or such. I'm gonna bet that my previous guesses are also all fairly on point
all in all: underwhelming. I remain promptdubious.
Yeah, that always perplexed me. Copilot is a terrible tool if you're using it in a domain you aren't already proficient in, because it will mess up. And even worse, it'll mess up in subtle ways that most people wouldn't do themselves. You need to know what you're doing to babysit it and ensure it doesn't fuck everything up.