this post was submitted on 21 Feb 2024
55 points (89.9% liked)
Asklemmy
43988 readers
785 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I’ve had the most luck with using ChatGPT for troubleshooting my existing code. I typically tend to lean more towards creative coding, and can provide it with my source code and a casual explanation of the issue and it can often explain how to manipulate things in a way I want.
I’ve relied on it a lot less for code generation and found it to be much more useful as a tutor for concepts that I can rework myself. I haven’t spent much time with Copilot since most of my projects are aiming for an uncommon goal.
Where I’ve found it to be less than useful in code generation is I’ll get caught in a loop where it’s trying an approach I’m not familiar with, so I feed it back the errors I’m getting and hoping it can solve it on its own, but it rarely is able to.
I don’t code professionally, but I’d probably hesitate to use it for anything used in production just based on what I’ve experienced.