Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics.
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
Talked about this a few times over the last few weeks but here we go again...
I am teaching myself to write and had been using chatgpt for super basic grammar assistance. Seemed like an ideal thing, toss a sentence I was iffy about into it and ask it what it thought. After all I wasn't going to be asking it some college level shit. A few days ago I asked it about something I was questionable on. I honestly can't remember the details but it completely ignored the part of the sentence I wasn't sure about and told me something else was wrong. What it said was wrong was just....not wrong. The 'correction' it gave me was some shit a third grader would look at and say 'uhhhhh.....I'm gonna ask someone else now...'
That's because LLMs aren't intelligent. They're just parrots that repeat what they've heard before. This stuff being sold as an "AI" with any "intelligence" is extremely misleading and causing people to think it's going to be able to do things it can't.
Case in point, you were using it and trusting it until it became very obvious it was wrong. How many people never get to that point? How much has it done wrong before then? Etc.