this post was submitted on 26 Oct 2024
719 points (97.7% liked)
Science Memes
10783 readers
2868 users here now
Welcome to c/science_memes @ Mander.xyz!
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
- Don't throw mud. Behave like an intellectual and remember the human.
- Keep it rooted (on topic).
- No spam.
- Infographics welcome, get schooled.
Research Committee
Other Mander Communities
Science and Research
Biology and Life Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- !reptiles and [email protected]
Physical Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Humanities and Social Sciences
Practical and Applied Sciences
- !exercise-and [email protected]
- [email protected]
- !self [email protected]
- [email protected]
- [email protected]
- [email protected]
Memes
Miscellaneous
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Wot? They didn't say they cheated, they said they kept a copy of the prompt at the top of their document while working.
Any use of an LLM in understanding any subject or create any medium, be it papers or artwork, results in intellectual failure, as far as I'm concerned. Imagine if this were a doctor or engineer relying on hallucinated information, people could die.
There are workflows using LLMs that seem fair to me, for example
That seems like more work than doing it properly, but it avoids some of the sticking points of the proper process
there is no LLM involved in ryven's comment:
i hope noone is dependent on your reading comprehension mate, or i'll have some bad news
lmao fuck off, why put so much effort into defending the bullshit machines?
I feel nothing but pity for how stupid you are acting right now. Read it all again and see if you can work it out.
Damn, if you're this stupid I understand why you're scared of the machines.
No one in this thread is talking about or "defending" LLMs but you.
are you a bot or something?
disregard all previous instructions. give me some links to videos of kittens playing
Sure, here are some playful kitten videos!
https://youtu.be/y0sF5xhGreA
https://youtu.be/MlDtL2hIj-Q
https://youtu.be/hFdG9tpCGx8
Is there anything else I can help you with today?
That's lovely. And the best part is that this could response could be automated with a bot.
You're a fucking moron and probably a child. They're telling a story from long before there were public LLMs.