this post was submitted on 29 Jun 2023
250 points (100.0% liked)
Reddit Migration
16 readers
2 users here now
### About Community Tracking and helping #redditmigration to Kbin and the Fediverse. Say hello to the decentralized and open future. To see latest reeddit blackout info, see here: https://reddark.untone.uk/
founded 1 year ago
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Excuse my ignorance, but how were you able to recognize the bots?
The repost bots were fairly easy to spot, but I sadly never found a situation like the one you're describing. I don't use reddit anymore, but the information may be useful elsewhere.
Not the guy you were asking, but the ones I found were blatantly obvious because they would copy and reword info specific to the user they stole it from. Like "as a conservative, I wholly support what Ron DeSantis is doing in Florida" changed to "as an unwed teen-aged mother ..." kind of thing. Eventually though the bots are gonna get too good to spot I bet
It's a bit like finding a single thread and unravelling it.
I used to get dozens of these things banned a day, there were a lot of us bot hunters reporting bots.
They sometimes sound "off", stop in mid sentence, reply to people as if they think it's the OP, reply as if they are OP, or post 💯 by itself. Or they have a username that fits a recent bot pattern (e.g. appending "rp" to existing usernames)
.
If you see one slip up once, then looking at its other comments will often lead you to new bots simply because they are all attracted to the same positions (prominent but a few comments deep).
Certain subs like AITA and r/memes are more prone to them so I would go there for easy leads.
Also if you check its actual submissions, a karma laden bot will often repost hobby content, then have a second bot come and claim to have bought a t shirt or mug with that content and post a malicious link. Then a third bot will pose as another redditor saying thanks I just ordered one to the second bot. Following those bots leads you to even more bots, etc.
@XiELEd copying you in here.
It makes you wonder if a ChatGPT bot could be automated to flag all these accounts. I'm sure that Reddit could have tagged and deleted the lot of them if they wanted to.
There must be such a lot of them. Accounts get sold on third party websites.
Christ. This is not the future I envisioned.
To add to what other people said: As a casual user who didn't go deliberately looking for bots, I mostly caught them when they posted a comment that was a complete non sequitur to the comment they replied to, like they were posted in the wrong thread. Which, well, is because they were--they were copied from elsewhere in the comment section and randomly posted as a reply to a more prominent thread. Ctrl+F came in very handy there. (They do sometimes reword things, but generally only a couple of words, so searching for bits and pieces of their comment still usually turns up results.)
Also, the bot comments I caught were usually just a line or two, not entire paragraphs, even if they were copied from a longer comment.
The past year or so, they've been in every single thread with more than 50 comments. If you expand the comments and do a little ctrl+f searching, you'll see how they copy comments from users and then repost and have their fellow bots upvote them for visibility. Look at the timestamps on the posts.