215
submitted 2 weeks ago* (last edited 2 weeks ago) by [email protected] to c/[email protected]

It's impossible, i got this instance to just see lemmy from my own instance, but no, it was slow as hell the whole week, i got new pods, put postgres on a different pod, pictrs on another, etc.

But it was slow as hell. I didn't know what it was until a few hours before now. 500 GETs in a MINUTE by ClaudeBot and GPTBot, wth is this? why? I blocked the user agents, etc, using a blocking extension on NGINX and now it works.

WHY? So google can say that you should eat glass?

Life is now hell, if before at least someone could upload a website, now even that is painfull.

Sorry for the rant.

all 35 comments
sorted by: hot top new old
[-] [email protected] 90 points 2 weeks ago* (last edited 2 weeks ago)

You can enable Private Instance in your admin settings, this will mean only logged in users can see content. This will prevent AI scrapers from slowing down your instance as all they'll see is an empty homepage, so no DB calls. As long as you're on 0.19.11, federation will still work.

[-] [email protected] 43 points 2 weeks ago

Enabled, thanks for the tip!

[-] [email protected] 53 points 2 weeks ago

At some point they're going to try to evade detection to continue scraping the web. The cat and mouse game continues except now the "pirates" are big tech.

[-] [email protected] 33 points 2 weeks ago* (last edited 2 weeks ago)

They already do. ("They" meaning AI generally, I don't know about Claude or ChatGPT's bots specifically). There are a number of tools server admins can use to help deal with this.

See also:

[-] [email protected] 13 points 2 weeks ago

these solutions have the side effect of making the bots stay on your site longer and generate more traffic. it's not for everyone.

[-] [email protected] 2 points 2 weeks ago

https://zadzmo.org/ is dead already and arstechnica is writing about them so...

[-] [email protected] 39 points 2 weeks ago

Use Anubis. That's pretty much the only thing you can do against bots that they have no way of circumventing.

[-] [email protected] 15 points 2 weeks ago

Yeah, going to install it this week, but the nginx extension seemed to solve the issue.

[-] [email protected] 17 points 2 weeks ago

Patience, AI crash bubble burst will be soon.

[-] [email protected] 8 points 2 weeks ago
[-] [email protected] 2 points 2 weeks ago* (last edited 2 weeks ago)

It won't crash soon, sorry Charlie. Maybe in like 2 - 5 years, but honestly I don't think there will ever be a "crash", just less ai buzzwords in everything

[-] Mwa 14 points 2 weeks ago* (last edited 2 weeks ago)

You can either use Cloudflare(proprietary) or anubis (Foss)

[-] [email protected] -1 points 2 weeks ago
[-] Mwa 9 points 2 weeks ago
[-] [email protected] 3 points 2 weeks ago

Because it harms marginalized folks' ability to access content while also letting evil corp (and their fascist government) view (and modify) all encrypted communication with your site and its users.

It's bad.

[-] [email protected] 1 points 2 weeks ago

For clarity, you are referring to Cloudflare and not anaubis?

[-] [email protected] 1 points 2 weeks ago

I am referring to cf, but I would expect anaubis would be the same if it provides DoS fronting

[-] [email protected] 4 points 2 weeks ago

Anubis work in a very different way than cloudflare

[-] [email protected] 1 points 2 weeks ago

How well does it work in tor browser in strict mode?

[-] [email protected] 9 points 2 weeks ago

So I just had a look at your robots.txt:

User-Agent: *
  Disallow: /login
  Disallow: /login_reset
  Disallow: /settings
  Disallow: /create_community
  Disallow: /create_post
  Disallow: /create_private_message
  Disallow: /inbox
  Disallow: /setup
  Disallow: /admin
  Disallow: /password_change
  Disallow: /search/
  Disallow: /modlog
  Crawl-delay: 60

You explicitly allow searching your content by bots... That's likely one of the reasons why you get bot traffic.

[-] [email protected] 7 points 2 weeks ago

AI crawlers ignore robots.txt. The only way to get them to stop is with active counter measures.

[-] [email protected] 9 points 2 weeks ago

Cloudflare has pretty good protection against this, but I totally understand not wanting to use Cloudflare

[-] [email protected] 4 points 2 weeks ago* (last edited 2 weeks ago)

Just cache. Read only traffic should add negligible load to your server. Or you're doing something horribly wrong

[-] [email protected] 5 points 2 weeks ago

They are 1 cpu and 1 gb of ram pods, postgres goes to 100% cpu on 500 requests per minute, after i put the NGINX extension, it reduced to at max 10%. On weaker servers, these bots make hell on earth, not the config.

[-] [email protected] 5 points 2 weeks ago

If it's hitting postgres it's not hitting the cache. Do you have a caching reverse proxy in front of your web application?

[-] [email protected] 1 points 2 weeks ago* (last edited 2 weeks ago)

I don't have a cache, but the problem is solved now, i can browse lemmy haha.

[-] [email protected] 5 points 2 weeks ago

The nginx instance you have in front of your app can perform caching and avoid hitting your app. The advantage is that it will improve performance even against the most stealthy of bots, including those that don't even exist yet. The disadvantage is that the AI scum get what they want.

[-] [email protected] 2 points 2 weeks ago

Oh, cool. I'm going to look at it!

[-] [email protected] 2 points 2 weeks ago

If that doesn't work for you, also look at varnish and squid.

[-] [email protected] 1 points 2 weeks ago

Load should be near zero for reads.

[-] [email protected] 1 points 2 weeks ago

I highly recommend using Anubis as a proxy for your entire instance. It's a little complicated to get going, but it stops any and all AI scrapers with a denial of access. Having a robots.txt works, but only so much, because some of these bots do not respect it. And, honestly, with the way Sam Altman talks about the people he's stolen and scraped from, I don't think anyone should be surprised.

But, I have Anubis running on my personal website and I've tested to see if ChatGPT can see it, and it cannot. Good enough for me

this post was submitted on 08 Jun 2025
215 points (98.2% liked)

Fuck AI

3238 readers
871 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS