this post was submitted on 07 Aug 2023
57 points (96.7% liked)

Lemmy.ca Support / Questions

488 readers
1 users here now

Support / Questions specific to lemmy.ca.

For support / questions related to the lemmy software itself, go to [email protected]

founded 3 years ago
MODERATORS
 

Right now, robots.txt on lemmy.ca is configured this way

User-Agent: *
  Disallow: /login
  Disallow: /login_reset
  Disallow: /settings
  Disallow: /create_community
  Disallow: /create_post
  Disallow: /create_private_message
  Disallow: /inbox
  Disallow: /setup
  Disallow: /admin
  Disallow: /password_change
  Disallow: /search/
  Disallow: /modlog

Would it be a good idea privacy-wise to deny GPTBot from scrapping content from the server?

User-agent: GPTBot
Disallow: /

Thanks!

all 22 comments
sorted by: hot top controversial new old
[–] [email protected] 21 points 1 year ago

Yes, please.

We can't stop LLM developers from scraping our conversations if they're determined to do so, but we can at least make our wishes clear. If they respect our wishes, then great. If they don't, then they'll be unable to plead ignorance, and our signpost in the road (along with those from other instances) might influence legislation as it's drafted in the coming years.

[–] [email protected] 19 points 1 year ago (1 children)

I'm on board for this, but I feel obliged to point out that it's basically symbolic and won't mean anything. Since all the data is federated out, they have a plethora of places to harvest it from - or more likely just run their own activitypub harvester.

I've thrown a block into nginx so I don't need to muck with robots.txt inside the lemmy-ui container.

# curl -H 'User-agent: GPTBot' https://lemmy.ca/ -i
HTTP/2 403
[–] [email protected] 3 points 1 year ago

I imagine they rate limit their requests too so I doubt you'll notice any difference in resource usage. OVH is Unmetered* so bandwidth isn't really a concern either.

I don't think it will hurt anything but adding it is kind of pointless for the reasons you said.

[–] [email protected] 18 points 1 year ago (2 children)

Yes. Ban them.

if ($http_user_agent = "GPTBot") {
  return 403;
}
[–] [email protected] 6 points 1 year ago (3 children)

Probably want == instead else we will all be forbidden

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

I would have thought so too, but == failed the syntax check

2023/08/07 15:36:59 [emerg] 2315181#2315181: unexpected "==" in condition in /etc/nginx/sites-enabled/lemmy.ca.conf:50

You actually want ~ though because GPTBot is just in the user agent, it's not the full string.

[–] [email protected] 2 points 1 year ago

Strangely, = works the same as == with nginx. It's a very strange config format...

https://nginx.org/en/docs/http/ngx_http_rewrite_module.html#if

[–] [email protected] 1 points 1 year ago

Look at me! I'm the GPTBot now!

[–] [email protected] 4 points 1 year ago

Thanks for empowering my lazyness =)

[–] [email protected] 11 points 1 year ago

1000% yes. Please block them.

[–] [email protected] 8 points 1 year ago (1 children)

Is this even possible without all federated instances also prohibiting them?

[–] [email protected] 14 points 1 year ago

You take action where you can ;)

[–] [email protected] 6 points 1 year ago
[–] [email protected] 5 points 1 year ago (2 children)

Are they even respecting those files?

But yeah, sure, it's worth trying!

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)
[–] [email protected] 1 points 1 year ago

Worth trying for what reason?

[–] [email protected] 2 points 2 weeks ago

Yes, please prevent them from using our conversations.

[–] [email protected] 2 points 1 year ago (1 children)

Just out of curiosity, why is everyone so up in arms about this? I mean sure it's just another corp but any other reasons?

[–] [email protected] 6 points 1 year ago (1 children)

Server load spent on a bot scraping our contributions to be used to make money.

There's so much there that it's gonna offend someone.

[–] [email protected] 1 points 1 year ago

Wouldn't it just be scraped once (per company)? That doesn't sound like such a problem.

[–] [email protected] -2 points 1 year ago* (last edited 1 year ago)

No, definitely not. Our work posted in the open is done so because we want it to be open!

It is understandable that not all work wants to be open, but access would already be appropriately locked down for all robots (and humans!) who are not a member of the secret club in those cases. There is no need for special treatment here.