376
submitted 2 months ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 5 points 2 months ago

The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can't always just cache every possible generated page at the same time.

[-] [email protected] -2 points 2 months ago* (last edited 2 months ago)

Of course you can. This is why people use CDNs.

Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.

this post was submitted on 20 Mar 2025
376 points (99.7% liked)

Open Source

37174 readers
61 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS