this post was submitted on 21 Mar 2025
209 points (99.1% liked)
Linux
6639 readers
368 users here now
A community for everything relating to the GNU/Linux operating system
Also check out:
Original icon base courtesy of [email protected] and The GIMP
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Most bots and scrapers from what I've seen already are using (headless) full browsers, and hence are executing javascript, so I think anything that slows them down or increases their cost can reduce the traffic they bring.
Source? I strongly disagree, and it's not hard to change your browser characteristics to get a new canvas fingerprint every time, some browsers like firefox even have built-in options for it.
@refalo @sudo If Proof of Work gets widely adopted I foresee a future where bot running data-centers can out-compute humans to visit sites, while old devices of users in poorer countries struggle to compute the required task for hours … Or is that fear misguided?