Such a stupid title, great software!
Btw, how about limiting clicks per second/minute, against distributed scraping? A user who clicks more than 3 links per second is not a person. Neither, if they do 50 in a minute. And if they are then blocked and switch to the next, it’s still limited in bandwith they can occupy.
They make one request per IP. Rate limit per IP does nothing.
Ah, one request, then the next IP doing one and so on, rotating? I mean, they don’t have unlimited adresses. Is there no way to group them together to a observable group, to set quotas? I mean, in the purpose of defense against AI-DDOS and not just for hurting them.
The ars technica article: AI haters build tarpits to trap and trick AI scrapers that ignore robots.txt
AI tarpit 1: Nepenthes
AI tarpit 2: Iocaine
thanks for the links. the more I read of this the more based it is
Wait… I just had an idea.
Make a tarpit out of subtly-reprocessed copies of classified material from Wikileaks. (And don’t host it in the US.)
Nice one, but Cloudflare do it too.
The Arstechnica article in the OP is about 2 months newer than Cloudflare’s tool
Really cool