• MonkderVierte@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 hours ago

    Btw, how about limiting clicks per second/minute, against distributed scraping? A user who clicks more than 3 links per second is not a person. Neither, if they do 50 in a minute. And if they are then blocked and switch to the next, it’s still limited in bandwith they can occupy.

      • MonkderVierte@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        9 hours ago

        Ah, one request, then the next IP doing one and so on, rotating? I mean, they don’t have unlimited adresses. Is there no way to group them together to a observable group, to set quotas? I mean, in the purpose of defense against AI-DDOS and not just for hurting them.