Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anubis isn’t really about reclaiming the public internet, though: it’s about excluding some internet users. It has its reasons, of course, but it’s fundamentally about making the internet not a commons.


From my perspective, anubis (and iocaine etc) is about keeping misbehaving load generators from suppressing small-scale "classic internet" sites. So yeah, it's exclusionary, like keeping semi trucks from taking shortcuts through a schoolyard.


It would probably be better if it excluded problematic behavior rather than excluding classes of clients.


Sure, but there's a very high correlation between classes of client and problematic behavior, and said classes of clients go to great lengths to mask their problematic behavior.


For the purposes of analogy, imagine the commons as a park where you can have a picnic or play frisbee, and AI crawlers are people ripping it rip on their dirt bikes.


While I get your point about Anubis excluding some users, its purpose is to protect the "commons" from those who would abuse and destroy them: big tech crawlers that do not respect robots.txt files.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: