I feel like both the article nor thr HN comments have actual experience with Anubis. Especially the recent posts from codeberg about their "AI" induced downtime show the clear need for it.
Hacker News
Posts from the RSS Feed of HackerNews.
The feed sometimes contains ads and posts that have been removed by the mod team at HN.
From the link inside the article:
At the time of writing, Anubis has 11,508 github stars. (blog author is using stars to approximate the number of Anubis deployments in the wild)
It looks like we can test about 2^21 every second, perhaps a bit more if we used both SMT sibling cores. This amount of compute is simply too cheap to even be worth billing for.
So (11508 websites * 2^16 sha256 operations) / 2^21, that’s about 6 minutes to mine enough tokens for every single Anubis deployment in the world. That means the cost of unrestricted crawler access to the internet for a week is approximately $0.
The point was, it's trivial. You'd be better off putting hidden links in the index page, and if something followed it, adding them to fail2ban.