this post was submitted on 02 Nov 2025
5 points (100.0% liked)

Hacker News

2918 readers
774 users here now

Posts from the RSS Feed of HackerNews.

The feed sometimes contains ads and posts that have been removed by the mod team at HN.

founded 1 year ago
MODERATORS
top 2 comments
sorted by: hot top controversial new old
[–] mormund@feddit.org 1 points 3 days ago (1 children)

I feel like both the article nor thr HN comments have actual experience with Anubis. Especially the recent posts from codeberg about their "AI" induced downtime show the clear need for it.

[–] ikidd@lemmy.world 1 points 2 days ago* (last edited 2 days ago)

From the link inside the article:

At the time of writing, Anubis has 11,508 github stars. (blog author is using stars to approximate the number of Anubis deployments in the wild)

It looks like we can test about 2^21 every second, perhaps a bit more if we used both SMT sibling cores. This amount of compute is simply too cheap to even be worth billing for.

So (11508 websites * 2^16 sha256 operations) / 2^21, that’s about 6 minutes to mine enough tokens for every single Anubis deployment in the world. That means the cost of unrestricted crawler access to the internet for a week is approximately $0.

The point was, it's trivial. You'd be better off putting hidden links in the index page, and if something followed it, adding them to fail2ban.