Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Exactly. I don't understand what computation you can afford to do in 10 seconds on a small number of cores that bots running on large data centers cannot


The point of anubis isn't to make the scraping impossible, but make it more expensive.


by how much? I don't understand the cost model here at all.


AIUI the idea is to ratelimit each "solution". A normal human's browser only needs to "solve" once. A LLM crawler either needs to slow down (= objective achieved) or solve the puzzle n times to get n × the request rate.


lets say that that adding Anubis does the job of adding 10 seconds of extra compute for the bot when it tries to access my website. Will this be enough to deter the bot/scraper?


Empirical evidence appears to show that it is ¯\_(ツ)_/¯




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: