Xe Iaso b640c567da
feat(lib): ensure that clients store cookies (#501)
* feat(lib): ensure that clients store cookies

If a client is misconfigured and does not store cookies, then they can
get into a proof of work death spiral with Anubis. This fixes the
problem by setting a test cookie whenever the user gets hit with a
challenge page. If the test cookie is not there at challenge pass time,
then they are blocked. Administrators will also get a log message
explaining that the user intentionally broke cookie support and that this
behavior is not an Anubis bug.

Additionally, this ensures that clients being shown a challenge support
gzip-compressed responses by showing the challenge page at gzip level 1.
This level is intentionally chosen in order to minimize system impacts.

The ClearCookie function is made more generic to account for cookie
names as an argument. A correlating SetCookie function was also added to
make it easier to set cookies.

* chore(lib): clean up test code

Signed-off-by: Xe Iaso <me@xeiaso.net>

---------

Signed-off-by: Xe Iaso <me@xeiaso.net>
2025-05-16 13:03:40 -04:00
2025-05-02 19:15:05 +00:00
2025-03-17 19:33:07 -04:00
2025-03-19 09:10:29 -04:00
2025-03-17 19:33:07 -04:00
2025-05-06 14:07:55 +00:00
2025-05-13 10:02:42 -04:00
2025-05-09 12:24:23 -04:00
2025-05-05 10:52:02 -04:00

Anubis

A smiling chibi dark-skinned anthro jackal with brown hair and tall ears looking victorious with a thumbs-up

enbyware GitHub Issues or Pull Requests by label GitHub go.mod Go version language count repo size

Sponsors

Anubis is brought to you by sponsors and donors like:

Distrust Terminal Trove canine.tools Weblate

Overview

Anubis weighs the soul of your connection using a proof-of-work challenge in order to protect upstream resources from scraper bots.

This program is designed to help protect the small internet from the endless storm of requests that flood in from AI companies. Anubis is as lightweight as possible to ensure that everyone can afford to protect the communities closest to them.

Anubis is a bit of a nuclear response. This will result in your website being blocked from smaller scrapers and may inhibit "good bots" like the Internet Archive. You can configure bot policy definitions to explicitly allowlist them and we are working on a curated set of "known good" bots to allow for a compromise between discoverability and uptime.

In most cases, you should not need this and can probably get by using Cloudflare to protect a given origin. However, for circumstances where you can't or won't use Cloudflare, Anubis is there for you.

If you want to try this out, connect to anubis.techaro.lol.

Support

If you run into any issues running Anubis, please open an issue. Please include all the information I would need to diagnose your issue.

For live chat, please join the Patreon and ask in the Patron discord in the channel #anubis.

Star History

Star History Chart

Packaging Status

Packaging status

Contributors

Made with contrib.rocks.

Description
Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlers
Readme MIT 18 MiB
Languages
Go 87.4%
JavaScript 5.4%
Shell 4%
templ 1.9%
CSS 0.7%
Other 0.5%