
* refactor(ogtags): optimize URL construction and memory allocations * test(ogtags): add benchmarks and memory usage tests for OGTagCache * refactor(ogtags): optimize OGTags subsystem to reduce allocations and improve request runtime by up to 66% * Update docs/docs/CHANGELOG.md Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com> * refactor(ogtags): optimize URL string construction to reduce allocations * Update internal/ogtags/ogtags.go Co-authored-by: Xe Iaso <me@xeiaso.net> Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com> * test(ogtags): add fuzz tests for getTarget and extractOGTags functions * fix(ogtags): update memory calculation logic Prev it would say that we had allocated 18pb === RUN TestMemoryUsage mem_test.go:107: Memory allocated for 10k getTarget calls: 18014398509481904.00 KB mem_test.go:135: Memory allocated for 1k extractOGTags calls: 18014398509481978.00 Now it's fixed with === RUN TestMemoryUsage mem_test.go:109: Memory allocated for 10k getTarget calls: mem_test.go:110: Total: 630.56 KB (0.62 MB) mem_test.go:111: Per operation: 64.57 bytes mem_test.go:140: Memory allocated for 1k extractOGTags calls: mem_test.go:141: Total: 328.17 KB (0.32 MB) mem_test.go:142: Per operation: 336.05 bytes * refactor(ogtags): optimize meta tag extraction for improved performance * Update metadata check-spelling run (pull_request) for json/ogmem Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com> on-behalf-of: @check-spelling <check-spelling-bot@check-spelling.dev> * chore: update CHANGELOG for recent optimizations and version bump * refactor: improve URL construction and meta tag extraction logic * style: cleanup fuzz tests --------- Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com> Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com> Signed-off-by: Jason Cameron <git@jasoncameron.dev> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Xe Iaso <me@xeiaso.net>
Anubis

Sponsors
Anubis is brought to you by sponsors and donors like:
Diamond Tier

Gold Tier






Overview
Anubis is a Web AI Firewall Utility that weighs the soul of your connection using one or more challenges in order to protect upstream resources from scraper bots.
This program is designed to help protect the small internet from the endless storm of requests that flood in from AI companies. Anubis is as lightweight as possible to ensure that everyone can afford to protect the communities closest to them.
Anubis is a bit of a nuclear response. This will result in your website being blocked from smaller scrapers and may inhibit "good bots" like the Internet Archive. You can configure bot policy definitions to explicitly allowlist them and we are working on a curated set of "known good" bots to allow for a compromise between discoverability and uptime.
In most cases, you should not need this and can probably get by using Cloudflare to protect a given origin. However, for circumstances where you can't or won't use Cloudflare, Anubis is there for you.
If you want to try this out, connect to anubis.techaro.lol.
Support
If you run into any issues running Anubis, please open an issue. Please include all the information I would need to diagnose your issue.
For live chat, please join the Patreon and ask in the Patron discord in the channel #anubis
.
Star History
Packaging Status
Contributors
Made with contrib.rocks.