mirror of
https://github.com/TecharoHQ/anubis.git
synced 2025-08-03 09:48:08 -04:00

* feat: add robots2policy CLI utility to convert robots.txt to Anubis challenge policies * feat: add documentation for robots2policy CLI tool * feat: implement crawl delay handling as weight adjustment in Anubis rules * feat: add various robots.txt and YAML configurations for user agent handling and crawl delays * test: add comprehensive tests for robots2policy conversion and parsing * fix: update example URL in usage instructions for robots2policy CLI * Update metadata check-spelling run (pull_request) for json/robots2policycli Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com> on-behalf-of: @check-spelling <check-spelling-bot@check-spelling.dev> * docs: add crawl delay weight adjustment and deny user agents option to robots2policy CLI * Update cmd/robots2policy/main.go Co-authored-by: Xe Iaso <me@xeiaso.net> Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com> * Update cmd/robots2policy/main.go Co-authored-by: Xe Iaso <me@xeiaso.net> Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com> * fix(robots2policy): use sigs.k8s.io/yaml Signed-off-by: Xe Iaso <me@xeiaso.net> * feat(config): properly marshal bot policy rules Signed-off-by: Xe Iaso <me@xeiaso.net> * chore(yeetfile): expose robots2policy in libexec Signed-off-by: Xe Iaso <me@xeiaso.net> * fix(yeetfile): put robots2policy in $PATH Signed-off-by: Xe Iaso <me@xeiaso.net> * Update metadata check-spelling run (pull_request) for json/robots2policycli Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com> on-behalf-of: @check-spelling <check-spelling-bot@check-spelling.dev> * style: reorder imports * refactor: use preexisting structs in config * fix: correct flag check in main function * fix: reorder fields in AnubisRule struct for better alignment * style: improve alignment of struct fields in AnubisRule and OGTagCache * Update metadata check-spelling run (pull_request) for json/robots2policycli Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com> on-behalf-of: @check-spelling <check-spelling-bot@check-spelling.dev> * fix: add validation for generated Anubis rules from robots.txt * feat: add batch processing for robots.txt files to generate Anubis CEL policies * fix: improve usage message and error handling for input file requirement * refactor: update AnubisRule structure to use ExpressionOrList for improved expression handling * refactor: reorganize policy definitions in YAML files for consistency and clarity * fix: correct indentation in blacklist and complex YAML files for consistency * test: enhance output comparison in robots2policy tests for YAML and JSON formats * Revert "fix: improve usage message and error handling for input file requirement" This reverts commit ddcde1f2a326545d3ef2ec32e5e03f55f4f931a8. * fix: improve usage message and error handling in robots2policy Signed-off-by: Jason Cameron <git@jasoncameron.dev> --------- Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com> Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com> Signed-off-by: Xe Iaso <me@xeiaso.net> Signed-off-by: Jason Cameron <git@jasoncameron.dev> Co-authored-by: Xe Iaso <me@xeiaso.net>
72 lines
1.9 KiB
YAML
72 lines
1.9 KiB
YAML
- action: WEIGH
|
|
expression: "true"
|
|
name: robots-txt-policy-crawl-delay-1
|
|
weight:
|
|
adjust: 5
|
|
- action: CHALLENGE
|
|
expression: path.startsWith("/admin/")
|
|
name: robots-txt-policy-disallow-2
|
|
- action: CHALLENGE
|
|
expression: path.startsWith("/private/")
|
|
name: robots-txt-policy-disallow-3
|
|
- action: CHALLENGE
|
|
expression: path.startsWith("/api/internal/")
|
|
name: robots-txt-policy-disallow-4
|
|
- action: WEIGH
|
|
expression: userAgent.contains("Googlebot")
|
|
name: robots-txt-policy-crawl-delay-5
|
|
weight:
|
|
adjust: 5
|
|
- action: CHALLENGE
|
|
expression:
|
|
all:
|
|
- userAgent.contains("Googlebot")
|
|
- path.startsWith("/search/")
|
|
name: robots-txt-policy-disallow-6
|
|
- action: WEIGH
|
|
expression: userAgent.contains("Bingbot")
|
|
name: robots-txt-policy-crawl-delay-7
|
|
weight:
|
|
adjust: 5
|
|
- action: CHALLENGE
|
|
expression:
|
|
all:
|
|
- userAgent.contains("Bingbot")
|
|
- path.startsWith("/search/")
|
|
name: robots-txt-policy-disallow-8
|
|
- action: CHALLENGE
|
|
expression:
|
|
all:
|
|
- userAgent.contains("Bingbot")
|
|
- path.startsWith("/admin/")
|
|
name: robots-txt-policy-disallow-9
|
|
- action: DENY
|
|
expression: userAgent.contains("BadBot")
|
|
name: robots-txt-policy-blacklist-10
|
|
- action: WEIGH
|
|
expression: userAgent.contains("SeoBot")
|
|
name: robots-txt-policy-crawl-delay-11
|
|
weight:
|
|
adjust: 5
|
|
- action: DENY
|
|
expression: userAgent.contains("SeoBot")
|
|
name: robots-txt-policy-blacklist-12
|
|
- action: CHALLENGE
|
|
expression:
|
|
all:
|
|
- userAgent.contains("TestBot")
|
|
- path.matches("^/.*/admin")
|
|
name: robots-txt-policy-disallow-13
|
|
- action: CHALLENGE
|
|
expression:
|
|
all:
|
|
- userAgent.contains("TestBot")
|
|
- path.matches("^/temp.*\\.html")
|
|
name: robots-txt-policy-disallow-14
|
|
- action: CHALLENGE
|
|
expression:
|
|
all:
|
|
- userAgent.contains("TestBot")
|
|
- path.matches("^/file.\\.log")
|
|
name: robots-txt-policy-disallow-15
|