mirror of
https://github.com/TecharoHQ/anubis.git
synced 2025-08-04 10:18:17 -04:00
docs(blog/v1.20.0): how did CI not catch this?
Signed-off-by: Xe Iaso <me@xeiaso.net>
This commit is contained in:
parent
ff5991b5cf
commit
d47a3406db
@ -171,7 +171,7 @@ If the false positive rate of this challenge turns out to not be very high in pr
|
|||||||
|
|
||||||
Anubis was created because crawler bots don't respect [`robots.txt` files](https://www.robotstxt.org/). Administrators have been working on refining and crafting their `robots.txt` files for years, and one common comment is that people don't know where to start crafting their own rules.
|
Anubis was created because crawler bots don't respect [`robots.txt` files](https://www.robotstxt.org/). Administrators have been working on refining and crafting their `robots.txt` files for years, and one common comment is that people don't know where to start crafting their own rules.
|
||||||
|
|
||||||
Anubis now ships with a [`robots2policy` tool](http://localhost:3000/docs/admin/robots2policy) that lets you convert your `robots.txt` file to an Anubis policy.
|
Anubis now ships with a [`robots2policy` tool](/docs/admin/robots2policy) that lets you convert your `robots.txt` file to an Anubis policy.
|
||||||
|
|
||||||
```text
|
```text
|
||||||
robots2policy -input https://github.com/robots.txt
|
robots2policy -input https://github.com/robots.txt
|
||||||
|
Loading…
x
Reference in New Issue
Block a user