zlacker

[parent] [thread] 0 comments
1. kmoser+(OP)[view] [source] 2023-05-10 15:57:03
Not only do we already have lots of ways of including structured metadata, but if you want to include directives about what should/shouldn't be scraped and by whom, we already have robots.txt.

In other words, there's no need to create an ai.txt when the robots.txt standard can just be extended.

[go to top]