Not only do we already have lots of ways of including structured metadata, but if you want to include directives about what should/shouldn't be scraped and by whom, we already have robots.txt.
In other words, there's no need to create an ai.txt when the robots.txt standard can just be extended.