aside from the other comments here - robots.txt does work to some extent because it tells the crawler something it might be useful for the crawler to know - if you have blocked it from crawling part of yur site it might be actually beneficial to the crawler to follow that restriction (to be a good citizen) because if it doesn't you might block it by seeing a user agent showing up a part of the site it shouldn't.
AI.txt doesn't have this feedback to the AI to improve it. Also it seems likely users might have reason to lie.