Here's an example of a site that was retroactively excluded by webmaster request, but was later (forced?) to remain in the searchable archive:
http://blog.archive.org/2018/04/24/addressing-recent-claims-...
edit: to answer my own question, seems like retroactive exclusion has, at least since 2007, not been interpreted to be a mandate for actual data removal:
https://archive.org/post/133690/robotstxt-only-gives-tempora...
> Hello, I want all old content from immortal ia.com REMOVED permanently from The Wayback Machine. So I read the exclusion policy, placed up a robots.txt file and requested the Alexa bot go to my website. Then checking The Wayback Machine, I got a notice that the site was blocked by the robots.txt file. But after I removed the robots.txt file, the archived pages reappeared. Is there a way to permanently purge all old pages of a website so that they will NEVER reappear in The Wayback Machine? Am I obligated to keep the robots.txt file in place forever?
Even spam stays in.