Even if it passed, I find it hard to believe a bunch of individuals couldn't collaborate via distributed training, which would be almost impossible to prohibit. Anyone could mask their traffic or connect to anon US VPN to circumvent it. The demand will be there to outweigh the risk.
Unfortunately this isn't a thing. Eg too much batch norm latency leaves your GPUs idle. Unless all your hardware is in the same building, training a single model would be so inefficient that it's not worth it.