zlacker

[parent] [thread] 1 comments
1. helloj+(OP)[view] [source] 2023-05-16 18:25:48
I thought we got away from knowledge distribution embargos via 1A during the encryption era.

Even if it passed, I find it hard to believe a bunch of individuals couldn't collaborate via distributed training, which would be almost impossible to prohibit. Anyone could mask their traffic or connect to anon US VPN to circumvent it. The demand will be there to outweigh the risk.

replies(1): >>NavinF+xN
2. NavinF+xN[view] [source] 2023-05-16 22:47:06
>>helloj+(OP)
> distributed training

Unfortunately this isn't a thing. Eg too much batch norm latency leaves your GPUs idle. Unless all your hardware is in the same building, training a single model would be so inefficient that it's not worth it.

[go to top]