they "steal" access to data because the LLM launders it on the other end
Llms know the contents of books because they are analyzed, reviewed and spoken about everywhere. Pick some obscure book that doesn't show up on any social media and ask about it's contents. GPT won't have a clue
What's your evidence contrary to this? Sounds like your common sense rather than inside knowledge
It is harder to prove to a "should have known" standard compared to say buying stolen speakers from the back of a truck for 20% of the list price.