zlacker

[parent] [thread] 0 comments
1. bigbil+(OP)[view] [source] 2023-12-27 16:09:14
> But how that can be possible for an LLM?

They should have thought of that before they went ahead and trained on whatever they could get.

Image models are going to have similar problems, even if they win on copyright there's still CSAM in there: https://www.theregister.com/2023/12/20/csam_laion_dataset/

[go to top]