for something to show up verbatim in the output of a textual AI model it needs to be an input many times.
I wonder if the problem is not copilot, but many people using this person's code without license or credit, and copilot being trained on those pieces of code as well. copilot may just be exposing a problem rather than creating one.
I don't know much about AI, and I don't use copilot.