This isn't a new policy and has been the case for at least a year.
GitHub Copilot is made with OpenAI's Codex model, a descendent of GPT-3 though.
Anyone putting anything into ChatGPT is taking a risk, or any third party tool really. Especially LLMs/GPTs because all AI models are like immutable datastores in some aspects. Once in, never getting out.
This also coincided with it being integrated directly in Windows. If there is a security issue, and I am sure there are many, this could be majorly problematic for business intel and confidentiality.