Wait, no, Microsoft said the action was a temporary mistake. From the article you linked:
In a statement to CNBC, Microsoft said the ChatGPT temporary blockage was a mistake resulting from a test of systems for large language models.
“We were testing endpoint control systems for LLMs and inadvertently turned them on for all employees,” a spokesperson said. “We restored service shortly after we identified our error. As we have said previously, we encourage employees and customers to use services like Bing Chat Enterprise and ChatGPT Enterprise that come with greater levels of privacy and security protections.”