I would argue that ChatGPT has opinions, and these opinions are based on it's training data. I don't think GPT has the type of reasoning skills needed to detect and resolve conflicts in its inputs, but it does hold opinions. It's a bit hard to tell because it can easily be swayed by a changing prompt, but it has opinions, it just doesn't hold strong ones.
The only thing stopping GPT from ingesting new information and forming opinions about it is that it is not being trained on new information (such as its own interactions).