I would be genuinely positively surprised if that stops to be the case some day. This behavior is by design.
AS you put yourself, these LLM systems are very good at pattern recognition and reconstruction. They have ingested vast majority of the internet to build patterns on. On the internet, the absolutely vast majority of content is pushed out by novices and amateurs: "Hey, look, I have just read a single wikipedia page or attended single lesson, I am not completely dumbfounded by it, so now I will explain it to you".
LLMs have to be peak Dunning-Krugers - by design.