Public information on the 'world wide web' should by nature be open and accessible to any agent in a neutral way. (Of course this is implied in Google's (bullshit, cough) mission statement.) Making information about what the agent is invisible as a principle from the start would have helped with that.
In reality, that vision was lost in the early 90s when the web went from being a proposed hypertext/document/information retrieval system to being mostly a presentation system for what started as magazine/leaflet/poster analogues ("websites") to which were added dynamic client/server web applications.
The difference in model is stark: in the former, the browser, even the user, makes decisions about the presentation of the content based on mostly structural information declared in the document. In the latter, the 'document' is not a document, but a program executed on the users computer.
And once you've made that transition, the "developer" of the "program" now expects more and more of the kinds of controls they get when they truly control the platform.
And it doesn't help that in the midst of this, mobile applications came on the scene, undermined the web completely, and changed expectations of how content should be made available. From that point on it became even more expected by companies and product managers that they control the whole sandbox. e.g. Meta couldn't even be bothered to launch Threads on web, probably precisely because they don't like the restrictions there, and having full control is so much more profitable to them, and they're not the first.
In any case, this all sucks. I've already personally switched to Firefox in most places, but the very fact that Google feels emboldened to push this tact says a lot about the state of the web and how this 30 year trajectory has gone.
In a way, I just hope the "www" dies and all of us who helped create this thing in the first place birth something new and better. But this is also hopelessly naive.