These days Apple is promoting "progressive disclosure" as a feature of Swift. HyperCard was an excellent example of that!
The web isn't. Doing these with the web requires at least a server, a programming language, and a database. Anyone can easily view a webpage, but you need a separate editing system to be able to easily create. Even then, there's probably no "View Source" for most of it. Without a big fancy editing system, there's a huge learning curve between "hello world in HTML" to sharing with your friend, creating a template shared between pages, or saving data between sessions.
Of course, that's modern web architecture. The "original web architecture" had no XHR, or even JS. It was 5 years before you could click somewhere on the screen and have it do anything other than "go to another HTML page".
Once the Javascript-coders infected the web it almost immediately turned into a write-only medium.
I worked at a browser vendor whilst all of this was happening. I was busy with the mechanics of building a particular browser. Couldn't make sense of the javascript/webstandards stuff in detail, but I trusted these people were smart enough. I did notice the people who were running this stuff (they were making the formerly kinda static html/css web standards dynamic by defining vast amounts of javascript API:s) were all very young people. Probably a median age of 21 or so. (Some "wunderkids" around 17 or so.) The average age of the people actually implementing stuff was closer to 28-30 in this company. Nothing against young people in general.. they can bring new thoughts etc.. but.. they do tend to lack experience.
In retrospect I regret that I didn't engage with this young crowd more. In general I wish there had been more discussion between these two camps.