I did my senior thesis/project in CS (we had to do several, it was anticlimactic) about visual programming, and basic paradigms that might be the future.
I ended up writing a missive about labview holding people back, because 2D planes suck at communicating information to people who otherwise read books and blogs and C# code.
My conclusion 15 years later is that we’ll talk to LLMs and their successors rather than invent a great graphical user interface that works like a desktop or a <table> or even a repl.
Star Trek may have inspired the ipad and terrible polygon capacitive touchscreens… but we all know that “Computer, search for M-class planets without fans of Nickelback’s second album living there as of stardate 2024” is already basically a reality.
EDIT: I like this CPU experiment too! It is a great example of the thing I’m talking about. Realized after the fact that I failed to plant my context in my comment, before doing my graybeard routine.
So. Food for thought, our LLM overlords are just unfathomable spreadsheets.
Natural language programming is not going to happen either because natural languages are to ambiguous. You can probably write code iteratively in a natural language in some kind of dialog clarifying things as ambiguities arise, but using that dialog as the source of truth and treating the resulting code as a derivative output sounds not very useful to me. So if I had to bet, I would bet that text based programming languages are not going anywhere soon.
Maybe one day there will be no code at all, everything will just contain small artificial brains doing the things we want them to do without anything we would recognize as a program today, but who knows and seems not really worth spaculating about to me.
In the nearer term I could see domain specific languages becoming a more prevalent thing. A huge amount of the code we write are technical details because we have to express even the highest level business logic in terms of booleans, integers and strings. If we had a dozen different languages tailored to different aspects of an application, we could write a lot less code.
We have this to a certain extent, a language for code in general, one for querying data, one for laying out the user interface, one for styling it. But they are badly integrated and not customizable. The problem is of course that developing good languages and evolving them is hard and lowering them into some base language is tedious work. But in principle I could imagine that progress is possible at this front and that this becomes practical.
Not saying graphical programming is a good idea, but the basic abstraction mechanism is to define new boxes, which you can look inside by opening (like some VHDL modeling tools do). Even in SICP it is said it is a bad idea and do not escalate. But is clear that the primitive is “the box” the means of combination, the lines between boxes, and mean of abstraction is making new boxes.
I thin the real problem is that there is exactly 1 primitive, one mean of abstraction, and one of combination, and that seems to not be enough.