zlacker

[return to "I built my own 16-Bit CPU in Excel [video]"]
1. b33j0r+i9[view] [source] 2024-01-28 07:57:51
>>SushiH+(OP)
I am super disappointed in the lack of evolution of dataflow, but am encouraged to see things like airtable, and I guess blender and etc using node-based interfaces for functional logic.

I did my senior thesis/project in CS (we had to do several, it was anticlimactic) about visual programming, and basic paradigms that might be the future.

I ended up writing a missive about labview holding people back, because 2D planes suck at communicating information to people who otherwise read books and blogs and C# code.

My conclusion 15 years later is that we’ll talk to LLMs and their successors rather than invent a great graphical user interface that works like a desktop or a <table> or even a repl.

Star Trek may have inspired the ipad and terrible polygon capacitive touchscreens… but we all know that “Computer, search for M-class planets without fans of Nickelback’s second album living there as of stardate 2024” is already basically a reality.

EDIT: I like this CPU experiment too! It is a great example of the thing I’m talking about. Realized after the fact that I failed to plant my context in my comment, before doing my graybeard routine.

So. Food for thought, our LLM overlords are just unfathomable spreadsheets.

◧◩
2. danbru+2F[view] [source] 2024-01-28 13:14:54
>>b33j0r+i9
Graphical programming just does not work, it has been tried often enough. As soon as you step beyond toy examples, you need a hierarchical organization, functions calling functions calling functions. How do you represent that graphically? You put additional graphs side by side or allow some kind of drill down. Now all your graphs are pretty trivial and you have not gained much over a hand full lines of code but you have reduced the density a lot with all the space between nodes and all the arrows.

Natural language programming is not going to happen either because natural languages are to ambiguous. You can probably write code iteratively in a natural language in some kind of dialog clarifying things as ambiguities arise, but using that dialog as the source of truth and treating the resulting code as a derivative output sounds not very useful to me. So if I had to bet, I would bet that text based programming languages are not going anywhere soon.

Maybe one day there will be no code at all, everything will just contain small artificial brains doing the things we want them to do without anything we would recognize as a program today, but who knows and seems not really worth spaculating about to me.

In the nearer term I could see domain specific languages becoming a more prevalent thing. A huge amount of the code we write are technical details because we have to express even the highest level business logic in terms of booleans, integers and strings. If we had a dozen different languages tailored to different aspects of an application, we could write a lot less code.

We have this to a certain extent, a language for code in general, one for querying data, one for laying out the user interface, one for styling it. But they are badly integrated and not customizable. The problem is of course that developing good languages and evolving them is hard and lowering them into some base language is tedious work. But in principle I could imagine that progress is possible at this front and that this becomes practical.

◧◩◪
3. f1shy+uO[view] [source] 2024-01-28 14:16:56
>>danbru+2F
>> you need a hierarchical organization, functions calling functions calling functions. How do you represent that graphically?

Not saying graphical programming is a good idea, but the basic abstraction mechanism is to define new boxes, which you can look inside by opening (like some VHDL modeling tools do). Even in SICP it is said it is a bad idea and do not escalate. But is clear that the primitive is “the box” the means of combination, the lines between boxes, and mean of abstraction is making new boxes.

I thin the real problem is that there is exactly 1 primitive, one mean of abstraction, and one of combination, and that seems to not be enough.

◧◩◪◨
4. danbru+451[view] [source] 2024-01-28 16:12:09
>>f1shy+uO
What else could you have? Whatever you are building, you will always have some primitives and ways of combining them and that's practically it. To make things more manageable, you start abstracting, give names to things and refer to them by name instead of by their structure. The next level up would probably by parameterization, instead of having a name for a thing you have a name plus parameters for a family of things. Maybe before that you could get a bit more fancy with instantiation, allow things like repetition. But that again is pretty much it, make parameterized instantiation a bit more fancy and you will quickly create a Turing complete meta layer capable of generating arbitrary constructs in the layer we started with.
[go to top]