zlacker

[return to "I built my own 16-Bit CPU in Excel [video]"]
1. b33j0r+i9[view] [source] 2024-01-28 07:57:51
>>SushiH+(OP)
I am super disappointed in the lack of evolution of dataflow, but am encouraged to see things like airtable, and I guess blender and etc using node-based interfaces for functional logic.

I did my senior thesis/project in CS (we had to do several, it was anticlimactic) about visual programming, and basic paradigms that might be the future.

I ended up writing a missive about labview holding people back, because 2D planes suck at communicating information to people who otherwise read books and blogs and C# code.

My conclusion 15 years later is that we’ll talk to LLMs and their successors rather than invent a great graphical user interface that works like a desktop or a <table> or even a repl.

Star Trek may have inspired the ipad and terrible polygon capacitive touchscreens… but we all know that “Computer, search for M-class planets without fans of Nickelback’s second album living there as of stardate 2024” is already basically a reality.

EDIT: I like this CPU experiment too! It is a great example of the thing I’m talking about. Realized after the fact that I failed to plant my context in my comment, before doing my graybeard routine.

So. Food for thought, our LLM overlords are just unfathomable spreadsheets.

◧◩
2. killco+qH2[view] [source] 2024-01-29 06:10:54
>>b33j0r+i9
I work on a product for building user interfaces for hardware devices. All the state management is done via incrementally updated, differential DataFlow systems. The interface is defined in code instead of graphically, but I think that's a feature, so that code can be version controlled.

I think there has been evolution in the underlying data computation side of things, but there are still unsolved questions about 'visibility' of graphical node based approaches. A node based editor is easy to write with, hard to read with.

[go to top]