"Our goal is to enable robots to express their incapability, and to do so in a way that communicates both what they are trying to accomplish and why they are unable to accomplish it... Our user study supports that our approach automatically generates motions expressing incapability that communicate both what and why to end-users, and improve their overall perception of the robot and willingness to collaborate with it in the future."
I'm not as plugged into human-computer interaction work, but as a user, it seems like this is sorely missing and getting worse. I wish I could get a happy medium somewhere between a full stack trace and silent failure, e.g. when my iCloud documents won't sync.
https://mymakerspace.substack.com/p/another-look-at-infrastr...
(Previous HN discussion: https://news.ycombinator.com/item?id=24303832)
When I get back into the game with Windows again, I'll be seriously looking into ETW, Event Tracing for Windows.
It seems the best startpoint to learn about ETW is https://randomascii.wordpress.com/2015/09/01/xperf-basics-re... and https://randomascii.wordpress.com/2015/09/24/etw-central/.
The 2nd link above has a bunch of links to other pages, but is a few years old, so while the old info is still relevant, a quick poke around this blog's tags finds the following additional, newer posts that also demonstrate real-world insights of ETW saving the day in a bunch of practical situations:
https://randomascii.wordpress.com/2017/07/09/24-core-cpu-and...
https://randomascii.wordpress.com/2019/10/20/63-cores-blocke...
https://randomascii.wordpress.com/2019/12/08/on2-again-now-i...
https://randomascii.wordpress.com/2021/02/16/arranging-invis...
https://randomascii.wordpress.com/2021/07/25/finding-windows...
https://www.brendangregg.com/blog/2019-12-22/bpf-theremin.ht...
It is fairly trivial to see all of main memory and single step execution of a wasm program. If one runs wasm3 in wasm3, you can then trace the inner interpreter as well. Check out the section on trace visualization.
The website appears to still be available at:
Conway's Game of LIFE in a DEC PDP-7 w/ Type 340 Display
https://www.youtube.com/watch?v=hB78NXH77s4&ab_channel=Livin...
Early computer graphics -LIFE - 4 Gosper Glider Guns on a DEC PDP-7 Type 340 display
https://www.youtube.com/watch?v=JhvOw7vW4iA&ab_channel=Livin...
DEC PDP-7 w/ Type 340 display running Munching Squares and Spirograph
https://www.youtube.com/watch?v=V4oRHv-Svwc&ab_channel=Livin...
Also PDP-7 related (but with more melodic music), here's a video remix I made of an early CAD system called PIXIE (with the first known implementation of pie menus, using a light pen) running on a PDP-7 with a type 340 display, networked with a Titan mainframe, at the University of Cambridge (one of the first network distributed graphics systems), set to music:
https://www.youtube.com/watch?v=jDrqR9XssJI&ab_channel=DonHo...
https://en.wikipedia.org/wiki/Calm_technology
>Calm Technology
>History
>The phrase "calm technology" was first published in the article "Designing Calm Technology", written by Mark Weiser and John Seely Brown in 1995.[1] The concept had developed amongst researchers at the Xerox Palo Alto Research Center in addition to the concept of ubiquitous computing.[3]
>Weiser introduced the concept of calm technology by using the example of LiveWire or "Dangling String". It is an eight-foot (2.4 m) string connected to the mounted small electric motor in the ceiling. The motor is connected to a nearby Ethernet cable. When a bit of information flows through that Ethernet cable, it causes a twitch of the motor. The more the information flows, the motor runs faster, thus creating the string to dangle or whirl depending on how much network traffic is. It has aesthetic appeal; it provides a visualization of network traffic but without being obtrusive.[4]
[1] https://web.archive.org/web/20190508225438/https://www.karls...
[3] https://web.archive.org/web/20131214054651/http://ieeexplore...
PDF: http://www.cs.cmu.edu/~./jasonh/courses/ubicomp-sp2007/paper...
[4] https://web.archive.org/web/20110706212255/https://uwspace.u...
PDF: https://web.archive.org/web/20170810073340/https://uwspace.u...
>According to Weiser, LiveWire is primarily an aesthetic object, a work of art, which secondarily allows the user to know network traffic, while expending minimal effort. It assists the user by augmenting an office with information about network traffic. Essentially, it moves traffic information from a computer screen to the ‘real world’, where the user can acquire information from it without looking directly at it.
https://en.wikipedia.org/wiki/Natalie_Jeremijenko#Live_Wire_...
>Natalie Jeremijenko
>Live Wire (Dangling String), 1995
>In 1995,[9] as an artist-in-residence at Xerox PARC in Palo Alto, California under the guidance of Mark Weiser, she created an art installation made up of LED cables that lit up relative to the amount of internet traffic. The work is now seen as one of the first examples of ambient or "calm" technology.[10][11]
[9] https://web.archive.org/web/20110526023949/http://mediaartis...
[10] https://web.archive.org/web/20100701035651/http://iu.berkele...
>Weiser comments on Dangling String: "Created by artist Natalie Jeremijenko, the "Dangling String" is an 8 foot piece of plastic spaghetti that hangs from a small electric motor mounted in the ceiling. The motor is electrically connected to a nearby Ethernet cable, so that each bit of information that goes past causes a tiny twitch of the motor. A very busy network causes a madly whirling string with a characteristic noise; a quiet network causes only a small twitch every few seconds. Placed in an unused corner of a hallway, the long string is visible and audible from many offices without being obtrusive."
[11] https://web.archive.org/web/20120313074738/http://ipv6.com/a...
>Mark Weiser suggested the idea of enormous number of ubiquitous computers embedding into everything in our everyday life so that we use them anytime, anywhere without the knowledge of them. Today, ubiquitous computing is still at an early phase as it requires revolutionary software and hardware technologies.
Also each time it beeped the bell it would start at a higher and higher tone rising to a fixed pitch, each starting higher and lasting less time than the last, so a lot of bells in a row would ramp up in tone and shorten out to a high buzz, so they weren't so annoying. Then it would decay back down after you didn't receive any bells for a few seconds. It was inspired by the way of an excited guinea pig squeals for lettuce.
https://www.youtube.com/watch?v=5jfoxSeJzWo&ab_channel=It%27...
Also, the underline cursor floated up and down and up and down in the character cell, so it was very easy to see where it was, and it drew a wavy line in the phosphor as it moved across the screen!
> -B > > Sound the bell at the start of each (major) garbage collection. > > Oddly enough, people really do use this option! Our pal in Durham (England), Paul Callaghan, writes: “Some people here use it for a variety of purposes—honestly!—e.g., confirmation that the code/machine is doing something, infinite loop detection, gauging cost of recently added code. Certain people can even tell what stage [the program] is in by the beep pattern. But the major use is for annoying others in the same office…”
https://downloads.haskell.org/~ghc/latest/docs/html/users_gu...
From the Usenix (https://www.usenix.org/legacy/publications/library/proceedin...) abstract:
> We created a network monitoring system, Peep, that replaces visual monitoring with a sonic `ecology' of natural sounds, where each kind of sound represents a specific kind of network event. This system combines network state information from multiple data sources, by mixing audio signals into a single audio stream in real time. Using Peep, one can easily detect common network problems such as high load, excessive traffic, and email spam, by comparing sounds being played with those of a normally functioning network.
The SourceForge page is still up: https://sourceforge.net/projects/peep/
This isn't as aesthetic as say, the LiveWire[2] mention in the comments. But it's readily available on almost all systems, and is a very flexible ambient indicator.
There's a lot of really really fun good stuff in the comments here. Ambient is good, but to me, I want computing that exposes the causal relationships of what is happening as it's processing, as it's running. "This button was clicked so I'm trying to change the screen brightness now." All of the entities of computing, the data, these user events, should be reified, should be made into a logged sequence of what is happening. From that basis, we can all be free to explore computing, and to- EventSourcing style- extend the graph of computing as we might see fit.
[1] https://github.com/torvalds/linux/tree/master/drivers/leds/t...