It's great people collect, restore, and publish valuable historical pieces like these.
Many web properties are no longer accessible due to M&A activity and Small/solo publishers unable or unwilling to maintain their assets. Archives like WayBack Machine mitigates some of the loss of digital content so long as the archives themselves are still maintained.
Will spinning rust be as durable as Microfiche?
The bar over a letter must mean that it's true upper-case. Cheesy, but it's what we did when characters were expensive.
Not sure how long microfiche lasts for but someone posted a link here not too long ago about how record companies had embraced magnetic hard drives in the 1990s to store music masters and are starting to find that the drives are no longer readable.
CDs and Laserdiscs are also seeing bitrot. The layer of material that is etched does degrade over time. Error correction helps some, but if it's a writable CD or DVD it's only likely to last a decade or two. M-Drives are CDs that are designed to retain their data for about 1000 years and can be writable by specific consumer drives. Not sure how long the professionally pressed CDs last but it's not that long.
The concept comes from NASA's Apollo Mission Control in the 1960s. These screens on the consoles were all just TV receivers. All the display data went onto a cable TV network. Any console could view any source. The network was remoted out, and displays outside the control room could look, too. Any display could be routed to the big screens, too.
The same technology was still in use in some USAF facilities well into the 1980s. (Long story. Short version: the 1970s upgrade project failed.)
That kind of switching remains a feature of military command and control centers. Some display may suddenly become important, and others need to look at it.
Accessing Dunlop's archives on the Xerox Star that would not have been a stand-alone system ended up requiring a Memorex machine that was accessed through multiple time-sharing CRT models. Piecing together the original audio in archival footage moved restoring the tape in an information management system to Englebart's accelerated NLS database.
The dev team went out "into the field" to help roll out the software to the company. This also allowed us to see how others used the software.
At the end of the day, one of the devs reported back that one personal assistant would maximize the email app's window (back when 17" CRT monitors were large) and after each email was processed, she'd print out the email and file it the appropriate spot in a filing cabinet.
All the devs were, "But... But... she can just file the email in an email folder in the program. Why does she need hardcopy? Email was supposed to save trees!"
Anyway I remember we used to write our weekly emails on paper first and then type them into the computer- your quote reminded me of that!
I'm also reminded of the Ashton-Tate software package Framework, which is one of my favorites from the 1980s. It's what they used to call "integrated software", which was a package of several productivity applications: word processor, spreadsheet, maybe a communications program or database or graphing capability, bundled together and sold as a unit. Unlike, say, Microsoft Works or DeskMate, Framework featured powerful versions of these tools and the ability to create composite documents, as well as a programming language with Lisp-like semantics to automate workflows. Because of this, Ashton-Tate pitched Framework as an executive decision-making tool, which was quite a bit different from how competitor programs like Lotus 1-2-3 were marketed:
What benefits do you, or others, see in looking back at these computer systems?
Thanks.
"real" video calling sort of snuck in through the back doors once people got webcams and MSN / Skype, and became mainstream / common in the 2000's with always-on internet, remote work, etc. And at one point the smartphone and mobile internet got in people's hands and (video) calls became casual.
I think the other part there is that it's normal people using them. What I mean by that is that in these videos, it's all very formal corporate people. And then the first people that really get interested in this kind of technology or who have an interest in futuristic stuff are / were the "nerdy" types. (I am probably living in a bubble though). But it was the average joe that normalised this technology.
I should do the same with anything I think is collectible / not trash / may end up in someone else's hands. For example, I bought some LPs over time, I should document when and where I bought them from at least. Maybe print out some information about the band / artist and include it, as the music themself is only part of the "product".
Kubernetes is awful at displaying secret stuff when sharing live terminals for showing ops.
Anyone else reminded of A Deepness in the Sky?
[1]: https://www.grassvalley.com/products/routing/vega-100-series...
I wonder if it failed it practice due to no boss having the patience of watching a programmer slowly writing out a program. Like, the video reminds me more of scifi computer interaction than actual programming. The boss voice sounds like the robot cops when beating the protagonist in TXH123 or whatever it is called.
I definitely cannot imagine a more wonderful vision of going about my day job as a programmer :-D
yeah those are the ones I'm referring to -- if you're archiving something like family history or data that needs to be good for centuries (without having to re-copy and juggle), those are a better choice than just about anything else.
One of the chairs would read emails on his iMac, then would handwrite a return message and give it to my wife who would type it into email and send it as him. He didn’t want to type anything. This was around 2008 to give you an idea of timing. My wife didn’t stay for long, but my understanding is he was doing this until he retired sometime in the 20 teens.
Like, those who don't study history are doomed to repeat it. But, those who do are mostly doomed to watch from the sidelines as other people repeat it. And even the things that are possibly obviously bad ideas without historical analogues get done...
Alternatively, tell people that they can't store something and you're likely to find it robustly mirrored by many.
This was what engineers were barely capable of, with the technology that did exist, but even most executives who were the type of user it was envisioned for never knew anything about it, much less had anything like this much desktop technology ever.
Everybody else in the non-executive category, even more of a complete fantasy.
IOW the difference between what you see there vs now is minor, compared to the real "backward" state back then.
Even though things like transistor radios were already common, you have to realize that in a huge percentage of dwellings in the US, and way more in the rest of the world, there was still not yet a single transistorized product.
I was a young math & electronics geek and was aware of more stuff like this than average.
Along with all the much more mature people, like the extremely rare engineering students who might want to work for IBM or something, this was exactly the kind of thing that was inspiring the movie "2001: A Space Odyssey" which came out the next year.
Anyone who had any clue something like this was already possible, could basically agree how cool it would be and was really looking forward to the 21st century when it would be here.
If the world was not destroyed by nuclear war before the 21st century got here :\
The archiving software in this area is quite obnoxious and user unfriendly, so it happens every now and then that counties or government agencies decide to just print the lot of it on paper and put it in physical archives.
But I do remember going back to the 90s that there was at least one senior exec at a computer company I worked for who basically didn't touch his terminal as I understand it. His admin printed out and typed everything.
Uppercase characters are represented using a bar/macron over the top - I was a bit slow to work that out and I don't remember seeing that convention before.
Link just to video: https://youtu.be/UhpTiWyVa6k
Edit: pulvinar said "It's clearly a vector display". You can see a graph using vector lines at 24:13, zooming at 20:50, and there's graphic lines mixed with text at 28:36.
I will concede that the aiming specifically isn't S-tier, but then again this is an "action-adventure" game, not a shooter, and everything else in the combat system more than makes up for that one less-than-perfect feature. Not to mention the fact that the game is much more than just the combat system. "Action" was just one of the characteristics I listed. The aesthetics and paranormal lore are reason enough to play it regardless of any combat.
It's incredibly satisfying to destroy the environment, throw objects and enemies around, levitate, dash in mid air... just thinking about it has me wanting to replay the whole thing even if I already know the mystery.
They're not entirely wrong in this regard - modern EMR web UIs are arguably inferior in many ways to some light pen driven systems of the 1970s-80s (I'm thinking especially of the old TDS system, which nurses (and the few docs that used them) loved because it was so easy and quick - replacing or "upgrading" it was like pulling teeth, and the nurses fought hard to keep it in every case I ever saw.)
Old habits take a while to change. Managers and executives were used to reports and memos on paper. So when email arrived, it was very common for secretaries to print emails for their bosses to read. Even at one of my early jobs in the 1990s, changes deployed to production had to be documented in memo form, and a copy of the memo printed, along with diffs of the code changes, and filed in a filing cabinet.
We got there eventually. I'd say that for all but the oldest generation still working, printing any kind of document to hardcopy has become pretty rare, at least where I'm working.
> Dunlop’s 1968 video demonstration of the Executive Terminal and the Information Center proceeds in three acts.
The article doesn't make this clear but the linked videos are not a video demonstration but instead unedited B-roll shots without audio probably captured to be cut-aways edited into a narrated video demonstration. Unfortunately, that video demonstration isn't part of this collection (or was never created).
This insight helped me understand the mindset of the IBM executives, which I wouldn't have before; just dismissed it as wrongheaded pre-boomer silliness. The executives saw demeaning themselves with the scutwork of looking things up for oneself as an attack on their position, their dignity and worth as individuals, and the organization as a whole -- perhaps even society as a whole. Those filthy hippies with their (sissy voice) "collaborative work environments" and their "interactive terminals". They're working for the Reds, I tell ya, trying to unravel the nation from the inside!
I owe LeGuin a profound debt for opening my mind to mentalities vastly different to my own, yet still essential to the history of the computing world I live in.
Digital makes it cheap and easy to have multiple in many locations. While any one media may fail, you still have a copy - I have on this computer all the data from whatever computer I was using 15 years ago. (most of it I have not looked at in 20 years and I could safely delete, but it is still here, and on other backup systems I have)
They demanded that their 'engineers' must be able to build out and manage both their own and their managed infra on AWS but never write any code - in fact they thought automation was outright dangerous, they said their engineers would never write any terraform, cloud formation or similar and that they wanted to become a MSP of cloud services preferring to write everything down in runbooks... and print those runbooks out.
The managers would turn up to meetings with huge stacks of paper that were just AWS documentation converted to pdf and printed.
We refused to work with them and essentially walked out. I'm sure this is something that someone like an Accenture or Deloitte would and probably did jump on.
This was 2019.
These days your photos are probably backed up by facebook, google, or are such major players. (there are a lot of privacy concerns with the above, but they do tend to have good backups)
Interesting UX fact: IBM researchers looked at user satisfaction on this system. They found that it wasn't poor response time that bothered people, but variability of response times. If users couldn't predict how long an operation would take, that bothered them. So they inserted delays so that average response times were maybe longer, but variance was lower. And users were happier.
https://www.scribeamerica.com/what-is-a-medical-scribe/
The TDS Health Care System had some unique advantages but unfortunately it was tied to obsolete technology and ultimately a dead end. Web UIs aren't necessarily a problem. Some of the most popular EHRs such as Epic use native thick client applications. The fundamental issue is that healthcare is inherently more complex than almost any other business domain, with every medical specialty needing a different workflow plus beyond the clinical stuff there are extensive documentation requirements imposed by payers and government regulators. Sometimes clinicians and administrators insist on certain functionality even when it makes no sense due to ego or ignorance. EHRs can be improved but I know from painful experience how expensive and time consuming it is to get everything right.
Have you read Stranger In A Strange Land? The alien word "grok" from that book has a similar way of being useful, and that one actually managed to make it into general speech somehow - at least by hacker types. In the book it's an alien word that literally means "to drink", though it really means something like "attain a real understanding of."
“I don’t know why we call it a mouse. Sometimes I apologise. It started that way and we never did change it”
Truly a sense of looking back into the past and seeing history in the making.
When thinking back, it wasn't at all unreasonable at that time not to have used one, but it already seemed unfathomable.
It's one of those "the future is here, it's just not evenly distributed" (paraphrased after William Gibson)
You make a good point about the lack of durability and instability of many types of chemical photo processes (especially color negative and print processing). I do think many digital formats will be lost to time when a color transparency or b&w negative will still be viewable without much aid into the future.
One of my favorite photo books is the re-photographic survey project by Mark Klett. He went around re-capturing the exact locations (and camera position) of notable images of the American West from the early days of the US geological survey when they had a plate photographer on the team. We are talking about a time period just after the US Civil War. So we see a landscape captured in time 10 decades or more after the original.
I've been a pro photographer for over 30 years. All my earliest digital work is archived in RAW so I have the original shooting data. It all triple backed up and I have a friend that allows me store one of my backups at his home. I've been amazed at how many photographers lost track of or throw away their older work. I'm still licensing my work hundreds of times a year and some of this older material is becoming even more valuable simply due to scarcity. The redundancy of digital is great of you take archiving seriously.
Yet, I still have drawers of original film from the late 80's - to early 2000's I'm scanning a few but will probably let many be disposed of . . .
When I was a kid my medical chart was paper. When I was around 13 years old the pediatrician’s office moved to an EMR.
It was more or less a digital version of the same chart.
As I have grown older, and with the benefit of having medical professionals in my family, I’ve seen how EMRs have changed from a distance. From an anecdotal perspective it seems like charting is more time consuming than it used to be. I’ve witnessed many different medical professionals using many different EMR platforms, and poor design seems to be a factor there.
They also deal with more information on a patient and in an aggregate form than paper charts ever did. From what I’ve observed I would venture a guess that more than a little of that is the result of neuroses and anal tendencies on the part of healthcare executives rather than quality improvement initiatives or research oriented objectives. There are other externalities like bad vendor implementation for CMMS requirements, or the continued granulation of conditions into ever more ICD codes, which then need crosswalk databases and interfaces and cross checks.
On the patient side, I’ve only ever truly been impressed by Epic’s portal. Every other one I’ve used is comparative garbage. I have recently been having a conversation with a manager at my doctor’s office trying to understand why and what changed so that chart data that used to be visible to me are now only visible to them, and why they can’t change that. It seems like the vendor implemented a forced change and I may just have to live with having ambiguously incomplete access to data I used to have access to, with no insight into what’s incomplete unless I already know.
With all of that said, at least there’s some access to one’s own health data. And comparing that to my birth records, which are functionally illegible (likely forever), at least what records are kept will be decipherable twenty years from now. Presuming they’re not mangled by a migration, which I’ve seen happen several times.
I really enjoyed working this way and kind of wish the same experience could be replicated between multiple machines and with more than 2 people. I would like it if anyone could drag an application onto a shared screen where multiple people could control/interact with it while still having the option to take a window from the shared display back to a private display, ie passsing an application from one system to another.