zlacker

[parent] [thread] 6 comments
1. moonch+(OP)[view] [source] 2023-04-05 19:49:05
This just reminds me of constant "things worked so fast on my Windows 95 machine back in the day with 16MB RAM". Meanwhile any piece of software could crash your PC and it did so regularly (I still keep spamming save in software because of those days) and internet was a pandoras box.

I wonder how much overhead in modern OS/PC user experience comes from security/stability abstractions and tools.

replies(2): >>jacobs+02 >>dylan6+s2
2. jacobs+02[view] [source] 2023-04-05 19:59:09
>>moonch+(OP)
I think it mostly comes from the fact that computers are so fast now people write apps without worrying too much about performance - apps have always grown to use whatever resources are available. But when you app had to run on a pentium with 16MB of memory - you actually had to work hard on performance because you had such limited resources.
replies(2): >>moonch+56 >>flatir+dn
3. dylan6+s2[view] [source] 2023-04-05 20:00:45
>>moonch+(OP)
>(I still keep spamming save in software because of those days)

muscle memory prevents me from being able to type a semicolon without cmd-s being the very next keys typed.

◧◩
4. moonch+56[view] [source] [discussion] 2023-04-05 20:24:16
>>jacobs+02
Yes but people have this nostalgic rose tinted glasses of software from that era - it was hot garbage that crashed all the time because they had so many constraints. Yeah GC introduces a bunch of overhead - but it also means you don't get segmentation faults, memory corruption, etc.

Modern software is much more reliable than the software from that era, people nowadays complain when a button isn't working - back then a button could randomly freeze my entire PC.

replies(1): >>throit+go
◧◩
5. flatir+dn[view] [source] [discussion] 2023-04-05 21:57:40
>>jacobs+02
And computers are so vastly different. We have these layers upon layers to deal with these differences. Back in the day it was just DOS and 386/486 then optimize the crap out of it. Even doom had their sound stuff done through a compatibility layer. Now a days you need to deal with multiple video cards and os and processors. Just easier to make a one and done solution and leverage it
◧◩◪
6. throit+go[view] [source] [discussion] 2023-04-05 22:03:13
>>moonch+56
> it was hot garbage that crashed all the time because they had so many constraints

Correlation != causation. I started using PCs heavily in the mid 90s, and yes "Illegal Operations" were abound. However, the SDLC has also come a long way with testing, automated QA, etc. Back then there was a lot more "wild west" going on for both hardware and software. Generally, practices are much more mature by default nowadays.

replies(1): >>moonch+l11
◧◩◪◨
7. moonch+l11[view] [source] [discussion] 2023-04-06 02:37:36
>>throit+go
But that's my point - the kind of constraints they had back then was not at all how we build software nowadays.

I remember people debating using global variables back then - I haven't seen a team not using unit testing in years. Scaling code up to multiple contributors, standardizing abstractions, building for automated testing, etc. We've taken many tradeoffs in the direction of development scalability and stability/correctness at the expense of performance and simplicity.

I still see people praising visual basic form builder - I think those were the kids that started doing dev with that and we're impressed they can put dialogs on a screen. I think it would be extremely hart to find someone who maintained a nontrivial app with that code behind shit and thought it was a good idea.

[go to top]