I would disagree: The problem is developers (and users in general, but their lack of formal training is an excuse) being comfortable using interfaces and abstractions they don't fully understand.
Note that the result of this might sound like it makes the idea of a professional system administrator invalid but that's not true: I think the better SAs of the past had a thorough understanding of what their tools did and many of probably even modified them, this contrasts the current situation where people are poking things in PAS GUIs and accidentally running up huge bills.
I am agreeing with what you wrote, though. Or at least I am trying to.
I have seen numerous times the results of that, where for instance, a developer creating a tool decides that his interpretation of a bad requirement is satisfied in a poor way. Or a sysadmin deciding that a default configuration is good enough because he did `mv conf.example to conf`, and it works.
I guess what I am trying to say is that the learning curve given the complexity of software / systems now a days, and the lack of judgement and training by the users / developers / sysadmins of those systems results in decisions where risk is not taken in to account.
I miss the time where people genuinely knew what they were doing, and had a mindset that allowed them to avoid / prevent risk in the decisions they take during their daily tasks.
Was there ever a time like that? I’m only in my 20s, but I can’t recall a time when most computer systems weren’t terrible. (I love computers, but honestly, they suck ass.)
Unverifiable working theory (I didn't live through the 80s myself):
I think computer systems were always terrible (I think our brains were only ever able to properly/completely grok the old 8- and 16-bit microcomputers and early game consoles), but because early networks and computer systems were built almost entirely on a combination of naivete and lack of awareness on the part of the large corporations, the sysadmins of the day got free reign to do whatever they wanted however they wanted it, pretty much by default.
With no "here, please build this highly technical thing.... with management's help" boring into your back making you question your own every move, and a status quo (read: world market) that simply didn't understand what was possible and how quickly it could be accomplished, the technical scene of the 80s and 90s was largely owned by the sysadmins, free to run everything at whatever pace they liked. You can imagine that genuine motivation and interest in maintaining mastery of the craft flourished in such an environment. So of course people knew what they were doing.
Lines of respect were drawn. (And script kiddies learned where the "real" sysadmins were and stayed away - lest they be out-pwned.)
Today, everything runs far too quickly to get the remotest handle on which way is up, let alone a competent understanding of all the interactions between everything. At the same time, a lot of early default assumptions (eg, "UNIX is a universally good design, and C is the best universal programming language") are being thoroughly trounced because of scalability issues nobody would have dreamed of in 1994, and security issues that a bygone era was honestly too complacent to take seriously.
Incidentally, the vitriolic reaction to the introduction of systemd IMO makes for a good example of why conservatism is bad: when push came to shove, those same sysadmins that had free reign to go as fast - or as slow - as they wanted, had grown so complacent and non-proactive, they were unable to coherently a) band together and b) argue (showing the "workings-out", not just the solution) against a cause none believed in. Instead there was vitriol, death threats and a peanut gallery louder than a fireworks display. TL;DR, that era did not age well.
(Note that I do not give the example above as a point of "oooo, they're utterly incompetent and incapable of anything" or "80s = bad". Black-and-white-ness is not implied by there being only one datapoint.)
I especially agree with your last note. Again, I’m young, but it seems to me that the shortcomings of humans have a largely homogenous distribution through time and space.
I was born in 1991 so a lot of my understanding/worldview of this era is synthesized from hopefully-representative anecdata. Also, learning difficulties (gross oversimplification, and not realizing I'm misinterpreting something incorrectly) add extra spin to some of my earlier models that I also have to correct for (and in some cases notice in the first place).
Haha, how wrong am I? "You're sorta heading in the right direction, if you squint right, but missing 99% of nuance/context"? Or "the turnoff for the correct forest was 50 miles back"? :)
If you have any suggestions for good references I could absorb that will (misinterpretation notwithstanding) present a decent capture of the nuance of 40 years ago, I'd love to hear it.
I think it's mostly because the Jargon File, BOFH stories, and similar lore only pick the interesting parts...and let's face it, parts that make the sysop narrator look good (as opposed to the "suit" antagonists).
Of course it is described as The Golden Age, because that's how all the stories of any golden age are curated.
In other words, I don't think that was a time that actually happened: we just remember the good parts and perhaps a few spectacularly bad parts, but never the mediocre hacks that always make up the remaining 90% of everything.
> being comfortable using interfaces and abstractions they don't fully understand.
I don't think it was an interface or abstraction that got the user in trouble here, they were using a pair of systems in ways that were fine on their own, but combined led to an emergent vulnerability that they didn't even know to consider.
It may be sheer pedantry but I really do see this as a unique "systems" issue, and this type of 'emergent' property between separate self-contained programs is fully within that domain.
I think it is an abstraction: the one underlying both of those components that combined to create the vulnerability.
The abstractions provided by the OS compose in very surprising and hard to predict ways for humans. This is why newer systems don’t use the same abstractions (JavaScript and browser APIs), or else sandbox them much more thoroughly (iOS).
Those newer tools have newer problems, of course, but I think a lot of the churn and reinvention of tech that we complain about is really about trying to find abstractions that combine in more predictable and useful ways.
From this[1] insightful video essay by Kyle Kallgren:
>> Metaphor Shear -- That feeling all users experience when you realize the metaphor you are working in is bogus. When the computer fails you and you remember that there are a hundred translations between input and output. Codes and translations we don't have the time or patience to do ourselves. Intellectual labor that we've surrendered to a device.
>> The joke at the center of Douglas Adams Hitchhiker's Guide To The Galaxy is about metaphor shear. The answer to an important question lost on its long journey from input to output. A computer glitch so huge, so strange and so embarrassing that its programmers have to make a computer the size of a planet to file a bug report.
I’m not sure I agree with this, there’s a point where you start infantilizing users and make it difficult or impossible for them to do useful things with their computers. iOS is still like this (although they may be headed in the right direction) and I’m fairly certain the web absolutely is.
I will say apple absolutely got the built in ssh client in iOS13 right.
Note that the result of this might sound like it makes the idea of a professional system administrator invalid but that's not true: I think the better SAs of the past had a thorough understanding of what their tools did and many of probably even modified them, this contrasts the current situation where people are poking things in PAS GUIs and accidentally running up huge bills.