Harder Core Than Thou

We are still that hardcore.

Forty to fifty years ago, in the 1960s, I.T. professionals worked with systems which, by modern standards, were close to useless. (The term "information technology" didn't exist, but let's call the computer users of that era "I.T. professionals" regardless.) These systems were cutting edge for their era, which is respectable, but they lacked much which we take for granted. For example, output was sent to a printer, not a monitor, and thereby consumed ink, paper and a great deal of time. Any serious computer system occupied an entire room, if not a floor, of your building. It required a team of people to maintain. It was inscrutable and difficult to handle, because any concession to usability carried a real cost in processor cycles and this was the era when computer processor time was more valuable and difficult to replace than human programmer time. You were lucky to get one compile per day, so you wrote your code out by hand, punched it into cards by hand or with some kind of mechanical machine, and avoided dropping them on the floor because re-ordering them was such a monumental challenge. It was the Bronze Age of computing or whatever metaphor you want to use. Programming was difficult back then and you had to meticulously validate every line of code you wrote, and have it cross-checked, because a failed build cost serious time and ultimately pushed back serious deadlines.

We get it. This is just how computing -- of necessity -- worked in that era. And in order to accommodate these limitations, people wrote good code. Code that would run continuously without hiccup or malfunction for literally forty years -- provided the underlying hardware didn't change significantly. And they wrote that code first time. Because they had to. Code that was close to perfect. The equivalent of walking a tightrope.

Those were great people. No question. They accomplished great feats and created great software, some of which is still operating to this day.

The problem is that people point at the leftover COBOL and processes from that era and lament at what software engineering has lost. The statement, whether implicit or explicit, is that, in this era of comforting, forgiving, error-tolerant, problem-catching IDEs, and of programming languages which were - horror of horrors - built to be more usable they are efficient, we have somehow lost the capability to write code which compiles first time. That we have lost the ability to write code which will run continuously for forty years without a hitch. That we are no longer hardcore.

I contest the "forty years" statement. We are observing from a biased perspective. The only code from forty years ago which is still running is the only code which is still running that was written forty years ago. How much code from that era is dead and erased by now, replaced with something more powerful and maintainable? Who knows? We don't remember that stuff. Because it was erased. What's the percentage, exactly?

Of course a computer program which runs without fault will continue to run without fault indefinitely. It just needs the hardware to support it. If the hardware manufacturers are prepared to continue to provide compatible hardware -- which we are -- then any proven program will be able to run indefinitely. It's not like some memory leak which was negligible after thirty-nine years will suddenly become fatally significant after forty. We work with basically mathematically perfect systems which, by now, have tolerance for hardware faults. If the program is good, and the hardware is good, there will be no problem.

And people write forty-year software all the time. Even right now. We have a tolerant approach to hardware failure these days, of necessity, but that doesn't mean that the software running on that metal is bad. People write software all the time which goes on to run, largely uninterrupted, for five years. Left alone, and properly supported, that software will run for another thirty-five years. No problem. It's not actually that difficult to write good software. The hard part is reserving the forty-year period needed to prove that it's good.

And I contest the "programming greats" statement -- the assertion that they don't exist anymore. There is a well known piece of prose about Real Programmers. The subject's name was Mel. (Mel wrote unmaintainable software. The technology to write unmaintainable software has not been lost. But that's beside the point.) Mel wrote highly efficient code, and was able to do so thanks to his intimate and frightening understanding of the underlying system. People don't write code like that anymore.

That doesn't mean that we can't.

The technology to be incredibly freaking smart has not been lost. People like Mel still exist, except that they fry bigger fish now, and create highly performant software on an entirely different level. The technology to make software which is effectively perfect has not been lost. People who can do it, first time, still exist. But the need to do that has gone. And software which is one hundred percent perfect is not what the customer wants, unless the customer is NASA. And NASA does have software guys. And they do deliver. The reality is that in the majority of situations, what the customer wants is 99% perfect software now, rather than 100% perfect software delivered five years from now at five times the price. The natural state of I.T. right now is one of continuous rapid change, and I.T. consumers are used to this concept. They're used to upgrading when the new fixes are ready. The requirements have changed, and the providers have changed to meet those requirements.

This is not weakness. It doesn't make modern software engineering an easier or wussier task. Don't confuse a relatively low quality of output with low effort or low ability. Look at the commensurate rapid pace and high complexity. It's a trade-off which is made consciously.

If we had to write the Right Stuff, we could. We, programmers as a society, are still That Good. We are still idealists, and the ideal is still attainable. But we simply choose not to, because our clients are pragmatists, and they are the ones for whom we work.

Discussion (14)

2010-12-31 23:20:05 by bbot:

Perhaps relevant to note that, even though the article is 15 years old, it's still perfectly relevant. The shuttle hardware is the same. The program is certainly the same.

2011-01-01 00:37:22 by skztr:

This reminds me of NASA vs the rest of the world. NASA engineers things using the purest science they can make use of. They take ten years and build a mathematical model of a perfect spaceship whose design very much resembles a perfect sphere. So they go out and they build a perfect sphere, with tolerances beyond any measurable precision. They need to hold it together with screws, so they come up with a model of a screw, determine exactly how much it needs to weigh, and spend $500 per screw to ensure that each one fits the model exactly. They know that if there is any deviation, the result will not fit the model, and so it may explode. The rest of the world can't spend thousands of dollars on a handful of screws, so they come up with a model which isn't so broken that it falls apart at the slightest deviation. That is the difference between "Computer Science" and "Programming", and it is the difference between working out the perfect solution to a problem and actually making a solution which works (especially once your boss casually tells you the real specs for the project when you're ninety percent done coding what he said six months ago)

2011-01-01 04:43:47 by YarKramer:

You know, I've always viewed phrases like "real man" or "real programmer" with a certain amount of disdain, especially once I grokked the "no true Scotsman" fallacy. This kind of thing is one of the reasons for that.

2011-01-01 07:01:00 by JoeCool:

I would like to state my disdain for the notion of hardcoreness in general. Rather: Is being hardcore awesome? Yes. Should you look down on people who aren't? No. Well okay it's fine if they think they're "t3h l33t h4x" but if they just want to use a computer then shove off. Mostly stop thinking you're hard core because you're using notepad instead of an IDE. You're not being clever, you're coding slower than you could be for no good reason. Genius programmers write genius code in whatever environment they're allowed. Genius python beats idiot x86 any day of any week.

2011-01-01 10:00:04 by Artanis:

@JoeCool I agree that IDE vs plain-text editor is kinda silly. It's a preference, and you can write good code in either. Personally, I take issue with the programmers that let the IDE write 90% of the program. One of my classes is using the WindowBuilder Eclipse plugin, and while the code generated is correct, I'm not sure anyone really understands it (I know I don't). I tried, I tried, but I couldn't get the professor to not make 9 *identical* anonymous classes to handle click events for a tic-tac-toe example. (no, no, make the handler here, assign it to a local variable. there. now pass that variable instead of--no, don't copy the class definition into all nine add handler calls. yes the variable is unused. you didn't use it. i give up.)

2011-01-02 02:32:34 by green:

"what the customer wants is 99% perfect software now" Yeah, I am okay with 99% perfect software. But I get annoyed by the surplus of 85% perfect stuff. It is just good enough for the user to expect it to work most of the time.

2011-01-03 09:41:04 by AlexB:

I agree with Artanis. I've written in several different IDEs in a couple different languages (I prefer plain-text C, old habits die hard) and if you know what you're doing, you accomplish the exact same thing. The biggest thing is the idea. The power to be had in raw creation. It doesn't matter where you work, so long as your design, your plan, your idea, is solid.

2011-01-03 16:32:52 by HOLIERThanThou:

Can we please stop using the word "performant"? Thank you. That is all.

2011-01-04 21:55:50 by PropagandaPanda:

Wouldn't it be relevant to also point out that the programs of today are also pretty much always WAY more complex then the stuff of forty years ago? I mean the software that has to run a cell phone has to run more than a simple dial pad, it has to run the weather widget, the clock, ring tone system and so on. The software that runs an e-commerce site has to keep track of who ordered what and how many and charge the right people and send it to the right place and do that a thousand times per minute and keep it all secure with an inventory of millions of items. Ok, yeah, its sort of a ratio thing, we have better things to program with for more complex programs, than 40 years ago, but I'm not sure I'd call it a 1:1 ratio.

2011-01-06 21:25:17 by Blank:

I don't think the level of code or programmers decreased. In fact, I think the opposite happened. Over the last century the whole concept of programming have gone through many researches and discoveries that allow modern programmers to create quality and highly efficient code. Remembering the past better than it actually was is a common bias in the human brain. These ancient codes you are talking about weren't all that perfect, and the lack of proper debugging and development equipment didn't contribute to that. You must also consider the additional requirements that developed over the years. While in the past making a console application that would just do the job was enough, the market now demands every program to contain user-friendly UIs, to integrate with the operating system and its mechanisms (API, registry, drag&drop, message forms) and most importantly modern code must be secure and comply to many security standards. While today every little exploit gets found and abused and affects tens to thousands to millions of computers worldwide, I bet none of these 1960 developers really bothered to protect their code against all the common exploits known nowadays (I once saw a warning on a Python forum not to use tempfile.mktemp to generate a random temp file name, because by the time you created that temp file an attacker could possibly make the directory a symlink to the system directory and thus abuse your hypothetical su access to read system files, if you even had such access. Seriously, are you kidding me?). And look at the results. Look at the linux kernels, at the GNU project, at Ubuntu. They are the meaning of modern coding. They write millions of code lines. They have to deal with all that shit I mentioned before, and thanks to all the modern programming mechanisms, they are doing it wonderfully. I don't think anyone in the past ever managed to get to this level of quality code.

2011-01-08 09:18:25 by Snowyowl:

Obligatory XKCD link which actually makes much the same point, though less in-depth: xkcd.com/378

2011-02-13 03:07:53 by KeaponLaffin:

I think there is a few false assumptions here. 'Back in the day' perfect programs were wrote for all the reasons above plus because the platform was static. For long periods of time the architecture was always the same. Every change was momentous and often changed the whole paradigm. You programmed it small because 8-bit was all you had to work with and you could make it perfect because it'll always be running on the same machine or an exact copy. That's why NASA -can- demand it's programmers to make it absolutely perfect. That program will always be running on that machine and no other. The specifications of the components are known intimately with high accuracy. Now we have different OSes, different versions, updates, multitasking, 3 new video cards since the project started, codecs, blah blah. So in that way 'we can't' write 'perfect' code in this modern world. It would be obsolete before your morning coffee. The 99% customer thing too. Customers are absolute geniuses at asking for that one little thing...that to 'cleanly' implement would require a major structure overhaul. So you hack it in, also because that's all they pay you for. Rinse and repeat. My personal favorite was when the customer asked that bills now be calculated to a millionth of a peso. I'll take your word that leftover COBOL is pretty or whatever. I'm a young COBOL programmer maintaining a program first written before I was born. Not one of the dozen on our team understand much of the thing and many have been around it a few decades. Some pieces of code haven't even been looked at since the mid 70s. That code sucks. I've even seen some GOTOs! The entire mess is spaghetti code like you wouldn't believe. I think it's in pain and hates all of us.

2019-12-09 17:43:29 by Filursiax:

I am not a programmer by any stretch of the term. I have, however, done very heavy-duty word processing work on a near-daily basis ever since the early 1980's, and have some very clear ideas about what a good word processor should be and not be. And since this seems to be a forum for some progressively-minded programmers, I am finally permitting myself a RANT, in hopes that someone will hear me and (oh vain hope!) do (or inspire someone else to do) something about it. So: On the basis of -very- considerable experience, I assert that there have been no improvements in word processing equipment since WordPerfect 5.1+. Indeed, from the point of view of the -user-, the programs have gotten worse: slower, more buggy, less versatile, with fewer useful features, and with ever more clunky and useless interfaces. Why is this? Why does no-one take up the challenge and do something about it? The once and future market leader - Word - is a joke. Aside from the long-overdue docx file format (still inferior to wpd) and enhancements to the spellchecker, there have been no substantial innovations at all. The new interface is pathetic. Open source programs (OpenOffice and Libre Office) are free, but otherwise simply mimic Word. So what was the big deal with WP 5.1+? (1) It was fast - incredibly fast. With a macro, it could do 20-30 complex search-and-replace operations on a 100+ page document in a few seconds (on a 486 machine!). It saved and retrieved large documents faster than anything on the market today. Menu display and complex macro execution (see below) were near instantaneous. (2) It was bug-free. In all my years of heavy use, I discovered one - 1! - bug (back in v. 5.0). I notified WP of it, received a friendly reply from the company, and it was corrected in the next version. The service was of course legendary. (3) Every new release pioneered new and genuinely useful features. The developers understood the needs of people working with large and complex texts, and strove actively to meet them. (4) Its formatting capabilities were on a professional level and superior to anything (aside from LaTex-type programs) on the market today. (5) Most fundamentally, it had a macro system that was simple enough for a non-programmer like me to understand and use, and versatile enough to do just about anything I could imagine, including reprogramming the macros themselves. The program had two weaknesses (in addition to weaknesses that were generic to the DOS environment): (a) The interface was simple, but completely non-intuitive. And (b) the macro system was very poorly documented, which was unfortunate, since this was what really made WP 5.1+ shine. So what was so great about the macro system? Back in the 80's, my brother and I explored the system in depth, and as a result were able to design a completely new interface for WP as a whole. Its features included, for example, the following: (1) A menu system consisting of a single line at the top of the screen (invisible until you pressed F12), with drop-down menus that contained every single feature of the program (plus any macro you wanted). It would return you to the exact menu state it had when you last accessed it (great for repeating the same operation). It could be simply rearranged and added to at will. You could go back through the menu hierarchy step by step or directly to the editing scree. And - most notably - it could be navigated faster than by mouse, using nothing but the arrow keys on the numeric keypad (with enhancements). (2) On exiting WP, it could save every particular of the program's state on exit (last documents edited, cursor position in those documents, menu state(s) on exit, last directory accessed and cursor position in that directory) - by saving a number of simple variables to a text file, which were retrieved on restart, allowing you to continue your work at exactly the point where you left it. (3) A search/replace system that allowed you to specify a list of search and replace terms (with wildcards), and either replace multiple search terms with a single replace term (wholly automatic) or replace one or several search terms with multiple alternative replace terms (requiring manual selection of the replace term - using the arrow keys). All search/replace operations could be saved in the menu system for re-use. (4) An innovative and consistent text navigation system (using arrow keys only) which allowed ultra-fast movement through the text (faster than a mouse). Features included: 1-touch jump to the midpoint of a word, line or (vertically) screen; double-speed character-by-character or word-by-word motion; navigation with block on using the same keys with Ctrl-key depressed; the ability to adjust both ends of a blocked area without cancelling the block and starting all over again; and a number of 1-touch shortcut keys that performed common editing operations (e.g. toggle capitalization on/off, exhange position of adjacent letters or words (ot > to; if as > as if) etc. All of this (except for the Ctrl key for blocking) was accessible to the right hand. (Another nifty detail simply turned off all illegal keys when typing file names, so you avoided the idiot message on saving that "x" is illegal in file names. Why nobody ever figured out that one since is beyond me!) There was a lot more. It was a simple, fast, and easy-to-understand, replacement of the entire WP interface, which retained all of WP's strengths, while saving incredible amounts of time and completely bypassing mouse arm syndrome. We were just getting the installation procedures and help files ready, and were planning to donate the whole package to WordPerfect, when WYSIWYG and multitasking came and wrecked it all. But consider the flexibility of the program that allowed us to do all this! The dumbing down that has taken place since then is incredible. I have nothing against WYSIWYG or multitasking or long filenames, but for a non-programmer they spelled the end to the work we had been doing. What I cannot understand is why time- and trauma-reducing features of the kind we were working on have never since appeared in word processing programs (including WordPerfect itself). Instead, all you get is ever slower and more bloated programs, with fancy-looking but useless interfaces (cf. Word's "ribbons"), and not a single new feature of any note. It's depressing. P.S. I have been using Autohotkey to replicate parts of our old system, but my programming skills are inadequate for anything but the simplest purposes.

2021-05-27 19:16:07 by Verisimilitude:

This article has aged poorly, with the rise of slovenly WWW code artisans. Most programmers, like most people, don't justify their existences from a purely utilitarian perspective, and only pretend to care due to money, resulting in disgusting and wasteful software. Ironically, many of the originators of such shun optimization, and yet pretend to care about environmental waste others produce. One thing computers were expected to do seventy years prior was work correctly, which is no longer true. The cult of worse is better hasn't been destroyed yet. Both Lisp and APL are from this supposedly barbaric era, and it's an indictment of the modern age that they've yet to be bested. What should be extremely disconcerting is that computing was already pulling itself up out of the low levels, from the hardware up, only to be pushed back in. People used more powerful languages decades ago modern programmers wouldn't now touch due to ``efficiency'' reasons. Look at the SCHEME-79 chip, the Lisp Machines, and the offerings from Burroughs. ``Mel wrote highly efficient code, and was able to do so thanks to his intimate and frightening understanding of the underlying system. People don't write code like that anymore.'' I disagree. I strive to write machine code like this, although I've not found others who do; I've yet to join the Demoscene, which seems like one of the only remaining havens for such. I've developed a specialized class of tools purely for such programming: http://verisimilitudes.net/2017-07-07 I want to share my favourite passage from ``Alan Turing's Electronic Brain'': ``Optimum coding - good or bad? It led to very fast routines but we could overdo it! I recall that when a built-in divider was available on DEUCE, I explored the usual iterative method for finding square roots to replace our digit-by-digit method learned at school. I found that for a floating point number x in the range 1/2 - 2, a starting guess of a + x/2 gave full accuracy through the range with, if I recall correctly, three iterations. The value for a was around 0.48. By chance, I spotted an instruction which when converted to decimal was about this size, so I used it to save a store. Some time later, a puzzled user had considerable difficulty in understanding my inadequately documented routine.'' It's much easier to use code as data than vice versa. How could it be that others could read of these feats and none feel compelled to walk that path? It isn't, because I'm one counter example. Unfortunately, most seem content with never stopping to question anything about computing, and instead walk into the open arms of cults. ``The reality is that in the majority of situations, what the customer wants is 99% perfect software now, rather than 100% perfect software delivered five years from now at five times the price.'' This supposes nearly-perfect is common, but I'd put it closer to fifty or seventy-five percent, and so unacceptable. If correctness ceases to be a goal, it will eventually become unobtainable. Fortunately, real programmers, using languages such as Ada and Common Lisp, can still deliver it. ``We are still idealists, and the ideal is still attainable. But we simply choose not to, because our clients are pragmatists, and they are the ones for whom we work.'' This idea of being capable and simply unwilling is a pathetic delusion. Part of the reason so much software is horrible is directly because of this attitude. An employee of a business demands characteristics of software he won't use, made by apathetic fools, and the true users suffer. So few understand for why machines exist, which is to automate and automate and automate and so on: http://verisimilitudes.net/2020-09-24 I'm a hacker, and the differences between us are the differences between mere coders and hackers.

New comment by :

Plain text only. Line breaks become <br/>
The square root of minus one: