Harder Core Than Thou

We are still that hardcore.

Forty to fifty years ago, in the 1960s, I.T. professionals worked with systems which, by modern standards, were close to useless. (The term "information technology" didn't exist, but let's call the computer users of that era "I.T. professionals" regardless.) These systems were cutting edge for their era, which is respectable, but they lacked much which we take for granted. For example, output was sent to a printer, not a monitor, and thereby consumed ink, paper and a great deal of time. Any serious computer system occupied an entire room, if not a floor, of your building. It required a team of people to maintain. It was inscrutable and difficult to handle, because any concession to usability carried a real cost in processor cycles and this was the era when computer processor time was more valuable and difficult to replace than human programmer time. You were lucky to get one compile per day, so you wrote your code out by hand, punched it into cards by hand or with some kind of mechanical machine, and avoided dropping them on the floor because re-ordering them was such a monumental challenge. It was the Bronze Age of computing or whatever metaphor you want to use. Programming was difficult back then and you had to meticulously validate every line of code you wrote, and have it cross-checked, because a failed build cost serious time and ultimately pushed back serious deadlines.

We get it. This is just how computing -- of necessity -- worked in that era. And in order to accommodate these limitations, people wrote good code. Code that would run continuously without hiccup or malfunction for literally forty years -- provided the underlying hardware didn't change significantly. And they wrote that code first time. Because they had to. Code that was close to perfect. The equivalent of walking a tightrope.

Those were great people. No question. They accomplished great feats and created great software, some of which is still operating to this day.

The problem is that people point at the leftover COBOL and processes from that era and lament at what software engineering has lost. The statement, whether implicit or explicit, is that, in this era of comforting, forgiving, error-tolerant, problem-catching IDEs, and of programming languages which were - horror of horrors - built to be more usable they are efficient, we have somehow lost the capability to write code which compiles first time. That we have lost the ability to write code which will run continuously for forty years without a hitch. That we are no longer hardcore.

I contest the "forty years" statement. We are observing from a biased perspective. The only code from forty years ago which is still running is the only code which is still running that was written forty years ago. How much code from that era is dead and erased by now, replaced with something more powerful and maintainable? Who knows? We don't remember that stuff. Because it was erased. What's the percentage, exactly?

Of course a computer program which runs without fault will continue to run without fault indefinitely. It just needs the hardware to support it. If the hardware manufacturers are prepared to continue to provide compatible hardware -- which we are -- then any proven program will be able to run indefinitely. It's not like some memory leak which was negligible after thirty-nine years will suddenly become fatally significant after forty. We work with basically mathematically perfect systems which, by now, have tolerance for hardware faults. If the program is good, and the hardware is good, there will be no problem.

And people write forty-year software all the time. Even right now. We have a tolerant approach to hardware failure these days, of necessity, but that doesn't mean that the software running on that metal is bad. People write software all the time which goes on to run, largely uninterrupted, for five years. Left alone, and properly supported, that software will run for another thirty-five years. No problem. It's not actually that difficult to write good software. The hard part is reserving the forty-year period needed to prove that it's good.

And I contest the "programming greats" statement -- the assertion that they don't exist anymore. There is a well known piece of prose about Real Programmers. The subject's name was Mel. (Mel wrote unmaintainable software. The technology to write unmaintainable software has not been lost. But that's beside the point.) Mel wrote highly efficient code, and was able to do so thanks to his intimate and frightening understanding of the underlying system. People don't write code like that anymore.

That doesn't mean that we can't.

The technology to be incredibly freaking smart has not been lost. People like Mel still exist, except that they fry bigger fish now, and create highly performant software on an entirely different level. The technology to make software which is effectively perfect has not been lost. People who can do it, first time, still exist. But the need to do that has gone. And software which is one hundred percent perfect is not what the customer wants, unless the customer is NASA. And NASA does have software guys. And they do deliver. The reality is that in the majority of situations, what the customer wants is 99% perfect software now, rather than 100% perfect software delivered five years from now at five times the price. The natural state of I.T. right now is one of continuous rapid change, and I.T. consumers are used to this concept. They're used to upgrading when the new fixes are ready. The requirements have changed, and the providers have changed to meet those requirements.

This is not weakness. It doesn't make modern software engineering an easier or wussier task. Don't confuse a relatively low quality of output with low effort or low ability. Look at the commensurate rapid pace and high complexity. It's a trade-off which is made consciously.

If we had to write the Right Stuff, we could. We, programmers as a society, are still That Good. We are still idealists, and the ideal is still attainable. But we simply choose not to, because our clients are pragmatists, and they are the ones for whom we work.

Back to Blog
Back to Things Of Interest

Discussion (12)

2010-12-31 23:20:05 by bbot:

Perhaps relevant to note that, even though the article is 15 years old, it's still perfectly relevant. The shuttle hardware is the same. The program is certainly the same.

2011-01-01 00:37:22 by skztr:

This reminds me of NASA vs the rest of the world.

NASA engineers things using the purest science they can make use of. They take ten years and build a mathematical model of a perfect spaceship whose design very much resembles a perfect sphere. So they go out and they build a perfect sphere, with tolerances beyond any measurable precision. They need to hold it together with screws, so they come up with a model of a screw, determine exactly how much it needs to weigh, and spend $500 per screw to ensure that each one fits the model exactly. They know that if there is any deviation, the result will not fit the model, and so it may explode.

The rest of the world can't spend thousands of dollars on a handful of screws, so they come up with a model which isn't so broken that it falls apart at the slightest deviation.

That is the difference between "Computer Science" and "Programming", and it is the difference between working out the perfect solution to a problem and actually making a solution which works (especially once your boss casually tells you the real specs for the project when you're ninety percent done coding what he said six months ago)

2011-01-01 04:43:47 by YarKramer:

You know, I've always viewed phrases like "real man" or "real programmer" with a certain amount of disdain, especially once I grokked the "no true Scotsman" fallacy. This kind of thing is one of the reasons for that.

2011-01-01 07:01:00 by JoeCool:

I would like to state my disdain for the notion of hardcoreness in general. Rather: Is being hardcore awesome? Yes. Should you look down on people who aren't? No. Well okay it's fine if they think they're "t3h l33t h4x" but if they just want to use a computer then shove off.
Mostly stop thinking you're hard core because you're using notepad instead of an IDE. You're not being clever, you're coding slower than you could be for no good reason.
Genius programmers write genius code in whatever environment they're allowed. Genius python beats idiot x86 any day of any week.

2011-01-01 10:00:04 by Artanis:

@JoeCool I agree that IDE vs plain-text editor is kinda silly. It's a preference, and you can write good code in either. Personally, I take issue with the programmers that let the IDE write 90% of the program. One of my classes is using the WindowBuilder Eclipse plugin, and while the code generated is correct, I'm not sure anyone really understands it (I know I don't). I tried, I tried, but I couldn't get the professor to not make 9 *identical* anonymous classes to handle click events for a tic-tac-toe example. (no, no, make the handler here, assign it to a local variable. there. now pass that variable instead of--no, don't copy the class definition into all nine add handler calls. yes the variable is unused. you didn't use it. i give up.)

2011-01-02 02:32:34 by green:

"what the customer wants is 99% perfect software now"

Yeah, I am okay with 99% perfect software. But I get annoyed by the surplus of 85% perfect stuff. It is just good enough for the user to expect it to work most of the time.

2011-01-03 09:41:04 by AlexB:

I agree with Artanis. I've written in several different IDEs in a couple different languages (I prefer plain-text C, old habits die hard) and if you know what you're doing, you accomplish the exact same thing. The biggest thing is the idea. The power to be had in raw creation. It doesn't matter where you work, so long as your design, your plan, your idea, is solid.

2011-01-03 16:32:52 by HOLIERThanThou:

Can we please stop using the word "performant"? Thank you. That is all.

2011-01-04 21:55:50 by PropagandaPanda:

Wouldn't it be relevant to also point out that the programs of today are also pretty much always WAY more complex then the stuff of forty years ago? I mean the software that has to run a cell phone has to run more than a simple dial pad, it has to run the weather widget, the clock, ring tone system and so on. The software that runs an e-commerce site has to keep track of who ordered what and how many and charge the right people and send it to the right place and do that a thousand times per minute and keep it all secure with an inventory of millions of items.

Ok, yeah, its sort of a ratio thing, we have better things to program with for more complex programs, than 40 years ago, but I'm not sure I'd call it a 1:1 ratio.

2011-01-06 21:25:17 by Blank:

I don't think the level of code or programmers decreased. In fact, I think the opposite happened. Over the last century the whole concept of programming have gone through many researches and discoveries that allow modern programmers to create quality and highly efficient code.

Remembering the past better than it actually was is a common bias in the human brain. These ancient codes you are talking about weren't all that perfect, and the lack of proper debugging and development equipment didn't contribute to that. You must also consider the additional requirements that developed over the years. While in the past making a console application that would just do the job was enough, the market now demands every program to contain user-friendly UIs, to integrate with the operating system and its mechanisms (API, registry, drag&drop, message forms) and most importantly modern code must be secure and comply to many security standards. While today every little exploit gets found and abused and affects tens to thousands to millions of computers worldwide, I bet none of these 1960 developers really bothered to protect their code against all the common exploits known nowadays (I once saw a warning on a Python forum not to use tempfile.mktemp to generate a random temp file name, because by the time you created that temp file an attacker could possibly make the directory a symlink to the system directory and thus abuse your hypothetical su access to read system files, if you even had such access. Seriously, are you kidding me?).

And look at the results. Look at the linux kernels, at the GNU project, at Ubuntu. They are the meaning of modern coding. They write millions of code lines. They have to deal with all that shit I mentioned before, and thanks to all the modern programming mechanisms, they are doing it wonderfully. I don't think anyone in the past ever managed to get to this level of quality code.

2011-01-08 09:18:25 by Snowyowl:

Obligatory XKCD link which actually makes much the same point, though less in-depth: xkcd.com/378

2011-02-13 03:07:53 by KeaponLaffin:

I think there is a few false assumptions here.

'Back in the day' perfect programs were wrote for all the reasons above plus because the platform was static. For long periods of time the architecture was always the same. Every change was momentous and often changed the whole paradigm. You programmed it small because 8-bit was all you had to work with and you could make it perfect because it'll always be running on the same machine or an exact copy. That's why NASA -can- demand it's programmers to make it absolutely perfect. That program will always be running on that machine and no other. The specifications of the components are known intimately with high accuracy.

Now we have different OSes, different versions, updates, multitasking, 3 new video cards since the project started, codecs, blah blah. So in that way 'we can't' write 'perfect' code in this modern world. It would be obsolete before your morning coffee.

The 99% customer thing too. Customers are absolute geniuses at asking for that one little thing...that to 'cleanly' implement would require a major structure overhaul. So you hack it in, also because that's all they pay you for. Rinse and repeat.
My personal favorite was when the customer asked that bills now be calculated to a millionth of a peso.

I'll take your word that leftover COBOL is pretty or whatever. I'm a young COBOL programmer maintaining a program first written before I was born. Not one of the dozen on our team understand much of the thing and many have been around it a few decades. Some pieces of code haven't even been looked at since the mid 70s. That code sucks. I've even seen some GOTOs! The entire mess is spaghetti code like you wouldn't believe. I think it's in pain and hates all of us.