A few weeks ago, Mike Loukides of O’Reilly Media published a lengthy survey on the state of the tech industry. He used user data from O’Reilly’s learning platform to determine what’s currently hot — or not.
His study concentrates on the present and the immediate future, but it did make me think how the field has changed since the dot-com boom (and bust) — that is, in the last 20 years, which really constitute the Internet Age and the Modern Software Era so far.
Any Language But C
The single most profound change may simply be the fact that today most programs are not written in C (or C++). The problem is not C itself, or any one of its features (pointers, typing, preprocessor macros — you name it). The problem is that it’s not enough to have 32 keywords and a compiler. C provides almost no support infrastructure: no memory management, no proper string type, no resizable arrays, much less a hashmap implementation, and no runtime. (For anyone unfamiliar with C: “no runtime” here means that if anything goes wrong, you don’t get a more or less informative stack trace, you get an entirely useless “core dump”.)
I grew up with C, and I quite like C. But C requires programmers to spend an inordinate amount of their effort simply managing the health of their programs — never mind the actual application logic!
Computational Resources are Unlimited
Until just a few years ago, almost every “serious” program I wrote was limited by resource constraints, usually time and/or space (that is, memory). This is no longer so: in the Cloud, resources are not exactly infinite, but they are essentially unlimited.
This is a big deal. Not so much because it makes things possible that previously were impossible, but because it makes things easy that previously were hard, and often effectively too hard.
As an example, I have occasionally had to deal with data sets that would not (comfortably, or at all) fit into memory — which made even simple things like sorting them a major ordeal. You have to break the data set into partitions that fit into memory, sort them individually, and then merge the partitions again into a single data set. The overhead of doing so is sufficiently large that you ask seriously whether it’s all worth it. The answer may well be “No”, and the project simply doesn’t get done. “Too hard”, not enough value for the effort.
Today, I spin up a large memory VM for 30 minutes, do my analysis, and take it down again. I don’t even think about it.
Most Projects Do Ship
These days, most projects actually succeed and “ship” whatever software they were building. Of course, the project will be over time and over budget — but that’s just because “nothing ever gets finished on time and within budget” (Cheop’s Law). But ship they do!
Not so long ago, this wasn’t the case. Well into this century, it was common for projects to get cancelled, late in the game, because after months or years of work, and millions of dollars spent, they had nothing to show for their efforts. No working code. (High profile examples include MS Cairo, Apple’s Copland, and IBM’s and Apple’s Taligent effort. Many more such fiascos have occurred, but out of the public eye.)
If projects did ship, it was frequently only as a result of a “death march” experience — an extended period of extreme overwork and despair, all against the better judgment of the programmers, leading to massive burn-out and turnover.
This has stopped. If you embark on a project today, the odds are that it will finish, and that something will “ship”. It will probably be late, it may be buggy, and the experience may be more or less stressful. But you are not likely to toil, desperately, for years, only to have the entire effort cancelled, because there is no working code.
I am not entirely sure why this is. I believe that more realistic expectations play a big part. Projects have gotten smaller; large projects are more actively broken into manageable chunks. Project visibility has improved drastically: all these daily stand-ups provide at least some level of insight and prevent entire project teams from “going dark” for extended periods of time with no effective oversight.
Ultimately, the Agile Movement, despite all the detractions that may be raised against it, apparently has prevailed.