Typing Speed
I have never seen a software project in which the limiting factor was the speed with which programmers could type.
I have never seen a software project in which the limiting factor was the speed with which programmers could type.
I was recently reminded of the minimal set of project management artifacts and activities that are, in one way or another, simply indispensable.
None of this is news: all of this has been well established for at least 25 years (a quarter-century!), so I was surprised that apparently it is still not common knowledge or practice — at least not as common as one would wish for.
They use a “real database”. They use “nice object-oriented libraries”. They use “nice C++ abstractions”. And quite frankly, as a result of all these design decisions that sound so appealing to some CS people, the end result is a horrible and unmaintainable mess.
Sharing files across computers on a local, “informal” (home) network is a recurring desire. Yet, at least in the Linux world, one that does not have an obvious, canonical, default solution. Email, cloud storage, or using an USB stick all seem shockingly common.
In a moment of nostalgia, I picked up my copy of “The Art of UNIX Programming” by Eric S. Raymond (esr) and flipped through it again. It’s a book I’ve had since when it came out in 2004, and that I’ve always been quite fond of. I was looking forward to a review of “the way the future was”, as viewed from the early 2000s. So, it came as a bit of a surprise to me to find that the book seems to have aged rather poorly.
I occasionally see references to the HDF5 file format, but I have never encountered it in the wild. But a recent project generated multiple data sets simultaneously, in addition to metadata. Was there a better way than maintaining a collection of flat files? This prompted me to look at HDF5.
I recently came across a collection of old (1990s) “programming challenges”. I thought it might be amusing to tackle one of these challenges using technologies from the period in which they were posed, and compare the solution to one using contemporary techniques. In other words, do the same problem in C and Python.
I learned Unix almost 30 years ago, while attending graduate school in the early 90s, from a now long-obsolete book entitled “Unix for the Impatient”.
Some of the tools and commands I learned back then have long since
become irrelevant (ftp
, telnet
, cvs
, biff
— remember biff
?).
Others, although long in the tooth, continue to serve me well every day
(emacs
, tcsh
, cc
). And yet a third group seems to be more important
than ever (such as tar
, which is the basis for Docker images).
A survey published by O’Reilly regarding the state of the tech industry made me reflect how the field has changed since the dot-com boom (and bust) — that is, in the last 20 years, which really constitute the Internet Age and the Modern Software Era so far.