Since I'm a software engineer, you'd expect me to be alert to developments and trends in that field. You'd be wrong.
When it comes to the extremely faddish field of computer science, I'm a "late adopter." I have work to do, inadequate time to do it, and more demands on my attention than you might imagine. Keeping current with glamorous, strongly touted new offerings -- I call them "brights-and-shinies" -- is typically too time-consuming to be compatible with my obligations.
Alongside that, I'm economically minded. My time is a scarce resource. I spend it carefully, and only when I can get something for it that's worth the expenditure. The myriad fads and fascinations of others in this field, some of whom apparently have more free time than is good for them, usually trail off into the weeds without offering a positive return on investment.
Which is why I read this Wired article with a dismissive chuckle:
At Princeton and the University of California at Berkeley, two researchers are trying to shed some light on why some programming languages hit the big time but most others don't. In what they call a "side project," Leo Meyerovich and Ari Rabkin have polled tens of thousands of programmers, and they're combing through over 300,000 computing projects at the popular code repository SourceForge — all in an effort to determine why old languages still reign supreme.
"Why have we not been reliably able to improve on C?" Rabkin asks. In the thirty-five years since C was popularized, there have been enormous leaps in the design of software and operating systems, he says. But although C has been beefed up and other new languages have been very successful during that time, C is still a mainstay.
Yes, and why do cars still incorporate steering wheels, accelerator pedals, and brake pedals? The article does make some interesting observations, but it hits the grandest of all clinkers with this:
Most programmers learn three to four languages, the researchers say, but then stop. "Over time, you'd expect that as developers get older, they'd get more wisdom; they'd learn more languages," Meyerovich says. "We've found that's not true. They plateau."
Part of the problem is that by the time they hit 35 to 40 years old, they're often moving from hands-on coding to managing other programmers. At that point, there's little motivation to learn and implement new languages.
Those "researchers" should study humility. Perhaps they might stand before a mirror for fifteen minutes each day, and practice saying, "It's not you, it's me," until they can at least make it sound sincere.
Though I do have supervisory responsibilities, I'm still hands-on; I still participate in my group's undertakings as a senior technologist. If these "researchers" were to ask me why I haven't troubled to learn C#, or Ruby, or Python, or whatever the hot language fad is just now, I could tell them in a single sentence: They're irrelevant to my problem domain.
My group's problem domain is real-time simulation. For that domain, nothing has come along that bests C++ and the available supports for it. There are other domains, some of which are more efficiently addressed with other languages and support systems. (I wouldn't dream of addressing a database problem or an artificial-intelligence project with C++.) A good engineer strives always to use the right tool for the job before him.
No doubt there are clever fellows working on a spiffy new language that does address the real-time domain. Perhaps I'll hear about it soon. But I still won't rush to embrace it. A good engineer doesn't adopt an unproven technology for dollars-and-cents work; he waits until those too-much-time-on-their-hands types can show him how their bright-and-shiny really supersedes his older tools. To do that, they must address four principally nontechnological considerations:
- The weight of legacy maintenance;
- The writing-off of prior (intellectual) investment;
- The front-loaded expenditure of money and time required by the new tool;
- Network effects.
These are all economic considerations. Even network effects are about economics rather than technology: they arise from the mass adoption of a technology, such that the size of the user community itself becomes an asset to its individual users. The Department of Defense discovered the importance of network effects with its abortive, highly expensive attempt to impose a new programming language, Ada, upon the defense industry. Today, DoD strongly recommends the use of C++ in new embedded systems.
When a tool is developed with a well-defined problem domain in mind, it can be successful -- if it truly offers its prospective users a way of solving their problems that's superior, not just from a technological standpoint, but from the broader economic perspective. But IBM's PL/I disaster -- its attempt to obsolete all other programming languages with its language PL/I -- should make it clear that there's no such thing as a tool that applies to every kind of problem. Even an attempt to address two disconnected domains with one tool usually produces an unsatisfactory compromise, which is why I don't reach for my Leatherman Multi-Tool when my regular toolbox is within range.
Researchers of the sort quoted in the Wired article seem to miss those points far more often than not -- and then prattle about their superior "wisdom" while utterly unable to explain the contrary decisions of others. Well, perhaps that's inherent in the mindset one must adopt to deem oneself a "researcher." Now if you'll excuse me, a forty-foot oak tree fell across my fence last night, and it's time I went out back and disposed of it. I'm all ready for the challenge; I've got my Swiss Army knife right here in my hand.