A tongue-in-cheek blog posting (
Desecrating Rails For A Brighter Tomorrow) laments the increasing popularity of Ruby on Rails because it is eroding the "competitive advantage Rails affords"; the "gorillas" ("legacy web companies that have payrolls 100x greater" than "a programmer or two") are catching on and becoming "nimbler", restoring order to the world. "The horror."
While entertaining, the author unintentionally illuminates an unspoken ugly truth about software development:
Many developers, if not most, are attracted to the software business because of exclusivity, that is, being able to do something that relatively few others understand how to do. This trait, however, is damaging the overall industry in many ways.Within the software business, developers engage in a continuous cycle of seeking out and immersing themselves in brand-new, leading edge technologies to maintain and enhance that exclusivity, with little regard to any non-technical project considerations such as risk management, robustness, return on investment (ROI), etc.
The manifestation of this cycle is the
churn of software tools, and it affects all stacks (Microsoft, LAMP, J2EE, at al). Software developers are under (peer) "pressure" to upgrade, switch or otherwise drastically change their development tools every two or three years, or risk "falling behind", becoming "obsolete" or, worse, being seen as a "legacy" software developer (the horror). Sometimes they reason that it's for "job security" (
an argument without any teeth), but it's really about "coolness", a component of which is
exclusivity.
One of the most insidious side-effects is promotion of the “not invented here” syndrome, where some developers will spend tens of man-hours or more building something that has been built many times before and can be acquired, completely tested and supported, for little or no cost. The Open Source movement is largely a manifestation of this, under the guise of promoting ABEM (Anyone But Evil Microsoft), but is really just a way to (1) avoid paying for tools and (2) creating a very large sandbox for software developers to play in under their own rules. "But", you may say, "these tools are helping us manage objects and other reusable components better in team environments", which sounds great in theory but there's little real evidence supporting this contention, borne out by the continuing state of poor performance in the software development industry.
Another side-effect is, simply,
investment loss (lowered ROI). The most destructive churning occurs when the developer swaps out hard-won habits and knowledge with entirely new ones. There are many examples of this (the evolution of Visual Studio, for example), but we need look no further than Microsoft Office and the new ribbon bar. Over the years, many of us have learned that maximzing use of the keyboard over the mouse dramatically increases efficiency (even more so for software developers working in their IDE); the ribbon bar forces the user in the opposite direction.
Regardless of what you think about the ribbon bar, the point is this: it changed dramatically how I
have to work in the tool; my
investment in learning how to use the tool goes largely out the window. I have to spend (invest) significant mindshare on thinking about how to do things that were second-nature before. A major league shortstop does not have to think about every little possibility that could occur when a ball is hit to him; he's got so much repetitive experience, he only has to react, and nearly always reacts correctly (the
all-time worst fielding percentage for a shortstop is .935, that is, a 6.5% error rate). What would happen to that error rate if tools and rules of the game changed every three years, if the basepaths were extended, gloves were removed and grass was replaced with concrete? Think the error rate would go up?
With reduced ROI comes reduced skill levels. A person who works day in and day out with a tool for six years is more than likely to significantly outperform a person who has worked in it half that time. Tool churn, however, never lets us get that far;
few ever really gets good at maximizing the use of their toolset. No wonder there's little predictability in the software development process.
All of this weakens the overall software development industry, especially the commercial side. While many developers hold their own skill levels in high esteem,
in fact they are mostly amateurs because they never work with a given tool or set of tools long enough to really attain expert status; what they’ve really attained is
expert status relative to their fellow developers. These “experts” are invariably those who get early looks at upcoming technology releases, giving them a temporal head start: they simply have gotten significantly more time earlier with a product than have most developers.
Disregarding jokes about government projects, at least some of these folks get it. Take a look at just about any software development endeavor in the space program, where real money is at stake, as well as lives, political consequences and national prestige. Consider
The Software Behind the Mars Phoenix Lander, which describes the antithesis of modern commercial software development practices. In the interview with the project manager, this telling exchange:
Pete McBreen, for example, had a great book a couple of years ago called Software Craftsmanship, where he said that about the only place where you can sit down and design all of your software up front is where you actually have the hardware; you know your exact constraints, and you know you're not going to be able to update the software once you deliver it. What's the process look like there for getting software to get 700+ pounds of metal and equipment to another planet?
"Generally what we do with these proposals, particularly with these kinds of missions, is we try to rely on — in order to cut costs, we try to rely on software that's been previously established and proven to work. So in this case, this is part of a line of spacecrafts that Lockheed Martin has produced, and the software has heritage literally going back to the Mars Pathfinder mission."
The Pathfinder mission landed on Mars in 1997; surely software development on that project preceded the landing by many years. There’s an interesting story about a software problem encountered during that mission
here.
This is a project that used, as its base, software developed and tested probably nearly 20 years ago! How uncool is that!
How different would your approach be if it was very hard and expensive to change
any code or hardware after putting the system in production? Would you take as many risks with unproven tools with which you only have a few years experience, and bet your career on that?
Some commercial developers may say that “well, we have tighter timeframes and market pressures, we couldn’t do that”, a specious argument at best; interplanetary space travel projects (where the laws of planetary physics rule the schedule) can apply more pressure than just about any commercial endeavor I know. The point is, software developers on any project face the same issues:
how do we do more with less, knowing that problems are going to happen. How do we design systems that are truly fault-tolerant. The folks on the Mars missions have to do that; in the commercial space, it seems to be a choice, one in which developers themselves have a disporportionate say.
So, who's making the better
business decision here? Who's taking the longer-term view? Who's protecting the large, costly investments made in matering the software development toolset?
To me, the answer is clear: we have met the enemy, and he is us.
It's time to get the foxes, who have failed miserably and consistenly, out of the henhouse; we need technical leaders who put business and project needs (way) ahead of technical considerations, who also extend the shelf-life of the investment made in these very complex endeavors.
What do you think?