Pages

Friday, May 28, 2010

How IT Got Boring (It's All About De-Commissioning)

I have been in and around big-corporation IT[1] for 20+ years. I started my career in 1987, fairly early in the PC era. I was attracted to computers in general for the technical and micro-scale problem-solving challenges. I was attracted to corporate IT for the opportunity to build systems which solve problems, deliver efficiency and enable innovation.

For the first 10+ years of my career, that worked out pretty well. The technology improved quickly, and there were numerous opportunities to apply technology and make a big impact quickly. Automating a manual process for the first time is usually a sure winner.

But something subtle started happening. It became apparent to me personally in the post-Y2K, post-dot-com era. The fact was that corporate IT was no longer a fast-moving, innovative, technology-driven profession. IT has reached "maturity". This maturity is manifest both in the technology, and in the projects.

Regarding the Technology
In the 80s and well into the 90s, it was conventional wisdom that keeping up with technology was an exhausting rat-race. If you weren't careful, and allowed yourself to work in the same "old" technology for 5 years, you would fall hopelessly behind. For corporate IT, that has long-since ceased to be true. The pace of technology turnover is way slower now. Information technology tools may develop and improve steadily, but they don't go through wholesale, next-generation replacement very fast at all. The web was probably the last pervasive new technology to break over corporate IT--and that wave hit 15 years ago. If you were at the top of your game 5 years ago, and sat out of the workforce for 5 years and came back, I think you would have no trouble picking up where you left off.

Regarding the Projects
At some point around the year 2000, everything that could be automated, had been automated. At least once, sometimes more than once. Somewhere along the line, in the rush to automate stuff and gain all those efficiencies, we didn't really think too much about maintenance[2]. We just built a bunch of systems. But we discovered that the care and feeding of those systems was becoming a crushing burden. As companies acquired other companies, or created new product lines, or just generally modified business processes, they found that it was getting harder and harder to alter computer systems to keep up. Ironically, the software became the most rigid ingredient in business processes and capabilities.

This fact in turn led to behaviors which ultimately compounded the problem. Since it was already hard for the systems to keep up with business changes, there was an inevitable tendency to take shortcuts. In particular, older (aka, "legacy") systems, when being superseded, were often not completely de-commissioned, but were left in place, performing some vestigal functions that the new system did not provide.

Other trends have only compounded this problem. SOX separation of duties requirements--while generally appropriate--slow down the pace at which small changes can be introduced. Data-privacy requirements create further burdens. Outsourcing, and turnover in general, also make the problem worse: it is an axiom that documentation for computer systems is always woefully incomplete and out-of-date, so work arrangements that eliminate the handing down of "tribal knowledge" further undermine the organizational knowledge needed to understand how a system works, and what is needed to take it out of production.

Conclusion
So the result of all this is that the typical large corporation, circa 2005, started to feel that it was choking on the complexity of its own systems environment. The single biggest focus of most IT departments, after keeping the lights on, is an increasingly frantic quest for environment simplification. However, because of the complexity of the problem, and the effect of running frantically while trying to shoot a moving target, this is not so easily done. Nothing goes quickly, and many of the results are close to invisible. In other words, corporate IT work has gotten kind of boring[3].

I will be interested to re-visit this post in a decade--my guess is that some progress will have been made, but this will still be a big concern.


Anyway, I thought it would be interesting to try to describe this problem for the "lay" person, as my sense is that most non-IT people have little awareness of this situation.


Notes
[1] This article pertains to the IT systems that are built for in-house use, by typical large companies. The situation is likely different for other categories of IT systems--such as software products or embedded software. For a fuller treatment of those distinctions, see Joel Spolsky's essay "Five Worlds of Software".
[2] Sort of like the original Y2K problem itself.
[3] It's not truly boring, any more than many other kinds of complicated, but slow-moving, professional work are boring. In fact, once you adjust your mind-set, simplifying the complex environment provides a very challenging problem to work with. But it is no longer intrinsically exciting, the way only building shiny, new things can be. (And remember--this only applies to BigCo in-house software, as per note #1.)
.

.

2 comments:

  1. You state "The web was probably the last pervasive new technology to break over corporate IT--and that wave hit 15 years ago."

    Your recent comments on your Android phone could lead one to believe that major technology break-outs are best realized in hindsight.

    3G phones that access the internet at fast enough speeds, combined with cameras, applications for all kinds of special needs (think materials calculator for a contractor), and GPS navigation, seem like a technological leap from Y2K. If you add using that phone as a credit card to buy things from a vending machine or retail store -- the posibilities expand geometrically.

    ReplyDelete