Monday, October 23, 2006

The Death of Information Security...

I was going over some recent gradate work I had done a year or so ago, and came across an article of mine on Nicholas Carr’s take on “The End of Corporate Computing”. Now – from an InfoSec perspective – where does that place us? I’m not sure yet. This diatribe is a little over a year old, and Carr’s source article a little bit older. Maybe the title should be “The End of Central IT Security…”… Who knows…

Reading Carr’s “The End of Corporate Computing” left me with several conflicting opinions and thoughts about the future of IT. In his paper, he outlines what basically are four stages of a technological life cycle. Condensed down, these stages are: 1) new introduction, 2) rapid build out, 3) vanishing advantage, and 4) commoditization. After my own readings, and those of other classmates, I agree with Carr’s illustrations – to a point. He indicates that Central IT is irrelevant, but I would argue that this may not yet be the case. First, confirmation that his theories are correct, at least to the point of Vanishing Advantage, based on my own experiences.

I am probably in what will be called the “transitional generation” when speaking about IT technologies. I believe this gives me a unique (generational) perspective on Carr’s theories on Information Technology and swing to “Utility Computing” and the four stage, because I have lived through the first three. It may be argued that we’re in the fourth stage of Commoditization, but we’ll see..

Like all things in IT, it’s all about timing. My first experiences with IT were in the mid 1970’s at a New England university. It was only when halfway through the fist semester that the university formally recognized Computer Science as a discrete discipline. For a school of over 8,000 students, there were fewer than two hundred timesharing terminals, and half of those were hard-copy. Most companies had never thought of owning their own computing infrastructures, and the image of technicians in lab coats was still prevalent. Phase one encompassed the development in the late 1940s of the large, single purpose computers, followed on by the general purpose mainframes of the 1960s and early 1970s.

Skip ahead to the early 1980’s and the rapid build out and availability of low-coast timesharing systems for smaller businesses. No longer were these devices limited to the Fortune 100 or Educational entities, any small business could afford a computer system of their own. Companies like IBM were still the sole producers of large monolithic systems that had served the very highly centralized IT groups until this time, but smaller, more agile companies were now filling in the niches left by this expansion. My own experiences paralleled this expansion, as a System manger for a small regional auto parts distributor, who for the first time in their (then) 50 year history, could afford to digitize their operations.

This period of “rapid build out” was evidenced by the likes of companies like Wand, Data General, and Digital Equipment. They brought to the consumers systems that did not need large cooling systems, large staffs, or custom programming. Applications could now be bought off the shelf and customized to meet a company’s particular needs.

Unfortunately, the transition into the third phase of vanishing advantage can be attributed to the development of the PC and lower cost midrange server platforms. These systems were faster, smaller, and more flexible than the “mini” computers that were produced during the 1980s. What this allowed was the shift from hardware budgets to software development budgets. No longer were particular manufacturer’s systems the focus. It was all about software. In my own experiences of this time frame, we shifted from canned software to highly customized versions of the same suite. This allowed several critical business opportunities to be taken advantage of, primarily “just in time” ordering from our suppliers and subsequent delivery of restocked items to our stores. This one phase contributed to reducing order cycles from six months to six weeks, and store restocking cycles from four weeks to four days.

Furthering this decline of central IT, was the introduction of the PC and Internet in the mid-1990s. Now, the power of programming and control could be literally placed in the hands of normal users. Central IT strived to retain control over this environment, but generally was consigned to setting standards, policy creation, and subsequent enforcement. It was at this point that I transition from a small (<> 25,000 employees). Evident at the time was this “tension” between IT and the Business community. The genesis for Carr’s commoditization phase can be traced directly to this newfound “power” that had been taken by the Business community.

Finally – to my opinion of whether or not Carr has the right idea… I would agree with Carr that IT as a function will be come more decentralized as technologies such as the Internet become more pervasive. Bandwidth increases will generally occur as more companies light up the large amounts of dark fiber optic cable that were installed during the Dot Com boom / bust cycle. However, I suspect that his view that this “irrelevance” is already happening is ill-timed and I can point to several anecdotal situations where any attempt at making IT a commodity may be hindered.

First, a question around a critical piece of infrastructure necessary for this commoditization may not actually be real. The percentage of unlit fiber optic network cable was generally thought of as being larger than that of “lit” or operational cable. Articles as old as 2002 (http://www.cio.com/archive/101502/et_pundit.html ) indicate that the cable may in fact already be close to capacity, and more importantly, where not lit, may be obsolete due to technological enhancements. Even more recent research (http://news.com.com/Dark+fiber+Businesses+see+the+light+-+page+2/2100-1037_3-5557910-2.html?tag=st.next ) Shows that were such dark fiber exists, it will not generate the commoditization envision by Carr, but will actually foster the continued centralized management of a corporation’s network and IT infrastructure.

The other basic flaw with any attempt at making IT irrelevant is that current state of software “engineering”. Stated (anecdotally) in previous postings, there is still no standard way to write software. The infrastructure may be heading towards commodity status, but the underlying application infrastructure is far behind. An abundance of the variety of platforms, languages, databases, all make for too much differentiation on how applications can be architected. Because of this, any standardization will be difficult if not impossible to accomplish.

It will take the combined efforts of the various software industries to make this happen, if at all. While this may seem cynical, it is unfortunately true that software is inherently unstable and prone to any number of problems from intentional bad design to circumvent controls, or unintentional programmer stupidity. Unlike the Utility models (electrical, and transportation) presented by Carr, the software is the key to the ubiquitous Computing model. Standard design, standard controls, standard interfaces, and standard controls all must be in place before any of the Utility Computing models can be effected.

No comments: