Interesting article in today’s Computerworld feed. Something that I’ve been harping about to my peers for some number of years - and anyone that would listen – is the concept of Knowing the Business of Information Security. As Mary Brandel points out in her article, “The Team at the Top”, it is becoming increasingly important for the technology leaders at the top tiers to not only be good Technologists, but to also be good Business people.
The things that have always bugged me is that with stove-piped or otherwise narrowly defined organizational structures, the technology guys & gals are not really that cognizant of what they are actually supporting. Sure – if you ask them what they “do”, they will say: “I’m a Network Engineer”, or “I’m a Security Analyst”, or “I’m a Software development Manger”. What they all fail to indicate is their relationship to the Company.
What they actually should focus on is that relationship. When asked – the answer should always be: “I’m here to support the Business in meeting their goals within their requirements”. Anything you do to support that is a function of your job, but it’s not your job. Semantics? Maybe – but as Mary points out – top level leadership are looking for individuals that can walk and chew gum at the same time. This basically means that you absolutely must be able to speak the language of Business, and secure / operation / maintain the IT infrastructure while at the same time maintain focus on what it is that you are really doing – helping the Business make money.
To remain solely a technological asset, at least at the upper tiers of administration, is not a future looking mode, and will guarantee that you become irrelevant in the near and long term.
Daily musings (almost)from an Information Security Professional about the world of Systems and Information Security.
Tuesday, October 24, 2006
Monday, October 23, 2006
The Death of Information Security...
I was going over some recent gradate work I had done a year or so ago, and came across an article of mine on Nicholas Carr’s take on “The End of Corporate Computing”. Now – from an InfoSec perspective – where does that place us? I’m not sure yet. This diatribe is a little over a year old, and Carr’s source article a little bit older. Maybe the title should be “The End of Central IT Security…”… Who knows…
Reading Carr’s “The End of Corporate Computing” left me with several conflicting opinions and thoughts about the future of IT. In his paper, he outlines what basically are four stages of a technological life cycle. Condensed down, these stages are: 1) new introduction, 2) rapid build out, 3) vanishing advantage, and 4) commoditization. After my own readings, and those of other classmates, I agree with Carr’s illustrations – to a point. He indicates that Central IT is irrelevant, but I would argue that this may not yet be the case. First, confirmation that his theories are correct, at least to the point of Vanishing Advantage, based on my own experiences.
I am probably in what will be called the “transitional generation” when speaking about IT technologies. I believe this gives me a unique (generational) perspective on Carr’s theories on Information Technology and swing to “Utility Computing” and the four stage, because I have lived through the first three. It may be argued that we’re in the fourth stage of Commoditization, but we’ll see..
Like all things in IT, it’s all about timing. My first experiences with IT were in the mid 1970’s at a New England university. It was only when halfway through the fist semester that the university formally recognized Computer Science as a discrete discipline. For a school of over 8,000 students, there were fewer than two hundred timesharing terminals, and half of those were hard-copy. Most companies had never thought of owning their own computing infrastructures, and the image of technicians in lab coats was still prevalent. Phase one encompassed the development in the late 1940s of the large, single purpose computers, followed on by the general purpose mainframes of the 1960s and early 1970s.
Skip ahead to the early 1980’s and the rapid build out and availability of low-coast timesharing systems for smaller businesses. No longer were these devices limited to the Fortune 100 or Educational entities, any small business could afford a computer system of their own. Companies like IBM were still the sole producers of large monolithic systems that had served the very highly centralized IT groups until this time, but smaller, more agile companies were now filling in the niches left by this expansion. My own experiences paralleled this expansion, as a System manger for a small regional auto parts distributor, who for the first time in their (then) 50 year history, could afford to digitize their operations.
This period of “rapid build out” was evidenced by the likes of companies like Wand, Data General, and Digital Equipment. They brought to the consumers systems that did not need large cooling systems, large staffs, or custom programming. Applications could now be bought off the shelf and customized to meet a company’s particular needs.
Unfortunately, the transition into the third phase of vanishing advantage can be attributed to the development of the PC and lower cost midrange server platforms. These systems were faster, smaller, and more flexible than the “mini” computers that were produced during the 1980s. What this allowed was the shift from hardware budgets to software development budgets. No longer were particular manufacturer’s systems the focus. It was all about software. In my own experiences of this time frame, we shifted from canned software to highly customized versions of the same suite. This allowed several critical business opportunities to be taken advantage of, primarily “just in time” ordering from our suppliers and subsequent delivery of restocked items to our stores. This one phase contributed to reducing order cycles from six months to six weeks, and store restocking cycles from four weeks to four days.
Furthering this decline of central IT, was the introduction of the PC and Internet in the mid-1990s. Now, the power of programming and control could be literally placed in the hands of normal users. Central IT strived to retain control over this environment, but generally was consigned to setting standards, policy creation, and subsequent enforcement. It was at this point that I transition from a small (<> 25,000 employees). Evident at the time was this “tension” between IT and the Business community. The genesis for Carr’s commoditization phase can be traced directly to this newfound “power” that had been taken by the Business community.
Finally – to my opinion of whether or not Carr has the right idea… I would agree with Carr that IT as a function will be come more decentralized as technologies such as the Internet become more pervasive. Bandwidth increases will generally occur as more companies light up the large amounts of dark fiber optic cable that were installed during the Dot Com boom / bust cycle. However, I suspect that his view that this “irrelevance” is already happening is ill-timed and I can point to several anecdotal situations where any attempt at making IT a commodity may be hindered.
First, a question around a critical piece of infrastructure necessary for this commoditization may not actually be real. The percentage of unlit fiber optic network cable was generally thought of as being larger than that of “lit” or operational cable. Articles as old as 2002 (http://www.cio.com/archive/101502/et_pundit.html ) indicate that the cable may in fact already be close to capacity, and more importantly, where not lit, may be obsolete due to technological enhancements. Even more recent research (http://news.com.com/Dark+fiber+Businesses+see+the+light+-+page+2/2100-1037_3-5557910-2.html?tag=st.next ) Shows that were such dark fiber exists, it will not generate the commoditization envision by Carr, but will actually foster the continued centralized management of a corporation’s network and IT infrastructure.
The other basic flaw with any attempt at making IT irrelevant is that current state of software “engineering”. Stated (anecdotally) in previous postings, there is still no standard way to write software. The infrastructure may be heading towards commodity status, but the underlying application infrastructure is far behind. An abundance of the variety of platforms, languages, databases, all make for too much differentiation on how applications can be architected. Because of this, any standardization will be difficult if not impossible to accomplish.
It will take the combined efforts of the various software industries to make this happen, if at all. While this may seem cynical, it is unfortunately true that software is inherently unstable and prone to any number of problems from intentional bad design to circumvent controls, or unintentional programmer stupidity. Unlike the Utility models (electrical, and transportation) presented by Carr, the software is the key to the ubiquitous Computing model. Standard design, standard controls, standard interfaces, and standard controls all must be in place before any of the Utility Computing models can be effected.
Reading Carr’s “The End of Corporate Computing” left me with several conflicting opinions and thoughts about the future of IT. In his paper, he outlines what basically are four stages of a technological life cycle. Condensed down, these stages are: 1) new introduction, 2) rapid build out, 3) vanishing advantage, and 4) commoditization. After my own readings, and those of other classmates, I agree with Carr’s illustrations – to a point. He indicates that Central IT is irrelevant, but I would argue that this may not yet be the case. First, confirmation that his theories are correct, at least to the point of Vanishing Advantage, based on my own experiences.
I am probably in what will be called the “transitional generation” when speaking about IT technologies. I believe this gives me a unique (generational) perspective on Carr’s theories on Information Technology and swing to “Utility Computing” and the four stage, because I have lived through the first three. It may be argued that we’re in the fourth stage of Commoditization, but we’ll see..
Like all things in IT, it’s all about timing. My first experiences with IT were in the mid 1970’s at a New England university. It was only when halfway through the fist semester that the university formally recognized Computer Science as a discrete discipline. For a school of over 8,000 students, there were fewer than two hundred timesharing terminals, and half of those were hard-copy. Most companies had never thought of owning their own computing infrastructures, and the image of technicians in lab coats was still prevalent. Phase one encompassed the development in the late 1940s of the large, single purpose computers, followed on by the general purpose mainframes of the 1960s and early 1970s.
Skip ahead to the early 1980’s and the rapid build out and availability of low-coast timesharing systems for smaller businesses. No longer were these devices limited to the Fortune 100 or Educational entities, any small business could afford a computer system of their own. Companies like IBM were still the sole producers of large monolithic systems that had served the very highly centralized IT groups until this time, but smaller, more agile companies were now filling in the niches left by this expansion. My own experiences paralleled this expansion, as a System manger for a small regional auto parts distributor, who for the first time in their (then) 50 year history, could afford to digitize their operations.
This period of “rapid build out” was evidenced by the likes of companies like Wand, Data General, and Digital Equipment. They brought to the consumers systems that did not need large cooling systems, large staffs, or custom programming. Applications could now be bought off the shelf and customized to meet a company’s particular needs.
Unfortunately, the transition into the third phase of vanishing advantage can be attributed to the development of the PC and lower cost midrange server platforms. These systems were faster, smaller, and more flexible than the “mini” computers that were produced during the 1980s. What this allowed was the shift from hardware budgets to software development budgets. No longer were particular manufacturer’s systems the focus. It was all about software. In my own experiences of this time frame, we shifted from canned software to highly customized versions of the same suite. This allowed several critical business opportunities to be taken advantage of, primarily “just in time” ordering from our suppliers and subsequent delivery of restocked items to our stores. This one phase contributed to reducing order cycles from six months to six weeks, and store restocking cycles from four weeks to four days.
Furthering this decline of central IT, was the introduction of the PC and Internet in the mid-1990s. Now, the power of programming and control could be literally placed in the hands of normal users. Central IT strived to retain control over this environment, but generally was consigned to setting standards, policy creation, and subsequent enforcement. It was at this point that I transition from a small (<> 25,000 employees). Evident at the time was this “tension” between IT and the Business community. The genesis for Carr’s commoditization phase can be traced directly to this newfound “power” that had been taken by the Business community.
Finally – to my opinion of whether or not Carr has the right idea… I would agree with Carr that IT as a function will be come more decentralized as technologies such as the Internet become more pervasive. Bandwidth increases will generally occur as more companies light up the large amounts of dark fiber optic cable that were installed during the Dot Com boom / bust cycle. However, I suspect that his view that this “irrelevance” is already happening is ill-timed and I can point to several anecdotal situations where any attempt at making IT a commodity may be hindered.
First, a question around a critical piece of infrastructure necessary for this commoditization may not actually be real. The percentage of unlit fiber optic network cable was generally thought of as being larger than that of “lit” or operational cable. Articles as old as 2002 (http://www.cio.com/archive/101502/et_pundit.html ) indicate that the cable may in fact already be close to capacity, and more importantly, where not lit, may be obsolete due to technological enhancements. Even more recent research (http://news.com.com/Dark+fiber+Businesses+see+the+light+-+page+2/2100-1037_3-5557910-2.html?tag=st.next ) Shows that were such dark fiber exists, it will not generate the commoditization envision by Carr, but will actually foster the continued centralized management of a corporation’s network and IT infrastructure.
The other basic flaw with any attempt at making IT irrelevant is that current state of software “engineering”. Stated (anecdotally) in previous postings, there is still no standard way to write software. The infrastructure may be heading towards commodity status, but the underlying application infrastructure is far behind. An abundance of the variety of platforms, languages, databases, all make for too much differentiation on how applications can be architected. Because of this, any standardization will be difficult if not impossible to accomplish.
It will take the combined efforts of the various software industries to make this happen, if at all. While this may seem cynical, it is unfortunately true that software is inherently unstable and prone to any number of problems from intentional bad design to circumvent controls, or unintentional programmer stupidity. Unlike the Utility models (electrical, and transportation) presented by Carr, the software is the key to the ubiquitous Computing model. Standard design, standard controls, standard interfaces, and standard controls all must be in place before any of the Utility Computing models can be effected.
Wednesday, October 18, 2006
Oracle patching Tsunami
Yeah – it’s been quite here at InfoSecToday. I’ve been letting Mike and Steve run with our collectively demented thought processes. However this one from Oracle, posted on Computerworld piqued my interest this morning.
One thing that we’ve all learned (the hard way) is that you must keep up with vendor patches to the best of your collective abilities. Of course, this does means that you do a proper risk assessment and only disrupt your business’s operations to the minimum level necessary. In the case of Microsoft, they got religion after Slammer and Blaster and now have a robust (really, they do) vulnerability assessment mechanism, and patch distribution program. Earlier this week, Oracle finally admitted that they’ve drank the patching Kool-Aid® and will be releasing patches, not only on a regular basis, but with actual details on what they patching and why.
Halleluiah. But - That was the easy part. The hard part will be getting all of the infrastructure and business application groups that have come to enjoy a significant level of “patch complacency” to start thinking in the “must patch regularly” mindset. This means that SLA’s (service level agreements for those not in the ‘enterprise’ spaces) need to change, and horror of horrors, you’ll need to actually need to what’s deployed in your environment – AND – what versions.
Where am I going with this? It means that the burgeoning CMDB market will get a boost, existing Change Management (another Enterprise euphemism for doing what you said you’d do, when you said you’d do it) processes will get better, and overall this is a good thing. Now = if we can only get the other players, especially all of the layered products out there (such as Adobe, etc) to play the same game, we’d all buy much better off. Although some within my group may disagree, CVSS and CVE are good starts. The software industry as a whole needs to get on the same wagon, and not fall off when things get rough.
More to come…
Bill P
One thing that we’ve all learned (the hard way) is that you must keep up with vendor patches to the best of your collective abilities. Of course, this does means that you do a proper risk assessment and only disrupt your business’s operations to the minimum level necessary. In the case of Microsoft, they got religion after Slammer and Blaster and now have a robust (really, they do) vulnerability assessment mechanism, and patch distribution program. Earlier this week, Oracle finally admitted that they’ve drank the patching Kool-Aid® and will be releasing patches, not only on a regular basis, but with actual details on what they patching and why.
Halleluiah. But - That was the easy part. The hard part will be getting all of the infrastructure and business application groups that have come to enjoy a significant level of “patch complacency” to start thinking in the “must patch regularly” mindset. This means that SLA’s (service level agreements for those not in the ‘enterprise’ spaces) need to change, and horror of horrors, you’ll need to actually need to what’s deployed in your environment – AND – what versions.
Where am I going with this? It means that the burgeoning CMDB market will get a boost, existing Change Management (another Enterprise euphemism for doing what you said you’d do, when you said you’d do it) processes will get better, and overall this is a good thing. Now = if we can only get the other players, especially all of the layered products out there (such as Adobe, etc) to play the same game, we’d all buy much better off. Although some within my group may disagree, CVSS and CVE are good starts. The software industry as a whole needs to get on the same wagon, and not fall off when things get rough.
More to come…
Bill P
Wednesday, October 04, 2006
Nice NIST Notes
It’s been wild here in my organization for the last few weeks. My efforts to get InfoSecToday off the ground have been sidetracked more than a handful of times due to work, kids, old diesel cars, etc… . For those of you who don’t know me directly, I work for a Fortune 100 (almost) Property & Casualty insurer in the Northeast US. Several weeks ago, not only did we lose our fearless leader in the Info Sec space, but also we were informed that our group is being reorganized into a yet-to-be-determined “matrix” structure. As the Flight Attendants always say – “Please be sure that your seat backs and trays are in the upright and locked position for Landing”… More to come…
Anyway – here is yet more really, really good reading from our friends down at NIST regarding Forensic best practices. If you don’t visit their site often, you should. Located there are sets of docs that can make a great foundation for almost any Standard you need regarding topics like forensics, device configuration, RFID (like we don’t have issues there), SCADA, performance metrics, and more.
One of the problems that I’ve consistently seen across a scattering of IT Security orgs is a lack of basic understanding of what really constitutes InfoSec best practices. Sites like NIST’s, go a long way towards bettering our abilities to craft meaningful policies and standards, as well as having a grounding in something that is “external” to our respective organizations. This makes it much easier to be Selling Security to out Business partners.
Lastly (for today) – thanks Mike for the “plog”. (Plugging my Blog)
Anyway –
Bill P
Anyway – here is yet more really, really good reading from our friends down at NIST regarding Forensic best practices. If you don’t visit their site often, you should. Located there are sets of docs that can make a great foundation for almost any Standard you need regarding topics like forensics, device configuration, RFID (like we don’t have issues there), SCADA, performance metrics, and more.
One of the problems that I’ve consistently seen across a scattering of IT Security orgs is a lack of basic understanding of what really constitutes InfoSec best practices. Sites like NIST’s, go a long way towards bettering our abilities to craft meaningful policies and standards, as well as having a grounding in something that is “external” to our respective organizations. This makes it much easier to be Selling Security to out Business partners.
Lastly (for today) – thanks Mike for the “plog”. (Plugging my Blog)
Anyway –
Bill P
Subscribe to:
Posts (Atom)