Monday, December 11, 2006

Ceci n'est pas un appareil.

Fine. I’ve not had anything of any consequence to chat about for a bit (other than trials and tribulations with a pair of teenage boys and maybe that of my grease car habits) – but this thread about Agents and Appliances is threatening to send me over the proverbial network edge.

So – a level setting comment. I’m not knocking or promoting anyone’s particular technology or implementation. My comments in this line are solely those of someone who is sick and tired of the blending of issues and technologies under the guise of “low / no maintenance” (aka “appliances”) or of “low server impact” (ala “no agents to hog your CPU or memory resources”).

First – the concept of an IT/IS “appliance” is crap. Yes – quote me on that one. An appliance is a something heavy, dumb, and generally bisque (hey – it’s not the 70’s anymore – no more Avocado) in color. Examples are: a refrigerator or an oven or a freezer, or if you want to get into some fancy programming – a microwave that just “knows” when the popcorn is done. What are the vendors pushing on unsuspecting IT staffers and users are actually intelligent systems that have: disks (hard drives or flash); memory; something along the line of an operating system like BSD, NT4e, WinXPe, or worse – thinly veiled full Linux distros – or much, much worse – full blown Windows 2000 / XP / 2003 OS’s..

Guess what – it’s *not* an appliance anymore. Anyone who says it is should be taken out to the IT Wood Shed and severely thwacked about the ears with an RS232-C cable (with the D25 connectors still on). If it has an OS – it needs to be patched and maintained. If it’s on the network (otherwise – what’s the point!?!) it needs to play nice with all other systems and comply with standards. Since when do “appliances” have their own web servers and SMTP servers? Most of them do if you look under the covers deep enough. If it smells like an OS, looks like an OS, and talks like an OS, then it needs to be treated *like any other server or PC in your environment*. Period. End of discussion. Any vendor or IT/IS “specialist” that tells you otherwise is: a) an idiot b) a fool, or c) looking for another job (they just don’t know it yet). Strong words? Yes – but the madness has to stop somewhere.

Secondly – “Agents” – yes – no one wants them, but without them you have several very, very serious issues to reconcile. It’s easy – if an agent-less “appliance” needs to get to the innards of that application thingy, how does it do it? Simple – just give it an “application ID”. What do these IDs generally require you ask? Easy – full access. Rarely is there any attempt to design in Least Privilege with these “Safe” agent-less systems. Since it all works if you have Root (or the equivalent), then they get it by design.

Where does this leave us now? We now have an unpatched/unpatchable opering system with no / little documentation of the innards, running around on our internal “secure” networks, gaining access to “secure” systems with absolutely full control. I ask you – what would you rather have? Agents or agentless zombie-wannabes…

Back to lunch… (or to losing it)

Sunday, November 05, 2006

Low dose aspirin and red wine…

What does this have to do with InfoSec?!?!? Hear me out - and think about it...

We all are getting older, and the systems that make up our bodies age accordingly, not unlike those of the systems that we (try to) secure. On one hand, they were in great condition when first deployed, and as time went on, slowly became slower and more clogged up with the detritus of the aging process. Both systems become set in their ways and more resistant to change and less resilient to interruptions.

It occurred to me today that the current trend (maybe it will keep me alive longer) is to drink more red wine and take low dose aspirin daily. One has components from the skins of red grapes that appears to keep arteries clear, and the other keeps the platelets in our blood from clotting too easily (hence clogging up the works)

The interesting thing here is that these are at the same time, both proactive and reactive. They are proactive in the sense that we can use relatively small amounts of a substance to ward off the effects of systems. And, reactive in the way that these same substances appear to reverse to a large extend the effects of bad habits and changes in use-cases over the years

So – it’s Sunday evening and where am I going with this? It’s simple, really. What drives us (at least some of us) to bulk up on cheap Cabernet and generic 81mg aspirins? It is the desire to live longer and get ourselves back to a state where we can be effective and health, while not having to make radical changes to our habits and ways. It can be the same with the systems that we manage, maintain, and secure.

We don’t have to go for the “big fix” or the dramatic diving save, but can instead take the slower, measured path with judicious, measured application of process, policy, and personal changes. Not only does this help out with the stability of our applications, but it allows us, as InfoSec professionals, the ability to work with systems that are predictable and tolerant to slowly increased controls.

Enough for today… Now – back to that Cab resting downstairs, I can hear it calling my name….

Tuesday, October 24, 2006

Know thy Business - really, I mean it...

Interesting article in today’s Computerworld feed. Something that I’ve been harping about to my peers for some number of years - and anyone that would listen – is the concept of Knowing the Business of Information Security. As Mary Brandel points out in her article, “The Team at the Top”, it is becoming increasingly important for the technology leaders at the top tiers to not only be good Technologists, but to also be good Business people.

The things that have always bugged me is that with stove-piped or otherwise narrowly defined organizational structures, the technology guys & gals are not really that cognizant of what they are actually supporting. Sure – if you ask them what they “do”, they will say: “I’m a Network Engineer”, or “I’m a Security Analyst”, or “I’m a Software development Manger”. What they all fail to indicate is their relationship to the Company.

What they actually should focus on is that relationship. When asked – the answer should always be: “I’m here to support the Business in meeting their goals within their requirements”. Anything you do to support that is a function of your job, but it’s not your job. Semantics? Maybe – but as Mary points out – top level leadership are looking for individuals that can walk and chew gum at the same time. This basically means that you absolutely must be able to speak the language of Business, and secure / operation / maintain the IT infrastructure while at the same time maintain focus on what it is that you are really doing – helping the Business make money.

To remain solely a technological asset, at least at the upper tiers of administration, is not a future looking mode, and will guarantee that you become irrelevant in the near and long term.

Monday, October 23, 2006

The Death of Information Security...

I was going over some recent gradate work I had done a year or so ago, and came across an article of mine on Nicholas Carr’s take on “The End of Corporate Computing”. Now – from an InfoSec perspective – where does that place us? I’m not sure yet. This diatribe is a little over a year old, and Carr’s source article a little bit older. Maybe the title should be “The End of Central IT Security…”… Who knows…

Reading Carr’s “The End of Corporate Computing” left me with several conflicting opinions and thoughts about the future of IT. In his paper, he outlines what basically are four stages of a technological life cycle. Condensed down, these stages are: 1) new introduction, 2) rapid build out, 3) vanishing advantage, and 4) commoditization. After my own readings, and those of other classmates, I agree with Carr’s illustrations – to a point. He indicates that Central IT is irrelevant, but I would argue that this may not yet be the case. First, confirmation that his theories are correct, at least to the point of Vanishing Advantage, based on my own experiences.

I am probably in what will be called the “transitional generation” when speaking about IT technologies. I believe this gives me a unique (generational) perspective on Carr’s theories on Information Technology and swing to “Utility Computing” and the four stage, because I have lived through the first three. It may be argued that we’re in the fourth stage of Commoditization, but we’ll see..

Like all things in IT, it’s all about timing. My first experiences with IT were in the mid 1970’s at a New England university. It was only when halfway through the fist semester that the university formally recognized Computer Science as a discrete discipline. For a school of over 8,000 students, there were fewer than two hundred timesharing terminals, and half of those were hard-copy. Most companies had never thought of owning their own computing infrastructures, and the image of technicians in lab coats was still prevalent. Phase one encompassed the development in the late 1940s of the large, single purpose computers, followed on by the general purpose mainframes of the 1960s and early 1970s.

Skip ahead to the early 1980’s and the rapid build out and availability of low-coast timesharing systems for smaller businesses. No longer were these devices limited to the Fortune 100 or Educational entities, any small business could afford a computer system of their own. Companies like IBM were still the sole producers of large monolithic systems that had served the very highly centralized IT groups until this time, but smaller, more agile companies were now filling in the niches left by this expansion. My own experiences paralleled this expansion, as a System manger for a small regional auto parts distributor, who for the first time in their (then) 50 year history, could afford to digitize their operations.

This period of “rapid build out” was evidenced by the likes of companies like Wand, Data General, and Digital Equipment. They brought to the consumers systems that did not need large cooling systems, large staffs, or custom programming. Applications could now be bought off the shelf and customized to meet a company’s particular needs.

Unfortunately, the transition into the third phase of vanishing advantage can be attributed to the development of the PC and lower cost midrange server platforms. These systems were faster, smaller, and more flexible than the “mini” computers that were produced during the 1980s. What this allowed was the shift from hardware budgets to software development budgets. No longer were particular manufacturer’s systems the focus. It was all about software. In my own experiences of this time frame, we shifted from canned software to highly customized versions of the same suite. This allowed several critical business opportunities to be taken advantage of, primarily “just in time” ordering from our suppliers and subsequent delivery of restocked items to our stores. This one phase contributed to reducing order cycles from six months to six weeks, and store restocking cycles from four weeks to four days.

Furthering this decline of central IT, was the introduction of the PC and Internet in the mid-1990s. Now, the power of programming and control could be literally placed in the hands of normal users. Central IT strived to retain control over this environment, but generally was consigned to setting standards, policy creation, and subsequent enforcement. It was at this point that I transition from a small (<> 25,000 employees). Evident at the time was this “tension” between IT and the Business community. The genesis for Carr’s commoditization phase can be traced directly to this newfound “power” that had been taken by the Business community.

Finally – to my opinion of whether or not Carr has the right idea… I would agree with Carr that IT as a function will be come more decentralized as technologies such as the Internet become more pervasive. Bandwidth increases will generally occur as more companies light up the large amounts of dark fiber optic cable that were installed during the Dot Com boom / bust cycle. However, I suspect that his view that this “irrelevance” is already happening is ill-timed and I can point to several anecdotal situations where any attempt at making IT a commodity may be hindered.

First, a question around a critical piece of infrastructure necessary for this commoditization may not actually be real. The percentage of unlit fiber optic network cable was generally thought of as being larger than that of “lit” or operational cable. Articles as old as 2002 (http://www.cio.com/archive/101502/et_pundit.html ) indicate that the cable may in fact already be close to capacity, and more importantly, where not lit, may be obsolete due to technological enhancements. Even more recent research (http://news.com.com/Dark+fiber+Businesses+see+the+light+-+page+2/2100-1037_3-5557910-2.html?tag=st.next ) Shows that were such dark fiber exists, it will not generate the commoditization envision by Carr, but will actually foster the continued centralized management of a corporation’s network and IT infrastructure.

The other basic flaw with any attempt at making IT irrelevant is that current state of software “engineering”. Stated (anecdotally) in previous postings, there is still no standard way to write software. The infrastructure may be heading towards commodity status, but the underlying application infrastructure is far behind. An abundance of the variety of platforms, languages, databases, all make for too much differentiation on how applications can be architected. Because of this, any standardization will be difficult if not impossible to accomplish.

It will take the combined efforts of the various software industries to make this happen, if at all. While this may seem cynical, it is unfortunately true that software is inherently unstable and prone to any number of problems from intentional bad design to circumvent controls, or unintentional programmer stupidity. Unlike the Utility models (electrical, and transportation) presented by Carr, the software is the key to the ubiquitous Computing model. Standard design, standard controls, standard interfaces, and standard controls all must be in place before any of the Utility Computing models can be effected.

Wednesday, October 18, 2006

Oracle patching Tsunami

Yeah – it’s been quite here at InfoSecToday. I’ve been letting Mike and Steve run with our collectively demented thought processes. However this one from Oracle, posted on Computerworld piqued my interest this morning.

One thing that we’ve all learned (the hard way) is that you must keep up with vendor patches to the best of your collective abilities. Of course, this does means that you do a proper risk assessment and only disrupt your business’s operations to the minimum level necessary. In the case of Microsoft, they got religion after Slammer and Blaster and now have a robust (really, they do) vulnerability assessment mechanism, and patch distribution program. Earlier this week, Oracle finally admitted that they’ve drank the patching Kool-Aid® and will be releasing patches, not only on a regular basis, but with actual details on what they patching and why.

Halleluiah. But - That was the easy part. The hard part will be getting all of the infrastructure and business application groups that have come to enjoy a significant level of “patch complacency” to start thinking in the “must patch regularly” mindset. This means that SLA’s (service level agreements for those not in the ‘enterprise’ spaces) need to change, and horror of horrors, you’ll need to actually need to what’s deployed in your environment – AND – what versions.

Where am I going with this? It means that the burgeoning CMDB market will get a boost, existing Change Management (another Enterprise euphemism for doing what you said you’d do, when you said you’d do it) processes will get better, and overall this is a good thing. Now = if we can only get the other players, especially all of the layered products out there (such as Adobe, etc) to play the same game, we’d all buy much better off. Although some within my group may disagree, CVSS and CVE are good starts. The software industry as a whole needs to get on the same wagon, and not fall off when things get rough.

More to come…

Bill P

Wednesday, October 04, 2006

Nice NIST Notes

It’s been wild here in my organization for the last few weeks. My efforts to get InfoSecToday off the ground have been sidetracked more than a handful of times due to work, kids, old diesel cars, etc… . For those of you who don’t know me directly, I work for a Fortune 100 (almost) Property & Casualty insurer in the Northeast US. Several weeks ago, not only did we lose our fearless leader in the Info Sec space, but also we were informed that our group is being reorganized into a yet-to-be-determined “matrix” structure. As the Flight Attendants always say – “Please be sure that your seat backs and trays are in the upright and locked position for Landing”… More to come…

Anyway – here is yet more really, really good reading from our friends down at NIST regarding Forensic best practices. If you don’t visit their site often, you should. Located there are sets of docs that can make a great foundation for almost any Standard you need regarding topics like forensics, device configuration, RFID (like we don’t have issues there), SCADA, performance metrics, and more.

One of the problems that I’ve consistently seen across a scattering of IT Security orgs is a lack of basic understanding of what really constitutes InfoSec best practices. Sites like NIST’s, go a long way towards bettering our abilities to craft meaningful policies and standards, as well as having a grounding in something that is “external” to our respective organizations. This makes it much easier to be Selling Security to out Business partners.

Lastly (for today) – thanks Mike for the “plog”. (Plugging my Blog)

Anyway –

Bill P

Thursday, September 28, 2006

New kid on the block...

I thought I'd better get on the blogosphere bandwagon before the music died down...

Most anything regarding Information Security (and related stuff) is appropriate for posting... One thing I've learned in my 27+years in IT is that sharing amongst peers (as well as lessor mortals) is an absolute must if anyone is to learn from past mistakes.

Bill P