What you say is not untrue, but the larger issues (IMHO) are that:
1. Most people design such that they avoid trouble and confrontation. 2. Most IT auditors have no IT experience.
For #1, most people have lost the ability to rationally assess risk. No one wants to be the guy to say "I saved $xxxxx by specing a lower box that will still handle the load" or some variation of that when that's the first decision that's going to be looked at if there is a problem. In most cases the IT department has lost touch with the business value they provide. So we get this proliferation of redundant servers and network gear that sits idle.
There is an aspect of hardware to it, though. Developers tend to assume they are writing to a machine that executes commands in zero clock cycles, has infinite memory, and has a network with zero latency and infinite bandwidth. Rather than try and correct these misunderstandings, IT will throw money at the problem to make it run and not get blamed.
For #2, I'm not sure what else has to be said. I have only met one auditor who I respect and actually gets these kind of discussions. He explained to me that he understood some of these things made no technical difference, but the problem was to convince every other auditor. Sometimes it's easier just to bite the bullet and do things sub-optimally rather than having to spend several hours explaining it each time the (new) audit team comes around. Back to #1, the cost of being right is high and the benefits are almost nil.
With respects to your arguments you're mixing data durability and data loss prevention. They are both aspects of security (eg, mitigating risk), but I'm sure that most IT departments would agree that they are more worried about a critical Excel spreadsheet getting in the hands of the media or competition than they are having Excel crash because of a memory error. The cost and likelihood of the former dwarf that of the latter.
Sean
On Wed, Feb 17, 2010 at 10:20 PM, Adam Thompson athompso@athompso.netwrote:
<soapbox> That's because we don't, collectively, think about hardware. And we don't think about hardware being buggy. And we especially don't think about "hardware" having inherent security flaws.
(OK, yes, the security folks who crossed over *into* IT do. They aren't auditors, for better or worse.)
A Cisco router is "software" enough (and has had enough bugs :-) that it crosses into our conscious awareness regarding security, but their switches? Nah. Mature product, all hardware (despite running an OS), no bugs. Either works or it doesn't.
Bullshit.
Show me a hardware-accelerated device and I can show you half a dozen ways it could fail unnoticed, (potentially) compromising security as it goes.
Notice that we install local firewalls on every PC but don't use ECC memory to guard against random bit errors. (I do, BTW - even on my PC. It's one small part of why I don't have a laptop.) A HERF gun is a better DoS tool than any virus or worm, by several objective measurements.
The entire IT industry has its head stuck up... you know where, in so many different ways.
Yet, this isn't surprising. Humans want instant gratification, a free ride, and the illusion of control. Those things are all way easier with software than with hardware. (Contemplate the difference between "soft" and "hard", if you will, for a moment.)
Do I expect this to change any time before the heat death of the universe? No. But I sure wish auditors took a wider view of the world.
"Never attribute to malice that which can be adequately explained by stupidity." - Hanlon's Razor (among other attributions)
</soapbox>
-Adam