It's as true with computers as it is in the real world.
Case in point from the last day or so of work: I was enhancing/fixing code that used Secure Sockets Layer ("SSL" for short--it's what should be going on under your browser's hood when you're providing sensitive information such as credit card numbers.) I promoted it up to the integration server, spot-checked things, then pushed it up to the Quality Assurance server, and spot-checked a few things more.
Because an inordinate number of clients still access the application with Internet Explorer 6, I've taken upon myself the nuisance of working with that outdated--and, consequently, less "secure"--browser. Unfortunately, all Hades broke loose for those using Internet Explorer in versions 7 and above. The result was a scramble to reverse my work, which is a right pain.
The heart of the issue was something known as cross-site scripting. Normally, if the JavaScript code on a web page delivered by one server attempts to access code on a web page from another server, browsers block it as potentially hazardous. However, in this case, the pages were being delivered from the same server--with the difference that one page was being encrypted by SSL and the other wasn't. Same server, same domain; only the ports were different.
To me, such browser paranoia is the equivalent of making toddlers take off their shoes at the airport. Frankly, I expect anyone--or anything--charged with "protecting" me to have as much brain as brawn. In the context of software, that ratio had better be even higher. Because the upshot of such over-reacting is that anyone in a similar situation will have to find a workaround. (Which, by the bye, is a polite term for "hack.") The lessons of Hadrian's Wall, the Berlin Wall, the Great Wall of China and the Maginot Line should be pretty self-evident, yes?