Monday, August 31, 2009

The Pamplona china shop, part II

The best side effect of having a dishwasher in the break-room is that hardly any plastic utensils end up in the landfill. I'd finished lunch using one of the honest-to-pete metal office forks, and found myself carrying it back to the kitchen dangling from a loosely pinched thumb and finger. That's how my first grade class was taught to carry scissors (even the rounded-end ones). Which directly countermanded the instructions of kindergarten, where we were supposed to clutch the pointy end in our hands in case we fell.

I don't suppose that in the grand scheme of things, the contradiction matters. In either case, the central tenet is to minimize the damage you do if you fall while holding them, even if you're so egregiously reckless as to be running at the time. I don't know how the current crop of kindergarten and first grade students are being instructed in proper scissor-holding, and I really don't care, so long as they're not taught to fear scissors. Apart from the fact that a phobia of plastic scissors with rounded ends with blades (specifically engineered to take away the forbidden pleasures of cutting cloth and hair) is just, well, lame, that's not the point.

The point, of course, is learning to recognized and respect the power and the drawbacks of any tools that come into your hands. It's not enough to memorize the rules. Before you can know when it's better to break them, you have to understand why and under what context the rules were made. And you have to have some idea of what happens after they're broken.

In the context of software development, I used to think that this largely applied to the folks who wrote the software, more so than their management/support. After all, if you make a point of hiring smart people, you won't have to worry about them making mistakes, right??? Well, then I completely deleted the production database by accident. Certainly I'm not as smart as I used to think. And when you tally the real brainiacs you know, you can probably think of at least one whose raw IQ and common sense are inversely proportional. Case in point: My husband's best friend in high school--the one who pulled a ridiculously high GPA, despite being half in the bag while doing his homework (he claimed Calculus made more sense that way)--also managed to run himself over with his own snowmobile. And that's not the only story I can tell, trust me.

But expecting policies and procedures to save the organization from big mistakes is equally far-fetched. When I begged the senior DBA to please take away the administrator rights for my regular login, he just laughed and said, "You won't do that again, will you?" He was of course right. That wasn't the end of "oopses" I've made with databases, but it hasn't been repeated.

The only silver bullet is to realize that there is no silver bullet, and simply have the means and know-how available when someone accidentally deletes data or screws up source code. Setting up situations where another set of eyes is involved can head off a world of hurt. Again from the software realm: Requiring (or heavily incentivizing) programmers to integrate their work with others' means that you'll see far more small yellow flags than red ones big enough to hang outside a Perkins.

I could go on, but instead I'll hope that I've made my point, that it's important to structure people, processes, and resources around the probability of failure at some point in production. I really wish that the famed Apollo 13 line had been, "Giving up is not an option," rather than "Failure is not an option." It's one thing to fire someone for laziness, shoddiness or willful ignorance. But failure on a managed (and hopefully small) scale should be merely the price of doing business. Particularly in industries like software (where we expect constant innovation), failure is largely an exercise in R&D. In that context, punishing failure is punishing the core process of trying new ideas, and it's the most self-defeating business practice I can think of.