Tuesday, March 31, 2009

Better decision-making with fewer facts?

I don't know about you, but I'm rather sick of the way that the word "fact" is twisted. I've already forgotten quite a bit of the statistics class that I took last semester, but one thing will stick with me for the rest of my life: Correlation does not equal causation.

For instance, you could randomly sample thousands of peanut butter sandwiches made in thousands of kitchens every day, and find a high correlation between peanut butter and jelly (together) as sandwich ingredients. Does this mean that peanut butter causes jelly? Of course not. I don't think that you will find anyone even remotely reasonable who would think that. But swap two other correlated things in for the PB and the J, and loads of people will immediately seize on the "evidence" of those two things being found in tandem and immediately connect the dots between cause and effect. Particularly if the cause-and-effect relationship involves losing weight, making money, or one-upping their ideological foes. Then, suddenly, peanut butter causing jelly is a "fact."

Me, I think that the world would be a lot saner if we were more stingy in awarding the term "fact." Personally, I can keep the process decision-making much less cluttered by categorizing nearly everything as either "data" or "opinion."

Contrary to the popular catchphrase, the data do not (typically) speak for themselves. Data must be collected in a manner as unbiased as possible; it must be compiled in a manner as unbiased as possible; it must be contextualized fairly. (And it's all too easy/tempting to screw any one of those steps up.) And, in the end, someone has to put some (or all) of their credibility on the line by drawing a conclusion from both compilation and context--all the while trusting that the data is not already obsolete. And at that point, it becomes an opinion--an opinion, mind you, that can run the gamut from empirically-informed to out-of-the-ear ignorant, but an opinion nevertheless.

Yes, I realize that facts are safer, at least for one's peace of mind. If you cherry pick your facts carefully enough, you can weather all manner and magnitude of contrary data within the fortress of your own correctness. Personally, I pity anyone like that. Because facts aren't eternally true; more than a few never were.

I'm not saying that you have to review every last data point yourself before making a decision. (Unless, of course, such superciliousness is in your job description.) Neither is projecting your biases on someone else's conclusion a valid response. But I think that it's a good mental exercise to minimize the "facts" that you operate with. And be very, very suspicious of those that just happen to support the way that you want the world to be.

Monday, March 30, 2009

A question for the future of programming tools

I probably shouldn't admit this, but I made the same coding mistake in two separate languages today. Basically it boils down to a situation like this:

Instead of typing

foo = foo + bar

I fumble-fingered it as

foo = foo = bar

The first instance was a string concatenation in ASP classic. That stuck a literal "False" in the middle of the XML that the code was supposed to return. The second instance was in ActionScript, which merely kept the value of a counter variable "running in place," as it were.

Both compilers/interpreters were perfectly happy to let me do something like that without any scolding whatsoever, despite the fact that the second language (ActionScript) is fairly strict-schoolmarmish about grammar and punctuation.

But it kind of drove home a question I've been noodling for awhile: Is the code development infrastructure having a mostly ignored tug-of-war with modern languages? Or is that even too simplistic. Tug of war implies two forces working in opposite directions; what if the tensions are actually more orthogonal to each other?

The tools that developers have available to them can be extremely "intelligent" in the sense that they handle indentation and can make reasonable (if not always contextually correct) guesses about what is intended. Some of them keep you on a short leash by scolding you for your mistakes even before you compile. Yet, at the same time, the explosion of programming languages in the last twenty-odd years has a downside. Almost anyone can invent a language and propagate it worldwide via online code repositories such as SourceForge. The predictable consequence is, of course that some are quite lax--with sloppy compilers/runtimes to match. When they're the only language in their class and become a de-facto standard (think JavaScript), that only compound the problem.

(And one not even invent a language ex nihilo, either. At one time or another, I've programmed in either the C programming language, its object-oriented sibling C++, and at least four of their cousins: Java, JavaScript, PHP and ActionScript.)

As for the resolution of the afore mentioned "tension" between languages and code development tools, I can't make much in the way of an intelligent guess at where that will eventually pull the whole universe of software development. What I do know is that the company that makes it easy to stay on top of a mainstream language--things like minimizing errors, boosting newbies up the learning curve, making it difficult to create bad design, and (most of all) making the tool itself as transparent as possible--is going to make an obnoxious amount of money in the years to come.

Sunday, March 29, 2009

Tweet and Retweet

I don't like to think that this is just me going all old-fogey in the "...we had to use the letter O" sense. But I mostly can't see the point of re-tweeting--at least not in any "shotgun" fashion. If a retweet's targeted at specific followers, yeah, that I get. Otherwise, the value-add's over my head. Then again, the dot-com coattail-surfers made such a travesty out of the concept of a "portal" that maybe I'm just looking at this as the proverbial old wine in new bottles.

Of course, there's always the possibility that my nieces and nephews are right and I'm just too uncool for this newfangled social media. After all, I did cut my dentures on Web 1.0, don'cha'know?

Saturday, March 28, 2009

A touchstone

I've told this story in a few different venues, so if you've heard it before, please forgive the repetition.

A few years back, my nephew's grade school was focusing on Presidents' Day. My brother-in-law was apparently trying to put history in context for said nephew by explaining, "When George Washington and Abraham Lincoln were President, there were no lights and no TV, and..," when my nephew--serious as a plane crash--interrupted with, "Stop it. You're scaring me." When my sister related the tale to me, I absolutely roared with laughter. But then I confessed--only half-jokingly--"If he'd said, '...and no internet,' I would have run screaming into the night."

To me, the true test of a technology is the answer to the question, "How did I live without this?" If it takes more than five seconds to envision the answer, it's a game-changer, in the major league sense of "game". Anything else is merely a frill, possibly just a fad.

Believe it or not, I do remember life before the Internet. Now instead of feng shui'ing my time around library hours and the city bus schedule (and hoping that at least one of the libraries has the information I need), I merely fire up a PC (another "How-the-heck-did-I-live-without-this?" game-changer, but that kinda goes without saying) and launch a search engine (which has also changed over the years).

I also remember life before what we used to call "walkman headphones." (The single hard plastic earbud for a transistor radio--AM only, even!--doesn't count.) Now I spend at least thirty hours a week with headphones in, just to focus on what needs doing. They've variously been plugged into my workstation's sound-jack, a few different MP3 players, and (once upon a time) a cassette player. But the headphones have always been there as an accessory for focusing. Since about 1984, anyway...

For answering machines, I was a little late to the party, not having one until the early nineties. My mother insisted that they were "rude." Me, I thought it was ruder still to make the people who want/need to talk to you run you to ground. And along came email, which is muuuuch less intrusive/obtrusive, with the bonus you can include pictures or other files with it. And...for those folks whose email I want to give priority access, there is forwarding to my El Cheapo pay-as-you-go cellphone via SMS.

I even remember life before VCRs (not to mention DVD and BluRay players), Web 2.0, microwaves, instant messaging and cable TV. All game-changers in their own right, in most cases spawning whole sub-industries. But you could take away any or all of them and I'd somehow manage to live a full life--personally as well as professionally.

So my challenge is to figure out what to create that most folks can't imagine living without. Or at least don't want to even imagine living without.

Friday, March 27, 2009

The Lake Wobegon (*) company

How many places have you worked that claimed, "We only hire the best"? A few, I'll be willing to bet. Now, if you worked for various employers in the same industry, that's an interesting proposition. How they all manage to hire "the best" must take a skill akin to that of Santa Claus, who manages to be in every mall you visit at Christmastime.

Nifty trick, that.

Not that I'm suggesting that you aren't, in fact, the best. For all I know, you are, and who am I to insist otherwise? Booyah: Go, you. Seriously. Me, I know darned well that I'm not. I merely have the proverbial Midwestern work ethic, and I've learned to time my (very) occasional prima-donna hissy-fit for maximum impact. For where I work, that is, mercifully, enough for most situations.

Yes, I understand that no one wants to tell their employees, "We hire losers," or even "We hire the B-list" or anything like that. That's just egregiously nasty--not to mention stupid. Yes, I also understand that managers want to encourage pride in a job well done and esprit de corps and all due etc. There's certainly nothing wrong with that.

But imagine what it would be like to actually have the best working for you. How would you treat the rock stars you're lucky enough to have on your payroll?

Rock star employees don't work in cube farms. 'Nuff said.

Rock star managers--I dare suggest--live in the hub of the workplace, keeping their finger on pulses, ears to the ground, their weather eyes peeled for trouble, yadayadayada. (Kind of like Twister, but with more sensitive body parts involved.) They do not wall themselves off in a corner office, with an administrative assistant for sentry duty.

Rock star anythings aren't paid "competitive" wages--everyone else is supposed to compete with them.

And, by the bye, Real rock stars have roadies to do the (literal) heavy lifting. They most emphatically do not have to dodge the interference of the martinets or kowtow to the gatekeepers just to get their friggin' work done.

Maybe it's just me with a burr under my tail today, but I would really appreciate it if Corporate America would keep its Lake Wobegon fantasies to itself and stop insulting my intelligence with them. Because for me, at least, the repeated lie does not become the truth; it only makes me loathe the liar exponentially more with each repetition.

* For those who don't happen to follow Minnesota Public Radio's A Prairie Home Companion, Lake Wobegon is the "Gateway to Central Minnesota," where "...all the women are strong, all the men are good-looking, and all the children are above average."

Thursday, March 26, 2009

Flamebait

I feel like I'm channeling Bill Maher just now, because what's going through my mind is...

New Rule: Software that doesn't work straight out of the box is broken.

Granted, there are many and varying degrees of "broken." A whole continuum. Maybe even a multi-dimensional continuum.

There are not, however, any valid excuses for broken software, only acknowledgement.
  • If your software only supports certain hardware configurations and you don't document it, it's broken.
  • If your software does not run on certain operating systems and you don't document that, it's broken.
  • If your software is a web application, but you only optimized it for one browser, it's broken.
  • If your software relies on other programs that the user is responsible for obtaining and you don't make this process as brainless and painless as possible, it's broken.
  • If your software runs significantly slower on one OS than another, it's broken.
  • If your software whacks the user's data or system configuration while they were following your instructions for use (or even doing something that a reasonable layperson would consider intuitive), it's broken. Really, reeeally broken.
  • If your software doesn't toggle between similar file formats more or less gracefully, it's broken.
  • If your software isn't part of the operating system or a system service, but requires a system reboot after an upgrade, it's broken.
  • If the user has to hamstring or downgrade other software to use yours, it's broken.
  • Oh, and if you expect your users to work around these things (and call them stupid when they don't) because you're giving your work away for free, not only is your software broken, but you're also a wanker.
It's that last point that I want to hammer home. Frankly, I don't care how many hours of blood, sweat, toil and profanity went into any bit of software. Giving away software that doesn't work as advertised doesn't make you generous. Giving away functionality--no matter how useful--wrapped in a half-baked interface doesn't make you generous either. Caring about your users enough to put yourself in their proverbial shoes does. The caring bit applies to before, during and after you write the software.

Tonight's rant was brought to you by the never-ending great gobs of suckage that is video/audio on Ubuntu. Fortunately, no lasting damage was done, which is an improvement from, say, ReTune wiping out my iPod or Envy whacking my video configuration to the point where XWindows wouldn't start or OpenSUSE refusing recognize a SATA drive after I changed motherboards. The only upshot is that I can't determine whether the program I have to write for school will actually play the .MIDI file that I downloaded to test it.

And now that I've named a few names, I expect to be flamed for the st00pid n00b or Windoz suxor or whatever it is I am for daring to suggest that Linux fanboys and fangirls stop beating their chests over how evil Microsoft and Apple are. Which is not unlike being lectured for my materialist ways by the hippies on the neighboring commune because I have this fetish for hot running water and toilets that flush.

Ubuntu is my OS of choice, hands down. But I refuse to make excuses for the absolute junk that makes it into the repositories. Because to do otherwise is flat-out insulting to the end user--who, by the bye, does not in fact have unlimited hard drive space, much less time for software that does not do what it says it does. Mercifully, Ubuntu also does a fantabulous job of scrubbing other people's lame software off the system.

And, now, if you'll excuse me, I need to switch over to the Windows side so that I can actually get some work done.

Wednesday, March 25, 2009

WWYFD?

James Herriot's All Creatures Great and Small portrays the Yorkshire farmer as virtually immune to bad news. The heartbreak of watching their backbreaking toil go to nothing is shrugged off with an "Ah. Well, t'ese t'ings happen."

I'm not like that. I've said many times that I'd be the one standing in the middle of a hail-flattened crop, shaking my fist and raging at the heavens--and I would, too. Even one or two beehives give me enough fretting over "livestock" to last the entire year, thank you very much. You don't last long in farming like that.

So I surprised myself a little tonight, shrugging off a lost freelancing gig--one big enough that I would have no life outside of it and my full-time job for at least two months. Not to mention that with it, I'm losing one heck of a chance to learn something new and important. Maybe I should start using "WWYFD" as a touchstone for these kinds of disappointments: What would a Yorkshire farmer do?

Fortunately (for my blood pressure, at least), freelancing isn't quite the same as farming. At least in the respect that the unlucky farmer is left solely with cleaning up the mess and biding time until the next season comes around. You, the freelancer, can immediately jump back into the thick of things: Pound the pavement for new business, become just a bit better at what you do, pick up a new angle (because you always learn so much just researching/writing the proposal), whatever. I'm just sorry that I was only an anonymous subcontractor on the afore-mentioned project. Given my 'druthers, I would be sending a polite "Thanks for considering us" note to the customer right now. Because people usually feel a little bad for having to say "no," and it would be nice to be able to let them know that I appreciated the opportunity to learn about them and their industry.

One other difference: You don't always have to cut your losses, either. In farming, dead cattle don't come back to life; neither do destroyed crops. But the competitor that undercut or out-promised you? They can fall flat on their face. I make a good slice of my salary working on second- and even third-hand code. My husband makes most of his from a client who turned down his firm's first proposal as too expensive, then returned after two other firms wasted countless hours and dollars and still couldn't deliver.

And so I'm off to research off-the-shelf software that could be customized for a different gig that could be out for bid shortly. Lots to learn from that project if it comes through, so better to know what I'm up against earlier than later.

Tuesday, March 24, 2009

From the Web 2.0 lexicon

Like most of my vocabulary-builders, I ran across something "edifying" while looking up the precise definiton of something completely unrelated. In this case the serendipitous find was "facebookemon," courtesy of Urban Dictionary. For the record, I can boast 31 Facebook "friends" at the time of writing. No facebookemons in the lot, but, then again, that wasn't the point.

A friend once teased me something along the lines of "If you're not a social butterfly, what sort of social insect are you?" I shot back: "I'm a social spider: I prefer that people come to me." The resulting flurry of banter led to him being challenged to a duel involving live wombats ("First blood only, or ... mortal wombat?"). Wherein the "smackdown" (involving a single stuffed wombat) fizzled into a riff on Monty Python's "Dead Parrot" sketch. But that's another story.

Anyhoo. The the underlying point is that there are no "good" or "bad" social insects. You just need to know what degree of signal-to-noise ratio you can tolerate. Mine's surprisingly low--"surprising" in light of the amount of distraction I've learned to cope with. As Urban Dictionary.com could tell you, YMMV.

Monday, March 23, 2009

Dinosaurs and mammals, Part II

I had a long conversation with half of a two-person business today. He was hoping to refinance his house, but the banks wouldn't touch him because he has only been working for himself for the last year or so, and they wanted two years of tax receipts. Now, understand he's still paying for the house (at the higher interest rate) in the meantime. Moreover, that people with that kind of millstone hanging on their monthly budget typically do not go solo unless they think it's a pretty sure thing. Considerations which make the rational person want to beat her head against a wall for the sheer therapeutic value of accupercussion. Which should give you a solid idea of why I've never had much stomach for the so-called "science" of business.

But on the heels of the mortgage anecdote he said, "It's a great time to be a small business," and theorized that pinched purses enforce a certain "transparency." I took that to mean that customers insist on value for their dollar, rather than the cachet of doing business with a brand-name company. And without the overhead of the empire-builders and other parasites that are inevitable in a smallish company--not to mention the spendy real estate they take up--I can completely see his point.

Granted, one entrepreneur does not an economy make. But I wonder whether whether we're seeing a sea-change in the attitude of the people who normally woulthatd normally gravitate toward large companies as place to work, places to buy from or sell to. If that's the case, it should be considerably less work to earn the trust of customers and vendors than it would be during boom times. If you can somehow scrape by without a private jet, million-dollar office refurbishments, three- and four-figure lunches, toadies by the platoon, and an au pair and driver for each of your children, well, I think we can fairly say that you have an edge on the clowns who tried driving the economy off a cliff. Who knows? Maybe some of them will be working for you someday. I hope you never let them forget it. ;-)

Sunday, March 22, 2009

A cocktail-party exercise

I don't think it's foolproof, but the answer to the question, "What's your ultimate comfort food?" can tell you quite a bit about your conversational partners at a cocktail party. Particularly if they've already had a couple drinks--much more particularly if you can find out why. (For the record, the weirdest response I've heard to date has been "swamp buck," meaning venison that lived around swamplands until it was shot and field-dressed. I am totally not making that up.)

See, I still subscribe to the adage, "It's three generations from shirt-sleeves to shirt-sleeves." In that light, someone's comfort food can tell you whether they came from money or are an up-and-comer. Which can tell you even more about them.

(In case you're curious, my answer is "Mac 'n cheese made with Velveeta, and canned kidney beans on the side." Why? Because Mom usually made that on the nights that she and Dad went to play Bingo at the Eagle's club, which meant that Flip Wilson would be on TV and my cousin Denise was coming over to babysit. Good times for a three or four year old, lemme tell ya...)

You?

Saturday, March 21, 2009

Excuses for success

There are probably more excuses for failure than the people who give them: No headline there. But what about excuses for success? Granted, every once in awhile you'll hear some brave (successful) soul say something along the lines of "I was just lucky to be in the right place when the market took off."

To me, such candor is certainly better than claiming to be a genius. Big kudos to those folks for their humility. But the "Aw, shucks," self-deprecation ignores two basic considerations:
  1. Someone had their surfboard in the water when the wave started to crest, and
  2. Didn't (completely) freak out when they realized that the wave was, in fact, a tsunami
What are your excuses for success? Persistence? Being plugged in to peripheral movement in your industry? Integrity? A spouse willing to take up the domestic slack while you bang on the keyboard into the wee hours? The aptitude to learn from mistakes before they do irreversible damage?

Whatever your excuse or excuses are, hang on to them. Not only are they a talisman against arrogance when you do conquer your mountain, they are part of the great story you will tell every time someone asks, "How did you do it?"

Friday, March 20, 2009

Too small to fail?

As AIG again headlines the news, I hope that the phrase "too big to fail" seems more patently ridiculous than it did the the panicked finish of 2008. Nothing--and I mean nothing--is too big to fail. Not T-rex, not the Roman Empire, not the Beatles. Nothing.

Similarly, it's impossible to be too small to fail. But I can't help but wonder whether the odds are better. What if, instead of trying to game the numbers four times a year to impress investors (who have a goldfish's memory--maybe even less), The Suits put that energy into actually tending their business--and the stakeholders who have a longer-term interest in its success?

In software development, we talk about whether a solution "scales"--i.e. if your usage spikes from a few hundred to several thousand or million, will it crash the entire system? Like software, human managers can only scale so much. When you're talking about organizations that make more money in a year than entire countries, and possibly employ more people than entire countries, that's a problem. Mere mortals--i.e. the CEO--or even a small cabal--the Board of Directors--can only spread themselves so thin. After that, it becomes a question of organization and motivation.

But the inherent problem is that if the organization is structured to maximize returns to a select few (shareholders, particularly institutional investors), the growth of the organization concentrates the rewards in a top-heavy fashion. And, predictably, it becomes a better economic proposition for those inside to scramble for the spoils, rather than focus on the activities that actually bring in the money.

True, companies with deeper pockets can--or should--take calculated risks that smaller ones can't afford. That's a huge advantage. But as you add more people, you add more inertia and its consequent group-think. That requires strong and smart leadership to overcome. Assuming that this will magically happen is one heck of a gamble. If the person heading the company is a founder, that's one thing--they have skin in the game. But someone who's guaranteed millions win or lose? Eh, not so much...

So, without anything more reliable than a knowledge of human nature to go on, I would bet on too small to fail before I'd bet on too big to fail.

Your thoughts?

Thursday, March 19, 2009

Application over-sharing

Like some people you know, computer applications can also over-share.

Seriously, how many years have computer users been griping about "splash screens"? Fifteen? Maybe twenty? And yet how many times does a barely disguised billboard bogart the middle of your screen and won't let you do anything with anything behind it until the featured program takes its place? As if I might just be a bit fuzzy on the concept of cause and effect: "Oh, look! That box says something about the Flibertiwidget program! What a coincidence: I just clicked on the Flibertiwidget icon! Small world, I tell ya..."

It's not only commercial products that do this. In fact, the most egregious one I can think of is the open source software development product called Eclipse. Not only does it grab your screen by the lapels, but it has to report on its loading progress the entire while. At least give me a cartoon before the main feature, already...

Believe me, I understand that the average programmer toils in anonymity. I do. For some, that annoying bit of screen-hogging chest-thumping is really all the glory to be had. But programmers--the best ones, anyway--understand the ethos and elegance of economy. Case in point: Email "toast." You can configure the Outlook or Thunderbird email programs, for instance, to make a summary fade in and out of the bottom corner of your screen when new mail arrives. It's just enough to catch the attention, but not obtrusive. Not like the Windows notification that parks itself in the lower corner to tell you what you should already know: "Your document was sent to the printer." (Booyah Windows: Take a couple victory laps. How about a Lambeau Leap?)

C'mon, software should act like the Victorian-esque butler: Invisible until you actually need to interact with it. Though perhaps in my case, Anthony Hopkins from "Remains of the Day" would be overkill. I probably need Sir John Gielgud from "Arthur" to occasionally slap the top of my head when I push the wrong buttons. But ultimately, computers and their software exist for my comfort and convenience. Anything else is a tiresome distraction. Believe me, I can find plenty of ways to distract myself. Even without the internet.

Wednesday, March 18, 2009

A new insight from an old acquaintance

It's feels suspiciously like navel-gazing nostalgia, but I recently picked up Huey Lewis & the News' "Sports." Sometimes you can, after all, go home again: This was one of those times.

But in the quarter century between transferring it from vinyl to cassette (so that I could play it on my walkman) and transferring it from CD to an MP3 hard drive, my eyes have opened to the power of craftsmanship. Because that's what makes this album shine. These guys are so dang polished that you don't really notice how they slip between pop and rock and doo-wop and blues and even rockabilly.

In the proverbial nutshell, they make it seem easy to be good at what they do for living. Something I thought--even well into college--that I could manage. Only now do I appreciate how much work and pure stubbornness goes into such "effortlessness." Which should be depressing (for a recovering slacker like m'self), except that there's a love of craft involved--not to mention the mule-headed refusal to be pigeon-holed. I like to think that was the case for the band. Particularly in the risk it took for a rock/blues/pop band to cover Hank Williams' "Honky Tonk Blues," even its turbo-charged version.

And for that reason, that song's the one my adult self likes the best, even as the scraps of teenager left in me smile at the ones she remembers from Top 40 radio waaaaay too darned many years ago.

Tuesday, March 17, 2009

I should be happy, but...

Honestly, I don't quite know how I feel about this. "This" being the reported uptick in Computer Science majors/degrees, per the New York Times. Or, in the case of Bachelors degrees, a decrease in the rate of decline, which isn't an increase per se, but is enough to be good news...in a math-y sort of way. (Pssst! That's the second derivative for you folks in Calculus this semester!)

Maybe it's the fact that I'm having a crisis of faith in education--and trust me, that's saying a lot--lately. From my own experience, there seems to be no happy medium between the two flavors of Computer Science education.

In the University version (again, in my experience), you're taught concepts that come into play less than half the time. Honestly, most of my work life revolves around working with databases. The question of whether to use a binary sort vs. a bubble sort or whether the problem is NP-complete just does not rule my workaday life.

In the Tech. School version, you're mainly taught how to use the tools and the ins and outs of whatever languages the school specializes in. But in my subjective experience, you're largely left to hack out a solution with little thought for the underlying algorithms--assuming you even come into contact with the term "algorithm."

Although I have an ever-so-slight preference for the Tech. School version, both systems fall well short of the mark in what I think differentiates even the average developer from the person who's just making a living writing code. Issues of personality aside, three simple questions should be suffice to answer the question of employment offer vs. rejection letter:
  1. Can this person communicate with techies and non-techies?
  2. How well does this person problem-solve?
  3. Does learning new things light this person up?
Point First: Communication. There is no substitute, and after a college degree, no excuse for not knowing how. A higher percentage of your development career will be spent working with humans, not computers. Technical brilliance is not a free pass. Frankly, I have no use for cowboy coders. Seriously. I don't care how brilliant they are. In the long run, they're more trouble than they're worth, even if you have a situation where you can isolate their work. (And that even if they're not prima-donnas.) Sooner or later you'll pay for it in the resentment of their more pedestrian team-mates who who have to color inside the lines week in and week out. (Not to mention--horrors!--document stuff the whole time.)

Point Second: Problem-solving. This is where I feel that both approaches to computer science education fail in epic proportions. Basically, you spend somewhere between two to four years writing new code from scratch. That's horse-hockey. Not to put too fine a point on it, but "software development" is just shorthand for "fixing some idiot's crap code for a living." In a perfect world, every assignment would involve making a borked program work before you're allowed to add your little stamp of individuality via a mind-numbingly boring "enhancement."

Point Third: Learning. Yet again, from my limited experience, I don't think I've once been asked on an application (much less in an interview) what I'm working on in my spare time. Or even what I don't know but haven't made time to learn yet. Honestly, if it weren't for the brutal hours of the industry, I'd love to see college degrees in Comp. Sci. expire unless new coursework is taken. Other professionals lose their certifications without continuing education, and few things become stale faster than a technical degree.

So, as much as I'd like to see a healthier interest in software development--if only to make it tougher for the suits to buy more H1-Bs from our Congresscritters--I hope that the ultimate effect is to re-legitimize computer science as a viable craft. By my algebra, we've mostly lost about a quarter-generation of programmers. That can't be good.

Monday, March 16, 2009

A new axis?

Like any self-respecting consumer of free internet content and applications, I am exercising my inborn right to complain about them. In this instance, it's the revamped Facebook format. Leaving aside the chunkier look and feel, I miss the granular timeline. I mean, if I didn't want my cousins--and, inevitably, my Mom--knowing that I Kidnap'd a pal or threw a thong at my husband via Superpoke, I would have blown off their invitations. After that, does it really matter that they (much less Mom) know when?

But thinking about "when" makes me realize how much the question of timing has come to matter in 2009. Ten and fifteen years ago, the web experience was more one-way. Or if it was two-way, it was more structured (e.g. e-commerce). But "social media" is predicated on two or more people interacting, and timing is fundamental to that: Reading an answer before the question never makes sense, no matter how hard they tried on Jeopardy. And so I have to wonder if this shift--or even the Next New Next New Thing will revolve around the axis of time.

Content is still crucial, no question. But reflexively thinking of web applications in terms of time may not be the worst habit a programmer ever acquires.

Sunday, March 15, 2009

The single most important tool

I'm spending the balance of the evening playing catch-up with the emails to come from the Derby user and development lists, and it occurs to me that the most important tool in computer programming is, without question, the internet itself.

When I think back to my toying with BASIC on the Apple IIe after school, it occurs to me that I had only a few resources, and those only some of the time:
  1. Asking the instructor(s), if they hadn't left for the day
  2. Making a nuisance of myself among the more serious student-coders
  3. Hoping that the FORTRAN book Mom picked up from UW-EC would at least point me in the direction of a lucky guess
I have to think that, despite the much reduced body of overall knowledge, many a programming career was cut short by sheer frustration. Fast-forward twenty five years, and I still feel like a poser. Except for my search engine querying--that's where I can claim to have mad skilz. ;-)

Saturday, March 14, 2009

Defining "Help"

Specifically, I mean "Help" files--the documentation that tells you how to use a bit of software. (Or at least that was the general idea before considerations of budget came into the picture. Or, worse, the Marketing and/or Legal Departments put on editorial airs.)

Being a recovering Tech. Writer, I have some strong--and, doubtless, heavily biased--views on that subject. Those will be aired in later installments.

But for the time being, I want to ensure that software or other widget-making folks understand that there are two basic flavors of "Help," and they are pretty much mutually exclusive:

  1. Concept-based, which is rather encyclopedic in nature. For instance, concept-based documentation will talk about the basic parts and/or functions of a widget.
  2. Process-based, which is a series of step-by-step instructions for performing specific tasks with the widget.

There are trade-offs between the two approaches, some of which are:

  • Concept-based Help is (or at least can be) comprehensive, but it can try the patience of the person who is just trying to accomplish something with your widget.
  • Process-based Help requires you (and your writers) to have some idea of what those who buy your product actually want to accomplish with it.
  • Chances are good that your engineers/designers/programmers will be more forthcoming with concept-based information than process-based information. However, what goes on under the proverbial hood may be completely extraneous for your customer.
  • To write process-based Help, the writer must be comfortable enough using the widget to invent those scenarios. This takes familiarity with the product, which in turn takes time.

The best of all widgets will have both sets of Help available, with the content weighted toward the process, supplemented by the concepts (e.g. in Appendix form) and hyperlinked within the step-by-step information as needed. That's a significant investment in usability.

But, as much as I would argue for making that investment, I would argue as forcefully for skipping the process entirely if you can't budget for doing it well. Simply put, no information is better than bad information.

Seriously. You, the maker, are the ultimate authority on the product. No excuses. If your Help is unintelligible, incomplete, obsolete, or just plain wrong, you're going to waste more of my time than if I had to Google everything. And if your corner-cutting is tripping me up when I need you most, well, that doesn't exactly inspire confidence in the product itself, now does it?

If, on the other hand, I know before downloading/buying that I'll have to Google my way to anything I need, that's just truth in advertising, IMO. So either fire your tech. writers or give them all the resources they need to make you shine. Otherwise, your "Help" will fall into one of the two categories defined by Shel Silverstein in the poem "Agatha Fry":

And some kind of help is the kind of help
That helpin's all about.
And some kind of help is the kind of help
We all can do without.

Friday, March 13, 2009

Just eat the whole d--n donut, already!

I had it from a reliable source--corroborated by Snopes in the spirit of trust-but-verify--that the legend of Van Halen's M & Ms was actually not a matter of rock star vanity, but, rather, a mechanism to keep them and their roadies from injury and possible death.

Now, I'm not at all positive that I will ever have anyone at all working for me, much less enough people to make bringing in a box of donuts anything but overkill. But if somehow that happens, I'm seriously leaning toward inserting a clause in the employment contract to the effect that leaving part of a donut in the box is grounds for termination.

True, it's partly a matter of the fact that this is a huge peeve of mine. And it's equally true that I harbor a dictatorial streak left over from being the defacto babysitter of my younger sister. You'll have no quibble with me on either score. But kindly keep your sprinkles on long enough for the rationale.

Similar to Van Halen's thinking, such a clause serves as a very detectable indication that the employment contract was not read--or at least not taken seriously--by the employee. As a reading-and-comprehension test, it's pretty darned cheap, even when you pick up something considerably better than the grease-sponges at the gas station on your way in to the office. And I think that anyone who's played hiring roulette can agree that it's better for all involved to show your mistakes the door sooner than later.

Please understand that I've lived in the upper Midwest all my life. Here, we tend to say "Please" and "Thank you" and stand in line patiently, try not to be a bother and all that. Trust me, I get it: It's in my DNA, fer' cryin' out loud. But the half-donut thing is not part of that ethos. Bisecting a donut, leaving it to dry out and (if applicable) hemorrhage its filling basically tells your co-workers that they can have your seconds, because their enjoyment of the treat takes a back seat to your delusions of not getting fat.

Now, if you and a co-worker agree to split the donut, wonderful. That's win-win, and I like to encourage that. Or if, after a bite, you decide that it's too greasy or otherwise objectionable, tell me straight-up and pitch it into my trashcan. That's decision-making and honesty, and I like to encourage that as well. But anything else is symtomatic of a self-involved and self-delusional personality, and I won't foist that on the people whose work pays the bills. Either eat the d--ned thing and take the consequences to the Stairmaster or show some restraint and stick with your coffee.

Thursday, March 12, 2009

A day late to the iParty

About a week ago, I dissed the standard iPod earbuds as part of a larger rant on suck in general. Yesterday, Apple announced an even smaller version of the Shuffle that packs 4 GB into about half the size of the previous version. This is made possible by the fact that they offloaded most of the controls to a widget on the headphone cord.

Now, lest I veer off into slamming a product I've yet to touch--and understand that I own two Shuffles and get more mileage out of them than the full-sized version--I want to jump back about a quarter-century in marketing.

You don't have to be female, merely d'un certaine age to recall the phenomenon of designer jeans in the early '80s. Gloria Vanderbilt was the reigning Queen, with Jordache plotting its coup. I don't pay $40 for a pair of jeans in 2009 money; expecting my underpaid (single) mother to come up with two-score 1982-ish clams would have been patently ridiculous--even the brat-child edition of me knew that.

At the time, Mom said something that's stayed with me, even as Gloria Vanderbilt gave way to Ocean Pacific, The Gap, Old Navy, and countless other brand-fads: "Why would you pay anyone that much money to advertise for them?"

Now, jump back to 2009 with me, and think about the process of listening to music--or at least blocking out the ambient inanity. Today, as in the 80s, headphones are as reliably black-clad as the audience at a Goth indie band gig. But Apple, in characteristic fashion, made their headphones white. Look at the iPod ads with the human silhouette. What stands out? Yep, the white earphones. Branding, and pretty obvious at that.

Silhouette people, though, aren't really doing anything besides standing there, so they can flash their iPod--in whatever color--for all to see 24/7/365. Unlike the rest of us, who will sooner than later need our hands to do something else. The case for my full-sized model has an optional clip, but I wouldn't trust it to stay clipped. So the iPod will be more securely stashed in my pocket. And so, because I find the headphones so tinny and uncomfortable and destructible and just too darned short, the Bose buds (though still too big for my ears) will be what snake from my backpack pocket to my ears. And so--for all the world knows--I could be using any old MP3 player. Heck, I could be plugged into my antedeluvian Walkman knockoff. I just might look old enough for that, you know.

So the takeaway is that I am not effectively paying Apple to advertise their product. (Although, on second thought, anyone who knows how mortally un-cool I actually am may consider that a point in Apple's favor...) But I don't think that this is what their Marketing Department intended. So I hope--if only for the sake of the whole ecosystem of cottage industries that has sprouted up around the iPod family--that they clue in on this iteration. Because Mom's right: Paying someone for the privilege of advertising their product is pretty stupid.

Wednesday, March 11, 2009

Health vs. wealth

I'm plagiarizing this idea (from where I can't recall): We trade our health for money, then spend our money trying to regain our health. Granted, the exchange rate's more strongly slanted toward health in the USA than it is in most countries, so I won't carp that much. But the fact that money and health (rather like money and time) seem to be mutually exclusive still sucks.

Tuesday, March 10, 2009

Economies are ecosystems, too

This is not a warm, fuzzy, let's-see-how-far-we-can-stretch-the-analogy kind of sentiment. I mean it quite literally. All the important elements are there: Producers, consumers, adaptation to environmental pressures, competition, symbiosis, the perils of monoculture and on and on...

What I want to talk about is the ramifications of the analogy. Contrary to the testosterone-infused epic that some people make of "survival of the fittest," Nature isn't always about every critter for itself. Half an hour working with honeybees should put paid to that "always" for good. (Trust me on this, we've kept bees for years.) Similarly, an economy that exists to reward the most ruthless is not capitalism. It's not even worth the dignity of the term economy: It's basically an enforced Ponzi scheme. Or, in biological terms, the economic equivalent of factory farming, in the most exploitative sense of that phrase.

First, let's agree that you won't have to look too hard to find a regulation that's merely proof that some self-important bureaucrat had to justify her/his salary. But the dirty little secret that's swept under the rug is that many, many regulations were more or less bought and paid for by established business interests. Mainly to raise the barrier to entry for the upstarts who might otherwise force them to adapt. Or to limit the choices of the consumers who might demand changes. One need look no further than the Big Three fighting CAFE standards in the interest of higher profit margins on SUVs, as if peak oil and over-extended consumer credit would never come home to roost.

Similarly, folks in business resist adaptation by dint of buying smaller competitors and destroying their value through the process of assimilation. Or by "partnering" solely with companies of equivalent weight and treating smaller vendors as interchangeable commodities. The whole point of being in business is to find--as in actively seek--ways to manage risk to maximize returns. Yet so many so-called defenders of the free market treat their niches as sinecures. (Not unlike claiming to live by the law of the jungle while shooting tigers from elephant-back.)

The way I see it, the second you stop paying attention--and I mean really paying attention--to what's going on in your niche of the ecosystem because you assume that you can control it, you've lost the right to call yourself a free market capitalist.

And, moreover, paying attention also entails a certain amount of active cultivation, even when it doesn't yield dividends in the short term. Some basic illustrations:

  • The contract you toss--with crossed fingers--to the startup of today could be the difference between it going under...or becoming your own customer down the road.
  • The budget that allows an employee to squeak in a college class means that the process of solving your problems has fresh information to work with.
  • The schedule jiggling you have to do to allow key people to rub elbows with their peers at a conference gives you a chance to put your finger on the proverbial pulse.
  • The intern you find time to train could be employee who hits the ground running straight out of school.
The bottom line is that none of us operate in a vacuum, any more than creatures in nature. And I can't help but think that if more people actively cultivated their niches (rather than drilling, clear-cutting and strip-mining everything from them with a covetous eye on their neighbor's niche all the while), we wouldn't be talking about the Recession of 2009, much less of 2010.

Monday, March 9, 2009

The power of positive squabbling

I think that a project actually stopped spinning its wheels a bit today. Why? Because the right people are getting angry. And showing it. Voices were sharp--just on the border of yelling. Two rather different universes collided, so it's to be expected. And I think that the person who's supposed to be holding it all together might just be getting the memo that it's not enough to break a project into chunks and put the smart people on the hard problems and the lesser talent on the gruntwork. Particularly with the personalities involved.

For me, it was kind of invigorating; for the more passive folks, maybe not so much. But however much you might dislike confrontation, I think we can agree that it's ultimately better to work with people who will disagree with you to your face (even vehemently), rather than those who will sabotage you behind your back.

Sunday, March 8, 2009

Where is your cave?

I’m totally stealing the “cave” meme from Michael Lopp’s book, Managing Humans, Biting and humorous tales of a software engineering manager. Actually, he specifically calls it a “nerd cave.” In the proverbial nutshell, it is the place we go to focus on our best work.

In the best of all caves, we have everything we need to do what needs to be done, and anything more is a distraction. The determination of how much of the rest of the world is allowed past the cave entrance is a huge factor in our success. Too much and our final product is obviously seamed; too little and it is uninformed navel-gazing.

I imagine that, at the dawn of humanity, it was much the same. The cave was the place where we dragged our hard-won meat, roasted it in the fire built for that and defensive purposes at the cave mouth, sharpened our spears, drew on the walls, mated, nurtured our young, and developed useful things like language: “Hey, be careful, there’s a stalagtite over—” [Cruuuuunccchh!!!] “Never mind…” (And thus was profanity born…)

You probably already know where your cave is, but if you don’t, look for it. If you can’t find it, make it. Cherish it as the evolutionary refinement it is. And keep an eye out for stalagtites. And stalagmites—they hurt just as much when you run into them.

Saturday, March 7, 2009

Mammals or dinosaurs?

I'd love to believe that our current recession is like the scene "Fantasia" where the dinosaurs turn their heads to see the flash of the meteor that ends their era. I'd like even more to believe that the smaller, nimbler "mammal" businesses will replace "dinosaurs" like AIG and GM. That Citibank's carcass will be covered by the grass that feeds more Kiva- and Grameen-like forms of lending. That tiny start ups burst the seams of their towns' industrial parks desperately trying to stay ahead of orders for green vehicles.

At the moment, this is merely a basket of hopes. Deep pockets will win. And ruthless practices will be given a free pass in the name of "the times." But I firmly and fervently believe that one part of the game has changed: Everyone can safely consider him or herself just as much an "expert" as the MBAs who were at the wheel and the media who let them wrap the world economy around the light-pole.

Just as the dot-com bust stopped HTML coders from calling themselves "web programmers," I think it's a safe bet that we'll see a similar thinning in the ranks of investment bankers and financial analysts. Maybe some will even land jobs that (gasp!) actually contribute to the GDP, rather than play games with imaginary money.

Friday, March 6, 2009

Wastin' my minutes, American (Express) style

I had to look up something at UrbanDictionary.com, and ran across the phrase "wastin' my minutes," which is a response to someone saying something stupid, particularly on your airtime. So that seemed pretty apt when I read about Amex's Terms of Service changes that I probably blew off in the rest of the junk that was stuffed into last month's bill. Following up on that, I caught wind that they are disposing of potentially toxic assets--in the parlance of our recessionary times--by paying them to pay off off their debt.

So let me see if I have this straight: AMEX is paying $300 a throw to shoo away customers who can pay their bill in full. At the same time it is risking the loyalty of paying customers with cheesey nickel-and-dime annoyance. (Not to mention the fact that people like me, who have been paying in full every month for twenty years won't see that $300.) I've seen bridges burned in my time, but rarely from both ends.

But the encouraging news for anyone else in business is that you don't have to be this stupid, either in recruiting or retaining your own customers. One of the joys in being a fractional fraction of the size of American Express is that you have waaaay more control over choosing your relationships. True, sometimes you might have to go with the questionable ones to keep the lights on. But you make that decision with far more data--even "soft" data like the proverbial gut-check counts--than you can when credit scores are all you have to go on.

And more aptly, you have far, far more control over what your customers ultimately do to your brand.

A story to illustrate: I was in college in the '80s, when deregulation brought credit card companies out of the woodwork. Seriously, you heard news stories about cats being mailed pre-approved plastic: I am not making this up. I was working for barely above minimum wage--part time!--and carried more credit than my mother--maybe even my father--would have been eligible for ten years before. It was a joke: I made a game of collecting the stupid things, some of which I never once used.

And my American Express card was no exception. Diner's Club aside, that was the "trendy" card, most expecially when it came in metal colors. And lo and behold, I was still working for barely minimum wage right after college when AMEX called to ask me if I wanted to upgrade to Gold. I humored their rep. by answering all her questions before asking, "Okay, now this is the part where you ask me how much I make in a year, right?" And then she blew my mind by saying, "No, you're pre-approved."

That, friends and bretheren, is the sound of a brand diluting itself. Cachet relies on exclusivity, and American Express flat-out blew it by thundering along with the rest of the plastic-money stampede. Mind you, I had a killer credit rating (for my age) b/c I'd always, ALWAYS paid all my balances in full every single month. But they blew it anyway. A twenty-three year old Midwestern liberal arts grad? Please. That's about as far from the Gold Card image as you can get without going completely ghetto--and even ghetto had more cool factor, even in the day when hip-hop was still called "rap music."

So, to reiterate in the wake of that long-winded anecdote, savor your power of choosing your customers. And rejoice that Wall Street-league stupid is far from mandatory.

Thursday, March 5, 2009

Describing your fortress

I shouldn’t admit this, but the Facebook game “Medieval Empires” is becoming an addiction. It’s a game of economics, rather than pure strategy, and the checks-and-balances are what makes it interesting to me. The perennial scourge of medieval kings was raising enough cash to pay their troops. Sure, they were owed service from their vassals, but like as not they were at war with at least some of them at any given time, so that didn’t work so well.

In this game, money equals bigger army, which means that you more successfully attack and defend from your fellow players in “skirmishes.” You can make money three ways:

  1. From your “conquered” cities, which is a steady income stream. You need progressively more powerful armies to conquer more energetically defended cities.
  2. From skirmishing with other players. This has the downside of increasing your experience level, which could push you into a higher class before your army’s power catches up.
  3. From buying more sources of revenue. (Trade caravans or thieves). These pay for themselves in roughly a month, so there is some delayed gratification.

So it basically boils down to two basic strategies: Patiently allowing passive revenue to build your attack over time, or actively skirmishing and taking your lumps (because you can, theoretically, lose money when you lose to another player). So far, I’ve been trying to skirmish my way to power, although I’m a rather small fish for my level. And that takes active participation over a not-inconsequential period of time, because you’re limited to ten skirmishes per ten minutes.

The central problem is also another medieval one: Allies. I have one, and I don’t trust him as far as I could throw a war elephant. So I need to put a lot of distance between him and me. Fortunately, he’s been something of a slacker since I met him twenty-some years ago. ;-)

So the takeaway is that I'm squandering time I could use for more tangible empire-building, all because, even in the make-believe arena of "Medieval Empires," I still can't stand the thought of losing when just a little more diligence could give me that breathing-room for another day, another week...maybe even forever if my "ally" loses interest and moves on to another game.

But it just hammers home that people don’t really work for money. Seriously. Money is just a means to an end. Even the most stone-hearted backstabbing miserly bastard in the world is ultimately looking for security. For him/her money keeps the wolf from the door. And I believe that’s true for even good-hearted people. Most folks probably wouldn’t mind having oodles of cool stuff to enjoy inside that security, and more than likely someone to share it with. But the walls of our fortresses are ultimately built of cash. Walls mortared with the sweat of work and worry—no question. But the bricks themselves are pure cash.

So it seems to me that if that much work and sleeplessness and time is involved, it’s important to describe your fortress to yourself before you start building it. And you may need to describe it to the people who are important to you, if it means that you will spend less time with them as part of that process.

For the record: My fortress is modest by business standards. It largely revolves around freeing me and mine from having to fret about money. It involves building a house off the grid on my own terms. And as a means to that end—partly an end in itself—I want to create a business that I want to work for—without having to turn into someone I don’t like in the process. Maybe I’ll want to add a porch or a bigger garage onto that fortress after I start feeling a little more comfortable inside those walls. I’ll worry about that when I’m at that point.

But, if you’ll excuse me, I have about a half-million dollars that needs to be converted into armed militia so that I can go conquer Kiev thirty thousand attack points from now…

Social media hits home

This isn’t navel-gazing, honestly. What happened was that I biffed the Tumblr-Twitter settings and yesterday’s oevre cross-posted to Twitter. Next thing I know, I have a new Twitter “follower” who’s into wine-making as well. Because he’s written a book about wine-making, specifically non-grape wine-making, which I assume was the reason I was pinged. I did give him fair warning that I probably won’t be tweeting about wine that often. Hopefully he’ll have the great, good sense to un-follow me, because it’ll be a waste of his eyeballs otherwise.

But the whole episode really stands as a microcosm of how so-called social media—I say “so-called” because I think that it encourages a more superficial form of socialization—is furthering Web 1.0’s job of “balkanizing” those with access to it.

Mind, I don’t consider “balkanization” a bad thing, not by any means. (Except, of course, it didn’t work so well in the Balkans…) Mostly because I’m (at best) a reluctant leader—“reluctant” because I think we’d live in a much better world if people stopped delegating their thinking and judgement-making, if only by half. For all that, I understand that Mr. Rivard is building his “tribe”—to use the micro-marketing lexicon of Seth Godin and Gary Vaynerchuk and a raft of folks I’ve yet to encounter. Fruit-based winemaking is an extremely focused niche, and big ol’ props to the guy if he can make a living within it.

And, who knows, I may buy the book. We make enough fruit wine—even the odd vegetable wine—that having an extra resource certainly won’t hurt. And, more to the point (of this post anyway), the social contact was made respectfully. By being “followed” I know the terms—namely that I am free to ignore the attention. And there is no attempt made to trick me into engaging where I do not want to engage. I’ll take that over spam any day.

An unexpected parallel

You could fairly say that my husband and I are “into” winemaking: Fifteen gallons are fermenting in the living room as I write, with six or seven more (five-gallon) kits cooling their heels in the den. About two or three months ago, I decided to get a little more serious about the actual discipline of winemaking.

Mind you, the guy who runs the homebrew shop is completely awesome—I’d never think to question his word. But it was time to start understanding why I like what I like. And more to the point, to stop wasting money—even ten bucks at a time—playing whack-a-mole at the supermarket liquor department.

Winemaking and what I do for a living (nominally, programming computers) are similar in that there are a lot of posers out there. And a lot of hooey that passes for “conventional wisdom,” largely propagated by the know-it-alls. (We all know how that story ends…) Not to mention the annoying fads. (Stupid “Sideways”!)

So the only antidote is immersion into the subject matter. But here’s the deal: You don’t have to know, much less understand everything while in that immersive state. I (maybe naively) registered for the mailing list of Derby, an open source Java database (brought into this world by IBM, before being fostered by the Apache Foundation). Thirty one messages hit my Inbox less than twenty-four hours later, and I made the commitment to myself to read them all before bedtime.

But that’s okay. See, I know I’m a poser when it comes to that particular database system. That being said, I need to know where the proverbial bodies are buried: I have two upcoming projects riding on that understanding. And there’s no other way to achieve that except to start now. There is enough time that I don’t need to scurry off to Wikipedia the first time I can’t zen something purely from context. (The third or fourth time may be another story…)

And wouldn’t you know it…four more messages just arrived. Time to skee-daddle, folks.

Sick of suck

Are you sick of suck yet?

Straight-up suck like McDonald’s Worldwide failing to lift a finger for one of its own franchisers (and his employee). Half-baked suck like paying four hundred clams for an iPod Touch and being insulted by its lame, tinny, white headphones. Free-but-time-wasting suck like Firefox on Ubuntu not getting Flash or Java right straight out of the box.

Sucky suckity-sucking suckitude. By people who have convinced themselves—and maybe even others—that they don’t suck. Or, worse, that their suck is actually cool.

If you’re sick enough of suck to be mad, congratulations. That’s half the battle. Cherish that anger. Nurture it. Own it. There’s more suckage on the way. Now that Joe Everbody’s convinced himself that the economy’s circling the drain, he’s formally absolved everyone—including himself—of sucking.

So brace yourself for the coming Suckapalooza. Sure, you don’t want to get any suck on you, but you will. It’s inevitable. No one says you have to like it or bend over and take it. But however you choose to handle it, do not fight suck with more suck. Never, EVER give yourself even the slightest excuse to suck.

By all means, smile indulgently upon your great want of perfection and find your humility therein. Forgive yourself your failures of execution and missed opportunity and take your instruction from them. Make no apologies for modest goals and incremental improvements.

Just do not suck.

The upside of a downturn

Ignore the fact that his shirt’s half tucked in and half not. If you don’t work in technology, much less web application development, ignore that too. Ditto the effenheimers that make this keynote Not Safe for Work. This is the most electrifying thirteen-minutes-and-change I’ve seen in quite awhile. (No offense, Mr. President: I thought that the Inauguration speech was pretty darned spiffy, too.)

But if, by the end, you don’t want the recession to end, don’t say you weren’t warned.

I would—with the most humble presumption—add one thing to the overall idea that recessions are when you get ahead of the competition that’s either cowering in its bunker or fled the field altogether.

Anyone—consumer, manager, entrepreneur—who’s been been nicked by a nasty (think 1982) recession knows that it enforces the fiscal discipline and focus that s/he should have had during the comfortable years. Bad recessions make that socially acceptable, too.

But that’s a problem in itself.

Don’t get me wrong: I’m not saying that focus and fiscal discipline are bad. I’m saying that there are good and bad ways of regaining those things if you’ve lost them. The worst-case scenario is no different from crash-dieting to shoehorn yourself into a swimsuit by June. The flab comes back in spades in October, doesn’t it? Why? Because you crashed the muscle, not the fat. Cutting fat takes work and diligence and, perhaps most importantly, time. Things that it’s all too easy to convince yourself that you can’t afford right now.

If you can still pay the bills in this economy, congratulations: You’ve (almost) won the lottery. Now the question is, how much of a head-start can you get on the posers who will slink back into your turf when the pickings get a little better? Have you made their work any easier? In other words:

  1. How many customers did you piss off with your corner-cutting?
  2. How many of your key people are disgruntled from your draconian mandates?
  3. How many vendors did you alienate with price squeezes and late payments?
  4. How much has word of all that gone around?
  5. Most importantly, how have the rules of the game changed?

When—not if, when—you have to retool for a the next round, you won’t be able to do it alone. Customers, employees and vendors may all need to be brought up to speed if they are expecting the clock to just turn back to the fat times. You won’t be able to guide them across that bridge if they’ve closed their ears to you.

And that, to me, is where the rubber really hits the road for Gary Vaynerchuk’s keynote. It’s not enough to understand that (or even how) your game is changing. It’s how you help the people you depend on to understand. And this is where the messenger becomes much more important than the message. You, the instigator or change, have a choice to make. You can be the martinet force-marching your troops over the mountain range. Or you can be the prophet pointing out the Promised Land on the other side. I think it’s a solid bet which group will make the journey faster…and more intact.

So break out that staff and cloak now. (The snowy Charlton Heston beard is optional.)