Tuesday, November 16, 2010

A thought on generosity

At the state convention of the Wisconsin Honey Producers Association, Dennis had helped someone who was having issues getting their presentation to work on a loaner laptop. Despite the last-second nature of the "emergency" (and typically sucky hotel wireless connection), the presenter was able to give her spiel.

To Dennis, it was just a matter of downloading and installing the correct software. Something both he and I do without blinking. (Booyah for open source!) To the presenter--and the folks who had come to cheer her on--he was the hero of the hour.

Sadly, I think we tend to underestimate what we have to give to others (and, so sometimes don't bother). But it's rather like how "trade" in Economics 101 is supposed to work: Each side thinks that what the other has is, pound for pound, more valuable than what it is offering in return. If we accept that mutual imbalance in perception as truth, then when gratitude is the coin of the other side, it cannot devalue what we have to offer.

Be generous where you can--it will probably be worth more than you will know, at least at the time.

Monday, November 15, 2010

Process and communication

This morning's edition of the weekly dev. meeting had a little more animated back-and-forth than usual. Mainly it was about how to flag a line-item in the issue-tracking system to indicate that it was supposed to be pushed into production during off-hours.

That sort of software promotion is not at all the norm, though, so I couldn't help but think (at the time) that we were collectively over-thinking the whole thing. Then a little red flag went up in my brain and I thought: "Uh-oh: We're trying to use process to compensate for lack of communication, aren't we? Gack. Here we go again..."

Fortunately, it didn't turn out that way. But as I headed back to my cube a bit later, I realized that, to an extent, process is communication. In the sense of signalling what has been done, and what there is still to do and, normally, who is to do it. The trick, of course, is to understand the limits of process' vocabulary and grammar. And, of course, be always wary of conflating the two.

Sunday, November 14, 2010

The cloud with a grey lining?

Maybe it's just me, but basic LAMP/WAMP stack web hosting has started to feel confining and kludgey lately. It's already a commodity, and as cloud computing gains traction, I wouldn't be surprised to see a two-caste system develop: Something like the current status quo (for non-profits, Mom-and-Pop storefronts, etc.) and scalable cloud resources (for the better-heeled clientele).

And I make that prediction because I think that the selling-point of cloud computing--meaning, that you only pay for the resources used--is actually a liability in the world of small budgets. Better to pay a monthly/yearly/bi-annual/whatever fee than be handed a sizeable tab if something goes viral. Budgets, after all, are All About the polite fiction that expenses are predictable.

Granted, some hosting ding their customers for exceeding bandwidth and/or disk space limits, but I think we can reliably expect commoditization to minimize that. Because, really, the only value-add to a commodity is to offer more of the same without raising prices--in other words, become the digital version of Old Country Buffet.

And, yes, I understand that folks on shoestring budgets will tend to be the late adopters of cloud computing. But that doesn't mean I think that the purveyors of cloud hosting are shortening that curve with the a-la-carte, parking-meter pitch.

Saturday, November 13, 2010

The "Green Acres" paradox

Based on recommendations of a couple A-list types, I snagged a copy of Lisa Gansky's The Mesh and am about halfway through it. I'm already sick of her raving about ZipCar, but expect to polish off the book anyway. Despite the fact that it doesn't apply to software development except to emphasize Web 2.0 and mobile apps as one of the underpinnings of mesh business development.

(Aside: It's a little weird reading about BP's oil spill in an actual book--particularly given how I'd just finished Crossing the Chasm, which--even in my "updated" edition--mentioned the Twin Towers as if they still stood.)

Trust me when I say that I'd like to believe that all Gansky's premises are true. Particularly the one about how the Great Recession has adjusted people's value systems to prefer part-ownership (or renting) well-made things to full ownership of their shoddier, inhumanely- and unsustainably-made counterparts. But--with all due disrespect to the ethical fecklessness of the average American consumer--my cats have longer-term memories. Seriously.

That being said, one point in favor of Gansky's arguments is the increasing density of urban areas. When done right, it's certainly a "greener" way of living than offered by the 'burbs. But that brings us to what I hereby dub "The 'Green Acres'* Paradox," by which I mean that "the dream" is still to carve out a slice of real estate that we can call "ours." Where we can remain blissfully ignorant of our neighbors' dreckish taste in music, their domestic woes, or even their pet ownership status. As my prof. for American History I and II in college put it, "There is no freedom like the freedom from the vices of one's neighbors." (Preach it, brother!)

Yet, somehow, just down the road (not even half-a-holler, y'all), there's supposed to be a full grocery store.

And a gas station.

And a WalMart.

And a Starbucks.

And a Gap.

And four bars on the 4G phone.

And this amazing little hole-in-the-wall Thai/Halal/Tapas/Tepanyaki/Dim Sum place.

And...you get the idea.

And what I take away from this is that the companies that best reconcile the paradox will win as big as they want to win. Not necessarily for the "right" reasons of sustainability and distaste for materialism. But because, as a culture, we're pastmasters at saying one thing and doing another.

P.S.: If anyone out there invents a technology to neutralize the hystrionics of a spectacularly undisciplined beagle, call me.

- - - - -
* Exerpt from the "Green Acres" Theme Song

Oliver Douglas: Green acres is the place to be:
Farm living is the life for me!
Land spreading out,
so far and wide.
Keep Manhattan,
just give me that countryside.

Lisa Douglas: New York
is where I'd rather stay:
I get allergic smelling hay.
I just adore a penthouse view.
Darling, I love you,
but give me Park Avenue.

Friday, November 12, 2010

Frivolous Friday, 11.12.2010: Curricula

I've been toying with the idea of taking actual instruction in ASP.NET that didn't involve Rolla U. No offense whatsoever intended to the celebrated "4 Guys"--if anyplace has a claim to be my alma mater of classic ASP, it's their website, and major props to them for everything they've given me over the years. WTC is offering a web application programming class that's a blend of VisualBasic.NET and ASP.NET; unfortunately, the time-slot turned out to be the deal-breaker.

Yet, as I perused the required courses for the program--pleased to find a de rigeur Ethics class--I was reminded of how many critical parts of a programmer's education must be left out of a two-year--or, indeed, even four-year degree. In a perfect world--which we all know means one smart enough to put me in charge--a programming degree would not be complete without any of the following courses:

Abnormal Psychology in the Workplace - Students will learn to recognize and neutralize co-worker and managerial pathologies such as gatekeeping, passive-aggression, prima-donna and/or drama-queen hissy-fits, stonewalling, brown-nosing, empire-building, back-stabbing and toadying. (Note: The graduate level version of this course will emphasize pricking holes in the Management Reality Distortion Field.)

Sandbagging - Students will first master the fundamentals of expectations management in the first half of the course, then progress to padding schedules and budgets for the inevitable but nevertheless unpredictable vaguaries of the real world upon both.

Marketing Language I and II - Students will study the language of Marketing, currently believed to be a Managementese patois. By the end of the first semester, students will be expected to detect which words and phrases are harmless vs. those guaranteed to completely hose schedules and feature lists. By the end of the second semester, students will be able to communicate simple ideas with real Marketing personnel.

Beginning Cat-herding - Students will approach basic project management skills by learning to cultivate organizational buy-in...or at least temporarily neutralize apathy, managerial and budgetary neglect, sabotage, and open cynicism while simultaneously co-opting the naifish energy of idea hamsters and those with ADOS.

The Care and Feeding of QA - Students will familiarize themselves with the unique psychology of Quality Assurance. Labwork will emphasis gauging the correct balance of pre-emptive unit-testing, trench-camaradarie, favor-trading and outright bribery necessary for maximizing the amount of significant bugs reported while minimizing the ones that can be kicked into the next software patch cycle.

Meeting Dynamics I, II, III, and IV - Over four-semesters of lecture and lab work, students will build proficiency in detecting hidden agendas, thwarting hijackers, shutting down grandstanding and public spankings, enforcing accountability, wringing decisions from stake-holders, and generally minimizing the amount of CO2 sucked into conference room HVAC intake grills. (Note: Course curriculum, including tests, to be personally developed by Professor Emeritus @rands.)

Thursday, November 11, 2010

A geek's perspective on Veterans Day

Regardless of whether or not one is a pacifist, today is one properly dedicated to counting the cost of war and remembering those who paid in the dearest coinages: Life, limb, friendship, health, freedom, innocence, hunger, cold, homesickness, hopes for the future. A simple "thank you" seems laughable...until one considers how insulting it would be to say nothing at all.

And, in that general--no quasi-military pun intended--spirit, it would be worth recognizing the kick-start the military gave to computers and my own profession. Incomprehensible as it may seem, computers were not, in fact, developed with Farmville in mind. (I'm just as shocked as you.)

No, it seems that even in WWI, the notion of offloading ballistics computations to non-humans was considered worth pursuing. Why? Well, when artillery like the Big Bertha had a range of fifteen clicks (a.k.a. 9 miles and change)--meaning its operators couldn't necessarily see their targets--calculations mattered. Even at closer range, you had three options for hitting people who wanted to kill you:

  1. Dumb luck. (Not recommended.)
  2. Experience (Good luck surviving long enough to get it.)
  3. Working through calculations that took into account factors such as:
  • Mass of the ordnance being fired
  • Force of the charge behind it
  • Recoil (a.k.a. our old friend Newton's Third Law)
  • Wind/Air resistance
  • The Quadratic Formula (Remember that from Junior High? Turns out, it had something to do with The Real World after all. Whoodathunkit?)

Oh, and did I mention that, while cranking through all that Algebra, your target could be on the move and--by the bye--you might be under fire yourself? Yeah. Kinda makes it tough to remember to carry that two, dun'it?

But in computer history--just like on The History Channel--it's WWII that gets all the glory. Enter ENIAC and its sucessors. That was the for Army (where its services were, most sensationally, conscripted for the Manhattan Project). Not to be outdone--in computing as in football--the Navy partnered with Harvard University and IBM to create the Mark I, for much the same purposes. And, of course, there's Bletchley Park's Colossus, ignominously burned in 1960. Because as much as weapons win battles, intelligence wins wars.

The Korean War didn't last long enough for IBM's 701 model to see much--if any--service, which can also be said for Big Blue's one-off NORC. However, by that time, businesses (and non-military government agencies), flush in the post-war boom of the 1950s, were already slavering for the breathless computing times such miracle-machines could give them.

As the final kick, let's not forget that the underpinnings of the internet itself originated with DARPA, intended to spread the risk of all-out attack by decentralizing the network.

And the rest, I'd say, is history. Save that Clio might just be the least glamorous of the Muses--valued only when she titillates...or provides those who attend to her with the smugness of precedent. So I would ask my gentle reader--as you receive a text or call, catch up on your peeps' Facebook statuses or tweets, become the Mayor of wherever, etc.--remember how that gadget at your fingertips got there. Thanks.

Wednesday, November 10, 2010

Context is King

I'm afraid I was a mite sharp with the new-ish QA person today because he--from my perspective--was just not getting the fact that records added to a certain database table are not created equal. Some are added mainly for tracking & possible troubleshooting; others do have an impact on the business logic that uses the data.

When I asked why he wasn't paying attention to the True/False field that flagged the record as either in-play or padding the roster, he seemed to backpedal a little bit. "Here," I insisted impatiently, popping open a fresh window in the database interface and punched in a quick query. "You need to pay attention to whether this field's value is True or False."

As it turns out, he merely couldn't see those columns on his screen. Context: Some folks in our office have a desktop with two monitors. Others have a laptop with a second monitor. Not only is he one of the latter, he also was viewing the database interface on the much smaller laptop screen. The upshot was that it was effectively off the radar for him, and he apparently missed the horizontal scrollbar.

Theoretically, we could have squabbled over whether the feature he was testing is, in fact, "broken" by lobbing notes at each other in the issue-tracking software. But he happened to stop by my desk on the way out to tell me that my fixes hadn't fixed anything, and it was only then that we sorted out the mistake. (At least I hope it's a mistake, 'cuz I'm sick of looking at that code.) Alternatively, if I had gone over to his desk to tell him that he was obviously smoking crack, we would likewise have saved ourselves the time--and at least some stress.

Fortunately, it's not every day that one of us has to be standing over the other's shoulder to be on the same page. Hopefully, it'll become even less frequent as our software and way of doing things warps his mind into the proper extra dimensions. But for the life of me, I just cannot grok the value-add of off-site (mostly meaning offshore) testing of software features. Much less whole applications. Now, I can possibly understand outsourcing large batches of automated testing. But anything that involves a more than superficial understanding of the once-and-future product?
Granted, I don't work in a large shop, but the necessarily longer feedback loop of outsourced testing strikes me as such a liability as product cycles tighten.

Certainly, enough companies still consider it a viable option. After all, to the bean-counters, there are no account entries for time zone shifts, language barriers, reduced job satisfaction, unpaid overtime, employee turnover and the like. And conventional business wisdom has a simple fix for such frictions as ultimately find their way to the bottom line: Find another offshore firm with an even lower bid.

I hope that most owners and managers are smarter than conventional wisdom, anyway. Because there's only one thing stupider, and that is to not have any testers at all.