Friday, April 30, 2010

Frivolous Friday, 04.30.2010: The pancake workflow

Kindly bear with me, as my copy of Silverstein's Where the Sidewalk Ends went missing some years ago, and I'm quoting from memory:

Who wants a pancake,
Sweet and piping hot?
Good little Grace looks up and says,
"I'll take the one on the top."
Who else wants a pancake
Fresh from the griddle?
Terrible Theresa smiles and says,
"I'll take the one in the middle."

Granted, I often wonder whether my neural wiring isn't, in fact, the love child of OCD and ADD. But I doubt I'm alone in my Grace-like predilection for taking the topmost pancake whenever it's thrown on my stack. Even when I was already in the process of shifting the previous pancake onto someone else's plate. Hopefully improved with a pat of butter and/or pure Wisconsin Grade A maple syrup, of course. (Just don't expect bacon on the side: I might just be the only non-kosher, non-vegetarian person on the planet who doesn't believe that "Bacon makes everything better.")

So if your workdays too often seem like an exercise in working your way down to the bottom pancake, at least you have company. Maybe not the kind of company you'd want your family to meet on holidays (because she'll say odd things at the dinner table). But company nevertheless.

Thursday, April 29, 2010

Management FAIL, clear as glass

The "landlord" at work sent two contractors up to replace the cracked window above my podmate's desk. As Dennis and I are currently window- and siding-shopping, I asked one of the gentlemen whether they also did residential work, and when he replied that they did, requested a business card. "Do you have any business cards?" he asked his workmate. I didn't catch the entire reply, but it was definitely negative--both in fact and in attitude. Something along the lines of "D'ya actually think they'd give cards to me?"

And that was the last of it. The pair carted the original window and their tool-sets out of the pod, and I still have no idea which company's name is on their paychecks. Unless the quote we've seen so far is at the tip-top of the price range, we're not talking chump change, even for our little house. Yet two employees didn't think it important enough to scribble contact information on a piece of paper.

True, an anecdote is merely an anecdote, a single data point. But given the number of contractors in the area, the statistical likelihood remains that a local business lost a potential sale. All because someone doesn't understand that, regardless of job title, all employees are salespeople. (It's an idea I'm borrowing from Harvey Mackay, by the bye, but just as true now as it was when we first dipped our flippers into the shark tank.)

Wednesday, April 28, 2010

A 1980s flashback from a 21st century language

I know a couple of folks who've programmed in the Ruby on Rails language/framework, and they both seemed pleased enough with the tradeoffs between functionality and personality quirks. That being said, I couldn't squelch an evil chuckle at RoR creator Hansson's reaction to a survey that basically called his baby "ugly."
Hansson tells InfoWorld that the kind of developers that are Ruby on Rails' core constituency wouldn't even partipate in such a survey, being far more likely to trash e-mails from organizations like Evans as spam. Moreover, the kind of people who use Rails certainly wouldn't stump up $595 to read a report on which framework to use.
All I could think of was Spinal Tap's manager rationalizing his band's threadbare popularity and D-List venues: "...I'd say that their appeal is becoming more selective."

Now, it's entirely possible that Hansson knows his posse that well. All the same, that's not what I'd consider the most politic response, nor the most likely to garner wider corporate patronage. If that's the ethos of the community, that's their choice. But the idea of a framework inventor who seems to think of his end-users as created in his own image? That just doesn't endear me to him...or his handiwork.

Tuesday, April 27, 2010

Leftovers from dinner

Kudos to Dennis for finding Die Falscher. Just when you think World War II's potential for drama (and morbid irony) has been mined to exhaustion, there comes this. Just trust me and put it in your Netflix queue or borrow it, or what-have-you. I can't guarantee that you will (or won't) look at money in the same way. I merely know that I'll be carrying the "leftovers" to work with me for awhile to come.

Monday, April 26, 2010

Debugging made easier by giving up

Not giving up on debugging itself--oh, no, no, no... (Don't rule out strolling away from the problem for five minutes, though--that can often do wonders.)

I mean giving up your pet nail-problem-hammer-solution theory. And giving up the reflexive instinct to blame your code...or their code--whichever suits your mentality. And giving up--whenever you have the luxury--the notion that you have to fight alone. And giving up the habit of reflexively trying everything "solution" The Google sends your way.

(File permissions, on the other hand, are fair game for blame, as was demonstrated for me yet again tonight.)

The so-called "definition" of madness is doing the same thing over and over while expecting different results. In debugging terms, it's also the short road to madness itself. Don't get on that bus.

The irony is that, in code, there can often be one, maybe two, "optimal" ways of doing something; in debugging, it's all out on the table--maybe even some crazy stuff. That's one of the harder lessons to learn as a software developer, toggling between those two modes. But unless you prefer gremlin-chasing to doing useful stuff with code, it's the only way to roll.

Sunday, April 25, 2010

An unwelcome way of "finding myself"

Yesterday I decided to activate a credit card online rather than waste time dodging attempts to be "up-sold" via phone. Meep! I haven't been this creeped out since plugging my name into ZabaSearch five years ago.

The first two screens had me input the card number, security code, expiration date and standardized information for verification. In other words, the usual--no big deal there. But then I hit two rounds of multiple-choice questions that asked me to select the correct answer for items such as "What county did you reside in in 1994?" or "Which of these phone numbers did you previously have?" or "Which of these schools did you attend?" In all but two cases, at least one of the answers was bang-on...and the jury's out on one of those simply because I don't recall phone numbers from a decade ago.

On one hand, my parents picked a name weird enough to bring my odds of triggering a false positive on some post-9/11 watch list to statistical zero. But on the other, I can't help but feeling a little too much transparency in my personal history just now. If Big Brother is a real person and actually watching, I'm betting he's a corporate suit, not a G-Man.

Saturday, April 24, 2010

"Consultation" consolation prize

Even after a solid night's sleep, I'm still not thrilled about The Sales Call That Ate Friday Night. (Memo to La Crosse area contractors: You have an hour to measure the house and crank out a quote or I send you an invoice for my time.) Despite the fact three and a half hours was too high a price to pay, it would be a more grievous waste if I hoarded the reminder that I did glean from it.

That reminder is that two basic components apply to the work of the contractor, regardless of whether the work is tangible (e.g. siding on a house) or digital (e.g. a website). The first component is competence in craftsmanship; the second is competence in process.

In the case of siding, the contractor is the one who's supposed to remember to check for any permits needed before the work begins, and to not not kill the grass by forgetting to have the rollout dumpster hauled out of the front yard at the end of the job. The website developer is the one who's supposed to save you from your instinct to use the bargain-basement domain registrar and/or hosting company. Ditto making sure that your accountant has a way to drop more quarters into the internet machine so your online shopping cart (or what-have-you) doesn't stop working a year after registration.

Never forget that you have the right to expect both compentencies, regardless of whether the deliverable is something you can touch, or something you see on a monitor.

Friday, April 23, 2010

Frivolous Friday, 04.23.2010: Code Fu

Just a random act of blogging tonight (following a 3.5 hour sales call from a siding/windows contractor--I wish I were making that up). But this thought cheered me up on ride home tonight, after a bad half-hour capping a "Meh" week: Wouldn't it be awesome if programming languages were considered in the same light as Kung Fu styles?

(Aside: If you're in the mood for escapist--in some senses innocent--mayhem, I recommend Stephen Chow's Kung Fu Hustle. I mean, when Roger Ebert describes it as ""like Jackie Chan and Buster Keaton meet Quentin Tarantino and Bugs Bunny," how can you possibly say "No"? When I'm talking programming Kung Fu styles, that's what I'm talking about. Not anything, like, actually real or serious. C'mon...this is Frivolous Friday an' all...)

Rather, I'm thinking that the disciplined, type-safe, syntax-strict languages would be more formalized; the variant-based, who-cares-about-capitalization languages not so much. So you kind of have an "old school" vs. "new school" tension baked right in to that. That's good for starters.

The problem with computer languages, though, is that they generally don't have tangible names. Yeah, Java has Duke...or maybe you could go the (well-trod) caffeinated route with a coffee bean meme. Even so, "Duke style Kung Fu"? (Immediate John Wayne connotations: Nix that!) "Bean style Kung Fu"? Eh...makes me think of either Rowan Atkinson or tofu--take your pick. (And don't even get me started on Apple's "Cocoa": Oh, please.)

At a minimum, we'd need to re-name languages to animal or plant equivalents (No worries, Python: You're grandmothered in, 'k? The jury's still out on Perl.). Yeah, definitely more drama there. Because--let's just face it--the "'Snake Strikes from its Tail' maneuver in Pirhana style Kung Fu" sounds waaaaay cooler than "substr() method with a negative starting position in PHP." I'm sorry: You just can't argue with that. Even when haphazardly mixing-n-matching critters, yo.

To be honest, I haven't thought all that much beyond that. But I do know that it makes Monday--with its bug reports, friction, and all the stuff you never thought you'd be paid to do as a coder--that much easier to face with a lighter heart. And so I pass it on.

Thursday, April 22, 2010

A tale of tribes

The tribe gathers in the cave that gives both light and warmth...and relative safety from the rival tribe somewhere outside. And perhaps, in that breathing-space, even gives a glimpse of hope. For it is time to pool the knowledge of the scouts, to decide what to do next. But the prospects are not especially rosy: The quarry they seek is unpredictable. They are still learning its ways.

As they chatter and squabble, they will cover the cave's walls with depictions of their quarry. The colors are bold and bright, with a dynamism to the lines that commands a respect beyond their child-like lack of skill. Passion and creative expression so tightly fused so always do, regardless of individual taste in art.

One of the tribe has acquired a new tool and shows it to the others. They know that it will help the tribe, though they cannot hide their jealousy. But the tool-bringer is not selfish nor particularly boastful, so their envy is not bitter. And they know that time wasted on envy will allow the quarry to elude them still--perhaps even benefit the other tribe. That cannot be allowed to happen; the tribe must prevail.

- - - - - -

Two teams of programmers are (independently) charged with creating an iPad application. None of them has worked with the platform. The team ("tribe") in question meets in the sunny out-of-the-way conference room whose walls are dry-erase whiteboards. Since the last meeting, one team mate has picked up an iPad, which helps clarify both the constraints and possibilities. And so the design and plan of attack are refocused, even renewed.

Plus ca change, hey?

Wednesday, April 21, 2010

Cat-herding in a different sense

My friend K.--a kitteh-hoomin extraordinaire--said of cats: "Never do anything more than twice that you don't want to do for the rest of your life." Later I came to realize that it also applies to hoomins. Hoomins who don't happen to be me, y'understand... [Insert self-deprecatory eyeroll]

When I forcibly "upgraded" the Ubuntu installation on my laptop and tried to install Sun's Java Development Kit (JDK), the installation failed because it couldn't find any mention of the JDK in its lists of add-on software.

Apparently, my Google-foo was not strong that day. As a result, I somehow managed to form the idea that Sun's new overlord (Oracle) had rebranded it to OpenJDK (the JDK having been open-sourced some time before). So I installed OpenJDK and merrily proceeded to install a couple Java code development environments over the top of that. (Because whyever would Ubuntu stop including the real JDK in repositories that have included it not just twice, but for years on end?)

Meow.

It was only today when I imported old (working!) code into those dev. environments and it failed to run in either that I realized that something wasn't kosher. Looking a little more closely at the error messages, I realized that I was working with the same imposter JDK that has caused me grief in the past by its...errrr..."selective" implementation of Java. Fortunately, Dan Washusen and Vivek Gite were on hand to set me straight.

Big shout-out to both for sharing their knowledge with two-legged, fur-less felines such as myself.

Tuesday, April 20, 2010

Transcending "what" vs. "whom" you know

Of the three employees in question, I'm the least informed. So let's consider me "Number 3."

Number 1 has been on the project so long that he can tell me which floor of which building the sixteen square feet of discrepancy between my database total and his CAD-calculated total are.

Number 2 hasn't been around quite so long, but knows the incantations necessary to link what's in the database with what's in the CAD drawings.

But when the CAD software doesn't produce the data that Number 2 and Number 3 expect, I essentially have two choices:
  1. Shuttle repeatedly between Number 1 and Number 2 in some geeky, grown-up version of "telephone"
  2. Herd Number 1 and Number 2 into the same cubicle and bounce their brains off each other
Not at all surprisingly, the second option is where the magic happens--or, in Classical Managementese, the efficiencies are gained.

So the upshot is that both what you know and whom you know take a back seat to knowing who knows what you need to know...and (most importantly) to asking enough "right" questions to harness the various knowledge-sets for solving a single problem.

Monday, April 19, 2010

Not hypocrisy...no, *really*

Okay, so I have precisely zero data to back my gut feeling that the ratio of software features vs. the options that have to be selected/specified during installation has been growing over the years. But it wouldn't surprise me, given the little "gotcha" the alpha-programmer, the alpha-admin and I ran into today. Fortunately, the top three Google results for the error message we received couldn't have pointed more emphatically at the ultimate fix had they been jumping up and down with their hair on fire. One mind-bending addition to a configuration file, and all was again well. Until something new broke. Sigh...

Feature-bloat--poorly-implemented at that--strikes again.

It's one thing to rant about feature bloat--I'm hardly bleeding-edge on that. All the same, one feature for installed software that I think should be as non-negotiable as actually shipping it is collecting usage statistics. Assuming that the user approves, of course--let's get that straight. But tallying usage statistics and periodically phoning them home to the Mother Ship, IMO, could save us (meaning the consumer) from sacrificing great gobs of hardware potential to bells that will never ring and whistles that will never tweet. I say "could" rather than "would" because, at a certain level in any organization, the reality distortion field evolves an immunity to even bell curves embedded in a PowerPoint slide.

But I can dream, can't I? And the dreaming is at least grounded in the reality that here are only so many checkboxes, drop-down lists, wizard screens, etc. that the consumer will tolerate. Eventually, the gulf between what you thought you were installing and the B-movie monstrosity that's monopolizing your hardware will become unsustainable. Right? Riiiiight?

Sunday, April 18, 2010

Hollywood programming languages

A co-worker and I were on the subject of movies, and I asked whether he'd seen the impressive Mongol. He hadn't, although the light of recognition immediately appeared in his face: "It's about Genghis Khan, right?" "Yep," I affirmed. "With John Wayne?" he asked, and--as I spluttered in horrified disbelief--proceeded to Google it to show that he wasn't making that up. Sure 'nuff: The Duke swaggers his way across the steppes to win the love of Susan Hayward (Princess Bortei).

I'm sorry to report that I kinda flipped out. Minus the "kinda" part. Such are the downsides of having a History degree...

Apparently, there are similar hazards in having a programming degree...even a two-year one. "The Vo-Tech--as it was called in Red Wing, MN--heavily emphasized the C and C++ languages in its curriculum (circa the late dot-com era, anyway). Since then, I haven't had much use for either C or C++...until now, when I find myself having to scramble up the learning curve of Objective-C, used on Macs & iProducts. And it honest-to-pete is like someone asked Hollywood to write the language, but was too cheap to retain Kernigan, Richie and Soustroup as consultants. Apart from all the stuff I disliked in C/C++ (pointers, memory management), there's been precious little that gives its creators any right to name the language after its alleged ancestor.

And, frankly, that sucks. Just like making a "historical" epic that could pass for history about as well as Donna Reed could pass for Sacajawea.Yes, there's a legitimate need for "telescoping" in drama--the Henry V prologue* And All That.

But.


To create a language that claims to be a child or sibling to another and then make it nearly unrecognizable to a student/master of the original language? That's Hollywood-style revisionism, and the harm it does is not limited to offending cranky middle-aged programmers such as your faithful blogger.

Family history--biological or linguistic--carries certain "baggage." In the case of biology/psychology, it has to do with health and neural wiring. In the case of language, it has to do with syntax and conventions. Java and PHP each bear a strong resemblance to C/C++--ironically, more so than Objective-C. But their designers had the decency to name them something else. There is, literally, no "c" in "Java" nor "PHP." Which makes them easier to learn in the sense that the C-savvy aren't so tempted to drag in all the assumptions of the language they know.

See, whenever you learn a new language (human or computer), there are actually two processes at work: Forming new assumptions about the underlying patterns and unlearning old assumptions. Insisting that Objective-C is really just a variant of (or successor to) C seriously trips up the "unlearning" part of learning. To my way of thinking at least, that's a problem, and one for which I think the language's designers should be taken to task.

- - -

*
... For 'tis your thoughts that now must deck our kings,
Carry them here and there; jumping o'er times,
Turning the accomplishment of many years
Into an hour-glass: For the which supply,
Admit me Chorus to this history;
Who, prologue-like, your humble patience pray,
Gently to hear, kindly to judge, our play.

Saturday, April 17, 2010

Found another documentation rogue today

Unfortunately, the mug shot for the gallery is beyond even digital editing, all because I'm too mortally uncool to take a decent cellphone photo. So you'll have to bee-bop on over to the French Island Quillin's to check out the self-serve gas pumps. Affixed between the credit card swiper and the number pad is a sticker that says "Push Here." And it doesn't do anything. (To humor my odd sense of humor, I actually tried: Trust me, there's nobody home behind that sticker.)

On further reflection, the only difference between bad documentation and a practical joke is intent. Except that, in either case, that distinction is completely lost on its victim.

Friday, April 16, 2010

No Friday post

For tonight at least, Offline Life has filled up my dance card: Waltzes, tangos, jitterbugs, and maybe even a Funky Chicken besides. Have an excellent start to your weekend, folks, and see y'alls tomorrow!

Thursday, April 15, 2010

Something they didn't teach in speech class

Another little "Ah-ha!" moment for me--one that may well already be obvious to others, but I'll pass it along anyway.

After @CyberCowboy's Amahi presentation for the La Crosse Linux Users' Group, I realized that a public speaker and the presentation itself should also be judged by the Q&A period. As I've been reminded the past two evenings, it's not just a question of being able to address questions with the right blend of knowledge and passion and focus. There's also a traffic cop element to the art of keeping the inter-audience discussion on track.

And so for the second night in a row, I've come away richer for good content, a good presenter, and great participants. Spending tomorrow evening with only the internet to learn from...that seems almost like a let-down. Wow, am I spoiled. And not a little sad that public speaking's interactive skills weren't even a factor during the six or so years I spent on Forensics teams.

Wednesday, April 14, 2010

An "Ah-ha!" moment to share

First off, major props to @NateSchneider, one of the ringleaders of @devCoulee, for a very useful presentation on project estimating. Second off, thanks to everyone at the meeting for the highly thought-provoking discussion of generating client/user feedback. Pure bubble-bursting, mojo-revving awesomeness.

But if I had to pick the moment that was worth the price of admission--which includes two hours of the non-work time that's running at a steep premium this week--it would be the distinction between time estimates and target dates in software project development (a distinction that likely applies to many research & development projects as well.)

That "Ah-ha! moment--was this: Project schedule estimates translate features into time; target dates translate time into features. In other words, it's the difference between these two questions:
  • Project Estimate: "How much time will it take to add these bells & whistles?"
  • Target Date: "How many bells & whistles can we add in this time-frame?"
This is not--I repeat, not--a matter of semantics. We are not splitting hairs, counting the angels that can dance on the head of a pin, or discussing anything remotely resembling philosophy here. These are merely the hard-nosed, real-world options. And you can only have one--no substitutions.

Tuesday, April 13, 2010

Good thing this Irony Meter goes to eleven

For reasons which shall remained unnamed, I have three weeks or so to morph into a reasonable facsimile of an iPhone programmer. Which is not a lot of time to learn how to:

  1. Unlearn enough Windows reflexes to use a Mac
  2. Learn my way around whatever IDE Apple supplies for iPhone/iPod/iPad development
  3. Dredge up enough decade-old memories of C/C++ to start stretching my brain around Objective-C.

The B&N reader application is still installed on my Windows desktops, so I bopped on over to the website for a bit of instant gratification. The search for "iPhone application" netted ten pages of results (the last page containing some splinters from the proverbial barrel-bottom). But of 100 results, how many were in digital format?

Two. One for a developer magazine (Independent Guide to the iPhone 3GS) and the other a book that might just be more appropriate in the dead-tree format (The iPhone Pocket Guide). Maybe I'm a statistical outlier, but it seems to me that the early adopters of digital books will be technical folks--people who don't want to buy one copy for the work office and another for the home office...and certainly don't want to lug single copies back and forth between the two.

True, a dead-tree book can sit off to the side of the work area, saving precious monitor real estate. But dead trees aren't quickly searchable. When you're scrambling up the learning curve, jumping between index and main text is horribly inefficient compared to CTRL-F. Plus, on-screen readers can always be minimized when they need to be out from underfoot. Granted, they don't allow you to copy and paste sample source code. But neither do dead-tree programming books, so that's more or less a wash.

The glaring misunderstanding of the demographic surprises me, but at the same time doesn't. In the short term, it's always more "efficient" to streamline your target audience/user. But at some point it's more work to round off the peg-corners than to drill square holes. Which is what I strongly suspect is happening here.

Monday, April 12, 2010

Time to put up or shut up?

If Vermont--among a handful of states--drag the concept of the corporation out of the 19th Century, it'll be a scary time for the apologists of Milton Friedman and mega-capitalism. Because after decades of arguing that capitalism most efficiently serves society's goals, there will be real world data to confirm or refute that cherished notion. If the concept of the B-corporation falls flat, there'll be no wiggle-room for arguing that the threat of shareholder lawsuits (backed by legal precedent) "forces" corporations to choose between doing well and doing good.

I don't want to be cynical, I expect the first round of B-corporations to fail miserably. Mainly because expect the enabling legislation to be sabotaged by the vested interests, resulting in standards that are either:

A.) Impossibly contradictory, so as to tilt the playing field like a water slide toward purely capitalistic corporations, or
B.) Watered down to the point of irrelevancy.

Revolutions being what they are, I expect the idealistic first round of B-corporations will likely be noisy in their entry into the market, and no less quiet in their spectacular exits. Or, in the event of lax standards and looser enforcement, the B-corporation will simply become a vehicle for in-kind embezzlement and/or tax-dodging. Either variety of flame-out will provide the Mr. Potters of this world (and their groupies) with more self-vindication than they'll ever need.

But if revolutionary history is any indication, the second generation will be more pragmatic, and far more effective. For the concept will probably remain on the books, although the requirements may well be tightened up. The Declaration of Independence certainly didn't form the United States; no one--saving complete pedants like George F. Will--cites the Articles of Confederation in political arguments.Hammering out the Constitution was hardly a glamor job, but guess what we're still working with two centuries and change later? You get the idea...

As little faith as I have in the dashing first run at this concept, I'm not panning B-corporations. Not in the least. For if we are stuck with the evil fiction that is corporate personhood, legally imbuing that "person" with a capacity for humanity is revolutionary indeed.

Sunday, April 11, 2010

Sometimes a difference in perspective makes no difference

This would normally be considered blasphemy chez fivechimera, but I was just reviewing the little bookshelf in my office (an "office" that looks suspiciously like a guest bedroom with a messy desk in the corner) with an eye to what can be culled. Dennis is more into Flex/Flash than I these days, so I flipped through Getting Started with Flex 3 for a sense of whether it's too elementary for what he's doing with the language.

Page 81 bore this gem:
Web 2.0 is all about the media; images and video. So, it's a good thing that Flex makes it so easy to build Flash applications that use heaping helpings of both.
I wish I were making that up. (Grammar-Nazi aside to Mr. Harrinton, Ms. Kim and their clearly-asleep-at-the-wheel editor(s): Know the difference between colons and semi-colons. Also, commas are more effective when used sparingly.)

More to the point: Even as a programmer, I take exception to the idea that replacing words with images--animated or not--can be considered "revolutionary." in any sense. Throwing back to illuminated manuscripts or the block-prints that lived cheek-by-jowl with movable type in the hands of Guttenberg, Aldus Manutius, et. al., there's absolutely no demarcation. In the case of the post-Usenet internet, history clearly repeats itself. The difference between 1.0 and 2.0 isn't about how many bytes you can stream at your users; it's about how many bytes they can stream back. And--more appropriately--how many bytes they can stream at each other.

I know that should feel at least somewhat guilty for what boils down to straw-man argumentation. But for the number of Marketing PHB-types who first looked at Twitter, YouTube, Facebook, etc. and (essentially) thought: "Cool! I can use this to spam more people!", it brasses me off at least as much to see my very own cohorts committing an equivalent sin. All of you: Stop assuming that the web's about you and what makes your job seem more important. Web 1.0 wasn't about that. Neither is Web 2.0. And I seriously--yea, even mortally--doubt that Web 3.0 will be, either. In short, even when the belly-button fuzz is 100% silk, navel-gazing is still navel gazing.

Saturday, April 10, 2010

When racing the Joneses can be a good thing

Today's grocery run entailed a certain amount of ridiculousness when I saw a "throwback" edition of a soda brand that didn't even exist when cane sugar was still in use. Heck, it wasn't even a twinkle in the marketing manager's eye in the '70s. But even the erstwhile History major who still lives inside me has a difficult time objecting to that kind of "revisionism" if it helps stem the tide of high-fructose corn syrup in the American diet. Tough call, that.

Friday, April 9, 2010

Frivolous Friday, 04.09.2010: More Murphy's Laws for programmers

Disclaimer: I actually don't have to take a whole lot of nonsense at work, so the following round of cynicism is largely brought to you by past experience--vicarious and non-.

The person who doesn’t “have time” to evaluate new software/features will the same one complaining about being “left out of the loop” immediately following implementation.

The greatest percentage of wasted hours will be those spent implementing “quick-n-dirty” solutions.

(H/t to former co-worker for the base idea) The sequence for creating a custom web application is as follows:
  1. Make it work.
  2. Make it usable.
  3. Make it fast.
  4. Make it secure.
  5. Make it pretty.
  6. Make the logo bigger! (Obligatory, albeit NSFW)
All variations on the sentence “Our users will never need to...,” are your cue to immediately—but discreetly!—add that feature to the scope.

(One for the System Admins).: Computer viruses, like V.D., will never, ever be the fault of those infected. Less so when they’ve managed to infect others before they're forced to shout for help.

Death-march deadline work will be done under salary. When your body inevitably hands you the bill, the sick time will be charged to your PTO. (Because programmers just aren’t that into working out or eating right anyway, don’cha know...)

The documentation for the down-level version of third party software you have to support will have been taken off the website before you need to debug that component or wrap your brain around what-all the API does.

Any work that furthers your marketability or improves your working environment will be done on your own time and/or nickel. Accordingly, expect to be frowned-upon for wasting it on non-objectives.

No matter how much you tweak an IDE’s code completion/formatting settings, its “intelli-sense” will always train you to deal with its quirks more than you will ever train it to deal with yours.

When you’ve just come off a milestone release, your reward will be spending more--politically mandatory--time "celebrating" with the co-workers you've seen more of than your family and friends.

When the software that the office lives and dies by goes down, it will be down for long enough to impact your billable hours, but never for long enough to throw in the towel and take the rest of the day off. Unless, of course, it’s “your” software...

Your best chances of having your brain-child feature make it into the final product is to let your boss suspect that you adapted someone else’s idea.

Googling your billion-dollar idea is an exercise in discovering that: 1.) It’s already been implemented by someone else, 2.) as open source, and 3.) poorly enough to poison any market for it.

In the event of a downturn, scarce resources will be allocated to marketing, rather than quality or new features. In the event of further downward fiscal movement, still-scarcer marketing resources will be allocated to re-branding, thus wiping out any product visibility achieved by previous marketing.

(Bonus: Your firm, during its death-spiral, will have assumed more names than 007, prompting interviewers to ask why you couldn’t hold down a job with the same company for more than eighteen months straight.)

When a critical patch has to be scrambled into production, the change-sets will always be merged from the wrong branch, breaking even more features.

Planning for after-hours or weekend “quiet time” to concentrate on a thorny problem guarantees that a great gaggle of co-workers will also choose to come in at that time. With their kids in tow. Who will promptly clean out the local candy-dish. And wash it down with Mountain Dew from the soda machine. Meaning whatever Mountain Dew isn't spilled chair-jousting. You, by the bye, aren't trusted to work remotely because, you know, you’d only be distracted at home.

One programmer’s precision-targeted workaround is another programmer’s undocumented ugly hack. Guess which programmer you will be for most of your career.

Among database synchronization tools, intuitiveness and usability exist in inverse proportion to the carnage the tool can wreak.

Meetings will be added to the schedule until productivity improves!

Thursday, April 8, 2010

Recalibration

It doesn't happen often at all, but once in awhile I've had to (ahem!) "educate" someone that the lack of a software feature they want does not constitute a "bug." To my way of thinking, that distance between "lack of" and "broken" features closed a bit late yesterday.

The backstory is that I was about to explain to a user two different ways of doing the same thing in a web application, but fortunately had a blinding moment of good sense and logged in as her. I could have sworn that both roads to that feature had been left open to that particular user profile's permission-set, but apparently we had good reason to close it. In context, that's just silly, because if the user can change that data in one place, she certainly should be qualified to change it in the other.

So I'll fill out a (cough!) bug ticket and toss it on my "whenever" pile. Granted, this feature-lack hasn't been commented on in the three-plus years since this particular sub-system went live; it's doubtful anyone thinks of her/himself as cheated. But this web app. has been "my" baby for going on five years, and standards are standards. Even when recalibrated.

Wednesday, April 7, 2010

The root of a problem?

It was probably just as well that I'd dialed down the percentage of coffee relative to milk in this morning's first mug. Because there are few things that will get an open source geek's blood moving like the news that their window to the online world (a.k.a. the Firefox web browser) carries a fundamental security flaw. As it turns out, all was actually in order, save for some paper-trail housekeeping. (See Kathleen Wilson's note about 3/4 of the way through this thread.)

(A definition of "root certificates" for non-admins: Ever notice that when you're logging into your bank's website or buying something from an online store, how the first part of the web address switches from "http://" to "https://"? That extra "s" means that your browser and the server it's talking to are communicating by encrypting their communication so that no one in the middle can intercept data such as account numbers, passwords, etc. But before your browser pulls out its Magic Decoder Ring, it needs to know that the server it's talking to is legit. That's where security certificates come in, and trust me when I say that they don't exactly come in Cracker Jack boxes. A "root certificate," is the Momma--heck, make that Ancestral Matriarch--of the certificates used by thousands of descendants. So you can probably imagine the theft, fraud and outright mayhem that could occur if millions of copies of a web browser accepted an ersatz Matriarch--or, more aptly, her descendants--as The Real Deal.)

Thankfully, the scare was just a scare. But it started me thinking about how--at this level at least--we might just be making a mistake by modeling browser security on very human notions of trust. Actually, less-than-human notions of trust. Trust is, so I imagine, a shades-of-grey matter for most personalities. But it's binary (figuratively as well as literally) for most computerized systems. "Binary" as in: Oh, your certificate isn't vouched-for? No trust for you! In real-world terms, it's the difference between a bored/rushed TSA employee ticking off a checklist and a one-on-one chat with a trained El Al agent. (Disclaimer: El Al's history of racial profiling is most emphatically not endorsed here. Why (apart from the obvious human rights dimensions)? Because race just boils down to a checkbox on a list, and thus gives the agent an excuse not to use her/his think-meat. Which in security is always, always A Heinously Bad Thing.)

Conventional wisdom says that the human personality is the proverbial weak link in the security chain. Ironically, though, it can also be the strongest--but only if the humans in question are trained and allowed to use all their senses. Including, as appropriate, the somewhat nebulous gut sense. Fuzzy logic isn't yet mainstream enough to be to reliably help your browser decide which websites to trust. But today's scare over the legitimacy of root certificates, I think, highlights the weakness in the binary nature of the trust/distrust model of browser security. Do I have an alternative suggestion? Not really. I can only hope that the incident sparks more discussion--and, ultimately, more alternatives--from the security/cryptography community.

Tuesday, April 6, 2010

Keeping it real by faking it

We’ve had an issue batted back and forth in our bug-tracking system b/c of different behavior on Internet Explorer 7 and/or 8, compared to their older sibling IE6. It’s not the first time it’s happened, which means that I’ve learned not to automatically toss the issue back at QA with under-the-breath muttering about someone smoking crack. (Muttering that audibly is not advisable when QA’s on the other side of the fabric half-wall—and is probably not advisable at all, come to that.)

Our workstations aren’t set up as virtualized instances of Windows, so unit testing involves caging a laptop from the System Admin and doing something like this:

  1. Making the code changes in the development environment shown on the left-hand monitor.
  2. Testing the code in the Internet Explorer 6 instance on the right-hand monitor.
  3. Testing the code in the Internet Explorer 7 instance on the laptop.
  4. Repeating as necessary until the code is well-behaved in both browsers.

Unfortunately, a laptop with IE8 wasn't available, or there would have been four monitors involved. (Or, on the other hand, I should say "fortunately," as I would have been scrambling for an extra outlet and network jack.)

Technically, you can sort of install more than one version of the browser on the same workstation, but it’s a hack and doesn’t always produce reliable results. And as my pod-neighbor just discovered, installing IE8 whacked the IE6 changeling, so I’m in no hurry to take that route. So what it boils down to is that for each version of IE (major and minor) that you need to support, you have to have enough hardware to run Windows (not negligible with Vista or Windows 7). Even virtualizing a PC (i.e. splitting its processing/memory/disk into separate instances of Windows all running at the same time on the same hardware) doesn't save all that much money b/c of the beefier hardware requirements and the VMWare license. And you still have to pay for every single copy of Windows either way.

Obviously, what’s needed from Microsoft is a reliable emulator for all reasonably recent versions (major and minor) of its browser. I realize this sounds startling coming from an open source software advocate, but I actually would not mind paying for that software. Not at all. When you compare the cost of supporting Internet Explorer 6, 7 and 8 in tandem, a few hundred bucks would be a screaming deal, even minus the I/T support. If an emulator can accurately show me how my web applications will behave (as well as look) under any reasonably current version, bring it: Tractor-beam my IDE straight into that Death Star, baby!

Alas, for all Steve Ballmer's sweaty cheerleading of "Developers! Developers! Developers! Developers!," it's pretty obvious that his sensibilities are still firmly rooted in the desktop application era. You know, the days when you could mail your user upgrades on a 5.25" floppy and blame them for their problems if they dawdled over installing your latest/greatest. The web (and the world in general) is just a weeeee bit more homogeneous than that nowadays, which means that control over the platform has slipped. It's mostly a good thing, from the standpoint that monocultures are dangerous (mainly to their own long-term prospects). But Microsoft choosing to ignore the fact that--just like its own programmers--developers outside Redmond have to worry about backwards compatibility also condemns the third-party ecosystem to the dangers of monoculture.

Theoretically, you could make your website inaccessible to anyone not running the latest browser version. That's a nice, easy solution. And, after all, programmers are forever being cautioned against giving the user too many choices. No doubt you can Google lots of beans & rice recipies before your ISP cuts off your internet connection for lack of payment...although cooking them after the power company cuts the juice (and you can no longer afford gas for your chainsaw to cut down neighborhood trees) could be a tad problematic.

Monday, April 5, 2010

Digital driftnets

First, a bit of backstory: I had to add a bit of functionality to a web page written by someone else (and possibly modified by even more people). If you're not a web programmer, you might only care to know that there are two instances when a programmer should check the input that a user fills into any form:

  1. After the user submits the form, but before it's actually sent to the server.
  2. After the form is received by the server, but before it's allowed to affect the database or user's session.

This might seem redundant, but if you want to keep your data clean and your downstream code reasonably safe from careless data entry (if not out-and-out maliciousness), it's the only way to roll.

Thus, I had put code in both places. For the pre-form-submittal code (#1 above), I merely added it to an existing function appropriately named "Verify()." So there was a bit of head-scratching involved when it was discovered that the web page had actually been submitted to the server before the error-messages popped up to complain about bogus input. As it turns out, someone had re-written the form to entirely bypass Verify(), but hadn't bothered to remove the function.

Which is when the analogy of obsolete code as driftnets jumped out and bit me on the nose. I won't re-hash earlier rants about code "housekeeping" paying for itself in the long run, simply because this makes a fair illustration. Particularly when you consider that the code in question wasn't all that complex--i.e. it could have been much worse. Even in that case, it would have taken my predecessor mere seconds to search for any other references to the function in the page and, finding none, highlight that function and hit the "Delete" key. In contrast, QA and I wasted at least an hour by the time the issue was filed, I'd located the code, debugged the problem, fixed it, unit-tested, merged, committed, promoted and swatted the issue back into testing. That obsolete code (i.e. the driftnet cut loose in a moment of distraction--or, for all I know, apathy) added the completely gratuitous cost of an hour's billable time to the project.

Even at offshore billing rates, the difference between seconds and an hour isn't negligible. Particularly when you multiply it by the number of times it could happen if you keep even one slob on a team--or skew the incentives toward raw code/features produced. Now, I've worked in this code-base enough to know that this kind of thing is a statistical outlier, which is probably why it still chaps my hide, even hours later. Even so, I hope that the real-world illustration of the difference between seconds and an hour makes the case in a way that no amount of ranting about code hygiene will do.

Saturday, April 3, 2010

Conferences as "concerts"

For years, I've been annoyed by the idea that the inventors and makers of things that keep the economy humming are generally faceless drones, whereas the fifteen minutes of the over-packaged confections from the so-called "entertainment industry" seem intolerably long. Yet I've noticed one similarity between software and music--one that should have smacked me upside the head sooner.

By all rights, The Great Recession, coupled with long hours, the threat of offshoring, etc., should have all but squelched the software development conference industry. Yet from where I sit, this doesn't seem to be the case. Granted, I didn't pay too much attention during the dot-com era (mainly because I was in school and couldn't afford to travel anyway). But all that I can really recall (off the top of my head) are DefCon and Comdex...although the latter certainly spawned any number of tchotchke-fest knock-offs.

Fast-forward a decade and change, and it seems--at least to me--that many, many more conference options are available, and not just for a specific technology or platform (e.g. Microsoft, MySQL, Java, what-have-you). Mainly I'm talking about the broader context: FOWA, Business of Software, SXSWi, Startup Lessons Learned, CanSecWest, and a slew of jamborees built around Web 2.0. All with their headliners and opening acts that run the gamut of Top 40 A-listers to Who-in-Grethor-are-you indies.

The publishing industry--which I consider a cousin to entertainment--is nervous (as well it should be), and even Amazon/B&N/Apple interposing themselves as toll-collectors on digital books shouldn't foster any long-term comfort. That being said, it's still possible to make a tidy side income, perhaps a living--or even in rare cases a fortune--committing your stories and insights to paper and ink. That's where the rock stars of the entrepreneurial and/or software world cash in--and why I should have twigged into the conference/concert similarity a couple years ago.

It makes more sense when you consider that, for music "rock stars," concerts serve three major purposes:

  1. To supplement music sales
  2. To maintain street cred. (i.e. "They're not resting on their laurels")
  3. To bolster sales of the current work

In an age of dropping album sales--whether rightly attitributed to piracy or not--the concert, so I understand, has become the safer investment. If I'm correct about the proliferation of software-related conferences, then I think at least some of it can be attributed to similar motivations on the part of the "rock stars" of the programming (and software startup) world. Swap "book" for "CD" and there's really no significant difference. Except maybe for the groupies and hand-sorted M&Ms bit. Then again, for all I know, Jason Fried, and Joel Spolsky, Eric Sink, et. al. haven't seen a brown M&M in years. And groupies? True, Mr. Fried is kind of a cutie-pie and all that, but...ummmm...maybe we just won't go there, 'k? Thanks.

I'm not out to slam either conferences or concerts. But one other similarity that I've noticed is that the value I've gained from either comes, in some part, from the quality of the other attendees. In terms of concerts, the most bang for my buck was a laid-back open-air deal headlining retreads from the 70s. The worst value was a Metrodome sell-out (seemingly) packed with idiots more interested in toking up than in the actual performance. The "rock star"'s motivation is pretty constant, but it's important to consider that their "performance" will be only part of the value of the ticket price.

Friday, April 2, 2010

Frivolous Friday, 04.02.2010: I/T irony

Working in information technology typically runs a bit more smoothly if you have a developed sense of humor. Okay, maybe gallows humor, but humor. It occurred to me this morning that if you stick around long enough, you should enjoy the beneficial side effect of a healthy inoculation against irony. Here a are a few reasons why:

When you read books like L. Sprague de Camp's "The Ancient Engineers" or Frances & Joseph Gies' "Cathedral, Forge & Water-wheel," you're struck by how many technologies were born as the toys of kings & emperors. (Astrology, for instance, eventually moved beyond fortune-telling or personality assessment to revolutionize our concept of the Universe.) In contrast, "modern" computers were developed for naval ballistics and core business functions such as accounting and payroll. Nowadays, conventional wisdom is that computing's envelope is being pushed by gaming and pr0n.

Microsoft based its fortunes on an operating system--i.e. a mechanism for managing files and running applications. Three decades later, Windows XP can't delete a zero-byte file without the notification icon hanging until I'm annoyed enough to click "Cancel."

And while I'm picking on Microsoft, there's always Bill Gates at the center of an urban legend that has him proclaiming that "640 kilobytes should be enough for anyone." Windows 7 requires over 1600 times that--over 3200 if the processor is 64-bit. The much-despised Vista's--widely panned for its bloatedness--had half memory requirement (512MB for the Home version; 1GB for everything else).

The popular image of the Mac user is still the free-spirited artist, despite the fact that Steve Jobs' design dictatorship is largely celebrated as a virtue.

I was testing the battery life on a brand-new netbook by keeping it on the kitchen counter as I worked on other things. "Are you storing your recipes on that?" my husband snarked in reference to the history of personal computing.

This isn't an original thought, but as long as we're on the subject of PC history, remember the prediction that the PC would be as simple to use as the telephone? Now whole books are published on the subject of how to use name brand smartphones.

The reason that the telephone was the model of ease of use in those days was, of course, that it had to be simple enough for your Mom & Grandma to use. Now, I'm sure that there are any number of exceptions, but I'm willing to bet that most of us who do informal PC support would much rather troubleshoot for them than any number of other "adults" we could mention. (Why? Because Mom & Grandma can be trusted to follow directions. Heck, Mom & Grandma probably even read the manual cover to cover, thus sparing us a goodly percentage of the calls we would otherwise have fielded.)

There are roughly one billion people online, yet certain oppressive governments [insert scowl in the direction of Beijing], corporations and loony-tunes fringe movements somehow believe that they can sway the discourse by creating fake identities and astro-turfing.

Thursday, April 1, 2010

Cookie vs. dough

Earlier today, I was thinking that applications (job applications, college admission applications, etc.) exist to establish a baseline--a lowest common denominator, if you will. But then I realized that it's usually more insidious than that. Applications are merely cookie cutters that define the bounds of critical evaluation--and figo to anything outside that.

The problem with cookie-cutters is not so much that they set a minimum standard as they give too much latitude for ignoring any dough that happens to fall outside the prescribed shape. If--and that's a big "if"--the organization is exceptionally self-aware, the "cookie" merely defines the present; it's the quality of the dough (inside and outside the lines) that shapes the future. Never lose sight of that.