Friday, October 31, 2014

Frivolous Friday, 2014.10.31: The Programmer's In-House of Horrors

Now that software development has become more mainstream--yea, even glamourised--the stereotype of the introverted nerd in the remotest office-corner typing cryptic incantations to invoke the blessings of the Server Gods has finally been put to rest.  (And good shuttance!)

Mind you, I'm certainly not claiming that programmers have lost all the personality quirks that come with the trade.  In fact, after over two decades in the workforce, I can't help but think that a course in Abnormal Psychology should be mandatory for every college graduate.   For MIS/CS majors, this should probably be a minor.  But it's never going to happen, of course.  So for those still in programming school, here's the Rogue's Gallery that just might be lurking in a cubicle near you at any given point in your career.

"Charlie Victor" -  The nickname comes from the CTRL-C and CTRL-V keyboard shortcuts for cutting and pasting.  If you're finding code that is all over the map style-wise, I strongly recommend Googling a bunch of snippets.  Don't be surprised to find StackOverflow, W3CSchools, or possibly 4GuysFromRolla (depending on the language) at the top of the list.   In that case, you're not working with a real software developer; you're working with Red Green.

The Magpie - Can't stay away from shiny stuff.  Which is fantastic if they're scratching their own itches--more power to them, then.  But for a software shop with a legacy code-base, the platforms, languages, & APIs are typically chosen on the "second mouse gets the cheese" principle.  Rightly so, I might add.  The worst part of working with the magpie, however, is by the time everyone else on the team is up to speed with the new hotness, they've already moved on to the new new hotness.

The Baby Duck - For all practical purposes, the antithesis of Magpie.  Covered elsewhere.

The Cowboy - When I worked in more of a startup scenario, I once had a co-worker directly editing files on the beta server while I was demo-ing them to a client.  'Nuff said.  (He's actually a pretty decent guy; it was more product of us flying by the seat of our threadbare pants for years on end.  That's a management issue more than anything.)

The Warrior-Priest(ess) - I owe this spot-on analogy to a client who was also the loudest critic of my team's work.  He was referring in particular to the UNIX "warrior-culture, where one must strive and suffer for one's knowledge" (verbatim quote).   This was pre-Ubuntu, pre-ServerFault, mind you, so he totally had a point.  Alas, it's not only UNIX.  The Warrior doesn't care what you learned; they're only interested in how many scars it left.  (It's a frighteningly specific incarnation of the sunk cost fallacy.  You, on the other hand, are sane enough to only do that sort of thing with, like, your house or car.)
 
The Firehose - This is the polar opposite of the Warrior.  Ask a simple question, be digitally water-boarded with links to Wikipedia articles, YouTube how-tos, tutorial blog-posts, whatever.  Don't be surprised if you climb back out of that rabbit-hole as clueless as you were falling in.  (Strangely enough, I once worked with someone who could toggle between Warrior and Firehose any given day.)

The Eternal Prototyper - Real world constraints?  Never heard of 'em.  Oh, you have to support varying levels of hierarchy in the live version of this navigation widget?  Sorry-not-sorry--they only hard-coded two.  But boy, did it ever look snappy in Safari on their Mac.  Sucks to be you watching it blow chunks in IE8.  But, hey, they're the bleeding-edge genius and you're just the bit-twiddling code-monkey, right?

The Goth - Like the Rolling Stones, this species of coder wants it all painted black.  At least the screen of their IDE, anyway.  Outside of I/T, it probably looks like a terror of the space-bar and "Enter" key....maybe even the "Tab" key.  To other coders, symptoms of the pathology include horrors like single-character variable names--all declared inline, nat'cherly.  Also?  Transcontinental method-chaining.  Nested callbacks with more layers than a matroyska doll.  Code so dense you need shades and a lead hoodie for the Hawking Radiation.

Pollyanna - "Validating input is just time that could be spent adding new features.  Hackers would never bother with our small-potatoes website.  Besides, our users are all internal--they know what they're doing, right?"
(Everone Else:  Twitches uncontrollably whilst donning tin-foil hat.)

The Road-Tripping Parent - Remember "We'll get there when we get there!"?  Admittedly, there may be a time and a place for this kind of push-back.  Typically, it's during the triage that happens after someone feeds that cute, innocuous little feature request after midnight and it cannonballs into the swimming pool.  That being said, having a more accurate sense of status than a Windows progress bar is a core function of a developer's job.  And, as much as I'm most certainly not a card-carrying member of the Steve Jobs Fan Club (Team Woz, baby!), he nailed it:  "Real programmers ship." 

Captain Flypaper - You know the "Let's throw it against the wall and see if it sticks" schtick?  Yeah, well, some programmers apparently work in flypaper-upholstered cubicles, because they expect every bit of code they write to stick on the first throw.  (Mercifully, unit-testing + automated builds are becoming the norm, which should either push these folks into a more disciplined workflow...or another career.)

The Helicopter Parent - Unless the individual in question is a natural-born passive-aggressive control-freak, this coder is largely the product of working with some or all of the above.  And when they're the senior geek or team lead who's expected to save the Earth week-in-and-week-out, it's difficult not to sympathise.  Until, of course, you're on the receiving end of, "[sigh]...Well, that's not how I would do it..." Or you find that rewrites of your code have been quietly checked into the VCS.  Again, this sort of thing typically boils down to an original sin committed at management levels.  But it's still a heckuva morale-buster for any coder within its blast-radius.

Before wishing everyone a safe remainder to their Hallowe'en, I'd like to give credit for the idea of nick-naming programmer personality quirks to Michael Lopp, who runs the Rands in Repose website.  (Sample:  "Free Electron" + the comments.)  If you don't feel like wading through years and years of blogging on the subject of working with geeks, you can save yourself a bunch of time by just buying the two books he's written.  They're laugh-out-loud funny, and I re-read them every couple of years for grins.

Wednesday, October 29, 2014

Something I didn't expect to learn at Programmer School

There are any number of good things you can say about attending a small Programmer School like the one that made me a professional geek.  Alas, one of them was not the civic planning.  Specifically, the fact that the school library and computer lab were situated next to the common area (a.k.a. the cafeteria)...directly across from the day care.

Now, I've never needed help with being distracted.  (Squirrel!)  So I wasn't too thrilled with the arrangement.  I found little sympathy, however, when I grumbled to one of my professors, a father of three:  "Eh.  I figure if they're screaming, they're still alive," quoth he.

Sigh.  Nobody understands me.  Except maybe squirrels.

But I can admit that my prof. had a point, at least as it relates to project management.  As both an employee and a freelancer, I've never been known to sit on problems that come up once our tidy project design meets up with messy reality.  (Although I normally try to have a least one workaround in my back pocket before I actually raise the red flag.)

After a couple of hard lessons, I've also learned not to drop off the radar even when the project is hitting its milestones ahead of schedule.  Once upon a time, I considered weekly status reports a sign that my boss was a paranoid control-freak who didn't trust me to be the professional they were paying me to be.

As a freelancer, however, I've come to the opposite view.  Someone who isn't interested in status "because you're the one who understands all that technical stuff" is a red flag.  Because if you don't want to be bothered with good news, what happens if there's any bad news to handle?  Software, like any other (designed) product, is nothing more that the sum of the decisions made between the initial brainstorm and the final bytes.  Not all of those decisions can be made in the heady optimism of the kick-off meeting.  And some of those decisions could even be mid-project course-corrections. 

A potential client who expects me to work in a vacuum and deliver exactly what s/he wanted makes me nervous.  But the flip side is that a software developer who expects to work that way should make you (as the potential client) more nervous still.  In a freelancer, that behaviour is symptomatic of someone afraid of criticism, who might just let the clock (and budget) run out until your decision boils down to take-it-or-leave it.

Look.  I know we're all busy.  But everyone on this road-trip is responsible for making sure we all arrive where we want to be, even when they're not technically driving.  Road signs matter.  Detour signs, in particular, do not exist to be ignored.  Once in a great while, we may even have to pull over for a minute and dig out the map and compass when we're not where we expect to be.  In the long run, we'll save time and gas.  And, unlike the Blues Brothers, our road trip won't end up in the hoosegow.

Monday, October 27, 2014

Change for time

For a couple of Frivolous Fridays, I've riffed on the subject "If geeks ran the world."  It was meant to be nerdy, goofy fun, with maybe some wistful wishful thinking thrown in.  Overall, I like to think that the world would be a better place:  More logical yet more experimental, and probably more caffeinated.

But there's one point about which I'm serious.  Like, global thermonuclear Armageddon serious.  And that would be in how we (meaning as a planet) would learn to deal with dates and times under a geek regime, particularly when the majority of those geeks are from I/T.

At some point in your programming career--hopefully earlier than later--you realise that working with dates and times will always suck.  No matter how good the language or framework or API you work with, the very best you can expect is that it will be less painful.  Not because there's less inherent suckage, but because the date-time APIs provide a topical anesthetic to numb you enough to finish the job.  Not unlike popping the first-string quarterback full of cortisone after a brutal sack so he can finish the game.

Boiled down to the essence, the concept of time revolved around the point when the sun was at its highest point in the sky.  Dates, of course, were based off the length of day relative to the length of not-day.

Dates, at least in European history, have been an occasional pain.  If you're morbidly curious, look up "Julian Calendar" and "Gregorian Calendar."  Basically, from 1582 to 1918, the calendar date generally depended on who was in power in your country/territory during the late 16th century and, more importantly, how they felt about the Pope.  (Exception:  France tried decimalising the calendar during the Revolution...and again for a couple of weeks in 1871.  I wish I were making that up.)


Which, except for cross-border contracts, will not be an issue for most folks living in an agrarian economy.  At least not until the mid-19th century, when telegraph lines were strung across whole continents...and then between them.  As we all appreciate, the world has been shrinking ever since.

The study of timepieces from the sundial to the atomic clock is a fun bit of nerdery, I'll grant you.  (For instance, I hang an aquitaine off my belt at SCA events...and keep the wristwatch discreetly tucked under poofy Renaissance sleeves.)

 You can accuse me--perhaps rightly--of too much optimism in thinking that our species will manage to stop squabbling and money-grubbing and gawping at glowing rectangular screens long enough to venture permanently into other regions of the galaxy.   If and when that happens, the (occasionally wobbly) spinning of a water-logged, gas-shrouded rock in its decaying orbit around a flaming gas-ball will cease to be relevant.

But for our immediate purposes, I don't see any practical reason why the earth can't all be on the same time, all the time.  Let's say that GMT/UTC/Zulu time is the only time zone.  (Yeah, I know--that means that the British can still pretend that they're the centre of the world.  But it's a trade-off I'm willing to make, however much it riles my American-born-and-raised sensibilities...)

The upshot is that you break for lunch at eight a.m. in NYC and five a.m. in L.A.--is that legitimately a big deal?  I'd argue not.  No more having to hope that you and your client both made the same mental adjustments when you agreed on a meeting time.  For programmers and their friends who keep the servers running, no having to convert everything to UTC to make stuff apples-to-apples.  And, best of all, good shuttance to the pointless nuisance that is Daylight Savings Time, while we're at it.

'Course, the arm-twisting politics of making that happen will probably be even more cantankerous than the religious grudges that gave 16th - 18th century European history a lot of its uglier moments.  Alexander Pope wrote "'Tis with our judgements as with our watches; none go just alike, but each believes his own."  Now, identical judgements can be a terrifying thing (think mob rule) and I'm not holding out for it in a world of billions anyway.  But our watches going just alike would make life easier in any number of ways.  And not just for we programmers who have to write code for a 24-timezone world.

Friday, October 24, 2014

No blog post tonight

Just finished up a promotion to the live server; now working on a status report for my client.

TCB, baby...TCB*.



* Image courtesy of the wallpaper downloads page at www.elvis.com. 

Wednesday, October 22, 2014

"Baby-ducking"

Even programmers have urban legends.  Or perhaps, more appropriately, Arthurian legends--at least in the sense that they may be based on something that happened at the misty edges of recorded history.  One of those legends concerns the origins of the term "rubber-ducking," which is short for "rubber duck debugging" or "rubber duck problem-solving."  The basic premise is that the mere act of describing (to someone else) the context of the problem, the problem itself, and what (thus far) hasn't fixed the problem, you force yourself to slow down your thinking enough to arrive at the solution before you even finish your monologue.

I tend to be more an intuitive thinker than a methodically analytical one, so I can totally vouch for this method working.  And not only for computer problems, I should note. 

But there's a term not currently in cant use within the programmer world (though it should be), and that's "baby-ducking."  As in the way baby ducks are known to recognise the first larger thing they see as "Mom" when not reared by their own kind.  (Ducks not the only bird--or even animal--that does this, but somehow they have become synonymous with the concept of "imprinting.") 

A similar phenomenon can be observed with new programmers where languages or frameworks or even tool-sets are concerned.  The first one they are taught is the Last Word On The Subject, and everything else is either a rapidly fossilising anachronism or an over-hyped flash in the pan.  Mercifully, very nearly all of them outgrow it and learn to accept programmers of other languages as siblings.  Mainly because they have no real choice--things just evolve too darned rapidly to stay employed/employable otherwise.   (That, and a good programmer is a curious programmer, which will soon tempt them beyond their comfort zone.)

There are, naturally, exceptions.  A programmer willing to learn and keep current with every last quirk of Structured Query Language (a.k.a. SQL) could conceivably become indispensable in a large enterprise for a decade or more.  Similarly, the C and C++ programming languages have been around since the 1970s and 80s, respectively.  Yet they're enjoying a new vogue in embeddable platforms like Arduino.

In general, however, my considered opinion is that it is a Very Bad Idea to hire a new programmer who only knows a single language...unless you intend to train them in another.  While the idea of a prefabricated (and, let's face it, cheaper) skill-set might seem like a good idea, chances are you're creating your own monster.   So when your project suddenly has to retool for whatever curve-ball Apple or Microsoft or Adobe (or whomever) has just thrown, and the resistance to change ranges from passive-aggressive subversion to pitchfork-brandishing rebellion...  Welp, I don't think you need to look any farther than the mirror for the person to blame.  Sucks to be you.

Monday, October 20, 2014

Generations

Both my parents spent the majority of their careers working in a hospital environment.   (If you want a good working definition of corporate benevolence, it would be in how my Dad's supervisor said absolutely bupkis about how long it took Dad to return from his maintenance jobs while Mom and I were in the maternity ward and nursery, respectively.  'Nuff said.)

Both parents, however, are at the stage of life where they're more likely to experience hospitals from a customer's, rather than an employee's perspective.  I called Mom today to check in on her progress after surgery a couple months back.  (No, it's not the first time I've called her since then.  Even I'm not such a horrid child as that.)  For the record, she's back out in the yard, shelling walnuts, fully up-to-date on the doings of the wild critters and feral cats, etc.  Business as usual, in other words.

Mom mentioned that the hospital where she'd worked until a couple years back had asked her if she was interested in volunteering.  She said "no."  Which didn't surprise me--she has enough going on right now, even without recuperating from her third surgery in three years.  But then I had an ear-full about everything her former employer has outsourced since she started working there--which, for context, was during the Carter Administration.

Food service, IIRC, was the first to go.  Now housekeeping has been outsourced.  So has billing.  Because, of course, nutrition has nothing to do with health.  And neither does cleanliness.  (My Gentle Reader naturally picked up on the sarcasm there.)  And when, thanks to data-sharing limitations, my Mom is batted like a ping-pong ball between Accounts Receivable and at least two insurance companies when she's still half-whacked-out on painkillers, I'm going to take a dim view of outsourced billing.   [sharpens fingernails] [bares teeth] 

I have a lot of fond memories of visiting that place when I was growing up:  The smell of acetone, the superball-bouncy scrambled eggs in the cafeteria, the stately pace of the elevators, the brain-in-a-jar in the Histology lab (true story).  But I can also understand why Mom turned them down, too.   So far as I can tell, it's still a clean, orderly, almost nurturing place.  But the ghosts of the nuns who built and poured their devotion into it become more transparent with every contractor who slings an ID-card lanyard around their neck.

Fast-forward a generation--meaning me--and press the "zemblanity" button, and there's today's news about IBM selling its semiconductor business to GlobalFoundries.  It's certainly not unprecedented, given IBM's sell-off of its PC/laptop business to Lenovo a few years back and some of its server business earlier this year.  Except that this isn't your typical offshoring:

GlobalFoundries will take over IBM manufacturing facilities in New York and Vermont, and the company "plans to provide employment opportunities for substantially all IBM employees at the two facilities who are part of the transferred businesses, except for a team of semiconductor server group employees who will remain with IBM."

Thus, at least in the near term, GlobalFoundries will employ IBM expats at IBM facilities to make a profit at what, for IBM, was a money-pit.  And IBM's taking a sesqui-billion-dollar hit on the deal besides.   Slow-clap for IBM management there.  (Sarcasm again, btw.)

Granted, I haven't seen the inside of the Blue Zoo since Lou Gerstner was yanking the platinum ripcord on his golden parachute.  And, even then, being "contractor scum" insulated me from the insane (unpaid) overtime and pension-jigging and attrition-by-early-retirement and other assorted idiocies inflicted by the buscuit-salesman.  But my frustration really boils down to one similar to Mom's.  Namely, that the definition of "core competence" has become dangerously strict.

Now, I'm certainly not arguing in favour of vertical monopolies.  The fact that Monsanto is allowed to GMO a seed specifically optimised for the petro-chemical atrocities they market frankly blows my mind.  As Bruce Sterling put it, "Teddy Roosevelt would jump down off Mount Rushmore and kick our ass from hell to breakfast for tolerating a situation like this."  And he's absolutely right--even when he was talking about software monopolies.

Maybe I've just been out of the server space for too long.  For all I know, pushing mission-critical data off-site to cloud servers doesn't give CIOs the willies it would have given them a decade ago.  Maybe Microsoft has finally earned enough enterprise-computing street-cred to muscle out the Big Iron in the server-room.

But I do know that outsourcing always entails friction and a certain amount of bridge-burning.  In the case of Mom's ex-employer, it's orders of magnitude easier and less expensive to retrain (or, if necessary, fire) a under-performing employee than it is to cancel a contract and find a replacement when work isn't up to snuff.  When you're a technology company that's weathered two decades of commodisation in both hardware and software by optimising one for the other, throwing away that balance makes no strategic to me.

When I look at the risk of things IBM flags as higher-margin (cloud, data and analytics, security, social and mobile), there is not one of them that I would flag as being "owned" by Big Blue.  (Cloud?  Amazon.  Data?  Oracle, with Microsoft hot on their heels.  Analytics?  Everybody's a "big data" guru these days.  Security?  Nope.  Social?  Please...who isn't gunning for Facebook?  Mobile?  Are we seriously expecting for an IBM phone?)

I owe IBM credit for making me a decent technical writer and for teaching me basic white-collar survival skills.  Oh, and for disabusing me of the notion that working inside the belly of the leviathon is securer than working outside it.  But apart from my comrades and the cafeteria/coffee-shop/cleaning ladies, there's no love lost for the big blue behemoth on my end.

Yet it galls me to see a company that lionised its R&D department (and the patent lawyers who filed every brain-wave thereof) hitching their wagons to other people's horses.  Or, perhaps more aptly,  jumping onto other bandwagons.  Because bandwagon-passengers forfeit their right to drive, right?

Friday, October 17, 2014

Frivolous Friday, 2014.10.17: Feline Intelligence

The doors and windows were tightly shut, and the cracks of the window frames stuffed with cloth, to keep out the cold.  But Black Susan, the cat, came and went as she pleased, day and night, through the swinging door of the cat-hole in the bottom of the front door.  She always went very quickly so the door would not catch her tail when it fell shut behind her.

One night when Pa was greasing the traps he watched Black Susan come in, and he said: "There was once a man who had two cats, a big cat and a little cat."

Laura and Mary ran to lean on his knees and hear the rest.

"He had two cats," Pa repeated, "a big cat and a little cat.  So he made a big cat-hole in his door for the big cat.  And then he made a little cat-hole for the little cat."

There Pa stopped.

"But why couldn't the little cat--" Mary began.

"Because the big cat wouldn't let it," Laura interrupted.

"Laura, that is very rude.  You must never interrupt," said Pa.

"But I see," he said, "that either one of you has more sense than the man who cut the two cat-holes in his door."

- Laura Ingalls Wilder, Little House in the Big Woods

As Boing Boing reminded me to today, the reputed inventor of the cat-door was none other than Sir Issac Newton.  But the surprising part was that he, as legend has it anyway, was guilty of the dual-egress: One hole for the cat and the other for her kittens.

I suppose there are a number of ways that it could be true.  Maybe Laura was on the right track.  After all, I could totally see one of our cats blocking the door to effectively shut out the other.  (I'm looking at you, Rollie!)  Or if the step on the original full-size door was too high for a kitten, one cut lower to the ground would make a workable--if ugly--hack.

Then, too, we all probably know someone who demonstrates the stereotypical inverse relationship between brilliance and common sense.  And let's face it, the man who invented a new branch of mathematics literally overnight (true story) also made the mistake of reinvesting his profits from the South Sea Bubble's boom back into its bust, and lost his shorts. 

Alas, the ahem, cromulence of the tale falls short:  Both Newton's claim to the invention and subsequent misuse thereof have both been roundly debunked.

Innovation is, of course, driven by many factors other than genius.  We owe the discovery of penicillin, for instance, to straight-up carelessness.  Or we could subscribe to George Bernard Shaw's "The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself.  Therefore all progress depends on the unreasonable man."  Or, when we find a door cut into a door for the convenience of felines, it's easy to suspect that the mother of invention is occasionally whiskered.

Now, if Sir Tim Berners-Lee had a cat around when he was inventing the HTTP protocol and thus the World Wide Web (which currently exists as a platform for cat photos/videos), I think we would be justified in our suspicions.   

Wednesday, October 15, 2014

Mirror, mirror

Normally, I'm skeptical of the argument that government should be run like a business.   First and foremost, businesses are intended to make a profit, and anyone who's ever walked past the Chapters "History" section should know that has never been compatible with good government.  (I mean, does anyone seriously want tax collectors working on commission like they did in the Bad Old Days?)

Nevertheless, management principles remain largely the same, regardless of whether the proverbial cats being herded are public- or private-sector.

But let's back up a bit for the benefit of some of my Gentle Readers who reside south of the U.S.-Canadian border.   Because Canadian party politics tends to follow a more European model than an American one.  In the U.S., of course, it's possible that the President or the Governor of a State may belong to the Coke party while both houses of the Legislative branch are predominantly Pepsi.  In Ottawa and the Provincial seats, however, the Prime Minister and the Premiers are the heads of the winning parties in the national or provincial jurisdiction, respectively.

Certainly, the arrangement spares the Canadian electorate the unedifying Legislative vs. Executive spit-balling which the American public has come to expect from its "leaders."  But it does have the unfortunate side effect that the party in power has a worse-than-usual tendency to act as if it "owns" the government.  That extends to the executive ("deputy minister") jobs that are filled by appointment.

Sure, that sort of thing happens in the U.S. too.  The difference is that the holder of such a job in the U.S. is expected to find her/his own golden parachute, often in the form of a cushy job on Wall Street, K-Street, with a (cough!) "think"-tank, or as a senior law partner.  Not in these parts, where the payout is straight cash when another party wins the latest election (and wants that earmarked paycheque to reward one of own its supporters...who weren't themselves elected this time).

Naked cronyism certainly can't help morale in the trenches, even at the best of times.  Certainly not at a time of downsizing (via attrition and layoffs) in the public service.  But that's not the only, or even the worse, blight on the profession.  A culture of sycophancy, not surprisingly, is another.  (Canada's notoriously muzzled scientists are shining exceptions.  Ditto the Parliamentary Budget Office.)

And that boils down to one of the Deadliest of Deadly Sins in Management--namely, the sin hiring those like yourself.  (Or, far worse, people who could almost be you...if only they had your talent, charisma, work ethic, savvy, or whatever you think is your special sauce.)

Contrast that with Barbara Corcoran's delicious (if slightly NSFW) story of how she found her business partner.  (Note:  Scroll to the bottom of the page and click the "PEOPLE: Expanders or Containers" one.)  The salient point is that no one has the personality traits optimised for everything that needs to be done.  (You know, Tony Stark and Pepper Potts.  That sort of thing.)  And, too, the bigger the organisation, the bigger that "everything" becomes, amirite?

You hiring a lesser you (who in turn has the power to hire lesser thems) is no different, really, from holding a mirror to a mirror:  Each reflection of you is smaller.  And while it's a fun exercise...once...when you're six...it's no game for adults in the real world.  Much less managers or leaders.

Monday, October 13, 2014

Canadian Thanksgiving, 2014

Like the vast majority in the developed world, I have much to be thankful for.  Today, the brightest--though not most valuable coin--in 2014's treasure-chest is a day warm and bright enough to leave the kitchen windows ajar while I work on dinner.  That's something you can't say every year--not even here, where the ocean moderates both the steamy highs and the bitter lows. 

Bonne action de grâce, tout le monde!

Friday, October 10, 2014

Frivolous Friday, 2014.10.10: Not there yet

Author Erika Hall* makes her living observing people,  Mainly people using design, and then often in the form of websites and mobile applications.  So it should not come as a shock that her insights are pointed and, more often than not, wryly funny.

Case in point:  This tweet from a several minutes back (relative to the time of writing):


She's the professional people-watcher, obviously, but I can't help but channelling Fargo's Margie Gunderson:  "Oh, I don't know if I agree 100% with your detective work there..."

After all:
  • Your smartphone wouldn't argue about splitting the bill for the appetiser platter because there were an odd number of cheese sticks and you technically had more than half.
  • Moreover, you can count on your smartphone's calculator app. to be able to compute the tip.  So unless you usually dine out with a math nerd, it's not apples-to-apples there, either.
  • And speaking of Apples...Siri knows that you're not really a gluten-free-paleo-vegan..and you're going to order dessert plus an extra glass of wine...because nobody's watching.  But Siri isn't going to tell anyone.  Siri doesn't judge.
  • Yeah, sure, you're still gonna totally Instagram that meal.  But there's enough of a stigma about solo dining that you wouldn't be so uncool as to Facebook a selfie, now would you?
  • Retina HD vs. standard "tea-light bath in the bat-cave" ambience of most restaurants after dark.  'Nuff said.

But maybe, someday, with the advances of AI or holographic Skype or some bad seed love-child of Google Glass and Google Hangouts, the practical differences between a dining companion with a battery and a dining companion with a pulse will actually be nil.  When that happens, I trust that there will also be an app. to automatically cancel your ZipCar and Uber you home when you try to drunk-text your ex.

- - - - -

* My thanks to Ms. Hall for graciously allowing a random person from Twitter to borrow her work for nefarious purposes.  She's a good sport.

Wednesday, October 8, 2014

The other side of the fence

I humour myself by thinking that my English degree and my years on the public speaking team at University have paid off.  Specifically, in the trust of my employers and clients that I can communicate like a normal person. 

Sometimes, however, my fond notions crash into the reality of other people's perceptions.  Such as when a client, while mentioning another web development studio he'd been talking to, said, "They speak the same language you do."

Ouch.

Yet, that can cut both ways.  Such as Monday, when I was attempting to get a fee-quote from a very, very bright information security guy.  I haven't had to be responsible for hacker-proofing anyone's servers in 9+ years, which would make me more than dangerous if I tried to do that now.

After a few crossed wires, we finally arrived at the understanding that I would arrange to set up the server, and he would find any security holes in the setup--and fix them.  Had I bothered to rummage through my info. security vocabulary, the phrase "harden a Linux server" would have come up earlier in the exchange and saved us a bunch of time.

I told him that that was exactly the info. I was looking for, thanked him for his time and patience, and promised I would keep that phrase in my back pocket to make things easier next time.   Because that's how polite, self-effacing Canadians are supposed to roll.  But I also sassed, "You infosec. kids and your crazy moon-language."  Because, technically, I'm still a wisenheimer American.  ;~P

I also owe Bright InfoSec Guy my thanks for the reminder that the most important ingredient of communication is not eloquence, but humility.   Specifically, the humility required to set aside your own vocabulary in favour of the other person's.  Because even the language of the I/T world has different dialects.  And living in the Chiac neighbourhood of l'Acadie, I should understand that experience better than most.

Monday, October 6, 2014

The fourth dimension of mobile app. design

My web development background is more in the unsexy database and backend server logic, rather than what you actually see on your cellphone or in your web browser.   That needs to change for a couple of projects I have on deck.  Now, the web is full of "Getting Started" tutorials for various technologies--and thank goodness for that.  But what I call "war stories" are somewhat rarer.   War stories bring you back to the real world when you're feeling that chest-thumping "Stand back, good people--I'm a programmer!" sense of invincibility after hacking together a proof-of-concept.

So when I was pointed to Luke Wroblewski's slim (78 page) book, Mobile & Multi-Device Design: Lessons Learned, I snagged it.  I was aware that Mr. Wroblewski had already written Mobile First, so I kind of knew what to expect.  Even so, some things were not only eye-opening, but almost eye-popping.  (The user interface overhaul required to support a user upgrading from iOS6 to iOS7 springs to mind.)  If you're not a software designer or developer and you want to understand why mobile app. development is so expensive--your answer's in this book.

And that answer is multi-dimensional, not unlike the real world where such applications have to work.

The first two dimensions that factor into mobile applications (and I include web pages that could be used on a mobile phone in the definition of "mobile applications") are the height and width of the screen.  Optimising for a screen the size of a credit card means that your website or app. looks chunky on a tablet...not to mention absolutely ridiculous on a large TV screen.   And, unsurprisingly, optimising for a full-size screen leads to something completely useless on a small smartphone.  We've all been there, right?

The third dimension has to do with getting input from the user of your mobile app. or web application, regardless of screen size (a.k.a "responsive websites").  And here there be dragons.  Unfortunately for programmers (and the people who pay them), there's no reliable way for software to infer all capabilities of the device on which it's installed.  Screen size is typically used to make educated guesses, but there are limits.  The line between Lilliputian tablets and Brobingnagian cellphones narrows by the nanosecond--or so it seems.  And even if the device in question is unquestionably a tablet, it might have a keyboard plugged in (e.g. Microsoft Surface) or it might not.  If there is a physical keyboard present, then displaying the on-screen touch keyboard whenever the user taps a data entry field wastes half the screen real estate...and annoys the user.

There's much more to it than just the examples above, of course.  But you get the idea.

Last Thursday night, I realised that there might be one more dimension to the question of responsive design.  That dimension is the context of time.  Last Thursday evening was significant in that my football team, the Green Bay Packers, played their divisional rivals the Minnesota Vikings.  Just now I can't justify the expense of the internet-syndicated broadcasts, so I follow the play-by-play on the packers.com website, typically while working on other things.

Here in the Atlantic Time Zone, night games (such as last Thursday's) start at around 9:30pm.  Dennis is an early riser, so keeping him awake with the tippity-tapping of my keyboard while bathing the room in the glow of the monitor would have been superlatively rude.  Technology to the rescue!  A tiny Nexus 7's touchscreen and the fact I can dim it and angle the glow away from Dennis eliminate that problem.  (That, plus insomnia, are the reason most of my reading & research are done after bedtime.)

Problem was, whomever the Green Bay Packers hired to redesign their website to adapt to different screen sizes eliminated the widget that shows the score, who has the ball, the down, time remaining in the quarter, etc. when the site is viewed on my little tablet.  So I switched over to the NFL's webpage for some of that info.  (This being the only game on at the time, it was only slightly less annoying to periodically refresh the screen for updates during the third and fourth quarters.)

That the game status widget would not be front-and-center during game-time really surprised me.  Because the statistical likelihood of the user visiting the website during those critical three and a half hours for something other than checking the score is practically nil.

Yes, I realise that there may well be a business decision here, and it does not want to cannibalise revenues from TV or internet streaming.  Also, there is a Green Bay Packers app. available for Android.  (No, I won't install it--it asked for waaaay too many permissions.  You're my football team forever, boys, but you really don't need to know all of those things to show me a score or sell me a cheese-hat, kthxbi.)  That being said, the Packers are arguably the most fan-oriented team in the NFL, so even the cynic in me is still surprised.

But for those of us creating applications to be used on small screens, the ruthless priortisation necessary might still have to account for the fact that what brings the user to the site/app. may not be a constant.

Which is most definitely food for thought as I make the mind-shifts necessary to work in four dimensions.  Though painting my office door to look like the Tardis would probably be overkill, yes? 

Friday, October 3, 2014

Frivolous Friday, 2014.10.03: The Devil is a software designer

During a phone call with a client today, he pointed me at a website geared for salespeople.  We were discussing aqui-hiring, and in fact the company in question had recently been acquired by Salesforce.com.  I won't mention names (much less URLs), but the website was your standard love-child of Marketing "Buzzword Bingo" and Legalese.  What little actual, y'know, information I could glean left me with the impression that the product being sold was no more than a fancy way for salespeople to shuffle email and contacts between Outlook folders.  All to avoid picking up the phone and calling people who might bruise their egos by rejecting the crap product they shouldn't be selling in the first place.

That's when it hit me that the market for that kind of product could be immense.  (If I had half the opportunism to which 23 of my chromosomes entitle me, I'd have been all over that market since, oh, the late 1980s or so.)  And, in this case, being acquired by Salesforce.com is a massive coup--the business equivalent of the mountebank from the traveling medicine show being elevated to a peerage.  Cynically, I can't help but tip my hat to the folks who have more or less tapped a limitless supply of energy third only to human stupidity and the impulse to one-up the neighbours.  Well played, nameless company acquired by Salesforce.com.

But the epiphany ran deeper than even that.  There's a lot--and I mean a LOT--of suckage out there in software.  Sometimes the suckage is purely naive and unintentional.  But too often there is too much money on the line and too many high-level meetings have been held for the possibility of accident.   Frankly, Verbal Kint had it all wrong:  The biggest trick the Devil ever pulled was convincing the world that software would enrich our lives. Cases in point: 
  • iTunes -Seriously, until it stops making me Google how to unscramble album covers on my music and podcasts, would the Apple fanbois and fangirlz kindly shut up about the transcendent superiority of design? kthxbi
  • Facebook - 'nuff said, amirite?
  • PowerPoint -  Kinda makes you nostalgic for the days when software was something that The Suits expected their (far more practical) assistants to know.
  • Comment sections - Fifteen-plus years of the likes of Slashdot and LiveJournal, and the software can't automatically hell-ban any user who refuses to learn the difference between "there," "their" and "they're"?  (Ditto "you're" and "your".)  Or won't stop using logical fallacies like slippery slope and ad hominem arguments?  Or wouldn't recognise real fascism if it shoved a swastika up their nose?  The internet hive-mind would be a lot smarter otherwise.  (Better yet, if the dating sites would do this kind of screening, we could breed the ungrammatical & illogical out of existence.  Get on that, Match.com!)
  • Adobe Flash plugin-in - Oh, your browser just crashed?  Again?  I don't think we need to look any further for the smoking gun.
  • Mindless games - Idle hands are the Devil's own instruments, so the saying goes.  But the makers of Farmville, Angry Birds, etc. are clearly his brokers.  And when a loud & loutish percentage of gamers can't respond to criticism without astroturfing, doxxing, and physically threatening their critics, it doesn't reflect well on the community as a whole.
Doubtless, the intent is evil--namely, wasting time you could spend bettering the world.  Such as ranting about software in geeky blog posts, don'cha'know?  ;~P

Wednesday, October 1, 2014

Armchair organisational psychology and the making of monsters

Most of my programming work these days is done on a Debian laptop or an Ubuntu desktop, so I actually acquired a copy of Windows 8.1 the "old-fashioned" way--by which I mean pre-installed on a new computer.   Since my first PC ran DOS 3.3 applications off two 5.25" floppies in the early 90s, I've become pretty adaptable when it comes to user interfaces.  That doesn't mean that I don't hate the touch-optimised tiles.  When you make your living being the person who's "good with computers," it's kind of insulting to know that that the "least common denominator" user Microsoft's designers obviously had in mind was a toddler who discovered the pretty bottles in the liquor cabinet.

In the Small Mercies Department, Microsoft bowed to backlash and brought back the "Start" button with the 8.1 version of Windows.  (Historical aside:  Remember when Windows 95 debuted and the Start button was the Worst. Thing. Ever.?  Yeah, me too.  Humans are so weird sometimes.)

So while yesterday's buzz mostly focused on Microsoft's numbering juke (intended to put perceived distance between Windows 8 and the newest new thing), the real news is how much the blow-back was apparently taken to heart.  At least for full-size monitors, Windows 10 will be 7-ish enough to appease corporate users.  It's a critical demographic, given how businesses would rather use the (dangerously) unsupported Windows XP than pay employees to thrash their way back to baseline productivity in Windows 8. 

True, whenever the interface for a workaday piece of software is radically changed, it rarely pleases anyone.  No news-flash there--if it ain't broke, don't fix it, amirite?  (Although it's always fun watching the religious war between the Team Flat Design and Team Skeuomorphic Design whenever Apple toggles between the two.)  And while people expect more for less all the time, too many features decreases the usability (and thus the value) of pretty much anything, including software

The number of reasons companies like Microsoft and Facebook jerk users around are probably as many as the reasons we keep coming back to them anyway.   But, apart from sheer sunk costs, it's important to keep in mind that software companies labour under one disadvantage not shared with most other industries.  And that is they don't have the luxury of contracting out their software development.  After all, who'd trust any company that outsources its core competency? 

Granted, in a Microsoft-sized company, you can shuffle people between products.  (And my gut feeling is that such cross-pollination is worth the disruption in nearly all cases.)  But the fact is that keeping people on staff through the slower periods is cheaper than boom-and-bust hiring and firing.  People always need time to "onboard" (in HR parlance).  Time to unsnarl the spaghetti code-base.  Time to trip over the land-mines in the home-grown tools.  Time to figure out which brain-sucking meetings can be avoided without political repercussions.  In short, time to absorb and internalise the company's own special dysfunctions. 

Most importantly, adding more people to an already late project makes it even later.

All of which makes a solid organisational case for staying fully-staffed for as long as it makes strategic sense.  But I have to wonder whether it also incentivises people to look busy--and thus to push pixels and twiddle bits that were fine just the way they were, spank-you-very-much.  It's easy to blame the pushers and the twiddlers themselves, but the fact is that any given change has to be approved by multiple hierarchies of someones before it goes on public view.  Because managers and executives worry about looking busy too.  Worse, some of them even worry about how they look to all the cool, edgy kids.

And, next thing you know, the monster escapes and kidnaps Fay Wray and tries to play volleyball with bi-planes from atop the Empire State Building.  (Which probably doesn't get a second look from any self-respecting native New Yorker--even in 1933 when there were no cellphones.  But, hey, for us hayseeds out here in the sticks...)

"Creature Feature," indeed.