Monday, February 22, 2016

When Empathy > Technology

Dennis's shirts hang on the left side of our shared closet, mine on the right.  The closet doors are the typical sliding ones, so that when one side is open, the other side is occluded by door.  When the doors are open to my side, light from the window is largely eclipsed by anyone standing in front of it, leaving the job of illumination to anything coming from the left.  On the opposite side, it's the opposite story.

Thus, Dennis hangs up his shirts so that they face the right; I hang up mine so they face the left.  In a logical Universe, we would respect this light-optimised orientation when hanging up each other's shirts.  But even a dual-programmer household falls well short of the Spock/Sherlock ideal.  Alack.

The mayhem and havoc wreaked by misaligned clothing can be quantified in terms of extra fractions of a second required to select the t-shirt whose snark and/or geek culture in-joke best matches our mood-of-the-moment.  A #firstworldproblem if there ever was one, in other words.  But it illustrates the power of personal norms to trump logic (and the instinct to use it).  And, in a way, it makes me despair for human progress as driven by first world technology...or even most first world technologists (in whose number I count myself, btw).

Silicon Valley has been panned by folks as diverse as Valleywag and Startup L. Jackson for burning so many calories turning paper millionaires into paper billionaires while infantilising the twenty-something dudebros who are the face of its culture/ethos.  The first is just what shareholder capitalism is optimised to do.  (The second is just plain pathetic.)  Neither of them can be considered truly "disruptive"--at least not in the net positive sense their apologists would have you believe.  Sure, it's taking bites out of the taxi and hotel industry by socialising the costs of industries formerly held more accountable via regulation.  But, hey, you can't make a creative destruction omelette without breaking a few social contracts, amirite?

It's not even a private sector ailment.  NGOs can (and do) squander resources applying first world thinking outside the first world.  Case in point:  The first attempts to convince Cambodian families to add a lotus-shaped chunk of iron to their cook-pots to reduce/eliminate anaemia fell short.  Follow-up visits discovered the iron being used for other purposes, notably doorstops.  But casting the iron in the shape of a fish considered "lucky" by locals changed the game.  Anaemia has been eliminated in 43% of trial subjects, and a sustainable business model was spawned in the process. 

Moral of the story:  Sometimes it's the users, not the technologies, that have to be "hacked."  The catch is that those of us who are paid to be problem solvers have the instinct to hack technology first.  Don't get me wrong--I'm a huge proponent of usability.  The bigger a technology's side effects, the more incumbent it is upon its designers to make it as impossible as possible to misuse.  I get it.

But the slickest, most bulletproof interface in the 'verse means bupkis if it is A.) Not solving a worthwhile problem, and/or B.) Is too expensive (in terms of cost, infrastructure support, externalised costs, etc.) to use by those who would most benefit from it.

So, to recap, to successfully "disrupt" anything, the designers/developers need to:
  1. Allow people to benefit their lives/families/communities in a way that was previously impossible
  2. Allow them to do it in a way that doesn't require huge (for them!) investments or later remediation
  3. Ensure that misuse is darned near impossible without anything beyond rudimentary training
Put together, that's a very tall order.  To pull all that off that takes great wads of empathy--which starts with the proverbial exercise of putting oneself in someone else's head-space.  (But we geeks are supposed to be the "smart" people, right?)  Alas, empathy (or even self-awareness) is not something I expect to find thick on the ground in a place where more than one techbro has very publicly hated on the problems his very own Galt-couture has created.  If that's a representative sample of the "thought leadership" in the culture of disruption, that culture is bankrupt.  Part of me thinks that, in this case, this latest tech. bubble can't burst soon enough.  Except that it won't move the needle on the culture.  At best, that offers the cold comfort that it won't be so lionised by press and pundit.  With February winding down, I've had enough cold for one winter, thanks.

Thursday, February 11, 2016

Exceptional madness

Doubtless, my Gentle Reader has heard some form of the adage, "The definition of insanity is doing the same thing over and over while expecting different results."  It's not a bad rule for nearly all situations, actually.

Problem is, it doesn't necessarily work that way in Software Development, particularly during debugging.

See, when you're trying to reproduce a problem, you want to see the same results after doing the same thing over and over.  Anything else spells extra time and resources.  Second-to-worst case scenario is you'll end up essentially creating a VirtualBox-type simulacrum of the production environment so you can replicate the live conditions as closely as possible.  Maybe you'll end up dorking with the system date, or writing custom scripts to roll back test data to a specific point in time...repeatedly.  For sure, you'll be spelunking in the database, likely scribbling down ID numbers and combing through matrices of data.

Worst-case scenario, though, is that you're never able to reproduce the error, EVER.  Nope, no matter how faithfully you re-create the conditions from the bug report, the gremlin never reappears.  That, mes amis, is The Short Road to Crazytown.

So if you've ever wondered why some programmers (and other I/T folks) have a slightly skewed perspective on the world, this is one of the reasons.  For us, madness lies in the exception, not the rule.  In more than one way.




Wednesday, February 3, 2016

A.B.H.

As with many rants, the thing at which I'm yelling is not actually the thing that cheesed me off in the first place.  For that reason, I'm keeping names out of this.

The Moncton tech. community, like any community, has what's known as "super-hubs" in its network.  These are the folks who pretty much know everybody (sometimes even before being introduced), and whose job--officially or not--is to connect people.  It would be difficult to overstate the time/resources/opportunities that would be wasted if not for these folks, and I look for ways in which I can repay what they've done for me over the last few years.

But with any community resource, its value is cheapened when any-old-body assumes that they can short-cut their way to clients, employees, etc. by basically asking a super-hub to spam her/his network.

As these requests go, today's email was comparatively defensible, and I'm honestly quite happy to note how tight-knit the Moncton and Fredericton tech. startup community is.  No silo-ing or turf wars in sight, and thank the FSM for that.

Alas, something that's been simmering for awhile boiled over.

For anyone who doesn't know me, I'm a freelancer building a fairly niche business and, frankly, I'm still in over my head on where I fit into the local economy...if indeed I fit at all.  And so I've been to a whack of after-hours mixers and lunch-n-learns and business-over-breakfast thingies lately.  And I'm afraid that in the context of such events, I've too often heard the lament that business "can't find" the programming talent it needs in Moncton.

Yet, strangely, the only people I see at the user group meetings are other programmers.  Ditto the classes sponsored by the Cybersocial.  Same deal at the MPL Makerspace/FabLab.  It's like all these non-technical managers/executives are afraid of programmers in packs.  Like there will be a West Side Story-esque suits-vs.-geeks rumble.  Or something.

There are a few notable exceptions.  I know of at least one Moncton tech. company that practically works hand-in-glove with a private college.  Frankly, such pipelining doesn't skeeve me out.  Far better that than the business that expects the government to provide a corporation-coddling tax climate and educated college grads.  And--bless his heart--Dan Martell's advice to would-be tech. entrepreneurs was to (gasp!) figure out where geeks hang out and go look for a technical co-founder in those places.  (Yes, I know I said I wasn't going to mention names, but credit's due where credit's due, yo.)

Clearly, some suits get it.  It's the ones who send out the emails that contain phrases like "We have an immediate need for _n_ developers..." that burn my bacon.

No.

You had that need as soon as you knew that you were opening a branch office.  Or as soon as you put together the project proposal you knew your current staff couldn't support.  Or as soon as you started lunching with VCs.  Basically, as soon as you were fantasising about how you were going to spend your share of the new profits, you, my friend, had that need.  If you now have an "immediate" need, it's loonies to Timbits that you chose to do other things until you absolutely couldn't ignore the most fundamental job a business has.  You're basically the kid who waits until 8:30 the night before the project is due to tell Mom that you have to go to CraftWorld for the supplies.

And so I frankly don't want to hear about the dearth of programmers that you could be standing knee-deep in if you'd bother to get to know them.  Best of all, the programmers who show up to places like Makerspaces and user groups are the motivated ones.  Sure, I've known plenty of very bright developers who prefer to pick up a new technology/framework by reading code and tinkering on their own.  Nothing wrong with that.  But the advantage of mixing with programmers among their own tribe is that you meet the ones who aren't embarrassed to ask questions in front of their peers. 

Look.  Most suits are inherently salespeople.  So I'm going to assume that part of their self-education involved the film Glengarry Glen Ross.  As such, they'll immediately recognise that I'm referring to that scene:  Alec Baldwin's profanity-laden verbal beat-down that includes the acronym "ABC," for "Always Be Closing."  That maps nicely--and without all the F-bombs--to "ABH" for "Always Be Hiring."

Hiring the best of the best means having a the pick of the market.  Which means that you have to know the market in the first place.  Oh, you had your heart set on that 10X full-stack unicorn, but now s/he can't be poached from the once-in-lifetime start-up opportunity?  Womp-womp on you for not seeing if there were any more at home like her/him.  Maybe next time you'll create a job around a person you know will take you to the next level...before it's time to scramble up that hockey-stick curve.  Or, better yet, you'll actually have met all your potential hires face-to-face long before their phone screen.

Start-ups have the mantra, "Get out of the building," which means validating their ideas before they build.  "Get out of the building" also applies to hiring.  Because you can't say that your employees are your most valuable resource if you expect them to magically drop into your org. chart with the right skills at the right time.

Always.  Be.  Hiring.

And, if we happen to meet at any number of these mixers, bee-bop on over and say "Hi."  Now that I have this rant off my chest, I probably won't be too bitey.  Unless, of course, you ask me if I know anyone who's looking for a developer job.  :~/

Monday, January 18, 2016

Trading the creeper for the stalker

I'm on the "admin." email list for a volunteer group that organises monthly tech. talks.  I'm not the volunteer who orders the pizza, so normally I don't fuss too much over how many registrations have landed.  But I recruited this month's speaker, and wanted to give him a rough head-count (especially since it was a higher than usual, which is always good to report).

One downside to flipping through all those EventBrite notifications, however, was the huuuuuuge and depressing preponderance of "anonymous" email accounts used to register for the talk.  Clearly, someone--well, pretty much everyone in our almost-big-enough-to-be-statistically-valid sample--has had their workaday email address trammeled by someone before.  And (more to the point) that's not news or even remarkable.  It's merely evolution in action, really.  In the sense that arms races can be considered "evolutionary," anyway.

Until now, I'd failed to see the extra irony in that response.  GMail and Hotmail (and the odd Yahoo) accounts are, of course, a prophylactic against having one's attention-stream repeatedly crashed by spammers and scammers.  Not unlike how single women will wear fake wedding/engagement rings as a prophylactic against being creeped-on.  (Needless to say, its deterrent effect is never 100%, but on balance it's worth the clunky el-cheapo jewelry.  Pro tip, ladies:  A layer of clear nail polish over the metal of a dime-store ring will extend its lifespan by months.  Trust me on this.)

So, in an attempt to preserve the online equivalent of personal space, people choose to trade their privacy and a certain amount of attention-span.  Because of course Google, Microsoft, and Yahoo are monetising both.  Behind the scenes, naturally.  Which apparently makes all the difference.  It's difference between the creeper at the pub who won't take "go away" for an answer and the stalker who rifles through your garbage and eavesdrops at your window.  But the latter is at least discreet about it--incredibly polite, in fact--and (best of all) you're not its only target.  So it's not even personal, which almost makes it not-creepy, hey?

Please understand that I'm actually not slagging Google or Microsoft or Yahoo here.  Blocking all the spam and worse spawned in the underbelly of the internet is itself an arms-race.  Just handling the sheer amount of illicit email traffic chews up resources that your average I/T department can't afford.  Bayesian filters require constant "training" and updates to the code to keep up with the latest scams, viruses, and desperately incompetent marketing hacks.  Only huge corporations have the resources to A.) Scale up to the challenges, and B.) Pay for it all with targeted advertising revenue.  "Targeted," naturally, implies sniffing email for keywords and (more importantly) patterns and embedding compatible ads in the user experience.

Within the confines of the free market, we're left with an imperfect system.  In essence, we're allowing the stalkers to protect us from (most of) the creepers.  To imagine any other outcome is reading History backward without remembering that it is lived forward.  And also to forget that people people--perhaps as much as the corporate people of which Mitt Romney famously spoke--have zero conscience when it comes to externalising costs for things you can't actually touch.  Maybe even negative conscience, given some of the rationalisations I've heard.  Internet security, naturally, ranks high on that list.  Which is precisely what criminals and griefers are banking on.  And, as if we could forget, Marketing's sins are both legion and legendary--regardless of medium.

If it weren't for those inconvenient truths, I'd feel less futile in wishing for a do-over on email at this late date.  Namely, a do-over that doesn't require the ghetto-isation of personal email.  Because a tool that's so critical to modern life (on the clock and off) doesn't fit well into any of those flows on or off the clock.  To say that I'm fussy about my tools (in software development as well as elsewhere) is a massive understatement.  You take care of your tools, and they'll take care of you.  I believe that.

Predictions of email's demise are over a decade old.  (Just like predictions of the demise of many things.  Particularly when made by people too busy penning tech. articles to read any Geoffrey Moore.)  But let's imagine that the street-corner nut-job with the doomsday sign is correct and that The End is, in fact, Near.  That would be our last, best hope for owning up to how much #FAIL is baked into the current system and keeping it out of the Next Big Next Big Thing, yes?  Effectively, means that it's time to (finally!) put a price-the on "free" email that reflects all its costs:  The internalised costs of our own context-switching as well as externalised costs of subsidising crime, giving viruses a vector for spreading, etc.

When the internet first went mainstream, we were treated to starry-eyed predictions of democratisation and broadened horizons and geysers of previously untapped human potential.  To some extent that's happened...along with other things less laudable or savoury.  But there's no excuse for not learning the lessons of the past two decades, and far less excuse for perpetuating its sins.  I'm at an age where I don't have soaring hopes for the future--after that whole flying cars and Mars vacations thing didn't pan out and all.  But I'm all-in for "not-creepy internet"  Can we get it right next time?

Wednesday, January 13, 2016

"Day Two"

Bedtime reading the last two nights has been Erin Kissane's The Elements of Content Strategy, which (on the surface, anyway) is sorta-kinda tangential to what I do for a living.  Simply substituting "data" for "content" doesn't always map in an apples-to-apples way, but one concept definitely does resonate for the career of an application programmer.

That concept is what Kissane calls "Day Two."  It's shorthand for the time after the system (typically a company website) is launched, approved, and everyone settles back in to their "regular" jobs.  In the case of any contractors or freelancers, it's the next gig (or looking for it).  But in the case of employees--at least some--the definition of the "regular" job might have shifted.

Problem is, those shifts are not always recognised, much less budgeted/scheduled.  (And if the consultants didn't hammer that point home during the planning phases, you probably don't want to hire them again.)  Bottom line is that those extra hours of researching, writing, photographing, proofreading, vetting (by the Engineering/Marketing/Legal/Whatever functions of your organisation), etc. do not magically appear out of nowhere.  (Nor, equally shockingly, does the time/money for training people when the launch crew moves on.)

Bigger problem is, no one is surprised to read about those realities.  Yet too often the surprise comes when (through some evil influence or misalignment of the stars, no doubt):
  • The company blog has been dormant for a year, plus some fool lost the Hootsuite account credentials, so the Twitter/LinkedIn/Facebook accounts are stale, too.
  • News releases are being used as a political football between Legal and Marketing...and ultimately published (late) as convoluted fluff that no one with a shred of respect for the written word reads.
  • Whitepapers cite the previous version of the product...and still have the old logo/branding--embarrassing!
  • Calls to Customer Service (not to mention social media hissy-fits) creep up because the online help doesn't deal with new products and/or features.
  • No one has any idea whether email campaigns are working or not, because who has time to set up A/B testing in MailChimp anyway?
  • (My personal favourite) I/T receives cranky calls from Management because "No one's updating the website."  (Don't laugh.  It's soooooo not funny.)
Clearly, that Content Management System was a huge waste, amirite?  We need to find a consultant to re-do it for us--the right way this time, darnitalready!  [insert uber-sarcastic eyeroll here]

Over in my more data-driven world, I'm usually, um, "lucky" enough to see somewhat less acute symptoms of the same disease.  But it still means time devoted talking clients out of wasting their money.

Because the bottom line is that information that can't be captured as part of the normal course of doing business won't be backfilled later.  It just won't.  Anyone who thinks otherwise is delusional.  That's the bad news.  The good news is that you will--or should--be talked out of that by any halfway decent/competent application developer.  (Why?  Because we frankly don't want the fallout and bad karma that comes from letting you do that to yourself.  For ourselves personally and for our industry in general.)

Case in point:  I worked for a company that had to make a whack of outgoing long distance calls--to the point of bringing in part-time help (and the owner's kid) three times a week for eight hours a day.  Before my time, the company had had to fire someone who abused the system by making long-distance personal calls.  For an hour.  Five days a week.  I know for a fact that it bugged the heck out of the company Accountant, because he told me the story twice.

For all five hours of lost productivity plus the long-distance charges add up over 52 weeks, it would have been sheer profligacy to force each employee to log all calls--either on paper or even the most user-friendly app. any programmer could devise.  Even requiring the Accountant scan through the month's  dead-tree phone bill isn't a negligible cost (especially as the company grew from 20 to 40 people and acquired two other companies in the process).

But when the afore-mentioned Accountant bee-bops into my office to mention that, hey, did I know that our new phone service provider supplies us with a downloadable plain-text version of our phone bill and is that maybe something I could parse and dump into a database so he can generate reports from it...now, that's a different equation entirely.

Sure, there was the up-front cost of my time (setting up the database, creating a user interface to allow the Accountant to upload the e-bill, and of course setting up the reports).  But after that point, the system could scale up to any number of employees with no extra incremental time required for the Accountant to segment and sort and total the data any old way he needed it. Including, if necessary, importing it straight into his own software. 

That, friends and brethren, is pretty much the gold standard of business automation.  By which I specifically mean that data that would have been prohibitively expensive to manually log was mechanically collected in an easily consumable format (by our vendor).  Better yet, the only business process that had to change was the Accountant having to remember to download the .CSV file and then upload it to our system.  Trust me, he did not complain.

And thus, "Day Two" of that application was just "Day One" of Happily Ever After.  That's how you want your story to end.  And it should end that way if you're realistic about how you're going to capture the data you need to make decisions.  Yes, I realise that it's all too easy to be caught up in the fresh promise of hackathons and project kickoff meetings and wireframes and mockups loaded with Lorem Ipsum.  Just remember that software comes with an invisible fine print that reads "Data not included."

Thursday, December 10, 2015

Sane, rational paranoia

So I've been grinding away on a database implementation for a few days now, and today passed a milestone peculiar to database geeks.  That's the one where the number of data tables is surpassed by the number of functions (and stored procedures) written to read, update, and delete that data.

It's the point where I start to feel safe--as safe as you can be allowed to feel, anyway--what with data hanging out on a web server. Once, I thought of that "safe"(r) feeling as mere superstition.  And, to a degree it is.  Just like any situation where you go all-in on one factor.

In short, this flavour of coding is kind of thing logicians call "a necessary but not sufficient condition" for a reasonably safe application.  A lot of different skills and roles play into that.  And, even getting all the technical stuff right means bupkis if the human factor fails.  ("Hi, this is the County Password Inspector calling.  We need to verify that your password is strong enough..."  [insert involuntary twitch])

That being said, there's no excuse for not doing it.  Or for not automatically distrusting every bit of data that wants to be written to your database. 

Thing is, writing database code is just repetitive enough to be boring, but just quirky enough in its logic that I don't automate the process beyond cut-paste-edit.  And once that code is in place as the "gatekeeper" between the main application (web, desktop, API), the main application's code similarly walks the line between "boring" and "can't afford to mail this in." 

Worse, it's slow business.  And not simply because so much is being written from scratch instead of copied and adapted.  This is also when the nitty-gritties that fell through the cracks of the design documents have to be fleshed out.  Even when the developers were part of the design team, this means round-tripping back with the major stakeholders.  Which means they're in meetings rather than coding.  And I can pretty much guarantee you several "Uh-ohs" occasionally punctuated by the more serious "Oooooops."

Which is why it can seem frustrating to the (non-coder) manager that resolving questions last week spawned even more questions this week.  That means your developers will spend even more time in meetings and less time doing what they're nominally paid to do.  And as a developer watching the original milestones loom (and possibly slip), it can be awfully tempting to take the short-cuts.  By which I mean querying, and (far, far worse) updating and deleting data directly in the web code. 

Now, I don't have any proof whatsoever, but I would not be in the least bit surprised to learn that those sorts of shortcuts were behind last month's VTech hack.  Because the smoking gun that left the information of 6.4 million children and adults exposed was a SQL injection hack.  "What's a 'SQL injection hack'?" a non-coder might ask.  I'll let "explain xkcd" well, explain it, because A.) He does it better that I did, and B.) Talking about SQL injection and not linking to the iconic xkcd comic (which I really need to get on a t-shirt)  is like going to RenFest and not making a single "Holy Grail" joke.  It just isn't done, mes amis.

Like I said, zero proof of my hypothesis.  Whichever way, though, that's bush-league work.  Let that sink in for a second:  I'm a freelance coder working out in the wilds of Grande-Digue, New Brunswick, and even I roll my eyes at that kind of sloppy naiveté.  Because SQL injection isn't half so much a technical problem as it is a human one.  And, honestly, if you have millions of customers, you can afford to hire solid, experienced, disciplined programmers, and let them do their jobs.  Which involves indulging the twitchy spider-senses developed from years of painful mistakes.  Normally, you'll only find that level of paranoia in ammosexuals and StormTrumpers.  But programmer twitchiness the sanest and most rational paranoia you'll ever encounter.  Trust it.

Monday, November 23, 2015

Hacking the health of a planet

Today's edition of the Toronto Star carried an item that made me smile in three different senses:  Scientists Hack DNA to Spread Malaria-Resistant Gene.  On a purely topical level, this could be A Really Big Deal.  That makes me smile.

On a purely geeky level, the use of the term "hack" was encouraging.  I/T folks like myself have been trying for years to make the distinction between "hackers" (the DIY tinkerer types who have no time for Apple-level polish) and "crackers" (the folks who keep the credit monitoring firms in business...and regular I/T folks awake at night).  Alas, the rest of the world doesn't make that distinction, so even white-hat "hackers" are tarred with the same proverbial brush.

And yet...no one (geek or non-geek) likes mosquitoes, so who wouldn't get behind "hacking" their DNA, amirite? [grin]

But the question of semantics and PR is the least of the problems here.  Because I yet again have to smile--albeit wryly--at the huge obstacle common to hackers across all disciplines.  Namely, the gulf between a working prototype in the lab (or hackathon-occupied conference room) and full-scale adoption in the real world. 

Don't get me wrong--this is really-most-sincerely NOT schadenfreude.  Half a million lives a year are at stake--most of them children.  And the health of two hundred million more people each year is in balance.  One would be very, very hard-pressed to overestimate the significance of eliminating mosquitoes as a vector for malaria.

There are 3,500 modern species of mosquitoes, and their lineage dates back 226 million years.  Before deliberate "gene drive intervention," humans were already triggering the development of new species by dint of pesticides.  "Hacking" every skeeter on the planet will be a knock-down, drag-out slog of decades.  But it's a worthy fight...and it sure beats the tar out of poisoning the ecosystem with whatever hell's broth Dow is cooking these days.

As a programmer/tinkerer, my goals aren't even a tenth so audacious and world-changing.  But that doesn't mean that I can't appreciate the scaling issues.  And the wherewithal it will take to surmount them.  All the best, my fellow hackers...you're going to need it.  And then some.