Monday, December 22, 2014

Holiday Hiatus

Okay, I'm just bowing to the inevitable and calling it a year (more or less) on blogging here.  Whichever holiday my Gentle Reader chooses to observe around the Winter Solstice, I hope it's a lovely one.

Catch y'all on the flip side!

Wednesday, December 17, 2014

It takes a village to raise a programmer

Today I was at the computer repair shop, having a fried hard drive replaced in a laptop.  When a client arrives with the diagnosis already made, repair folks are only being smart when they question it.  I assured the gentleman at the front counter that I had booted into the laptop's BIOS and found no hard drive listing.  Also that I had also tried running two flavours of Linux off CDs/DVDs and both freaked out when they found no hard drive.

At that he took my word for it. and I walked out with a new hard drive installed in less than fifteen minutes.  (For the record, the laptop is now running Ubuntu with one PEBKAC* wireless issue since overcome.)

The gentleman seemed surprised at me being a Linux user, and asked what prompted me to use that operating system.  Not wanting to drag politics in it, I explained that it makes web programming in the PHP/MySQL space a little more seamless when work is promoted to a server.

It turns out that he's starting to learn web programming just now, but that it isn't always coming easy.  So I handed him my business card and told him that he was perfectly welcome to call on me for help when he was stuck.  Just as I was able to call upon my elders in my times of need.  (And, 's'matter'o'fact, still do.)

While I savour the few opportunities I've had to repay my elders for what they've taught me, it's equally delicious to be able to smooth the path for someone else.  Sometimes it's not even a matter of catching a logic-bug or pointing to a more appropriate API or anything even remotely technical.  Sometimes just knowing that someone won't consider your question "stupid" is more important than the answer itself.

I do hope to hear from him.  Because that means that another craftsman is joining the trade, and auto-didacts who aren't afraid to ask for help are an asset to the guild.

- - - - -

* "PEBKAC" == "Problem Exists Between Keyboard And Chair"

Monday, December 15, 2014

Unwelcome magic

I'm trying to rebuild an old-ish (and rather banged-up) personal computer with a new operating system.  Because, you know, friends don't let friends use Vista.  [insert trollface]

So far, it's Gremlins: 4, Me: 0.  (I've had to break out the paperclip--'nuff said.)  I'm starting to sympathise a bit with the sub-species of programmer who despises hardware.  In my case, I was less than amused to learn to realise that sometimes, the System Administrator has to deal with their own version of "syntactic sugar."

For non-coders, "syntactic sugar" is a shortcut in a programming language that has counter-intuitive (at least to a newbie) results or side-effects.  In the worst-case scenario, "it just works" despite seeming to miss critical input or function invocations.

Other coders--including my betters--will disagree, but I generally dislike that in a language.  It jacks up the slope in the learning curve, in my less-than-humble-opinion.  Arthur C. Clarke aside, the magic of "it just works" is antithetical to the scientific reasoning process that's supposed to govern computer programming (and debugging).

Outside of the coding sphere, I tripped over something suspiciously similar.   My Debian installation DVD didn't include a hardware driver compatible with the USB wifi stick I was using.  Fortunately, I was able to find them on the Debian website and copy them to a flash drive.   I plugged the drive into the back of the PC and continued.  From that point on, I had wireless access.

Convenient, right?  To me, it was actually kind of creepy.  First off, the installer never asked me to specify the location of the wifi driver.  Secondly, the driver was just one of many files bundled into a larger .DEB archive.

Magic.

Ugh.

I realise that there are times--particularly when it's not part of your core competency and the SysAdmin has already left for the day/week--when "stuff just working" is a good thing.  But it's unlikely that this is the last time this particular PC and I will tussle.  In which case "magic," however convenient, may actually be counter-productive in the long run.

That and I flatter myself that a programmer who isn't interested in pulling back the curtain on the controls is no programmer at all.

Sunday, December 14, 2014

Silly Sunday, 2014.12.14: Tribalism in computing

Once upon a time (roughly coinciding with the Carter and Reagan Administrations in the U.S.), I actually liked to travel by airline.  By which I mean that there was a time when I flew at least twice a year and didn't understand why airline jokes were a staple of stand-up comedy.

Enter three decades of cost-cutting, capped by the collective bed-wetting in response to a "black swan" event like 9/11.

Part of the reason this de-evolution is particularly painful for yours truly has to do with my transition from Liberal Arts graduate to programmer.  The Liberal Arts bit pretty much ensures my awareness of the history of my profession.  To wit:  Twentieth century computing was initially driven by things being launched from Point A to Point B.  Initially, those things were ballistic in nature.  But eventually, the assumption was that the landing would be somewhat less...ahem..."dramatic."

One convenient aspect of visiting Washington, D.C., is that the major museums are (mostly) clumped in one place--namely along the mall.  That includes the National Air and Space Museum.  There behind glass, I was tickled to find a retro metal IBMonster that once crunched flight &  reservation data.  Anyone who's discovered a new branch in their family tree understands that feeling.

And so it was that last week, when I was flying through YYZ (Pearson Airport in Toronto, ON), my proverbial cold comfort for a 4+-hour delay (other than an extra bag of almonds) was the knowledge that the blame rested entirely with the hardware side.

That lasted until one of my fellow passengers wryly informed the flight attendant that, according to the in-flight display, we were already en route to our final destination. 

D'oh!

And so, to my chagrin, I was reminded that what sociologists call tribalism can be just as rife the the left-brain-dominated world of computing as it can be anywhere else.  And just as challenging to overcome as any other bias.  Thus do computers make us more human.  In this isolated instance, anyway.

Wednesday, December 10, 2014

Of Google and Gestalt

Maybe I have an odd definition of "amusing," but the near-juxtaposition of two Slashdot article picks today made me smile.

The first item is a thought-provoking rebuttal of Elon Musk and Dr. Stephen Hawking's warnings about the dangers of artificial intelligence (AI).  Its salient point is that intelligence is not autonomy--i.e. a machine cannot truly have free will.  Certainly, our reliance on AI--as with computers and networks in general--makes us especially vulnerable to its failure(s).  We're sometimes vulnerable to its successes, too.  (Think obsolete livelihoods, cybercrime, etc.)  And when some fool decides to abdicate critical decisions to an algorithm?  Yeah--I think that most of us know the end of that movie.

There's also phenomenon known as "the uncanny valley," wherein computer generated (human) images are oh-so-close-but-no-cigar to life-like that we actually react negatively to them, compared with something more cartoonish.  (Aside:  If you're among those who were creeped out by The Polar Express but think that the minions of Despicable Me are adorable, now you know why.)  In Star Trek: The Next Generation, the android Data notes that he has been programmed not only to blink, but to do so at semi-random intervals so as not to trigger that vague sense of unease associated with the uncanny valley.

And, even being a programmer, I have to admit to being creeped out myself by the accuracy of voice recognition in some automated phone systems.  In the end, it may well be that the market's response to the uncanny valley may forestall an AI bot takeover before the technology is even capable of making it a threat.

In short, we are (probably) a long, long way off from renegade replicants and time-travelling hit-men for a genocidal AI.  Or so The Matrix wants us to believe...  ;~)

At this point, it's tempting to congratulate ourselves for being such inimitably complex carbon-based beasties.  Until we consider the second Slashdot item, which brings home how easy it is to impersonate a human browsing a website.  And not only a human, but a wealthy human.  In related news, Google made headlines last week for stealing a march in the arms-race against the bots--or, more aptly, the people who code them.  (Though I do have to wonder whether the visually impaired will be the collateral damage of that escalation.)

That's the contrast that made me smile, albeit wryly.  To wit:  The bar for "humanity" is set so high in one area of software development, but so low in another.  (Embarrassingly, that latter area is my own.)

As Mr. Etzioni pointed out, part of culture's freak-out over threats of AI is our fear of irrelevance.  Or...do we also fear that we've passed some inflection-point where our lazy, self-fetishising parochialism leaves us open to a palace coup by our digital serfs? Personally, machine learning doesn't worry me half so much as humans who refuse to learn.  But if my Gentle Reader is more of the Musk/Hawking camp, perhaps we can agree that the only viable response is to insist on a higher bar for humanity. 

Monday, December 8, 2014

Snippets

Back in college (the first time), I was into what some might consider an "extreme" sport.  Oh, there was no athletic ability involved--not unless you count hauling cases of newspaper/magazine clippings or occasionally sprinting across a campus...in high heels.  Once in a full-on snowstorm.  High heels are great for digging into some kinds of snow--trust me on this.

Basically, I was on the speech team.  And while I joined thinking that my focus would be on interpretive readings of poetry/prose/drama, I was shanghaied into something called "Impromptu."  The rules for that type of speaking were straight-forward:
  1. You had absolutely no idea what you were going to talk about until it was your turn.
  2. You would be given a topic, typically in the form of a quotation or proverb.
  3. You had seven minutes to prepare and deliver your speech; you were docked for going over seven minutes--or running significantly under.
  4. While you were prepping, the judge for your round would call out elapsed time in 30-second increments, and give you a countdown of time remaining by hand while you were speaking.
Talk about an adrenaline rush.  But it quickly became my favourite.  By sophomore year, I managed to un-suckify my skills well enough (relative to everyone else in my district) to actually advance to Nationals (where I pretty much sucked relative to the rest the country). 

How does one train to face the unknown?  My team more or less had a structure for each speech, which (apart from the obvious benefit) helped you pace yourself when were actually speaking.  But the Team Elders--which included the reigning national champion in Impromptu speaking (no pressure there)--also kept a full desk-drawer of note-cards containing quotations.  Sometimes it seemed like every sound-bite uttered by Every Famous Dead Person Ever was in that drawer.  To practice, you'd draw a card at random and find yourself riffing on anyone from Karl Marx to Groucho Marx.

My afore-mentioned Elders, as well as the official coaches, also liked to move the proverbial goal-posts.  For instance, after the first semester of my Freshman year, I wasn't allowed to use a note-card to jot down my outline--from then on, I had to keep it all in my head.  Still later in my career, the Assistant Coach knuckled down on my prep time.  Most folks spent between one and two minutes of the allotted seven in preparation.  I wasn't allowed more than one in practice.  Of course, the consequence of having less time to prepare was that I had to fill that much more time with content to hit the seven minute goal.  Eeep!

Like I said, it's one heck of an adrenaline-kick--even compared to public speaking when you have a prepared, memorised presentation.  (And keep in mind that PowerPoint hadn't even been invented at that point.)   Apart from reading, writing, and math, here are few skills as valuable as learning to hide your terror while you're on stage...or even being put on the spot in a meeting.  And being able to riff while still keeping the salient points at your synaptic fingertips is a huge--yea, even ginormous--bonus to that basic skill.

But the afore-mentioned drawer of quotations wasn't just a means to an end, either.  Being largely the handiwork of a couple of political junkies and a six-year student about to graduate with a Philosophy degree, you can imagine the breadth of topics.  Combining that with the discipline of organising my thoughts on the fly and making them sound good when they came out, the experience was a liberal arts education in microcosm.  And it is quite possibly the single most useful thing I gained from my four-year degree.  (Also, that's how I met Dennis, because he competed on the speech team of a neighbouring university.  If that's not a win, I don't know what is.)

Nearly three decades later, I'm fairly certain that this training has improved how I do my day-to-day work.  See, programming is no different from any other expertise in that the proverbial 10,000 hours of practice rule really applies.  But unlike, say, mastering the violin, the range of things you need to unlearn and learn anew to stay on top of the game doesn't diminish with time.  Moreover, much of that unlearning/relearning is done on the clock, on the spot.

That's where the Team Elders--in this case, the mavens of an unfamiliar (to me) technology/language/platform--come into play.  Their code snippets, typically by the time they're blogged or accepted/up-voted on StackOverflow, are like the nuggets received wisdom I once pulled out of the desk drawer.  Likewise, much of what I do afterwards is to fit them into the framework of the moment.  All while the clock is ticking, of course.  At the end of this process, if I've done my job correctly, there's the sense that I've added context and relevance to the snippet's pithy brilliance.  And I can vouch for being better off for the experience--maybe a little smarter, maybe a little more efficient, maybe even both.

So don't ever let anyone tell you that a Liberal Arts degree doesn't have practical value.  I'm living proof that it does.  Oh yeah, and I moonlighted on the Debate team, too--so I've been trained argue for a reeeeallllly loooooong time.  ;~)

Wednesday, December 3, 2014

Digital rubbernecking

As a programmer, I write a fair amount of code that is paranoid--specifically, of anything coming in from the Web.  There is also the added overhead of dealing with encrypted data--passwords, email addresses and the like.  The rest I more or less leave to the folks who set up the servers on which my code and data lives.  Ditto the folks who set up the routers and networks and who invent/improve the encryption algorithms.

That's not to say that I'm not fascinated by issues around security and cryptology.  It's just that I know that I have no aptitude for it--particularly what you'd call "thinking like a hacker."  And hacking-post mortems are like candy for me.

Which, in the wake of last week's Sony mega-hack, basically makes me a rubbernecker.  (In my defence, I don't rubberneck in real life; gawping at disasters online doesn't slow down the traffic or hinder first responders.)

Oh, and is this ever a train-wreck.  Partly, it's the sheer scope.  Thirty-eight million files stolen--some of those files whole databases. 
  • Movies leaked to torrent sites before their release date
  • Scripts for movies not even in production yet 
  • Source code--presumably for games
  • Legal assets like contracts, non-disclosure agreements, etc.
  • Salaries, including highly embarrassing discrepancies in executive level pay
  • Human resources data, including social security numbers, addresses, birthdays, phone numbers, etc.
  • Sales and financial data going back years.
  • I/T infrastructure maps, complete with security credentials
Worse, Sony's own Playstation-related Amazon cloud servers were apparently being used to distribute stolen data.  Ouch.

Also, though, there was the initial blank-wall response, and now the possibility of fingering the wrong wrongdoer.  North Korea was the prime suspect from the get-go.  That assessment has been disputed and even criticised by the infosec. community, but that's Sony's story and they're sticking to it.  You have to admit, being targeted by a rogue government makes for better security theatre than falling victim to an inside job carried out by pissed-off plebeians.

Oh, and passwords weren't encrypted and the hackers managed to nab SSL root certificates that won't expire for years?  #headdesk

It's impossible not to look, right?  There are just so many flavours of "screwed" involved here--for the short-, medium-, and long-term:
  • Revenue lost to piracy
  • Further revenue loss if said pirated content sucks and no one wants to pay to see it
  • A pretty-much-unquantifiable loss in competitive advantage to its competitors in the entertainment and gaming industries
  • Equipment and staffing costs for scanning, then scrubbing or replacing every single computer currently on possibly touched the network
  • Nothing short of an identity theft nightmare for thousands of employees and contractors:  Sony footing the bill for any reasonable amount of credit-monitoring and remediation will easily run into the millions of dollars
  • The productivity-killing morale-buster for employees now freaking about their current job or their future credit rating
  • Possible (probable?) massive class-action lawsuits, particularly if North Korea doesn't turn out to be the villain after all
  • The inevitable stock price bobbles, particularly as the after-shocks play out
One hopes that, with all those hits to both sides of the balance-sheet, Sony can scrape together the cash to build stronger, higher walls between its data-compartments.  (Translation:  One hopes that Sony--all historical evidence to the contrary--has learned its lesson.)  As the Forbes article mentioned, a breach for Sony's movie division should theoretically have had zero impact on its PlayStation division.  Unless this was a very carefully-timed parallel attack, that sort of information-bleed across departments (e.g. HR and Legal), not to mention across whole product divisions, is straight-up inexcusable.

If it sounds like I'm blaming the victim, I am--but only sorta-kinda.  Yes, it's tempting to see this as karma for a company that had no problem infecting paying customers with malware--basically using them as conscripts in their battle against piracy--thus leaving them open to other hackers.  And, honestly, Sony's response when the news broke might just be the douchiest thing you'll read all day...assuming you're not following Timothy Loehmann / Daniel Pantaleo apologists on Twitter, of course:
NPR was one of the first to report on the scandal on November 4, 2005. Thomas Hesse, Sony BMG's Global Digital Business President, told reporter Neda Ulaby, "Most people, I think, don't even know what a rootkit is, so why should they care about it?"


Obviously, I don't work for or at the company, so please don't think I'm speaking with any evidence-based authority here.   But the circumstantial evidence points to a management mindset in which security was viewed as an expense to be minimised, rather than an asset to be built and leveraged as a competitive advantage.

If true, investors and other stakeholders should take that cavalier attitude--toward their own crown jewels as well as the personal data of others--as a sign-post on the road to extinction. 

Because ultimately, Sony lives in a fully digital world.  Movies no longer exist as spools of celluloid.  Except for audiophiles, 21st century music is not served up on fragile black vinyl platters.  Most games do not play out with wooden/plastic/metal markers on cardboard these days.  The upside of that world is that copies of intellectual property can be made for mere fractions of pennies.  The downside of that world is that copies can be made for mere fractions of pennies.

Playwright GB Shaw claimed that because the "reasonable man" adapts himself to his environment while the "unreasonable man" adapts his environment to suit himself, all progress must therefore be driven by the unreasonable man.  But that assumes two finite ends of a continuum--a continuum that ignores the possibility of unreasonability shading into delusion.  

Further back in time, the earliest tragedy tracked the ruin of the great, often precipitated by the hubris with which they met forces or events beyond their control.  You'd think that an entertainment company would take that wisdom to heart.  The warnings of Euripides and Sophocles ring true even today.  But companies like Sony bear no resemblance to the travelling companies of players in centuries past.  They're little more than accounting machines, slicing revenues into royalties and residuals. 

But you're smart enough to actually listen to your security folks, am I right, Gentle Reader?  Please tell me I'm right.  Because as much as I do enjoy a good hacking post-mortem (in the same way some people enjoy a good murder mystery), I'd really rather not be rubbernecking at the hacking of someone I know.  Thanks.

Monday, December 1, 2014

Two styles of product development

Whew.  That was a close call.  For a few minutes there, I thought that the house had also eaten James Webb Young's A Technique for Producing Ideas.  (Mind you, I can't find the Penguin History of Canada when I need it, so the house still has some 'splainin' to do.  But that's another story.)

I have a librarian friend who--unsurprisingly--organises her home bookshelves according to Library of Congress numbering.   I'm not so professional m'self, but even so, my History shelves are parsed according to a system...albeit one which likely makes no sense to anyone else.  And the delineation between "literature" vs. mere "fiction" admittedly tends to factor in book size to an inordinate degree.

But given my usual categorisation instincts, the fact that I didn't immediately think to look for Young's book in the "software development" neighbourhood is disgraceful, truth be told.  Particularly as that's also where Strunk & White and a dot-com-vintage Chicago Manual of Style live.  (Anyone who thinks that being a software developer starts and ends at the bytes is--in Web 2.0 parlance--"doin it rong."  So sayeth my bookshelf:  Your argument is invalid.)

A Technique for Producing Ideas is a slim slip of a book--almost a pamphlet on steroids, really.  It dates from the advertising world of the 1940s--notably before "the tee-vee" became the most valued fixture in America's living room.  But even a grab-your-belt-and-fly-by-the-seat-of-your-pants autodidact like Don Draper would have known it chapter-and-verse.  And it's no less relevant today, when all we First World pixel-pushing proles (allegedly) need to do to hose the backwash of globalisation off our blue suede shoes is "innovate."  (This is where I'd love to link to Rory Blyth's brilliant, scathing "Innovidiots" blog-post, but it looks like it's offline indefinitely.)

Absent Mr. Blyth's take on the subject, I think our biggest problem with the term "innovation" is its intimidating suggestion of the blank page.  And I don't think I'm making a straw-man argument when I say that, popularly, innovation is too often conflated with creating something ex nihilo. Intellectually, you can look at the portfolio of the most valuable consumer products company on the planet (Apple) : Graphical user interfaces, MP3 players, smartphones, and tablet computers--and know that Woz/Jobs didn't invent a single one of them. 

That insight doesn't necessarily help when you're staring into a looming abyss of enforced downtime--yay, holidays.  It helps even less to remember that Sir Tim Berners-Lee invented the HTTP protocol on Christmas Day.  No pressure there... [grumble]

So to bring things down to the scale of a manageable metaphor, you mainly just need to decide whether you're ultimately the waffle-iron or the crock pot when it comes to making something.

Waffles have the advantage of being very specific--anyone who's been by the grocery story freezer case should have the basic idea down.   But the parameters, to a certain extent, are relatively fixed:  Starch--typically flour--for body, eggs for structure, some sort of leavening (typically baking soda/powder or yeast) for loft, and milk to make the batter pourable.  Too much of one thing and you could have a weird looking hockey-puck or (literally) a hot mess.  Moreover, modern electric/electronic waffle irons typically impose limits on temperature.

Within those basic parameters, however, you can make some amazing waffles.   (In my world, read "Dennis" for "you.")  Making a "sponge" of yeast-leavened batter the night before, and only adding the eggs in the morning, for instance, makes for a revelation in texture.  Likewise, eggs can be split into yolks for the main batter, while the whites are frothed and gently folded in afterwards.  A touch of vanilla or almond extract?  Yes, please.  Topped with lingonberry syrup (because you live close enough to Newfoundland/Labrador that it's a staple in Maritime grocery stores)?  Bring it.

Waffles are incremental innovation in a nutshell.  Evolution, y'understand.

In contrast, there's the crock pot.  True, milk and/or eggs probably won't be staples of most recipes.  But apart from those, you have a lot of latitude...assuming you respect the laws of Physics.  A crock pot will happily simmer organic vegetarian vegetable soup all day.  A crock pot will just as happily caramelise cocktail weinies and bottled BBQ sauce into artery-clogging, potentially carcinogenic ambrosia.  A crock pot doesn't judge. 

In tonight's metaphor, that latitude is what pushes the crock-pot toward the "revolution" end of the invention spectrum.

I'm not particularly partial to either--in fact, I'm delighted when an idea that I consider commoditised is successfully re-invented/re-imagined.  LCD monitors, LED light bulbs, thermostats, etc. 

But whether you ultimately choose to make waffles or some slow-cooked goodness, the end-goal is the same.  Sure, maybe the first few attempts you'll end up feeding to the dog or what-have-you.  But ultimately, you have to muster the confidence to serve it to company.  Because just as there is no art without an audience, there is no invention without an end-user.