Saturday, November 29, 2014

Silly Saturday, 2014.11.29: Nerd vs. Snob

Dennis & I bottled a kit's worth of red wine last weekend, a blend of Shiraz and Cabernet Sauvignon.   Unlike (most) whites, reds typically need a little time to settle into their new digs.  Our future dinner-companion was no exception.  "The fruit and the tannins really haven't melded yet," I pronounced after a sampling sip, "Does saying that make me a wine snob?"

"Yes," pronounced Dennis.  And I laughed, because I knew he was rattling my cage.  (He has standing orders to shoot me if I ever become a wine-snob, and I know darned well that he'd be too sneaky to let me know what was coming.)

I will cop to being a wine nerd...or at least a wanna-be wine nerd--no question.  But it wasn't until a bit later in the afternoon that the difference between wine snob and wine nerd really occurred to me.  What's more, I think that my nerd-snob distinction pretty much applies to anything about which one can be a nerd or snob.

Simply, this:  A nerd tries not to let their judgment interfere with learning; a snob tries not to let learning interfere with their judgment.

That's not to say that a snob won't keep up with the material; it's just that their grading-system is pretty well set for life.  And it's also not to say that a nerd doesn't have standards.  "Two-buck Chuck" is still plonk.  And over ice?  [insert uncontrollable twitching]

Wednesday, November 26, 2014

The flip-side of an old engineering adage

The description of software development as an "engineering" discipline is, to me, one of those "for lack of a better term" bits of taxonomy.  Sure, there are marked similarities.  In an increasingly networked world, it can truly be said that lives are riding on things working as designed--even when those "things" are merely electrical impulses transmitted between or stored in bits to silicon or magnetic platters.

There's another area where software engineers and all other flavours of engineers certainly do overlap.  That's in the common adage, "'Better' is the enemy of 'done.'"  In management, it's the mantra of those who have to keep an eye on cash-flow.  Below management level, it's the mantra of people who have this filthy habit of wanting to spend time with their families and/or significant others.

Don't get me wrong:  I'm totally about 40 hour workweeks and keeping the company solvent.   I consider anything less an #epicfail on the part of management.

Yet, what rarely (if ever) is addressed is that the adage has an equal-and-opposite truth:

"Done" is the enemy of "better."

If you didn't instinctively grok the essence of that, I can pretty much guarantee that you will the first time you have to get Version 2.0 out the door.  All those corners you had to cut?  They're still as sharp as they ever were.  All those values you hard-coded as 1:1 relationships because you didn't have time to populate association tables?  Consider them diamond-hard-coded now.   Yeah--have fun dynamiting those out.

Now, I would never say that those first-iteration shortcuts were ill-judged.  After all, this is Version 1.0 we're talking about.  One-point-oh is a unique and invariably rude & mercurial beastie.  Version 2.0 is our attempt to domesticate it.  Flea-dip.  De-worming.  The dreaded trip to the vet's office.  If we play our cards right, it won't eat any (more) couch cushions.  If we're very lucky, it won't lick its nethers in the middle of the living room...at least not while company's over.

Problem is--and oh, Friends and Brethren I confess that I also have the stain of this sin upon my soul--too many times we don't take Version 2.0 as seriously as we do 1.0.   First generation products are castles in the air that have been successfully brought to earth.  That's a massive and commendable achievement.   But thinking of Version 2.0 purely in terms of "features we didn't have time for in Version 1.0" is not a forgiveable sin for anyone who aspires to the title of "Software Engineer."  After all, no sane engineer would dream of adding turrets and towers to a castle built on sand.  

For programmers, the work authorisation for Version 2.0 is an opportunity to pay down some (technical) debt, not just wear the silver numbers off a brand-new debit card.  And in actuality, I'm preaching to the choir for the vast majority of programmers.  It's the folks who commission programmers to make new stuff for them that I'm hoping to convince here.

Clients:  Your programmers saved you a wad of cash up-front when you weren't sure whether that wild-haired idea you had on the StairMaster would actually pan out.  But its weekend of garage-tinkering in its boxer shorts is done; let's get this thing showered, shaved, and dressed for real work.  Don't be surprised when that takes extra money and time.  Whether you were aware of it or not, you're the co-signer on the afore-mentioned 1.0 debt.

That probably comes off as sounding brutal.  But I can assure you that it's a mother's kiss in comparison to the sound of a potential customer clicking on a competitor's product instead.

Monday, November 24, 2014

When sorcery and software don't mix

It's been nearly a decade since I had to wear the "Sys. Admin." hat full-time, but apparently the karma that goes with that role hasn't entirely worn off.  Today was the first time I realised that this can sometimes be a mixed blessing.

Let me back up for a bit and first define what I mean by "Sys. Admin. karma."  Let's say you work in an office environment and your computer is, for lack of a better term, "being stupid."  Maybe you've already rebooted, or maybe that would throw the proverbial monkey-wrench into your current workflow.  Either way, you're hosed, and it's time to call in someone whose job it is to un-hose you.

Back in the Day(TM), in another country, in another industry, that would have been me...when I wasn't babysitting servers or refurbishing workstations for the new folks being shoehorned into a rapidly-expanding staff.   Now, my office was tucked away from most of everyone else--probably because I shared it with four servers and, hoo-boy, were they loud.  So by the time I'd crossed my floor to the stairwell and trotted over to the far end of the lower floor, the problem had a good chance of fixing itself.  Memory/CPU usage had stopped spiking, a file lock had been relinquished, whatever. 

Being Upper Midwesterners, my co-workers would typically apologise profusely for "bugging" me, typically after swearing up and down that the problem had been there just a minute ago, really-and-for-true.

That's Sys. Admin. karma.  The phenomenon is not limited to I/T of course--as anyone who has had their car's disconcerting squeak/rattle disappear on the way to the mechanic can attest.

When I changed jobs back to developer, I was spoiled for several years by having The Sys. Admin. Who Walks on Water there.  But for my own projects, particularly after going freelance, I'm pretty much on my own.  So it was today after I was fresh off a status call with a client.  We'd both noticed that there'd been no actionable traffic to/from his web app.  That's weird for a Monday.  But then again, it's a slow week in the U.S. due to the Thanksgiving holiday.

Or so I rationalised.

For a short while.

Inevitably, paranoia got the best of me, so I logged in to peek at the database.  Sure enough, data was still being crunched; it's just that nothing had tripped the required threshold.  So I emailed the client to let him know that, so far as I could see, everything was cool.

Not fifteen minutes later, the app. spit out a couple of emails indicating action items.

It was pure coincidence, of course.  (No, really.  Pinky-swear.)  Yet the human mind could easily translate the juxtaposition of me telling my client that everything was cool and the sudden appearance of app.-generated emails into a cause-and-effect relationship. 

Technically, that's synchronicity.

But--is that necessarily a bad thing?  From an outside perspective, I only had to log in, barely poke around, and the inscrutable Server Gods blessed the client with a couple of emails.  Magic!  w00t!  Five points for Hufflepuff!

Problem is, the root of magic is the audience seeing an action (or set of actions) result in something seemingly impossible...or at least counter-intuitive.  In the absence of complete information about inner workings, folks will construct their own narrative.  Professionally, the magician has two jobs:  1.) Conceal the actual process between the action(s) and results, which includes 2.) Preventing the audience from forming unwanted hypotheses about cause and effect.  

But since the days when we stared into the darkness outside the firelight in hope that the darkness wasn't staring back at us, our species has mastered few skills quite like narrative-generation.  (Which probably explains why statistics--more honoured in the misuse than the use--have a bad name.) Thus, one person's magician is another's charlatan--or, worse, practitioner of the Dark Arts.

In my case, my client could suspect that I quietly fixed some bug under the guise of "sanity-checking" that the app. hadn't stalled out.   And, in the face of suspiciously close timing, I couldn't in fairness call that unreasonable.

Right now I'm trusting to nearly a year and a half's work with said client that he doesn't, in fact, suspect me of server-side slight-of-hand.  Mind you, I do still occasionally take joy in finding the magic in what I do for a living.  But I know that I'll never have the marketing chops to peddle it.  Then again, if I can earn that kind of trust from someone with a very different skill-set, that's a higher form of magic than anything I could coax from a compiler, no?

Friday, November 21, 2014

Frivolous Friday, 2014.11.21: Maslow's Hierarchy for programmers

amirite?
/ \
/   \
/     \
/       \
/         \
/           \
/ Source \
/  Control \
-------------------
/  Debugger  \
-----------------------
/     Runtime      \
----------------------------
/       Compiler       \
--------------------------------
/      Code Editor       \
------------------------------------

- - - - -

* If you weren't required to take something like Psych. 101 as part of your GenEd. requirements, here's the original: http://www.simplypsychology.org/maslow.html

Wednesday, November 19, 2014

Working "unplugged"

So there's been a bunch of chatter in recent years about reducing distraction at work...at least when we're not being all collaborative-y, getting loopy on dry-eraser fumes and all that knowledge-pollen we're sharing while singing "Kumbaya" in some corporate war-room.

Turning off your cell phone, powering down the IM client, signing out of social media, even checking out of the digital Hotel California otherwise known as email--that's how I usually understand "unplugged" working.

But what would happen if we also pulled the plug on our workstations?(Assuming that they can run off batteries, of course--regular PCs don't take kindly to that sort of thing.)  Human value-systems shift drastically when something previously taken for granted becomes scarce.  I can't imagine that electrical current is any different.

I, working on this 2009-vintage Dell would be lucky to see half the battery life of, say, a new-ish Macbook Air.  But...would that make a difference in productivity?  That's the interesting question.

Maybe, when we know that we need to go heads-down on some chunk of work, we should think about turning that battery indicator into a hourglass.  (And, yes, I'm thinking about that scene from The Wizard of Oz.)  Scarcity clarifies...and while not the mother of invention, is frequently is the midwife.

Monday, November 17, 2014

Yin and Yang, web style

I don't know whether there's a French equivalent, but in my little corner of l'Acadie, I've been living the web development equivalent of the English adage that "the cobbler's children have no shoes."  Meaning that I have yet to put any content out on the business domain I've been hosting for over two and a half years.

That's finally changing.  Anyone who knows my "business" Twitter account (bonaventuresoft) recognises the sailing ship avatar.  It's a nod to the ships so closely entwined with New Brunswick's history.  (I just want to go on record now as saying that NB has the coolest flag of any of the Canadian provinces/territories.  Sorry, everybody else.)

But I digress.

To me, the sailing ship evokes the 18th century, and its Colonial-era aesthetic here on the East Coast, so I was trying to capture that look and feel as much as feasible for the eventual home of the bits that will make up www.bonaventuresoftware.ca.  Obviously, I don't want to completely replicate the newspapers and almanacs of those times in every respect.  Multi-column layouts and packed text is antithetical to the web browser experience.  And--for cryin' in yer syllabub--keep that stupid "s" that looks like an "f" out of the 21st century.

Now, calligraphy is one of my nerderies; alas, my grasp of its fundamentals is only applicable up to about the year 1600 or so.  But of the several favours I owe Tantramar Interactive, one is pointing me to the correct font-family for the job.

In Design is a Job, Mike Monteiro points out that the increasing sophistication of web browsers (and their sorta-kinda-maybe-lip-service-commitment to W3C standards) renews the demand for skills of traditional print-oriented graphic artists.  In particular, programmers who code solely for newer browsers have enough fine-grained control over layout, typography, gradients, transparency, etc.  And goodbye to "Any font you like, as long as it's Arial or Roman."  (Woo-hoo!)

Which is great news for those of us who use the web day-in and day-out.  For those of us who make things for the web, but do not have a graphics background, not so much.  And thus much of my time on this project has been spent fighting small skirmishes with things like indentation.  And tilting at windmills like aligning the Roman numerals used in a list. 

(This is why I'm always happy to hand layout work off to web designers/developers who actually grok cascading stylesheets (CSS).  And give them ample kudos for not only understanding the difference between box and inline flow, but also knowing where the sharp edges of each version of each browser are.)

But it's not all slogging on this project.  A closer look at the Atom syndication of this blog handed me an unexpected gift, namely that it includes the labels (e.g. "Software Development," "Innovation," etc.) that I can optionally apply to any blog post.  That gives me a way to automatically cross-post specific topics to the blog.  (In a shocking development, potential clients may not actually base their purchasing decisions on a freelancer's ability to filk tunes or scour the internet for Elvis photos.  Or even for their world-class sarcasm.  Who knew?)

What makes that possible is that, while HTML tags (and their attributes) are used by web developers to format content, Atom feeds (like their RSS forebears) use tags to give meaning to content.  (Yes, I know that, originally, HTML was intended to denote things like paragraphs, headings, tables, etc.  It didn't take too long before that was honoured more in the breach than the observance.  Now, with a little CSS, you can created lists that run horizontally rather than vertically.)

Thus, in the context of one very minor project, it's hard to ignore the two directions in which the web has been--and will continue to be--pulled.  The left brain prefers XML content that is what it says it is (titles, publication dates, etc.).  The right brain wants HTML5 that's pretty and friendly to any number of screen sizes.

And while I'm fairly disciplined about marking up my HTML to make it XML-like, I know by now that the promise of XHTML will never be realised.  But maybe convergence isn't the point.  Maybe it's a yin and yang thing...and maybe that's also fine.  Because the software trade needs both left and right brains.

Friday, November 14, 2014

Frivolous Friday, 2014.11.14: Live-blogging my NADD* navel-gazing

In software development, there's a term known as "yak-shaving," which refers to all the time-consuming stuff that you didn't budget time for doing before you could get down to the serious business of coding.  Or maybe debugging.  Scott Hanselman's definition is the most cited.

Today, my client and I are basically in evaluation mode for code that's been rolled out in pieces over the last few weeks.  So far--knock on wood--nothing's gone kerblooey, so I set aside a "half-day" to get my house in order for upcoming development projects.

W00t!  New dev. tools!  It's Christmas morning!

... But first, I really should install all those system/software patches to which I've mostly been giving lip service (if that).

... But there are some kernel-level updates for the Debian laptop (my primary computer for development and emailing clients)--it'd be a good idea to back up email first.  In two places, because this is mission-critical stuff.  Okay, patched and turned off.

... But the brand-new Windows 7 installation is complaining that the standalone (not OEM) copy of Windows is not valid.  Some updates fail.  So does the online attempt to prove to MS that my copy came from Best Buy and not the back of a windowless van.  So I dig out the DVDs and re-type the activation code...no thanks a certain Office Cat #1 who shall remain nameless.  Reboot, lather, rinse, reboot.

... Ubuntu workstation updated w/o any static.  Good baby.  [pats top of case]

... But the Windows 8 laptop is not prompting me to download and install updates, despite being off for well over a week.  Okay, where are they hiding update in Windows 8?  Found it.  Go do something else while those downloads take their sweet time to download.  Install.  Reboot, which installs more patches.  Okay, you do that, Windows 8.  Sure, let's take Windows Defender out for a run while we're at it.

... So...time to turn this shiny Win 7 installation into a real software development box.  Geany?  Check.  Mercurial?  Check.  NetBeans?  Whoops--I need the JDK first.   Which fails on the first download.  Try again.  Okay.  Now NetBeans.  Cool.

... But out of the box, a lot of Geany's default settings are the polar opposite of my preferences.  Go away, message window and sidebar.  80 character line wrapping, please.  Show white-space, and tab-indent at three spaces, thank you.  That will be all for now, Windows7.

... But last night I learned that Microsoft had released a new freebie version of Visual Studio.  So let's figure out how to enable IIS on Windows 8 (no biggie).  Microsoft wants you to create an account before they'll give you free stuff.  Fair enough.  But, no, Dennis doesn't have a login I can borrow.  Dig out the password file to make sure I don't have one.  Go to account page; try to come up with an ugly but memorable (to me) password.

... Knowing how much Microsoft will email me, I select my usual spam email.  Sign in to said email account to activate account.  Decide that as long as I'm at it, I should delete all the cron job emails that landed when I was testing an enhancement last week.  Okay, now whack a bunch of other automated emails.  Account activated.

... But there's very little point in IIS + Visual Studio w/o a database, so off to find the download for the freebie edition of SQL Server, particularly since I'm already logged into my Microsoft account.

... SQL Server download is lickety-split.  Expected downloaded time for Visual Studio climbs to over 24 hours.  Kill that.  Crud.  I already closed the download window in the browser.  Go find it again.

... Waitaminnit...isn't Firefox supposed to be dumping my history & cookies on every shutdown?  Go check settings on that.

... Dang.  I should have thought of the Visual Studio / SQL Server thing before I turned off the Windows 7 box.  Turn that box back on.  (Have I mentioned that I heart KVM switches?)

... Whoops, except that I heard Windows 7 make the startup beep, which means it rebooted itself, which requires me to (quickly) switch back to it on the KVM or it won't pick up the keyboard, mouse, and monitor and I'll have to crash it.  Okay, as long as I'm here, I might as well log in with my shiny new Microsoft account and get my freebies.  Some runaround from SQL Server...maybe I should have just sneakernetted the executables from Windows 8? 

... Um, no, I will not wait six days for a 6.x GB file that you're streaming through a digital eyedropper.  Yeah, sneakernet...something's not right.

... So....what, precisely, just happened after I installed what's supposed to be SQL Server 2014 Express?  There's absolutely nothing in Program Files, and I can't even find its daemon in Services.  Fine.  Uninstall, and hopefully I didn't clutter up the registry too badly.  Back over to Windows 8 to figure out what's going on here.

... Oh, for the love of Cthulu, how many updates does Windows 7 need to install?  Guess we're shutting down.  Again.

... Time for dinner now.  (Yeah, technically, I'm just sharpening my razor and not shaving the yak in this step.)

...  Okay, back at it.  Cool.  Now, where were we?  Right--figuring out what I actually installed when I thought I was installing SQL Server Express 2014.  Ah.  Got it.  Don't take the default options.  Let's try this again.  No, you can't contact me at my business phone number, Microsoft.  You're on the West Coast; you don't even know my time zone exists.

...  Oooof...this is going to take awhile to download.  Maybe this would be a good time to snag jQuery & jQuery Mobile, maybe even see what we can do about setting up for Apache Cordova development, since that looks like it will be in the cards shortly.

... jQuery Mobile was a almost a no-brainer except for Windows freaking out about unzipping a file in the inetpub folder.  Now download and install Node.js (crazy-simple) and Git for Windows (because Cordova uses both under the hood).

... Oh, there's a free eBook for Git?  Groovy.  Gimme summa that goodness.  Fetch the tablet and cable and import the .EPUB into the Aldiko eReader.  (Bonus:  This is Windows -> Android.  Ergo, simple little file copy.)

... Command line...npm install -g cordova.  Oh, fun little retro touch of -/|\ spinner!  Totally brings me back to the 90s.  (Good times, the 90s...except for the part about graduating as a Liberal Arts major into that pre-dot-com recession.  But, hey, I could eat half a pan of brownies and burn off the calories drinking a pot of coffee.)

... Oh!  Looks like SQL Server Express (the bells-and-whistles version) is done downloading.  Let's use the Win 7 box as the guinea-pig on this.  Copy to USB drive...eject...copy to Win7.  Mostly take the defaults while installing.  Aaaaaannnnd wait...

... Meanwhile, back on Windows 8, let's snag MySQL and MySQL Workbench.  Oh, Workbench can be installed alongside MySQL.  Ossum.  Let's do that.

...  Or not.  Visual Studio (still downloading...allegedly for another hour to go) has MySQL connectors, and MySQL knows this.  It also wants Excel (not gonna happen) and Python (that we can do).  Long story short, though, this isn't going to all happen tonight. 

... Checking in with Windows 7, it's finished installing SQL Server Express 2014...plus a piece of SQL Server 2008.  It's taking awhile to launch, which I put down to initialisation issues--and toggle back to Windows 8 to install Python.

... While Python is installing, peek in at Windows 7, and find that SQL Server Management Studio will at least launch--if slowly.  Elect to install yet another round of updates (106 of them, as it turns out--I am not making this up) and shut that machine down for the night.

... The Visual Studio installer still--allegedly--has  a half-hour to go.

And now it's nearly 11:00 and it doesn't look like I'll get to rebuilding a Raspberry Pi (which--my bad--I more or less rooted b/c I was misinformed about the permissions one needs to patch Raspbian) will have to wait until later this weekend.   Determining how well MacOS will run in a VMWare instance on Ubuntu is looking like it might even have to wait until next weekend--assuming I can finagle a legit copy, of course.  (Dirty secret:  To download a copy from The Mac Store, you have to use, well, a Mac.  Open-source, commodity hardware hippies like m'self have to do a little horse-trading, y'understand...)

A small part of me desperately misses the SysAdmin Who Walks on Water.  But the vast majority of me fully appreciates what an amazing time it is to be a developer here in the developed world.  As I hope that the above (bit)stream-of-consciousness fully demonstrates, the only real problem is the embarrassment of riches one has at the other end of one's broadband connection.

And I would be remiss if I didn't credit the commercial software behemoths for what they contribute to the ecosystem.  Microsoft is mentioned above...a lot...but Oracle--miraculously, despite every incentive--has yet to kill off MySQL.  Java/NetBeans are still free-as-in-beer in 2014  (also thanks to Oracle's noblesse oblige).  Apple has loosened the screws--a bit--on how one can generate the bits for an iOS app.  One hopes they will eventually have no choice but to come back to ground in other respects--particularly if they don't stop treating Mac developers like untermenchen.

- - - - -

* NADD is a term coined by (the oft-quoted) author Michael Lopp, and stands for "Nerd Attention Deficit Disorder."  NADD stands in surreal contrast to the monomaniacal state of concentration we geeks are known to achieve when debugging or taking a firehose of interesting data straight to the brain.

Tuesday, November 11, 2014

A nerdy Remembrance

I decided last evening to break the Monday-Wednesday-Frivolous-Friday pattern of this blog to make a tech-related post relevant to Remembrance/Veteran's Day.   Arguably, drone warfare is the ne plus ultra (and probably the reductio ad absurdam besides) of the earliest military apps--namely, target acquisition.  But having already covered that plus some origins of the wireless communication that also make drone strikes possible, I was casting about for a fresh twist on the intersection of military and technological history.

WWII buff husband (and kindred geek soul) to the rescue!

I want to make it absolutely, Waterford-crystal-clear that my intent tonight isn't to glorify military applications of technology.  But when Dennis mentioned something called the Norden bomb-sight, I was intrigued.  Because, after all, the whole point of the contraption was ultimately to drop fewer bombs overall by having them land on people who (arguably) "deserved" them...as opposed to collateral civilian damage.  (Which, in turn, requires fewer missions in which "the good guys" will fly into harm's way.)  For that to have a reasonable chance of happening, altitude, airspeed, the vector of the aircraft (relative to the target), the speed and vector of the wind (relative to the aircraft) all have to be taken into account.  (Remember that the next time you're assigned quadratic equations by the dozen in Algebra.  Also, please be grateful that you don't have to solve them while being shot at.)

What Dennis described over dinner frankly sounded like what you'd see nine months after a gyroscope, a telescope, and a slide-rule all woke up during a hazy weekend in Vegas.   That's not too far off in some ways, though later incarnations of the device also plugged into the plane's autopilot and even controlled the bombs' release because its precision was thrown off by human reaction-times.

Not surprisingly, this was top-secret stuff, at least until toward the end of The Good War.  Norton-made (and rival Sperry-made) bomb-sights cooled their gears in safes between flights, and were supposed to be destroyed in the event of impending capture--even at the cost of American lives.  British allies offered to trade the Crown Jewels--meaning sonar technology, rather than the shiny gew-gaws currently on display in the Tower of London--for its use. 

Ultimately, however, it was an evolutionary dead-end in technology.  It was sold to the Nazis by spies, but never used by a Luftwaffe that preferred dive-bombing.  American bombers eventually adopted the carpet-bombing tactics of their RAF counterparts.  So why care?  (Well, besides the fact that I'm kind of a sucker for analog computing...though I have yet to learn how to use a slide-rule.  Bad me.)  Alas, it's also a grand textbook-quality example of a technology's life-cycle.  
  • Usability issues?  Check.  Earlier versions of the device almost required an advanced degree in Mathematics...and pure math nerds could arguably be more useful at Bletchley Park or Nevada.  (To its credit, such issues were addressed in future iterations.)
  • Prima-donna egos?  Check.   Its eponymous developer, Carl Norden, had originally worked with his future rival Elmer Sperry, but the two geniuses had parted ways before the onset of the First World World War.
  • Over-hyped promises that didn't hold up under field conditions?  Check.  Jet-stream winds, cloud/smog cover, higher-than-anticipated altitudes (required to avoid detection and anti-aircraft fire) and a host of other issues put the lie to marketing claims of dropping bombs into "pickle-barrels," and users were forced to develop workarounds (literally) on-the-fly.  (Worse, failures have been too often blamed on operator error.  Oh-so not cool, yo.)
  • Engineering vs. Management?  Check.  Mercifully, the Navy paired up Army Col. Theodore Barth as yin to Norden's yang.  The two became not only a formidable combination but good friends.
  • Politics?   Check, check, and check.  Army vs. Navy. U.S. vs. U.K.  Not to mention Round II of Sperry vs. Norden, when the former was called on to take up the slack in the latter's ability to keep up with wartime demand.
  • Prolonged obsolescence due to bureaucratic inertia?  Check.  When last sighted (pun intended) Norton bomb-sights were dropping reconnaissance equipment in Vietnam.
None of the above is intended to cast the technology as a failure.  Even its incremental improvements during wartime were impressive.  Doubtless, its successes saved lives that would be difficult, if not impossible, to count.  And the impulse to minimise loss in an era of total war is nothing if not praiseworthy.

Then, too, as a programmer--particularly one married to a recovering manufacturing engineer--I flatter myself that I have some appreciation of the problems of scaling something new.  Sometimes it seems like the real world is nothing but edge cases.  At the same time, once it latches onto something better than the former status quo, it typically relinquishes it only after a bitter fight.  I sympathise--really, I do.

Yet, if the Norden example is how old men behave when they send young people into possible death and mayhem (with PTSD, addiction, divorce, homelessness, neglect, labyrinthine bureaucracy, and who-knows-what evils to come), the least we can do for current soldiers and future veterans is to give them better old men (and nowadays, old women). 

So do me a favour and keep your poppy handy for the next time you head to the polls, okay?  For those of us who don't work directly for veterans/soldiers week-in and week-out (i.e., most of us), that's the only kind of "remembrance" that truly matters.

- - - - -

Bibliography:

Monday, November 10, 2014

Software innovation, security, and the chain of plus ça change

I've been thinking of sending a client a copy of Geoffrey Moore's Crossing the Chasm to give him an inside perspective in launching a new software offering.   Whenever I do that sort of thing, though, I always re-read the book myself, in case it comes up in discussion.  It's a fantastic book--don't get me wrong--but it's making me grind my teeth because my copy is the 2nd edition from 1998.  That the company/product references are stale isn't so bad--c'mon, I'm a History grad.  It's the feeling that I might be missing insights relevant to our broadband, mobile-driven, social media phase of the internet age.

Moore's non-tech references have sent me scurrying out to Wikipedia a few times so far.  One of those references was a quote by Willie Sutton, a prolific--though gentlemanly--bank-robber of the mid 20th century.  One of Sutton's nick-names was "the actor," because his preferred M.O. was impersonating people who would have a legitimate reason to be in the bank, jewelry store, etc. as a non-customer.  Additionally, one of his prison escapes involved dressing as a guard.  The true brazenness of that escape was in how, when he and his fellow escapees were caught in the glare of a searchlight as they were putting a ladder against the wall, Sutton shouted, "It's okay!" and the gang was allowed on its merry way.

Sutton caught my interest not because of his apocryphal quote, but because of his later career as a security consultant, writer, and general casher-in on his notoriety.  Toward the end of his life, he was even featured in advertisements for photo-IDed MasterCards, which (tongue-in-cheek) advised bank customers to "tell them Willie sent you."

It was impossible to miss the parallels with the only slightly less flamboyant Kevin Mitnick, over whom the media and law enforcement of the Clinton Administration worked themselves into a hysterical lather*.

Mitnick repeatedly stresses that his "successes" were more due to social engineering than software engineering. To quote an interview with CNN:

"A company can spend hundreds of thousands of dollars on firewalls, intrusion detection systems and encryption and other security technologies, but if an attacker can call one trusted person within the company, and that person complies, and if the attacker gets in, then all that money spent on technology is essentially wasted. It's essentially meaningless."

In other words, the art of impersonation strikes again.  Also like Sutton, Mitnick's career after "going straight" (in the parlance of gangster movies) involves hiring out his expertise to people who want to defend themselves against people just like him.  And, of course, writing books. 

Which--in the cluttered curiosity shop I fondly call a memory--calls to mind parallels even further afield in time and space.  My Gentle Reader will not be shocked to learn that the "father of modern criminology" and history's first private detective was a reformed criminal.  (Also unsurprising:  Vidoq's appeal for storytellers and novelists, which presumably didn't dent the sales of his own ghost-written autobiography.)

Then, too, in this part of Maritimes Canada, I only have to drive a few hours to view the remains of 17th- and 18th-century star forts in various states of preservation/restoration.  The star fort has its origins in the 15th century (as a response to the innovation of cannon).  But the example of Fort Anne in Annapolis Royal, Nova Scotia brings to memory the name of the Marquis de Vauban.  Vauban's career under Louis XIV was doubtless capped by his gig as Marshal of France.  But that career was made as an expert in both breaking and defending such fortifications. (In other words, he was a one-man arms race.  I'm sort of shocked that he didn't write an autobiography, too.)

Doubtless, My Lord de Vauban would strongly object to being compared with the above rogues, however they might have redeemed themselves to society.  Yet the parallel is undeniably apt, even for an age defended by earthen walls rather than firewalls.  The best defender is an accomplished (though hopefully reformed) offender, it seems.

Long--and meandering--story short, I'm probably fretting needlessly about missing any new insights on ideas that have been relevant since 1990 (when Crossing the Chasm was first published).  As we've seen, very rarely is there anything truly new under the proverbial sun.  But, hey, as long as I'm already making a trip to the bookstore anyway...

- - - - -

* "While in Federal custody, authorities even placed Mitnick in solitary confinement; reportedly, he was deemed so dangerous that if allowed access to a telephone he could start a nuclear war by just whistling into it." - Forbes. 2013.04.11

Friday, November 7, 2014

Frivolous Friday, 2014.11.07: What is your computer age?

It's probably a textbook case of priming, but after a Facebook exchange with pal Larry earlier this week, the "What is your real age?" (cough!) "quiz" (cough!) seemed to pop out of my time-line like baby Surinam sea toads hatching from their Mom's back.

Larry was taking exception to the fact that the cringe-worthy use of "literally" by people who really mean "figuratively" is receiving official recognition.  Doubtless, the Romans seeing Alaric's Visigoths on the horizon felt much the same. 

The English major who inhabits those corners of my soul still perfumed by old books and fresh ink is not unduly concerned.  After all, this is the natural order of things.  The person who lives where two languages blend smiles and agrees.  The History major squawks, "Just be thankful you're statistically likely to live long enough to witness it!"

My inner I/T Geek just rolls her eyes and thinks, "Oh, honey, please."

I'm already feeling d'une certaine age as it is.  Granted, I've thus far been spared the horror of hearing my all-time favourite tune from high-school re-booted by a dreckish pop-star/boy-band who won't be around long enough to be filked by Weird Al.  But it's bad enough hearing covers of crap that should have stayed buried alongside the first Reagan Administration.  (Ditto movies.  I mean, seriously-and-for-realz-y'all, was Footloose not actually bad enough the first time around???)

But compared to measuring age by computer advances, that pales to #FFFFFF.  Go back to the line in Apollo 13, where Tom Hanks' character talks of "computers that only take up one room."  I was born when they were still multi-room.  Gordon Moore had already made what must have seemed like pie-in-the-sky predictions about the computing capacity of the future, at least to the specialists in the know.

But advances miniaturisation meant that permanent storage (a.k.a. hard drives) had actually been removable for several years.  What's more, you could actually own them instead of renting them from IBM, who needed a cargo plane to ship 5 megabytes to you a decade or so earlier.

My step-sisters had "Pong" in the late 70s, but it wasn't until the (very) early 1980s when the middle school computer lab teacher humoured me by actually letting me store my (admittedly) pathetic attempt at re-creating "Space Invaders" onto cassette tape.  Our TRS-80s and TRS-80 IIIs didn't actually have removable storage.  For normal programming assignments, you printed out your BASIC program and its output in 9-pin dot-matrix on 15cm wide silvery paper (that picked up fingerprints like nobody's business), stapled it to your flow-chart, and turned off the computer (losing your program for good).

By high school, we had the mercy of Apple IIes and (360 KB) 5.25" floppy drives--i.e. no retyping programs from scratch if you screwed up.  And 5.25" floppies it remained through college--CDs were what music came on...if you weren't still copying LPs onto cassette for your Walkman.  I carried two of them in my backpack.  One was the DOS boot disk, and the other the disk that held both my word processor (PC-Write) and my papers.  Later, I schlepped three whole 5.25" floppies.  That was after PC-Write freaked out and somehow sucked my Senior project into its Help section.  (True story.  The tech. in the University lab could only say, "Wow, I've never seen that happen before," and my BFF the MIS major quizzically enquired, "Didn't you have a backup?"  and I looked at her like, "What's a backup?"  And my boyfriend spent part of Spring Break helping me type it all back in.  I married that guy--small wonder, hey?)

Nowadays, I still carry a backpack.  It's green, not stressed denim.  It's slightly smaller, because I'm not toting tombstone-weight textbooks.  Yet it carries the equivalent of over 186,000 5.25" floppy disks.  (In case you're wondering, that thumb drive is actually a step-sibling to the one that lives in my purse.  So, yes, I have actually learned my lessons about having a backup.  Go, me. [eyeroll])  And that's not counting what's archived to cloud drives at speeds known only to science fiction on the high school modems with their cradles for telephone receivers.  (Or, for that matter, even in the days when we were using AOL CDs for coffee-coasters.)

So, despite being born into a time that made the first quarter or so of my life oblivious to personal computing, that's now pretty much impossible for folks in the developed world...much less the constant churn it introduces into daily life.  And, when you spend your workdays mostly under the hood, setting your clock to the pace of hardware, software, and networking evolution is a sure way to feel ancient in a hot hurry.  (And for cryin' in yer Sleeman's don't even think about measuring time by the lifespans of technology companies.)

Fortunately, for anyone who considers I/T a craft or a calling, rather than a ticket to a middle-class lifestyle, it's a wellspring of endless possibility.  Perhaps even a fountain of youth for those who opt to drink deeply enough.

Thursday, November 6, 2014

Treating attention as a resource

In "The Reichenbach Fall" episode of the BBC series Sherlock, Moriarty trash-talks, "In the kingdom of locks, the man with the key is King.  And, honey, you should see me in a crown!"

In the digital kingdom, locks mostly come in three forms:  firewalls, encryption algorithms, and of course username/password combinations.  Keeping the baddies' fingerprints off our email addresses, credit card numbers, nude selfies, what-have-you is the point. 

But, genius that he was (or is?--we won't know until 2016), Moriarty didn't mention another species of baddie to whom locks were also immaterial.  Namely the counterfeiter.  Throwing back to Sherlock Holmes, this time the original incarnation:  "...the counterfeiter stands in a class by himself as a public danger."  Why?  Counterfeiting is really a double-crime because, if undetected, it ultimately debases the value of the real deal.

You can make the case that spam, clickbait, and SEO shenanigans fall into this category, particularly when they're done convincingly.  And they will only become better at pick-pocketing our attention.  Doubtless, there are already children in pre-school born with immunities to NewsMax skeeziness coded into their DNA, so I shudder to think of the evolutionary counter-strike coming soon to a browser window near you me us. ;~) 

We demand an internet with locks in place to prevent our money (and any personal brand we might cultivate online) from being stolen.  We resent the bandwidth siphoned off our data plans by spam & ads.  "You wastin' my minutes" denotes the waste of both money and time.  But I find it absolutely bizarre that we do not guard our attention with the same jealousy we apply to our money and time.  To a degree, it's understandable.  Multi-tasking is a prized skill--has been since Julius Caesar's reputation for dictating four letters at once...all while other people were yakking at him, no less.

The conventional wisdom is that a knowledge economy is our future.  (Personally, I'm not buying it, but that's a whole 'nuther blog post for a whole 'nuther time.)  If you subscribe to that notion, though, you pretty much have to trade the adage that "time is money" for the more accurate "attention is money."  In an economy powered by three shifts of people standing in front of machinery, attention took care of itself.  Lack of attention on the part of the worker generally ended in maiming or death and possibly a starving family afterward.  That was a world of time-clocks and piece-rates.

Powering this more nebulous economy of knowledge-y thingamajigs, however, is not a series of discrete steps performed by interchangeable labour.  There can--and should--be a process in place, certainly.  Metrics, too, one hopes.  But the emphasis is on collaboration, not a waterfall assembly-line.  (Hence the dreaded open-office layout.)  And, at least in theory, a key differentiator will be the quality of worker--not only as individual talent, but also how well their atom bonds with other atoms in a team's molecule.

But, at some point, all that cross-pollination is supposed to gestate into some innovative-y, disruptive-y, paradigm-shift-y thing that will make the company the next Google or Apple.  And that absolutely, non-negotiably, requires focus--a.k.a. uninterrupted attention.  You know how those last couple hours on a Friday (when the office has mostly cleared out) can be more productive than the entire rest of the week?  Behold, the power of attention. 

Yet here we are in 2014, when the most politically important meetings are too often the useless ones.  In 2014, we still believe that Silicon Valley is "innovating" when apps. like Snapchat are handsomely rewarded for dumping still more navel-gazing bits into the internet.  (And somebody, for the love of Cthulu, pretty-please tell me that "Yo" is dead.  Please?)  In 2014, neither Twitter nor (especially) Facebook have added a "Snopes this before you look like a moron, m'kay?" button.  (Okay, maybe that last one's wishful thinking, and sites like Snopes and Politifact might not appreciate the surge in traffic anyway.)

Let's pretend for a minute that the knowledge economy isn't just another hand-wavy term made up by MBAs to make us believe that there's light at the end off the offshoring tunnel.  If that's actually true, then we need to treat attention--ours and others--as the coin of the realm.  Prioritise the technologies who can boost signal and/or cancel out noise.

Given my druthers, I'd like to see the counterfeiters put out of business for good--specifically the greater good of the internet.  It would free up resources--including mine--to solve more pressing problems.   For the time being, however, here in the kingdom of firehoses, the woman with the sieve is Queen.

Monday, November 3, 2014

Of grey hairs and last leaves

Last week, I was teaching some old code new--or at least more sophisticated--tricks.   At one point, I tripped over a particular tactic that I thought I'd more or less "outgrown" by the time the code was originally written.  Every programmer who's been at a job long enough to replace the batteries in their cordless mouse knows this feeling.

Now, there was absolutely nothing wrong with the "old" code from either a functional or performance standpoint; it basically boiled down to style.  Alas, for all the wrong reasons--what future maintainers of this code might think heading the list--I "modernised" the syntax.  That's bad for two distinct reasons:

1.)  I'm basically lying to myself about myself.  I could write a whole blog about how not cringing at your old code is a sign that you've stagnated.  Doubtless, that's been done many times over.  So I won't.  (Though, for all I know, I already have.  I'm a little hazy on most of this blog's early history, truth be told.)

2.)  That change had to be tested to make sure I didn't introduce any errors or other unintended side effects into the code.   That wastes budget and time.

Don't get me wrong:  I believe in making code as easy as possible to skim.  That's no more than professional courtesy to future maintainers of that code...who will probably look an awful lot like a slightly greyer version of me.  Who, by the bye, should be old enough to appreciate what a little salt-and-pepper and laugh-lines around the eyes can add.  (Case in point--George Clooney.  A decade ago, I rolled my eyes at all the swooning. Now? Hawt.) In code, it's a signal that this isn't greenfield work; that's a valuable insight.

One more thought:  Trying to disguise the age of a code-base isn't doing newer programmers any favours, either.  The digital equivalent of a As-I-am-now-so-shall-you-be memento mori is good for perspective.  In our trade, that lesson's better internalised sooner than later.

And if I should live to be
The last leaf upon the tree
In the spring,
Let them smile, as I do now,
At the old forsaken bough
Where I cling.

- Oliver Wendell Holmes, Sr., The Last Leaf