Thursday, December 7, 2017

The limits of affordances

Just because I'm mostly a back-end (read database and business logic) developer doesn't mean that I don't care about UI or UX.  Mom drilled the ethic of putting myself in another person's proverbial shoes into me pretty early-on.

For what I usually do, that plays out in well-organised data-sets, scads and scads of validation code, and fighting tooth-and-nail for the absolute rock-bottom minimum of data passing back and forth between client and server.  Mostly do-able...assuming you aren't double-teamed by Marketing and Legal.  (Pro Tip:  Know your bandwith/hosting costs cold, have your calculator handy, and put the onus squarely on those noseyparker byte-hoovering data-slobs to quantify the ROI.  Preferably in front of upper management.  Chances are, they can't and won't.  And they'll think twice before doing it again.  Alas, twice isn't always enough times.  Lather, rinse, repeat.)

While my world is largely ruled by, well, rules, front-end designers and developers wrestle with the container-of-fuzzy-spaghetti-from-the-back-of-the-fridge otherwise known as the modern human psyche.  More power to them.  I've wrestled heisenbugs a-plenty (losing more than a few rounds, mind you), and still would rather re-fight every single one of those battles than allow a random user to type a DateTime value as plain text.  Seriously. 

Thus I'm a starry-eyed fan of design folks like Don Norman, Mike Monteiro, Erica Hall, and Luke Wroblewski.  (Viktor Papanek's magnum opus has been on my wish-list for years, but it's been out of print for so long.  Some day...)  Also behavioural psychologists like Daniel Ariely and Daniel Kahneman.  Likewise Clay Shirky when it comes to pointing out the differences in how humans act individually vs. in groups (absolutely critical in this connected age).

The single most important take-away from all this reading is:  Thou shalt make it easy for thy user to do The Right Thing(TM).  (In an ideal world, thou shalt make it statistically impossible for thy user to do the wrong thing, but -- Mike Monteiro excepted -- nobody in this crowd is rocking anything like Charlton Heston's beard from The Ten Commandments, so...).

And yet, for all that reading--in most cases multiple readings because it's all crazy-good--I find its limits no farther away than the local co-op grocery store.

Several years ago, I had both the keep-you-on-your-toes challenge and the genuine pleasure (because they shouldn't be mutually-exclusive) of holding down the account for a household-name client.  The kind of client that was flush enough to send holiday gifts to their vendors.  As a matter of corporate policy, I donated gifts like the cheese-knives and cutting-board to the office kitchen.  (Being the Patron Sinner of the office's "liquid potlucks," what else would I do with that kind of thing, I ask you.)

The two shopping bags were another story.  Lovely things, those.  Two sets of handles, for shoulder or hand.  Constructed from the recycled (PET) plastic equivalent of Grade-A Wakandan vibranium, IMLTHO.  (Or adamantium -- take your pick, Marvel fans.)  Crossed with a Tardis to boot.  They're awesome.  Especially when compared to the branded "resusable" bags hawked by the local grocery stores.  Two of which (in our household, at least) have been known to blow their seams in less than a month...one on the very next use.

So I, being a back-end developer aspiring to minimise my UI/UX sins, attempt to do the right thing by making it easy for my user (a.k.a. the cashier) to do the right thing.  Which means the double-decker short cart rather than the monolithic monstrosities with the turning-radius of a battleship.  Heavy stuff on the bottom (because Physics, nat'cherly) and little and/or smooshable stuff on top.

At the checkout line, the first thing dropped onto the conveyor-belt is the bags.  With the awesomer ones unmistakably on top of the not-so-awesome ones.  Then the groceries, carefully graded (front to back on the belt) from heaviest to lightest.  The affordance couldn't be plainer:  Put the stuff closest to you in these-here bags up-top.

Yet, well over 90% of the time (in my observations), the cashier dumps the awesome bags off to the side and loads the flour, the shortening, the canned goods, etc. into the el-cheapo bags and resorts to the industrial-strength bags only when the others have run out.

Why?  Because when it's the difference between the known and the unknown, afforances don't mean jack.  As far as the average customer and cashier are concerned, all branded "reusable" (in my case, the quotation-marks are there for good reason) bags are churned out by the same no-name offshore factory.  (And, for all I know, they are.  Booyah, race-to-the-bottom globalism, yo.)  The default option -- i.e. the ultimate affordance -- is the known quantity.  It's the same reason that the "Save" button still looks like a 3.5" floppy disk and not a USB flash drive or an SD card.  The exact same reason.

Which, while it's frustrating as all get-out, is deeply humbling.  It's why I consider darned near every UI/UX developer on the planet grossly underpaid.  As much as non-programmers puzzle (perhaps shudder) at the inner workings of computers, I embrace them as fundamentally logical -- ultimately knowable.  In contrast, stepping into the world of human-computer interaction means trading the near-certainties of Boolean logic for the fuzzier approximations of statistical norms...clouded by the occasional chaos of lizard-brained mob-thought.  [shudder]

Alas, my patron-client must have custom-ordered the awesome bags, because they have no manufacturer's tag.  (They're a dark olive green, slightly beveled bottoms, two sets of handles -- can ya do a girl a solid here, Intertubes???  Halp.) Or I'd cheerfully solve my weekly problem by ordering a few more.  ("Thou shalt make it impossible for thy user to do the wrong thing," remember???)

The other, cheaper, solution, of course, would be to just tell the cashier to use the weirdo bags.  Except that cashiers are basically just trying to optimise for ringing up as many sales as possible without getting yelled at.  At the risk of sounding ancient, I'm old enough to remember when ringing up sales (and making small-talk about the weather) was all cashiers had to do.  Another person not only bagged your groceries, but also loaded them into numbered tubs so that a third person could help load them into your car (if you so chose).  Contrast that with the utter faceless dickishness of Home Depot, et. al., and their self-checkouts, and you'll understand why I refuse to participate in any further dehumanisation.

Ultimately, computer programming, like any discipline, has to push the boundaries to stay relevant.  In the 1990s, we had the waggish adage of "Intel giveth and Microsoft taketh away."  And while I sometimes wonder whether an on-demand "serverless" world of ginormous anything-goes NoSQL databases and spaghetti-nets of asynchronous-microservices-du-jour will trash back-end coding as a legit. discipline, I have no such worries about UI/UX developers.  Those boundaries, however mystifying to a left-brainer like myself, are vast, but relatively immutable.  But certainly no less challenging for all that. 

Wednesday, November 8, 2017

"Experience is what you get when you were expecting something else."

That was the little bio. tag-line for someone on the Arduino forums.  (I didn't make a note of their handle.  Apologies!)  Granted, if you're trolling through the Arduino forums, you're already feeling that.  But boy howdy, that hit a little too close to home for me.

Backstory:  The chicken coop is almost finished from a structural standpoint.  For monitoring temperature, humidity, and ammonia (NH3) levels, I decided to take the cue from my neighbour and roll with a wifi-based interface vs. Bluetooth/Android. 

The wifi breakout board that I'm holding up below is the ESP8266:  Version 1.0 form-factor with the Version 12 firmware.  Go figure.



I've known for well over a year what a difficult little beastie it is.  
  • The above form-factor's pinout is downright breadboard-hostile. (Two rows of pins?  Just...whyyyyyyyyyy????) 
  • It runs on 3.3 volts (vs. 5 volts for most other things).
  • Like all electronics that transmit a signal, it's a PIG for current (~300mA peak).
  • Constructing and sending an HTTP GET request is super-fussy about syntax (including carriage returns) and timing.  At the time of writing, HTTP POST is above my pay-grade.
And as the proverbial cherry on top of all that lamesauce, my (Ubuntu 12.04) workstation has this nasty, sneaky tendency to quietly drop USB -> Serial connections.  When you're in the middle of debugging and already inclined to blame your code, this can waste scads of time.

But this, in addition to being my first full Internet of Things project (well, first non-proof-of-concept project, anyway) was only my second rodeo with a smaller microprocessor.  I've already gone a few rounds with the Atmega328 chip running on a breadboard.  It's the same chip used in the Arduino UNOs, which streamlines development (as much as development can be streamlined).

The squee little ATtiny85 is the baby of the Arduino family.  Its surface-mount form-factor is the brains of Adafruit's Trinket, so I'm already familiar with some of its...errrrr..."quirks."  (So, yeah, the "baby" can sometimes swing to more of the "Stewie Griffin" than the "Maggie Simpson" end of the spectrum.)

The biggest quirk is that peeking inside the head of the ATtiny chips is almost impossible.  (I've done it with another full Arduino and an FTDI/UART programmer, but the wiring alone would make Rube Goldberg shake his head.)  All the usual protocols you'd use with a full-size Arduino chip (I2C, SPI, full Serial) are not supported.  The best that the ATtinys can manage is the SoftwareSerial library.  Which looks all fancy-schmancy on the surface, but under the hood does something known as "bit-banging."

Now, without wading into the intricacies of two- and three-wire communication, let's just compare bit-banging to an intersection where the drivers only sorta-kinda follow the protocol of taking turns and where the cars don't always arrive at neat intervals.

The bigger Atmega328 chips require you to supply an external "clock" (a crystal oscillator plus a couple of capacitors) to time the exchange of ones and zeros.  And while you have that option with the ATtinys, you do it at the cost of two pins -- of which there are only five on the ATtiny85.  The remaining (and default) option is to rely on the chip's 8 MHz internal clock.

As it turns out, that clock doesn't exactly run with the tightest tolerances.  According to one reply on the Arduino forums, the slop can be as high as +/- 10%.

Yeeeee-OWCH.

The upshot was that the hit-to-miss ratio of the HTTP GET request even making it to the router intact was absolutely abysmal.  By contrast, when I ported the code to a full Arduino UNO, the ratio flipped.  Still not perfect, but more than acceptable.  When I wired up an Atmega328 on a breadboard and uploaded the self-same code to it, the results were the same as the UNO.  The relative reliability of the external clock made all the difference in the world.

Pity.  I had high hopes for that little cutie-pie of a chip.  I still plan on wiring up and coding a Bluetooth-based proof-of-concept with it.  Mainly to get a handle on its reliability..  As I mentioned, HTTP GET is hella-finicky.  The AT-commands used with Bluetooth are simpler, so there is some hope there.

In the meantime, I'm going into production with a slightly over-engineered "Mark I."  I still need to pick up a replacement NH3 sensor from BJW in Moncton (because guess who hosed up soldering headers and then nearly ripped off a couple of pads trying to de-solder them?), burn it in, and kick its tires.  And, finally, knock together a screened enclosure to proof the whole contraption against, ahem, "re-wiring" by a trio of curious -- and probably bored -- chickens.  (I've seen the movie Chicken Run, y'all, and I would not put it past any of them.  Related:  Remind me never to let them watch Iron Man either.)

For the longer-term, I have a couple more sketches for the "library" of Arduino code accumulating in a Mercurial repository.  For this project I imported the temperature and humidity code lock, stock and barrel and it worked right out of the box.  Which is the whole point of a well-organised code-library.  And with a tenacious sinus infection knocking me flat in the middle of this project, don't think that I don't appreciate my past self for that!

And, most important, this radically expanded my notes.  Most especially the "Gotcha" section of an Arduino-on-a-breadboard ("Boarduino") presentation that I might give sometime in the next few months.  Because while I consider this latest round of butt-kicking just another installment of paying my dues, I benefited hugely from the forums, etc.  And so I'm paying that forward.  Mind you, I don't worry about the next generation becoming soft and spoiled.  Because even if I spare someone this particular butt-kicking, there are plenty more lurking out there with each new project. Oh yes, yes there are...

Tuesday, October 24, 2017

Security SNAFU Solidarity

I could have applied by mail for my first Canadian passport.  But I figured that it worth the heavily-detoured trip into Shediac to have Service Canada double-check my work.  Turns out, my paranoia was rewarded by the fact that the agent caught two errors.

One was just a brain-burp on my part.  The other, though, is on Service New Brunswick.

See, to prevent counterfeiting, our drivers' licenses are watermarked (for lack of a better term) with both grey and silvery-irridescent text/patterns.  Then "personalised" information (number, name, DoB, etc., etc.) is printed over that.  In my case, the watermarking seemed to affect the printing of one number so that it looked very much like another.  Especially given that that the numbers are printed in red instead of black.

Fortunately (and unsurprisingly), I had to supplement the application with photocopies of photo-ID.  In black and white, the number in question is far more legible.  Which is how the Service Canada agent caught the mistake.  (Whew!)

Needless to say, as a member of the I/T tribe, I found this more than a little ironic.  I kept a straight face, but I couldn't help but think (sarcastically), "Hmmmm...securing information by making it less usable and its users more error-prone:  Where-oh-where have I seen this before?  Oh, riiiiiiiiight..."

Scarier thing is, my British-born neighbour told me that the UK passports now require biometric data--e.g. retina scans.  I can only begin to imagine how securing digital-based data with meat-based data is going to complicate things in unintended ways.  (True, glaucoma and diabetes don't seem to affect retinal biometric reading too adversely.  But still, I'm totally thinking about that scene from The Avengers.  Of course I am.  Blech.) 


Tuesday, October 10, 2017

Don't egg me on

Our current neighbourhood is a little more communal than the previous one.  Maybe it's an Acadian thing.  But the borrowing and sharing (especially when the gardens are coming in) has an old-time small-town feel. Thus, I was seriously-and-for-realz not at all surprised to see a text from the across-the-back-lawn neighbour asking (out of the blue) whether Dennis would want to borrow his spare chicken-coop in lieu of Dennis having to scramble one together before the snow flies.

Because that's just what good neighbours do, right?

Dennis will probably build to his own design anyway, but it's good to know that it's there in case the weather makes the tarp-covered Quanset-hut run inadequate for the three remaining birds.  Because of the need for a heated waterer, we'll need to run electricity out to the coop.  Which opens up other possibilities for monitoring things like temperature, humidity, and overall air quality, etc.

The app. I have in mind is Bluetooth-enabled, allowing an Android app. to check in on the current status of those numbers and setting off an alarm if the numbers get out of whack.  And there's less than no sense in keeping that design/code to myself if someone else can get the benefit of it for the cost of a few electronics components.  So I texted our neighbour to see whether they have any Android devices in the house.

Because that's just what good neighbours do, right?

(Well, not one hundred percent:  A second set of feathered guinea-pigs for field-testing benefits me, too.  Let's not get too crazy with the altruism angle here...) 

Nope, texts the neighbour:  They're strictly an Apple household.  (Fooey.)  But then he asks how they would keep the stored data.

Hoo boy:  That's a whole 'nuther basket of eggs.  Because now we're talking back-end:  Databases, a web application, all that jazz.  Granted, that's basically my core competency as a developer.  But it's also scope-creep (on radioactive steroids!) at this proof-of-concept juncture. 

And yet.  It's also useful to know that someone's already thinking long-term.  Assuming that I ever wanted to jump through all the UL-type certification hoops to sell this kind of contraption to backyard chicken-herders.  Mind you, two data-points do not define a market, but the fact that a single text dragged the scope so far outside the pale tells me that I wasn't thinking "holistically" enough.  Even for this six-chicken neighbourhood.

But, then, that's just what good neighbours do, right?

Friday, August 4, 2017

Frivolous Friday, 2017.08.04: Gamification gone bad

A pal of mine's applying to immigrate to Canada.  Not just the usual huffing, mind you:  We're talking about actually jumping through the hoops here.  And boy howdy, there are plenty of them.  As an American expat proud to call Maritime Canuckistan home, I can totally vouch for that.  Frankly, I suspect that at least some of the paperwork is there just to see how badly you really want it.

For instance...

How long would it take you to generate a list of all your previous addresses since the age of 18?  I was lucky; I'm a pack-rat and I have friends who are pack-rats.  For everything else, there was a vague recollection and Google Street View.  I'm even not kidding.

Do you know where your birth certificate is right now?  How about your marriage license (and/or divorce papers)?  College transcriptsHigh school transcripts (again, not kidding)?  Don't have some of them?  Better send away for copies.  (Cha-CHING!)  Then have those copies copied (in colour).  Eh, might as well have 'em notarised while you're at it so they look all official 'n such.

Hey, U.S. residents:  Did'ja file away those annual earnings statements from the Social Security Administration so you can back up your claims about employment history (which I certainly hope you can just snag off your resume, btw.)  If you tossed them, womp-womp:  The SSI folks do not supply replacements at any price. 

Speaking of employment history, if you immigrate as a Skilled Worker, you might be called upon to prove that you have enough years of experience in specific job types (National Occupation Codes, a.k.a. NOCs; best familiarise yourself with them--and hope they don't change before you submit your app.--because they are make-or-break). 

How do you "prove" your work history?  The preferred format is letters of reference from your current and/or previous employers.  On company letterhead.  Stating your position, your tenure, your responsibilities, and your pay.  Let's get one thing straight:  Your previous employers have ZERO interest in complying with that.  Oh, did one or more of them close?  Maybe bought out by another company?  That sucks.  Also?  You probably want to work up an explanation for your current employer if you need their help in documenting your work history.  I was extremely lucky that my employer was willing to let me work remotely as a contractor. 

Criminal background checks (which in the U.S. is done by having your fingerprints taken by the local constabulary and sending them to the FBI for cross-checking) have a shelf-life.  With a non-trivial processing-time (of course).  So there's some timing you have to do with your actual application.  And if, for some reason, there's a problem with your application, you might have to do it again.

You of course have a current passport, right?  No?  Uh-oh:  Better make an appointment to have your photos taken and take them and your birth certificate and your photo ID and head down to the Post Office to set that in motion.  You usually have to make an appointment for that too, by the bye.  During regular business hours:  Whee!

And, assuming you make it through the first rounds of vetting, you'll be expected to take a medical exam which is only offered at a limited number of offices and correspondingly spendy.  Blood draw, uninalysis, chest X-ray, the whole shootin' match.  Oh, and you have a rather limited time in which to do it.  (Our consultant dropped the ball on our file and we had ten days to scramble that together.)

I could dig up my notes for fuller detail, but that's what sticks in my memory just now.  There's more--trust me.

Bottom line:  You'll bleed toner ink and money and time off from work and even a little bit of real blood before they punch your ticket.  If they punch your ticket.

What with all the obstacles and requirements and a few time constraints, it hit me:  Why not gamify it?  Heck, it's kind of already gamified.  I mean, Immigration, Refugees, and Citizenship Canada already calculates eligibility on a points-based system.  And some immigration programs are run as a lottery anyway, which means you're rolling the bones.  So it's almost like CIC is asking for it, amirite?

Now.  Apart from being eligible in the first place, the most challenging part, by far, is the paperwork-gathering mentioned above.  Because it sometimes walks the line between weeding out the posers and bureaucratic sadism.  (I mean, seriously, the all-addresses-since-the-age-of-eighteen thing?  Just....whyyyyyyy does that even matter?????)

But, wait!  People will go to limb-and-life-risking lengths to collect completely imaginary stuff in, say, Pokémon Go.  So why not real stuff? 

When I pointed that out to the afore-mentioned friend, their response was basically, "OMG--if I spent as much time on that as I have on Pokémon Go, I'd be in Canada already!"

And that's when the light bulb went on.  And, after roughly thirty seconds of scoping out the DB structure, back-end infrastructure, and riffling my memory for the names of the iOS/Swift developers I know, I pulled the plug on that light.

As it should be.

Because the reality is that I've already put serious thought into a checklist version of this sort of application.   To the point of picking the brains of a (willing) immigration consultant.  And the central issue is that keeping up with the myriad of requirements for dozens of immigration programs (e.g. Skilled Worker, Family Sponsorship, Provincial Sponsorship, Entrepreneur class, etc.) is pretty much a full-time job.  Which doesn't work for a part-time pro-bono project.

Also because, as (to quote Mark Andreeson) software continues to eat the world, making a difference means resisting the urge to fix what's not really broken.  If you want to become an immigration consultant in Canada, the barrier's already fairly low.  About $8K plus your time for the course-work.  Silicon Valley-style "disruption" only makes sense when markets are un-/under-tapped.  And the number of cases the Government of Canada can and wants to handle is a fixed number.  In fact, there's a backlog (hence, the lotteries).

The case could be made that turning the Permanent Resident application process into Pokémon Go might improve the overall quality of applications (see "weeding out the posers" above).  I didn't say it was a good case:  Ain't no AI out there gonna tell you whether, say, a birth certificate looks cromulent.  At best, it makes people more organised before they submit their PR application.  Frankly, I'm not holding my breath for until that happens.

Worse, though:  What happens if the "game" goes viral?  And the tsunami of applications hits the CIC's processing centers.  I mean, we already saw their website crash in the wake of a single election last autumn.  (Shudder.)

Worst, the furor will eventually die back.  Then they'll track me down and kick me out of the country.  And, frankly, I couldn't find it in my heart to blame them when they do.

Friday, July 28, 2017

"One app. at a time"

A couple years ago, Dennis bought a laptop on the cheap that swiftly demonstrated its shortcomings as a "portable" Windows machine.

For tasters, it's a leviathan:  A Dell Precision M6400, boasting a 17" screen and tipping the scales at 8.54 lb / 3.88kg.  During its limited use, its Windows 7 OS never could be convinced that it was Genuine Microsoft(TM), despite entering the registration code over and Over And OVER.  It's also a battery hog, running less than two hours on a full charge.  Which is not at all surprising, considering how, were you to flip it upside down while it's running, I'm not convinced that you couldn't pop popcorn with the vents.

It was Dennis's tablet purchase that ultimately consigned this bully-boy to the corner and dust.  But my go-to Linux laptop ("Big Grey") is about eight years old.  Also, it spooked me a week or so back with a bad HDD sector.  So it was time to think about migrating.  Dennis swore that he really-most-sincerely had zero use for the Dell (not even as a doorstop).

So, booyah for a free laptop, amirite?  (Uh-huh:  I can already hear the DIY computer-geeks start to snicker evilly already...)

Normally I run on AMD chips, but this laptop is Team Intel.  Welp, turns out that, somewhere along the line, I missed the memo that the "AMD64" distributions of Linux now apply to 64-bit Intel chips as well.  And I didn't realise my mistake until after making it about halfway through the a Debian 16.04 installation/configuration.  (Side note:  Whatever improvements the Ubuntu team may have made to the base product, their installation is still as unpolished as ever, so it took two attempts to make it that far.)  And that was with the wifi drivers working right out of the box.  (A not-so-small mercy which does not go unappreciated.)

That-all basically shot a day.

But, hey, a free laptop's a free laptop, right?

Disgusted, I made the i386 system image dig its own grave by downloading the AMD64 Ubuntu .ISO file and burning it to DVD.  Except that ISO file was corrupted sometime during the burn-in.  (Do not go gentle into that good night, i386 -- I gotta admire your spirit.)

Fine.  Screw you, Ubuntu...and the godsforsaken Unity desktop you rode in on.

Debian 9 ("Stretch") is barely over a month old, so I'll get all the latest/greatest stuff, right?  Also, its system requirements are muuuuch friendlier to older hardware than Ubuntu. 

So I dug out my notes -- OF COURSE I HAVE NOTES -- from previous Ubuntu/Debian installs; more a collection of URLs than anything else because why write it down when you can look it up, I ask you.

That was my first (big) mistake.  Not the notes themselves, but forgetting how long it had been since I'd installed and customised a Debian OS.  That would have been "Big Grey," about two Debian versions back, to be precise.  (Raspian  doesn't count b/c the Raspberry Pi is its own beastie, and most certainly not a the Swiss Army knife that a desktop/laptop is.)

Now.  One of my oldest friends is a fan of old cars -- the antique auto show in Iola, WI, was the highlight of his summer (and his bonding time with his Dad).  No surprise that it was he who introduced me to the Johnny Cash song "One Piece at a Time."  Lyrics (courtesy of Google) are at the end of this post, but the tl;dr is a how guy schemes to assemble his very own Cadillac from parts smuggled out of the factory.  Over the course of a multi-decade career working in the GM factory.

My Gentle Reader doubtless sees the punch-line coming...and can well understand that the parallels were not lost on me.  Because technology moves so much faster in I/T, and I'm not convinced that there weren't about as many moving parts by the time that beastie was fully functional.

All told, it cost me the better part of three days to set up this "free" laptop.  A few reasons:

Gnome  Apparently, it's a different flavour ("Metacity" vs. just "Gnome Classic") that allows for the civilised amenity of app. launchers on the top bar.  Except Metacity didn't show up in the login options for multiple boots/reboots.  Once it was an option, fighting custom launchers to make them retain the designated icon took a few tries to get right.

Dropbox  Installed from binaries, not repos (Boo!)  This version refused to start on startup; needed to create autostart file.  Still figuring out how to terminate its process at shutdown and wondering why in name of Mordor that isn't a part of the .desktop file vocabulary in the first place.

The "AMP" part of "LAMP"  I'm spoiled by having this install from a single terminal command.  I forgot how crazy-simple this made things.  And I was about to be forcibly reminded...

Apache  User directories are enabled differently in 2.4 than they are in 2.2.  So two tries there.

PHP  Well-behaved, though I had to briefly go spelunking for the .conf file.

MySQL  Absolute NIGHTMARE.  Debian 9 actually installs the MySQL fork "MariaDB" and masks it with "mysql" executables.  In a PR juke worthy of K Street, MariaDB is billed as a "drop-in replacement for MySQL."  Uh-huh...yeeeeeah...so, about that...  There's no prompt for a root password during install, which was the root (pun intended) of a morning's worth of hair-pulling.  I could "anonymously" interact with the command-line UI, but all GUIs refused to connect.  Between 2 separate Q&A sites, I managed to piece together the info. I needed to disable the anonymous command-line access and create an uber-sooper-dooper admin. user to take the place of root.  Oh, and installing the mysql binaries from Oracle?  Fuggedabouddit...I couldn't even connect to the server from the command-line UI (much less Workbench), so I went back to MariaDB after forcibly ejecting the borked MySQL installation.

MySQL Workbench  Special mention b/c I literally lost track of how many times I installed/uninstalled it.  Blows up on login, even with a valid username/password.  No surprise--it's been trouble since the days (15 years ago) when "MySQL Query Browser" refused to run at all on Windows 2000.  For now, PhpMyAdmin and the Database tool in NetBeans will, between them, do what I need to do.

PhpMyAdmin  Speaking of which, there's now an extra step to install the missing mbstring and gettext modules.  Otherwise, it worked like a charm after the main authentication issues were sorted out.

NetBeans  Lesson learned: Go with the platform-independent version.  Designate the base JDK folder, not the path to the executable.  (That screw-up was sheer PEBKAC on my part, btw.)  Blessings upon the Moncton Java guru who saved my bacon on that one.  May the fries in his poutine always stay crispy.  (I think that's a legit. Canadian blessing, but don't quote me...)

Fritzing  Is in the Debian repos, but the parts bin is a separate install. (Whyyyyy???)

Geany  Apparently changed the per-language syntax highlighting sometime between versions 1.23 and 1.29.  Have temporarily thrown in the towel for custom keyword highlighting for OpenSCAD.  (On the plus side, this version of OpenSCAD doesn't lose the menu-bar, so booyah.)  But there went most of the afternoon.

And that's just the high points...

Along the way, of course, I learned a few new SysAdmin-type tricks.  Mind you, I have no illusions that I'll remember them, but at least we won't be strangers when inevitably we meet again.  I have a renewed & enhanced appreciation for the "Ask Ubuntu" community and the global treasure that is StackOverflow.  And I would be remiss not to note an array of independent bloggers and the Google algorithm-finders that put them at my fingertips.

In the past, I've had a few sharp things to say about "free" software in the context of modern capitalism.  And, let's not pussy-foot around it:  I use this software gig-in and gig-out to earn my crust.  But there is less than no question that my hard-won knowledge (not to be confused with "wisdom" because that's another beastie) cost me scarcely a drop compared to the Great Lake that is the sum of FOSS efforts going back decades.

Yes, I've drastically expanded and revised my notes.  Mainly because I have a DisplayPort to DVI adapter on the way that will allow me to plug this laptop into the KVM switch (or so I trust) and take the place of the current workstation.  (Sort of a two-for-one win-win.)  If that-all pans out, I might just spring for the SDRAM to bring the memory up to 8GB.  And after that goes down, it's finally Big Grey's turn for a rebuild and (likely) rebirth as a dedicated "maker" machine (Arduino, AVR, and 3D printing).

After that,  of course, the value of the information in those notes will incrementally decay from "asset" to "liability."  Next rebuild, I'll know better than to just whip them out thinking, "It's cool--I got this."  "How can you be so sure?"  my Gentle Reader might (justifiably) wonder.  Don't be silly, GR:  Of course I added a note IN ALL CAPS at the very top of the document.  If you can count on a recovering tech. writer for anything, it, it's reading the documentation.

- - - - -

"One Piece at a Time" - Wayne Kemp

Well, I left Kentucky back in forty nine
An' went to Detroit workin' on a 'sembly line
The first year they had me puttin' wheels on Cadillacs

Every day I'd watch them beauties roll by
And sometimes I'd hang my head and cry
'Cause I always wanted me one that was long and black.

One day I devised myself a plan
That should be the envy of most any man
I'd sneak it out of there in a lunchbox in my hand
Now gettin' caught meant gettin' fired
But I figured I'd have it all by the time I retired
I'd have me a car worth at least a hundred grand.

I'd get it one piece at a time
And it wouldn't cost me a dime
You'll know it's me when I come through your town
I'm gonna ride around in style
I'm gonna drive everybody wild
'Cause I'll have the only one there is around.


So the very next day when I punched in
With my big lunchbox and with help from my friends
I left that day with a lunch box full of gears
I've never considered myself a thief
But GM wouldn't miss just one little piece
Especially if I strung it out over several years.

The first day I got me a fuel pump
And the next day I got me an engine and a trunk
Then I got me a transmission and all the chrome
The little things I could get in my big lunchbox
Like nuts, an' bolts, and all four shocks
But the big stuff we snuck out in my buddy's mobile home.

Now, up to now my plan went all right
'Til we tried to put it all together one night
And that's when we noticed that something was definitely wrong.

The transmission was a fifty three
And the motor turned out to be a seventy three
And when we tried to put in the bolts all the holes were gone.

So we drilled it out so that it would fit
And with a little bit of help with an adapter kit
We had that engine runnin' just like a song
Now the headlight' was another sight
We had two on the left and one on the right
But when we pulled out the switch all three of 'em come on.

The back end looked kinda funny too
But we put it together and when we got through
Well, that's when we noticed that we only had one tail-fin
About that time my wife walked out
And I could see in her eyes that she had her doubts
But she opened the door and said "Honey, take me for a spin."

So we drove up town just to get the tags
And I headed her right on down main drag
I could hear everybody laughin' for blocks around
But up there at the court house they didn't laugh
'Cause to type it up it took the whole staff
And when they got through the title weighed sixty pounds.

I got it one piece at a time
And it wouldn't cost me a dime
You'll know it's me when I come through your town
I'm gonna ride around in style
I'm gonna drive everybody wild
'Cause I'll have the only one there is around.


Ugh! Yeah, RED RYDER
This is the COTTON MOUTH
In the PSYCHO-BILLY CADILLAC Come on

Huh, This is the COTTON MOUTH
And negatory on the cost of this mow-chine there RED RYDER
You might say I went right up to the factory
And picked it up, it's cheaper that way
Ugh!, what model is it?

Well, It's a '49, '50, '51, '52, '53, '54, '55, '56
'57, '58' 59' automobile
It's a '60, '61, '62, '63, '64, '65, '66, '67
'68, '69, '70 automobile.

Monday, June 26, 2017

Death, be not ironic *

The news is a little stale, mainly because I've been lazy when I've not been shaving yaks.

Jean Sammet passed away a little over a month ago.  And while I bless the New York Times for dispelling a myth I've lived with for about two decades, I'm equally outraged that this was the first I'd heard of Ms. Sammet's work.  When I re-booted my computer programming education in 1996, the curriculum included the COBOL programming language.  (Gen-Y and older folks will remember the Y2K non-event, which was proceeded by a sharp demand for such, ahem, vintage languages as businesses threw money at decades of technical debt.)

COBOL was/is indelibly associated with Admiral Dr. Grace Hopper, who is largely responsible for the fact that "programming" no longer involves building software with tweezers, pushing ones and zeroes on and off stacks of memory-addresses.  (Assembly-language is the closest you can get these days.  I tried it once.  Once. [shudder])  One of my teachers -- himself a PhD -- was immensely proud of having met her...and the fact that she, by that time a senior citizen, was exhausting to keep pace with.

But, as it turns out, "Amazing Grace" didn't design so much as a feature of the language.  No question that her work was absolutely foundational to it (and, really, everything since).  But, as Ms. Sammet's obituary points out, Hopper's "Mother of COBOL" moniker is entirely undeserved.  (Hello, Halo Effect.)  The actual credit belongs to Ms. Sammet and five other programmers who slammed out the design in two weeks.  (In the days before Red Bull and foosball tables, if you can believe that!)

What I find interesting about this era in computing history is the sense of a battle for the soul of computer programming.  Dr. Hopper's work was, at it base, driven by an abiding love of mathematics.  She found the bit-twiddling a needless waste of a mathematician's time, and she bucked management to develop a more English-like grammar.  (Woo, skunkworks project!) 

In contrast, her fellow force-of-nature, Ms. Sammet, seemed more influenced by working for hardware manufacturers, absorbing the culture of calipers, slide-rules, etc.  And it comes through in her view of software as the product of a more rigorous engineering process.  (A view echoed during my stint at IBM in the early 2000s when "the process is the product.")

In the end, however, neither of these formidable ladies was entirely correct.  Because software was quickly co-opted by business, where neither mathematical precision nor engineering rigour can stand up to the short-term profit motive...and the long-term tendency to kick the can down the road.  Fittingly, the NYT obituary for Ms. Sammet concludes:
"COBOL was initially intended as a short-term solution to the problem of handling business data — a technology that might be useful for a year or two until something better came along. But it has lived on. More than 200 billion lines of COBOL code are now in use and an estimated 2 billion lines are added or changed each year, according to IBM Research."
Apart from the term "bug" that has been Adm. Hopper's ubiquitous contribution to the programming lexicon, you'll sometimes see her quoted along the lines of, "The most dangerous phrase in the English language is, 'we've always done it that way.'"  But there's also the (anonymous) adage, "If it ain't broke, don't fix it."  As programming languages go, "something better" has probably come along in the last fifty years.  Just not "better enough" to justify tossing out the engineering that went into the original COBOL-flavoured solutions.  And that in itself is one HECK of a memorial.

- - - - -

* In case it matters, title is a riff on one of John Donne's poems.

Wednesday, May 31, 2017

You can't copy-and-paste a career

[Warning:  Rant ahead.]

Because filing paperwork that will almost certainly be a waste of paper & printer ink & postage (not to mention the time of all involved) wasn't infuriating enough, this had to land like a fresh cow-pat across my path in Twitter today:  Computer science students should learn to cheat, not be punished for it.

The tl;dr summary was best done by Homer Simpson (quoting from memory here):  "Marge, don't discourage the boy!  Weaseling is an important skill.  It's what sets us apart from animals...except, of course, the weasels."

Because, you see, in The Real World(TM), coders copy and paste all the time.  And coding in school should reflect the less ethically pristine norms of Silicon Valley.  At least, according to a journalist who lists precisely no coding background in his profile.

Oh, and teaching Java as a first language is somehow corroding professional skills.  I say "somehow" because there was literally no explanation for that offered in the main article.  The CrossTalk URL stalls out.  (Pity--it looked much more promising.)  The second related URL links to another article by the same author which argues that JavaScript is better because it isn't as scary and thus doesn't discourage the "fundamentally creative endeavor" that coding is supposed to be.  (No, really, it said that.  I wish I were making that up.)

Because schools, you see, are failing the software industry, and the 10% unemployment rate among UK CS graduates is iron-clad proof that Universities aren't teaching real job skills.

I mean, no real job skills besides sitting in herds pretending to be interested in what the authority figure at the head of the room is saying.  Or to subsist on crap food consumed at irregular hours.  Or the mad, scrambling stampede ahead of arbitrary deadlines (a.k.a. the semester).  Or swallowing the seething rage that comes with individual performance ratings being dragged down by slackers you didn't want on your team in the first place.  Or, not least of all, the almost rhythmic filling and emptying of your memory with the Next New Hotness that we need you on the bleeding edge of so we have someone to tap when we're finally forced to use the industry standard five years hence.

And now a journalist is advocating stealing--notice I didn't say borrowing--code as a professional skill. 

Now.  I spent about a year in the Fourth Estate, and even after more than two decades, I can appreciate how your knowledge has to be the proverbial mile-wide-and-an-inch-deep.  I know that I had to lean on other people to understand the intricacies of red clay vs. blue clay in the street renovations, the problems caused by shoddy contracting on the new high school, and even which number under that pile of jerseys was actually holding the football.  But I leaned on people who actually knew what they were talking about.  Reporting on something from your personal "Hello, World!" perspective is a great view into how beginners view things.  But it does not qualify you to design curriculum for an entire industry.

But!  Surprise twist!  I actually agree in principle that four-year University degrees are doing no one any favours by devoting weeks' worth of time to edge cases like sorting algorithms.  Or discrete mathematics.  Or even much NP-completeness theory beyond the basic epiphany that not all software solutions can be encapsulated by an algorithm.  (Turning a brilliant-but-naive coder loose on something that they don't know can only be approximated or brute-forced will waste one heck of a lot of money.  Let's avoid that, but not go too crazy on knapsack problems, m'kay?)

And I certainly can't argue that coming out of school knowing unit-testing, source control (particularly the Special Hell(TM) of branch merges) aren't a bad thing.  Replace pop-quizzes with Scrum stand-up check-ins, for all I care. 

But school already does a bang-up job of reinforcing some of the worst aspects of the world of work.  (See above snark on "real job skills.")  Stealing code should not be added to those sins.

Borrowing code is an entirely different matter.  By "borrowing" I mean citing the source (e.g. the StackOverflow URL) in the comments.  That accomplishes a few things:
  1. It allows you (or the poor slob who has to maintain your code) to go back to the source for reference.  Which can answer questions like:  "What was the original code meant to accomplish"?  "How old is it?"  "Was the solution up-voted and/or embellished with further useful comment?"
  2. It demonstrates to your team-mates and/or bosses that you don't take credit for other people's work.
  3. If the code completely bombs QA/CI tests, you don't look like quite the idiot you would have it had been your own creation, amirite?  ;~)
The first point is more immediately and practically useful.  The second, however, has more far-reaching implications.  We've seen billion-dollar lawsuits filed in the name of code-stealing.  (Remember SCO Linux?  No?  Okay.  Howsabout the "APIs are copyrightable" legal Wrestlemania between Oracle and Google?)  My industry is notorious for men claiming credit for women's contributions.  And someone thinks that taking credit for other people's work is a skill to reward from the age of 18 on?  Because citing your sources is for academia (and, one hopes, journalism)?  Seriously???

The good news is that there is a stupid-simple way to stop universities from turning out unqualified junior programmers.  Seriously.  Paint-chip-eating-stupid-simple.

Ready for it?

Programmer job postings just need to stop requiring CS degrees.

That's it.  Supply, meet demand.  Next problem, please!

Okay, so there's actually some bad news:  The Suits won't stand for that.

Let's take a minute to break that down.  If The Suits absolutely must hire an onshore programmer, that's not chump-change.  Granted, the modern (read: "internet-ified") job search process does a fantastic job of externalising much of that cost on the job-seeker.  But there is some residual internal cost to hiring.  That cost rises dramatically after a newly-acquired warm body is parked in its cubicle-stantion.  Mainly because most new hires require 60-90 days to evolve the dysfunctions necessary to survive in their unique new corporate micro-climate.  (But I'm not cynical.)  Two to three months of salary plus overhead is not an insignificant investment.  The bean-counters are right to want to minimise the risk that the new organism they've introduced will turn out to be a parasite.

Suits want security and stability.  Disruption, y'understand, is only a good thing when it happens to other people.  For them, post-secondary degrees are a form of insurance -- with the shiny bonus of not having to front the money for it.  (If you're a home-owner, think about your bank requiring you to buy mortgage insurance to protect them in case you lose your ability to make payments:  It's exactly like that.)

Are all companies so risk-phobic?  Of course not.  The current (U.S.) average seems to be that only about half of software developers hold a CS degree.  It's absolutely possible to get a coding job without checking that box.  Just not at places like Google, of course.  Also, that expensive piece of paper, broadly speaking, is leverage for a higher starting salary (upon which future raises and starting salaries at other companies are largely dependant).

Because any stable company -- the kind we work for when we don't feel like gambling our ability to pay next month's rent on the chance of owning a private island -- is generally large enough to have a candidate-filtering mechanism called Human Resources.

I've had the privilege of knowing some very, very sharp HR folks.  Yet nary a one of them can tell you whether or not my GitHub check-ins are crap.  My StackOverflow score?  That's literally just a number...one without an anchor (Pareto distributions and all that).  Obviously, higher is better.  But what's the baseline minimum?  And what if there was even a baseline?  Think about wine for a second.  It's rated on a hundred-point scale, but its scores (among its self-appointed referees) can be all over the map.  And you can still pay top dollar for what tastes like plonk to you.  But HR's gonna extrapolate from an up-vote count?  Yeeeeeeah...

The thing is, if programming were treated like every other profession instead of the Dark Alchemic Art that it emphatically is not, none of this would even be an issue.  And I place the blame squarely on business.  When you can't trust the standard metrics, make your own.  You need demonstrable coding skills?  Have your current developers put together a quiz for candidates.  There's software for that.  Does this person have language-specific certifications?  Great.  Take a quick peek at the GitHub/StackOverflow oeuvres to see whether they've actually been used.  Do they have a blog?  Do they give any presentations to other coders?  Google is your friend.  And you don't even have to leave the office.  Yet.

Once you have a handful of candidates, get your butt off the internet and meet them.  Preferably in a third-party setting.  Then coordinate interview with those that make the cut in-person.  This is where HR really earns its salary.  While you (or your technical evaluators) are swimming in alphabet soup, they're looking for the social cues:
  • What's their body language?  Closed?  Aggressive?
  • How do they react when they're challenged/corrected?
  • Do they treat people of different genders/ethnicities differently?
  • How much of their attention goes to people who were not "authority figures"?
  • When talking about previous teamwork, what's the "We"-to-"I" ratio?
  • Do they put their feet up on my desk during our 1:1?  (No joke, this legit. happened to a recruiter I worked with.  Needless to write, the candidate was not invited back.)
Of course, I have to wonder why there was even a (public) job posting to fill in the first place.  Employee referral rewards, internship pipelines, on-the-job-training, coding "boot camps," and tuition reimbursement are real, actual things, friends and brethren.  Those are all investments -- most of them not inexpensive.  But, then, so is on-boarding someone whose salary averages $50K+ in Canada...and unintentionally sabotaging an entire team with an incompetent git who looked good on paper.

Sure, the internet is a great (and relatively cheap) way to boost the signal.  But it also affects the signal-to-noise ratio like, whoa.  So the first-line filter (a.k.a. HR) needs some criteria to weed out the self-proclaimed ninjas, rock stars, and unicorns of our Dunning-Krueger age.  Employment histories will list date-ranges and, hopefully, mention the technology stack used by the previous employer.  But there's rarely room to mention how long or how extensively any given technology was used in the field.

The bottom line is that, without training to evaluate code, HR doesn't have the resources to make this call on their own.  At best, HR can do the legwork of plugging code samples into search engines to scan for plagiarism (assuming you care about that).  Lacking proper domain knowledge, they have every incentive to fall back on traditional metrics.  Which includes "outsourcing" the "follow up" / "circular file" judgement-call to the four-star rating-system of (you guessed it!) the good ol' college GPA.

At which point, you (as the hypothetical employer) have effectively lost your right to complain about universities not preparing CS students for real-world coding work.  And fer cryin' in your Double-Double, please don't expect colleges to reward plagiarism.  This world is already too much a kleptocracy, thanks.

Tuesday, January 31, 2017

The presentation I won't give

Today I had lunch with two of the three other Illuminati of the local programmer group.  (If you happened to be in the Codiac Cafe on George Street over the noon-ish hour, yeah, that was us in the shadowy grey robes.  Next time, come up and say "Hi.")

Anyhoo, in addition to hashing out the details of the upcoming meeting/hackathon, we also knocked around ideas for upcoming presentations.  Which involves, of course, recruiting speakers.  Wherein lies the problem.  Because, you see, the Venn diagram of people who know a lot about something interesting and the people who have the ability to get up in front of a crowd of their peers and roll out that information in a structured, digestible way is not what'cha'call a perfect circle, y'know?

But the rub is that the only people to show up for a "How to Give a Presentation" presentation are likely to come from a single demographic -- i.e. the folks who don't want to see the presenter's feelings hurt.

So I'm doing this online.  Think of it as a self-study course.  If you're old enough to remember people working out to a Jazzercise VCR, so much the better.

Now.  I do at least one presentation a year.  Mostly to give back, but partly to look the beast of public speaking in its many eyeballs and say, "I ain't scared 'a you."  (The beastie and I both know it's an alternative fact straight-up lie, but that's actually the point here.)  But I have an ace in the hole:  I was doing this at sixteen for giggles.  Okay, not really giggles.  I had a massive crush on one of my high school's Debate alpha-team.  But that crush put me on a path that lead to meeting my husband and the Bestest of BFFs.  So who's the real winner here?

Anyhoo.  After six or seven years (depending on how you count) of competitive public speaking, I have a few things to pass down to the up-and-coming generation of geek-speakers.  Normally, I'd issue a "your mileage may vary" kind of disclaimer, but if you haven't done this more than one or two times, stick with the program until you are comfortable enough not to need it.  (This is highly analogous to following a recipe to the number until you grok what makes it tick, m'kay?)

I'm assuming that you have one of the following:
  •  A topic you're so passionate about that you're bursting at the seams with its awesomeness, or
  • Your boss has "voluntold" you to present on something of relevance to your co-workers
Either way, here are your moves:

1.) Pull out your favourite plain-text editor.  Yes, plain-text.  Heck, Notepad, for all I care.  Because you are emphatically NOT ALLOWED TO FORMAT ANYTHING at this point.  Got that editor fired up?  Groovy.  Now shut up and let your brain barf out a bullet-list.  Single-level only.  Don't think:  Just get it on the screen.  Like, yesterday.

Done?  Fabulous.  Whatever you do, don't fall in love.  Because, about a century before George R. R. Martin, "Murder your darlings" was already legit. literary advice. 

2.) Organise.  Okay, NOW you're allowed to be hierarchical.  But not more than three levels, including section-headers.  That effectively leaves you with two levels of detail.  Why only two?  Because this is soooooooooooo not about you dazzling anyone with nuance; this is about you not boring the ever-loving snot out of the people who are graciously lending you their attention-span.

Cut-and-paste, drag-and-drop until the content feels like it should flow like cream into the brain of someone who knows nothing about the subject.

Nope, don't fall in love here, either.  You're going to both murder and mutilate very shortly.

3.) Show, don't tell.  Time to lean on images.  First, create a folder at the same level as your brain-dump.  Now go search and scroll.  At a bare-bones minimum, you're going to want images to jazz up your section-header pages.  I prefer sly, snarky humour m'self -- big surprise there -- but follow your intended audiences tastes.  Whatever you pick, name the file descriptively.  You're going to need that information at your fingertips.

Can you convert either-ors into flow-charts?  Go.  Side-by-side comparisons as tables?  Dooo eeeeet.  Comparisons/Contrasts as Venn diagrams?  Make it so, Number One.

Are you showing code-samples?  Excellent.  Go type them up, make sure they compile, and then screen-capture them.  Ditto the output. This is your insurance against wi-fi issues.  Also:  You know all those nervous presenters you've seen trying to type code live?  This is precisely why you won't be them.  You're welcome.

4.) Unit-testing.  First, brace yourself for some ugly truths.  Deep breath.  Now, translate your outline (and images) to simple slides.  Simple slides, d'ya hear?!  PUT THAT ANIMATION MENU DOWN.  Animations are for closers.  (Sorry-not-sorry, Alec Baldwin.)

Uh-oh:  Some of those topics don't fit onto a single slide, now do they?  Huh.  Guess you should be thinking about how you're going to break them up into separate topics, mmmm?  'sokay, it's not like you lose points for this.  This is the "mutilation" I talked about.  Good thing you never let that outline give you big chibi puppy-dog-eyes, amirite?

When you're done, convert the whole thing to PDF, and close your slide editor.


5.) First integration test.  Close the door.  Keep your laptop/keyboard within arm's reach.  Open the document in PDF format.  Annnnnnnnd...present!  (No stopping allowed -- just plough through it, already.)

Dread Cthulu, that was painful.  All that goodness in your head doesn't always quite make it to the spoken word, no?  The good news is that the presentation will never sound that gawdawful again.  The bad news is that you have a bunch more iterative more work in front of you.

6.) Cull content.  First things first.  Let's get rid of the stuff that your audience, on second glance, doesn't actually need to know to understand the main points.  Just get rid of it and don't look back.  Yep--murder your darlings.  "Red Wedding" style if necessary.

7.) Re-organise ruthlessly.  Did some sections really seem like non-sequiturs when you were fumbling your way through them?  Move them before they try to consolidate their positions.  Do some still stick out like the proverbial sore thumb?  You might want to re-think their importance.

8.)  Subsequent integration tests.  Follow the recipe for "First integration test."  This is basically analogous to the "childbirth amnesia" women experience between their first and second (and even subsequent) children.  Sure, the first time is generally the worst, but even so...yeeowch.  But, in fairness, perhaps not quite as bad.  Maybe.

Keep iterating:  Cull and re-organise.  Work on something else for awhile and let things simmer on the proverbial back-burner.  The content is starting to fall into its rightful places.

9.) Refine the flow.  No slide-deck, however well put together, will always flow seamlessly from one section into the next.  Or subsection.  Or maybe even between bullet-points on the same slide.  You probably noticed that during the iterations, yes?  That's okay.  Seriously okay.  Because, really, if it were All About the bullet-points alone, you could (and -- let's face it -- should) just email the slide-desk and let people read it at their leisure.  Your value-add is to, quite literally, read between the lines.  And the page-breaks, for that matter.

Condense your bullet-points.  They exist as hooks for your content, the war-stories you're going to tell, the experienced opinions you're going to lay down.  You are there to riff on, not read from, the slides, remember?  Now start bridging the slides with transition material.  Trust me:  This material is more for your sake than even your audience's.

Repeat until your gut tells you it's done. You're not really done, but roll with the sweet illusion for a short time, m'kay?

10.) Beta-test with a trusted (and brutally honest!) peer.  And no, I don't mean your cat.  I'm talking about someone who has a similar background but not the level of expertise in your subject matter.  This will be painful, guaranteed.  But you're already used to that.  If you've picked the right peer, the feedback will be tough to process.  That's okay.  The trick is to triage the big-ticket items (e.g. she was totally lost with this whole section).  Ignore the nit-picky stuff.  No, really.  Yes, it's the easiest to fix.  But in terms of bang-for-the-loonie?  Fuggedabbouddit.

11.) Dance without a net.  By this time (several days in), your creation should be starting to breathe on its own.  At the risk of sounding pervy, take it into the shower with you.  (Not the laptop, silly!)  Blanking between sections is perfectly normal and, really, who cares what the shower curtain thinks?  If you're visually-oriented like me, the images you picked out for your section-header slides will help trigger the unwritten "glue" content.

12.) Dress rehearsal.  At least a day (but no more than two days) before the scheduled presentation, Do a couple runs.  If you can arrange to make them in the same conference room (or whatever) in which you'll give the final product, so much the better.  What's in your head and what comes out of your mouth should be fairly close on the second try.

Pro tip:  Limit the # of rehearsals per day if you don't want your vocal cords to turn on you on The Big Day.  If you're still feeling dicey two days out, try three whispered full rehearsals, but no more.

13.) The real deal.  Honestly, there's nothing that I can say or that you can do that will truly prepare you for all the eyeballs boring into you when you get up.  You will be terrified...and that's okay.  Like Q said to James Bond:  "Never let them see you bleed."  You're going to be riding on obsessive preparation and your think-meat's muscle-memory.  This is precisely why you screen-capped your code samples and output.

Okay...because you've read this far and (apparently) trust my experience on this, I'm going to drop an ugly secret:  You're going to need to go through this process many times -- dozens, in fact -- before you can trust your mind to stuff your lizard-brain into a sound- and chew-proof box.  And the box is the best you can hope for.  That lizard-beastie will always be there.

Sorry 'bout that, but to tell you otherwise would be to lie to you.  Not to mention short-circuit the process of you becoming that rarest of birds:  The geek who can teach.  And we need more of you more than ever.  The opening shots of a war between the people who actually know what's going on and the mouth-breathing ideologues who think they can shoot from the hip have already been fired.  And not by our side.

So, to tweak the closing line from all my presentations:  Now get out there and teach something awesome.