A couple years ago, Dennis bought a laptop on the cheap that swiftly demonstrated its shortcomings as a "portable" Windows machine.
For tasters, it's a leviathan: A Dell Precision M6400, boasting a 17" screen and tipping the scales at 8.54 lb / 3.88kg. During its limited use, its Windows 7 OS never could be convinced that it was Genuine Microsoft(TM), despite entering the registration code over and Over And OVER. It's also a battery hog, running less than two hours on a full charge. Which is not at all surprising, considering how, were you to flip it upside down while it's running, I'm not convinced that you couldn't pop popcorn with the vents.
It was Dennis's tablet purchase that ultimately consigned this bully-boy to the corner and dust. But my go-to Linux laptop ("Big Grey") is about eight years old. Also, it spooked me a week or so back with a bad HDD sector. So it was time to think about migrating. Dennis swore that he really-most-sincerely had zero use for the Dell (not even as a doorstop).
So, booyah for a free laptop, amirite? (Uh-huh: I can already hear the DIY computer-geeks start to snicker evilly already...)
Normally I run on AMD chips, but this laptop is Team Intel. Welp, turns out that, somewhere along the line, I missed the memo that the "AMD64" distributions of Linux now apply to 64-bit Intel chips as well. And I didn't realise my mistake until after making it about halfway through the a Debian 16.04 installation/configuration. (Side note: Whatever improvements the Ubuntu team may have made to the base product, their installation is still as unpolished as ever, so it took two attempts to make it that far.) And that was with the wifi drivers working right out of the box. (A not-so-small mercy which does not go unappreciated.)
That-all basically shot a day.
But, hey, a free laptop's a free laptop, right?
Disgusted, I made the i386 system image dig its own grave by downloading the AMD64 Ubuntu .ISO file and burning it to DVD. Except that ISO file was corrupted sometime during the burn-in. (Do not go gentle into that good night, i386 -- I gotta admire your spirit.)
Fine. Screw you, Ubuntu...and the godsforsaken Unity desktop you rode in on.
Debian 9 ("Stretch") is barely over a month old, so I'll get all the latest/greatest stuff, right? Also, its system requirements are muuuuch friendlier to older hardware than Ubuntu.
So I dug out my notes -- OF COURSE I HAVE NOTES -- from previous Ubuntu/Debian installs; more a collection of URLs than anything else because why write it down when you can look it up, I ask you.
That was my first (big) mistake. Not the notes themselves, but forgetting how long it had been since I'd installed and customised a Debian OS. That would have been "Big Grey," about two Debian versions back, to be precise. (Raspian doesn't count b/c the Raspberry Pi is its own beastie, and most certainly not a the Swiss Army knife that a desktop/laptop is.)
Now. One of my oldest friends is a fan of old cars -- the antique auto show in Iola, WI, was the highlight of his summer (and his bonding time with his Dad). No surprise that it was he who introduced me to the Johnny Cash song "One Piece at a Time." Lyrics (courtesy of Google) are at the end of this post, but the tl;dr is a how guy schemes to assemble his very own Cadillac from parts smuggled out of the factory. Over the course of a multi-decade career working in the GM factory.
My Gentle Reader doubtless sees the punch-line coming...and can well understand that the parallels were not lost on me. Because technology moves so much faster in I/T, and I'm not convinced that there weren't about as many moving parts by the time that beastie was fully functional.
All told, it cost me the better part of three days to set up this "free" laptop. A few reasons:
Gnome Apparently, it's a different flavour ("Metacity" vs. just "Gnome Classic") that allows for the civilised amenity of app. launchers on the top bar. Except Metacity didn't show up in the login options for multiple boots/reboots. Once it was an option, fighting custom launchers to make them retain the designated icon took a few tries to get right.
Dropbox Installed from binaries, not repos (Boo!) This version refused to start on startup; needed to create autostart file. Still figuring out how to terminate its process at shutdown and wondering why in name of Mordor that isn't a part of the .desktop file vocabulary in the first place.
The "AMP" part of "LAMP" I'm spoiled by having this install from a single terminal command. I forgot how crazy-simple this made things. And I was about to be forcibly reminded...
Apache User directories are enabled differently in 2.4 than they are in 2.2. So two tries there.
PHP Well-behaved, though I had to briefly go spelunking for the .conf file.
MySQL Absolute NIGHTMARE. Debian 9 actually installs the MySQL fork "MariaDB" and masks it with "mysql" executables. In a PR juke worthy of K Street, MariaDB is billed as a "drop-in replacement for MySQL." Uh-huh...yeeeeeah...so, about that... There's no prompt for a root password during install, which was the root (pun intended) of a morning's worth of hair-pulling. I could "anonymously" interact with the command-line UI, but all GUIs refused to connect. Between 2 separate Q&A sites, I managed to piece together the info. I needed to disable the anonymous command-line access and create an uber-sooper-dooper admin. user to take the place of root. Oh, and installing the mysql binaries from Oracle? Fuggedabouddit...I couldn't even connect to the server from the command-line UI (much less Workbench), so I went back to MariaDB after forcibly ejecting the borked MySQL installation.
MySQL Workbench Special mention b/c I literally lost track of how many times I installed/uninstalled it. Blows up on login, even with a valid username/password. No surprise--it's been trouble since the days (15 years ago) when "MySQL Query Browser" refused to run at all on Windows 2000. For now, PhpMyAdmin and the Database tool in NetBeans will, between them, do what I need to do.
PhpMyAdmin Speaking of which, there's now an extra step to install the missing mbstring and gettext modules. Otherwise, it worked like a charm after the main authentication issues were sorted out.
NetBeans Lesson learned: Go with the platform-independent version. Designate the base JDK folder, not the path to the executable. (That screw-up was sheer PEBKAC on my part, btw.) Blessings upon the Moncton Java guru who saved my bacon on that one. May the fries in his poutine always stay crispy. (I think that's a legit. Canadian blessing, but don't quote me...)
Fritzing Is in the Debian repos, but the parts bin is a separate install. (Whyyyyy???)
Geany Apparently changed the per-language syntax highlighting sometime between versions 1.23 and 1.29. Have temporarily thrown in the towel for custom keyword highlighting for OpenSCAD. (On the plus side, this version of OpenSCAD doesn't lose the menu-bar, so booyah.) But there went most of the afternoon.
And that's just the high points...
Along the way, of course, I learned a few new SysAdmin-type tricks. Mind you, I have no illusions that I'll remember them, but at least we won't be strangers when inevitably we meet again. I have a renewed & enhanced appreciation for the "Ask Ubuntu" community and the global treasure that is StackOverflow. And I would be remiss not to note an array of independent bloggers and the Google algorithm-finders that put them at my fingertips.
In the past, I've had a few sharp things to say about "free" software in the context of modern capitalism. And, let's not pussy-foot around it: I use this software gig-in and gig-out to earn my crust. But there is less than no question that my hard-won knowledge (not to be confused with "wisdom" because that's another beastie) cost me scarcely a drop compared to the Great Lake that is the sum of FOSS efforts going back decades.
Yes, I've drastically expanded and revised my notes. Mainly because I have a DisplayPort to DVI adapter on the way that will allow me to plug this laptop into the KVM switch (or so I trust) and take the place of the current workstation. (Sort of a two-for-one win-win.) If that-all pans out, I might just spring for the SDRAM to bring the memory up to 8GB. And after that goes down, it's finally Big Grey's turn for a rebuild and (likely) rebirth as a dedicated "maker" machine (Arduino, AVR, and 3D printing).
After that, of course, the value of the information in those notes will incrementally decay from "asset" to "liability." Next rebuild, I'll know better than to just whip them out thinking, "It's cool--I got this." "How can you be so sure?" my Gentle Reader might (justifiably) wonder. Don't be silly, GR: Of course I added a note IN ALL CAPS at the very top of the document. If you can count on a recovering tech. writer for anything, it, it's reading the documentation.
- - - - -
"One Piece at a Time" - Wayne Kemp
Well, I left Kentucky back in forty nine
An' went to Detroit workin' on a 'sembly line
The first year they had me puttin' wheels on Cadillacs
Every day I'd watch them beauties roll by
And sometimes I'd hang my head and cry
'Cause I always wanted me one that was long and black.
One day I devised myself a plan
That should be the envy of most any man
I'd sneak it out of there in a lunchbox in my hand
Now gettin' caught meant gettin' fired
But I figured I'd have it all by the time I retired
I'd have me a car worth at least a hundred grand.
I'd get it one piece at a time
And it wouldn't cost me a dime
You'll know it's me when I come through your town
I'm gonna ride around in style
I'm gonna drive everybody wild
'Cause I'll have the only one there is around.
So the very next day when I punched in
With my big lunchbox and with help from my friends
I left that day with a lunch box full of gears
I've never considered myself a thief
But GM wouldn't miss just one little piece
Especially if I strung it out over several years.
The first day I got me a fuel pump
And the next day I got me an engine and a trunk
Then I got me a transmission and all the chrome
The little things I could get in my big lunchbox
Like nuts, an' bolts, and all four shocks
But the big stuff we snuck out in my buddy's mobile home.
Now, up to now my plan went all right
'Til we tried to put it all together one night
And that's when we noticed that something was definitely wrong.
The transmission was a fifty three
And the motor turned out to be a seventy three
And when we tried to put in the bolts all the holes were gone.
So we drilled it out so that it would fit
And with a little bit of help with an adapter kit
We had that engine runnin' just like a song
Now the headlight' was another sight
We had two on the left and one on the right
But when we pulled out the switch all three of 'em come on.
The back end looked kinda funny too
But we put it together and when we got through
Well, that's when we noticed that we only had one tail-fin
About that time my wife walked out
And I could see in her eyes that she had her doubts
But she opened the door and said "Honey, take me for a spin."
So we drove up town just to get the tags
And I headed her right on down main drag
I could hear everybody laughin' for blocks around
But up there at the court house they didn't laugh
'Cause to type it up it took the whole staff
And when they got through the title weighed sixty pounds.
I got it one piece at a time
And it wouldn't cost me a dime
You'll know it's me when I come through your town
I'm gonna ride around in style
I'm gonna drive everybody wild
'Cause I'll have the only one there is around.
Ugh! Yeah, RED RYDER
This is the COTTON MOUTH
In the PSYCHO-BILLY CADILLAC Come on
Huh, This is the COTTON MOUTH
And negatory on the cost of this mow-chine there RED RYDER
You might say I went right up to the factory
And picked it up, it's cheaper that way
Ugh!, what model is it?
Well, It's a '49, '50, '51, '52, '53, '54, '55, '56
'57, '58' 59' automobile
It's a '60, '61, '62, '63, '64, '65, '66, '67
'68, '69, '70 automobile.
Thoughts on computers, companies, and the equally puzzling humans who interact with them
Friday, July 28, 2017
Monday, June 26, 2017
Death, be not ironic *
The news is a little stale, mainly because I've been lazy when I've not been shaving yaks.
Jean Sammet passed away a little over a month ago. And while I bless the New York Times for dispelling a myth I've lived with for about two decades, I'm equally outraged that this was the first I'd heard of Ms. Sammet's work. When I re-booted my computer programming education in 1996, the curriculum included the COBOL programming language. (Gen-Y and older folks will remember the Y2K non-event, which was proceeded by a sharp demand for such, ahem, vintage languages as businesses threw money at decades of technical debt.)
COBOL was/is indelibly associated with Admiral Dr. Grace Hopper, who is largely responsible for the fact that "programming" no longer involves building software with tweezers, pushing ones and zeroes on and off stacks of memory-addresses. (Assembly-language is the closest you can get these days. I tried it once. Once. [shudder]) One of my teachers -- himself a PhD -- was immensely proud of having met her...and the fact that she, by that time a senior citizen, was exhausting to keep pace with.
But, as it turns out, "Amazing Grace" didn't design so much as a feature of the language. No question that her work was absolutely foundational to it (and, really, everything since). But, as Ms. Sammet's obituary points out, Hopper's "Mother of COBOL" moniker is entirely undeserved. (Hello, Halo Effect.) The actual credit belongs to Ms. Sammet and five other programmers who slammed out the design in two weeks. (In the days before Red Bull and foosball tables, if you can believe that!)
What I find interesting about this era in computing history is the sense of a battle for the soul of computer programming. Dr. Hopper's work was, at it base, driven by an abiding love of mathematics. She found the bit-twiddling a needless waste of a mathematician's time, and she bucked management to develop a more English-like grammar. (Woo, skunkworks project!)
In contrast, her fellow force-of-nature, Ms. Sammet, seemed more influenced by working for hardware manufacturers, absorbing the culture of calipers, slide-rules, etc. And it comes through in her view of software as the product of a more rigorous engineering process. (A view echoed during my stint at IBM in the early 2000s when "the process is the product.")
In the end, however, neither of these formidable ladies was entirely correct. Because software was quickly co-opted by business, where neither mathematical precision nor engineering rigour can stand up to the short-term profit motive...and the long-term tendency to kick the can down the road. Fittingly, the NYT obituary for Ms. Sammet concludes:
- - - - -
* In case it matters, title is a riff on one of John Donne's poems.
Jean Sammet passed away a little over a month ago. And while I bless the New York Times for dispelling a myth I've lived with for about two decades, I'm equally outraged that this was the first I'd heard of Ms. Sammet's work. When I re-booted my computer programming education in 1996, the curriculum included the COBOL programming language. (Gen-Y and older folks will remember the Y2K non-event, which was proceeded by a sharp demand for such, ahem, vintage languages as businesses threw money at decades of technical debt.)
COBOL was/is indelibly associated with Admiral Dr. Grace Hopper, who is largely responsible for the fact that "programming" no longer involves building software with tweezers, pushing ones and zeroes on and off stacks of memory-addresses. (Assembly-language is the closest you can get these days. I tried it once. Once. [shudder]) One of my teachers -- himself a PhD -- was immensely proud of having met her...and the fact that she, by that time a senior citizen, was exhausting to keep pace with.
But, as it turns out, "Amazing Grace" didn't design so much as a feature of the language. No question that her work was absolutely foundational to it (and, really, everything since). But, as Ms. Sammet's obituary points out, Hopper's "Mother of COBOL" moniker is entirely undeserved. (Hello, Halo Effect.) The actual credit belongs to Ms. Sammet and five other programmers who slammed out the design in two weeks. (In the days before Red Bull and foosball tables, if you can believe that!)
What I find interesting about this era in computing history is the sense of a battle for the soul of computer programming. Dr. Hopper's work was, at it base, driven by an abiding love of mathematics. She found the bit-twiddling a needless waste of a mathematician's time, and she bucked management to develop a more English-like grammar. (Woo, skunkworks project!)
In contrast, her fellow force-of-nature, Ms. Sammet, seemed more influenced by working for hardware manufacturers, absorbing the culture of calipers, slide-rules, etc. And it comes through in her view of software as the product of a more rigorous engineering process. (A view echoed during my stint at IBM in the early 2000s when "the process is the product.")
In the end, however, neither of these formidable ladies was entirely correct. Because software was quickly co-opted by business, where neither mathematical precision nor engineering rigour can stand up to the short-term profit motive...and the long-term tendency to kick the can down the road. Fittingly, the NYT obituary for Ms. Sammet concludes:
"COBOL was initially intended as a short-term solution to the problem of handling business data — a technology that might be useful for a year or two until something better came along. But it has lived on. More than 200 billion lines of COBOL code are now in use and an estimated 2 billion lines are added or changed each year, according to IBM Research."Apart from the term "bug" that has been Adm. Hopper's ubiquitous contribution to the programming lexicon, you'll sometimes see her quoted along the lines of, "The most dangerous phrase in the English language is, 'we've always done it that way.'" But there's also the (anonymous) adage, "If it ain't broke, don't fix it." As programming languages go, "something better" has probably come along in the last fifty years. Just not "better enough" to justify tossing out the engineering that went into the original COBOL-flavoured solutions. And that in itself is one HECK of a memorial.
- - - - -
* In case it matters, title is a riff on one of John Donne's poems.
Wednesday, May 31, 2017
You can't copy-and-paste a career
[Warning: Rant ahead.]
Because filing paperwork that will almost certainly be a waste of paper & printer ink & postage (not to mention the time of all involved) wasn't infuriating enough, this had to land like a fresh cow-pat across my path in Twitter today: Computer science students should learn to cheat, not be punished for it.
The tl;dr summary was best done by Homer Simpson (quoting from memory here): "Marge, don't discourage the boy! Weaseling is an important skill. It's what sets us apart from animals...except, of course, the weasels."
Because, you see, in The Real World(TM), coders copy and paste all the time. And coding in school should reflect the less ethically pristine norms of Silicon Valley. At least, according to a journalist who lists precisely no coding background in his profile.
Oh, and teaching Java as a first language is somehow corroding professional skills. I say "somehow" because there was literally no explanation for that offered in the main article. The CrossTalk URL stalls out. (Pity--it looked much more promising.) The second related URL links to another article by the same author which argues that JavaScript is better because it isn't as scary and thus doesn't discourage the "fundamentally creative endeavor" that coding is supposed to be. (No, really, it said that. I wish I were making that up.)
Because schools, you see, are failing the software industry, and the 10% unemployment rate among UK CS graduates is iron-clad proof that Universities aren't teaching real job skills.
I mean, no real job skills besides sitting in herds pretending to be interested in what the authority figure at the head of the room is saying. Or to subsist on crap food consumed at irregular hours. Or the mad, scrambling stampede ahead of arbitrary deadlines (a.k.a. the semester). Or swallowing the seething rage that comes with individual performance ratings being dragged down by slackers you didn't want on your team in the first place. Or, not least of all, the almost rhythmic filling and emptying of your memory with the Next New Hotness that we need you on the bleeding edge of so we have someone to tap when we're finally forced to use the industry standard five years hence.
And now a journalist is advocating stealing--notice I didn't say borrowing--code as a professional skill.
Now. I spent about a year in the Fourth Estate, and even after more than two decades, I can appreciate how your knowledge has to be the proverbial mile-wide-and-an-inch-deep. I know that I had to lean on other people to understand the intricacies of red clay vs. blue clay in the street renovations, the problems caused by shoddy contracting on the new high school, and even which number under that pile of jerseys was actually holding the football. But I leaned on people who actually knew what they were talking about. Reporting on something from your personal "Hello, World!" perspective is a great view into how beginners view things. But it does not qualify you to design curriculum for an entire industry.
But! Surprise twist! I actually agree in principle that four-year University degrees are doing no one any favours by devoting weeks' worth of time to edge cases like sorting algorithms. Or discrete mathematics. Or even much NP-completeness theory beyond the basic epiphany that not all software solutions can be encapsulated by an algorithm. (Turning a brilliant-but-naive coder loose on something that they don't know can only be approximated or brute-forced will waste one heck of a lot of money. Let's avoid that, but not go too crazy on knapsack problems, m'kay?)
And I certainly can't argue that coming out of school knowing unit-testing, source control (particularly the Special Hell(TM) of branch merges) aren't a bad thing. Replace pop-quizzes with Scrum stand-up check-ins, for all I care.
But school already does a bang-up job of reinforcing some of the worst aspects of the world of work. (See above snark on "real job skills.") Stealing code should not be added to those sins.
Borrowing code is an entirely different matter. By "borrowing" I mean citing the source (e.g. the StackOverflow URL) in the comments. That accomplishes a few things:
The good news is that there is a stupid-simple way to stop universities from turning out unqualified junior programmers. Seriously. Paint-chip-eating-stupid-simple.
Ready for it?
Programmer job postings just need to stop requiring CS degrees.
That's it. Supply, meet demand. Next problem, please!
Okay, so there's actually some bad news: The Suits won't stand for that.
Let's take a minute to break that down. If The Suits absolutely must hire an onshore programmer, that's not chump-change. Granted, the modern (read: "internet-ified") job search process does a fantastic job of externalising much of that cost on the job-seeker. But there is some residual internal cost to hiring. That cost rises dramatically after a newly-acquired warm body is parked in its cubicle-stantion. Mainly because most new hires require 60-90 days to evolve the dysfunctions necessary to survive in their unique new corporate micro-climate. (But I'm not cynical.) Two to three months of salary plus overhead is not an insignificant investment. The bean-counters are right to want to minimise the risk that the new organism they've introduced will turn out to be a parasite.
Suits want security and stability. Disruption, y'understand, is only a good thing when it happens to other people. For them, post-secondary degrees are a form of insurance -- with the shiny bonus of not having to front the money for it. (If you're a home-owner, think about your bank requiring you to buy mortgage insurance to protect them in case you lose your ability to make payments: It's exactly like that.)
Are all companies so risk-phobic? Of course not. The current (U.S.) average seems to be that only about half of software developers hold a CS degree. It's absolutely possible to get a coding job without checking that box. Just not at places like Google, of course. Also, that expensive piece of paper, broadly speaking, is leverage for a higher starting salary (upon which future raises and starting salaries at other companies are largely dependant).
Because any stable company -- the kind we work for when we don't feel like gambling our ability to pay next month's rent on the chance of owning a private island -- is generally large enough to have a candidate-filtering mechanism called Human Resources.
I've had the privilege of knowing some very, very sharp HR folks. Yet nary a one of them can tell you whether or not my GitHub check-ins are crap. My StackOverflow score? That's literally just a number...one without an anchor (Pareto distributions and all that). Obviously, higher is better. But what's the baseline minimum? And what if there was even a baseline? Think about wine for a second. It's rated on a hundred-point scale, but its scores (among its self-appointed referees) can be all over the map. And you can still pay top dollar for what tastes like plonk to you. But HR's gonna extrapolate from an up-vote count? Yeeeeeeah...
The thing is, if programming were treated like every other profession instead of the Dark Alchemic Art that it emphatically is not, none of this would even be an issue. And I place the blame squarely on business. When you can't trust the standard metrics, make your own. You need demonstrable coding skills? Have your current developers put together a quiz for candidates. There's software for that. Does this person have language-specific certifications? Great. Take a quick peek at the GitHub/StackOverflow oeuvres to see whether they've actually been used. Do they have a blog? Do they give any presentations to other coders? Google is your friend. And you don't even have to leave the office. Yet.
Once you have a handful of candidates, get your butt off the internet and meet them. Preferably in a third-party setting. Then coordinate interview with those that make the cut in-person. This is where HR really earns its salary. While you (or your technical evaluators) are swimming in alphabet soup, they're looking for the social cues:
Sure, the internet is a great (and relatively cheap) way to boost the signal. But it also affects the signal-to-noise ratio like, whoa. So the first-line filter (a.k.a. HR) needs some criteria to weed out the self-proclaimed ninjas, rock stars, and unicorns of our Dunning-Krueger age. Employment histories will list date-ranges and, hopefully, mention the technology stack used by the previous employer. But there's rarely room to mention how long or how extensively any given technology was used in the field.
The bottom line is that, without training to evaluate code, HR doesn't have the resources to make this call on their own. At best, HR can do the legwork of plugging code samples into search engines to scan for plagiarism (assuming you care about that). Lacking proper domain knowledge, they have every incentive to fall back on traditional metrics. Which includes "outsourcing" the "follow up" / "circular file" judgement-call to the four-star rating-system of (you guessed it!) the good ol' college GPA.
At which point, you (as the hypothetical employer) have effectively lost your right to complain about universities not preparing CS students for real-world coding work. And fer cryin' in your Double-Double, please don't expect colleges to reward plagiarism. This world is already too much a kleptocracy, thanks.
Because filing paperwork that will almost certainly be a waste of paper & printer ink & postage (not to mention the time of all involved) wasn't infuriating enough, this had to land like a fresh cow-pat across my path in Twitter today: Computer science students should learn to cheat, not be punished for it.
The tl;dr summary was best done by Homer Simpson (quoting from memory here): "Marge, don't discourage the boy! Weaseling is an important skill. It's what sets us apart from animals...except, of course, the weasels."
Because, you see, in The Real World(TM), coders copy and paste all the time. And coding in school should reflect the less ethically pristine norms of Silicon Valley. At least, according to a journalist who lists precisely no coding background in his profile.
Oh, and teaching Java as a first language is somehow corroding professional skills. I say "somehow" because there was literally no explanation for that offered in the main article. The CrossTalk URL stalls out. (Pity--it looked much more promising.) The second related URL links to another article by the same author which argues that JavaScript is better because it isn't as scary and thus doesn't discourage the "fundamentally creative endeavor" that coding is supposed to be. (No, really, it said that. I wish I were making that up.)
Because schools, you see, are failing the software industry, and the 10% unemployment rate among UK CS graduates is iron-clad proof that Universities aren't teaching real job skills.
I mean, no real job skills besides sitting in herds pretending to be interested in what the authority figure at the head of the room is saying. Or to subsist on crap food consumed at irregular hours. Or the mad, scrambling stampede ahead of arbitrary deadlines (a.k.a. the semester). Or swallowing the seething rage that comes with individual performance ratings being dragged down by slackers you didn't want on your team in the first place. Or, not least of all, the almost rhythmic filling and emptying of your memory with the Next New Hotness that we need you on the bleeding edge of so we have someone to tap when we're finally forced to use the industry standard five years hence.
And now a journalist is advocating stealing--notice I didn't say borrowing--code as a professional skill.
Now. I spent about a year in the Fourth Estate, and even after more than two decades, I can appreciate how your knowledge has to be the proverbial mile-wide-and-an-inch-deep. I know that I had to lean on other people to understand the intricacies of red clay vs. blue clay in the street renovations, the problems caused by shoddy contracting on the new high school, and even which number under that pile of jerseys was actually holding the football. But I leaned on people who actually knew what they were talking about. Reporting on something from your personal "Hello, World!" perspective is a great view into how beginners view things. But it does not qualify you to design curriculum for an entire industry.
But! Surprise twist! I actually agree in principle that four-year University degrees are doing no one any favours by devoting weeks' worth of time to edge cases like sorting algorithms. Or discrete mathematics. Or even much NP-completeness theory beyond the basic epiphany that not all software solutions can be encapsulated by an algorithm. (Turning a brilliant-but-naive coder loose on something that they don't know can only be approximated or brute-forced will waste one heck of a lot of money. Let's avoid that, but not go too crazy on knapsack problems, m'kay?)
And I certainly can't argue that coming out of school knowing unit-testing, source control (particularly the Special Hell(TM) of branch merges) aren't a bad thing. Replace pop-quizzes with Scrum stand-up check-ins, for all I care.
But school already does a bang-up job of reinforcing some of the worst aspects of the world of work. (See above snark on "real job skills.") Stealing code should not be added to those sins.
Borrowing code is an entirely different matter. By "borrowing" I mean citing the source (e.g. the StackOverflow URL) in the comments. That accomplishes a few things:
- It allows you (or the poor slob who has to maintain your code) to go back to the source for reference. Which can answer questions like: "What was the original code meant to accomplish"? "How old is it?" "Was the solution up-voted and/or embellished with further useful comment?"
- It demonstrates to your team-mates and/or bosses that you don't take credit for other people's work.
- If the code completely bombs QA/CI tests, you don't look like quite the idiot you would have it had been your own creation, amirite? ;~)
The good news is that there is a stupid-simple way to stop universities from turning out unqualified junior programmers. Seriously. Paint-chip-eating-stupid-simple.
Ready for it?
Programmer job postings just need to stop requiring CS degrees.
That's it. Supply, meet demand. Next problem, please!
Okay, so there's actually some bad news: The Suits won't stand for that.
Let's take a minute to break that down. If The Suits absolutely must hire an onshore programmer, that's not chump-change. Granted, the modern (read: "internet-ified") job search process does a fantastic job of externalising much of that cost on the job-seeker. But there is some residual internal cost to hiring. That cost rises dramatically after a newly-acquired warm body is parked in its cubicle-stantion. Mainly because most new hires require 60-90 days to evolve the dysfunctions necessary to survive in their unique new corporate micro-climate. (But I'm not cynical.) Two to three months of salary plus overhead is not an insignificant investment. The bean-counters are right to want to minimise the risk that the new organism they've introduced will turn out to be a parasite.
Suits want security and stability. Disruption, y'understand, is only a good thing when it happens to other people. For them, post-secondary degrees are a form of insurance -- with the shiny bonus of not having to front the money for it. (If you're a home-owner, think about your bank requiring you to buy mortgage insurance to protect them in case you lose your ability to make payments: It's exactly like that.)
Are all companies so risk-phobic? Of course not. The current (U.S.) average seems to be that only about half of software developers hold a CS degree. It's absolutely possible to get a coding job without checking that box. Just not at places like Google, of course. Also, that expensive piece of paper, broadly speaking, is leverage for a higher starting salary (upon which future raises and starting salaries at other companies are largely dependant).
Because any stable company -- the kind we work for when we don't feel like gambling our ability to pay next month's rent on the chance of owning a private island -- is generally large enough to have a candidate-filtering mechanism called Human Resources.
I've had the privilege of knowing some very, very sharp HR folks. Yet nary a one of them can tell you whether or not my GitHub check-ins are crap. My StackOverflow score? That's literally just a number...one without an anchor (Pareto distributions and all that). Obviously, higher is better. But what's the baseline minimum? And what if there was even a baseline? Think about wine for a second. It's rated on a hundred-point scale, but its scores (among its self-appointed referees) can be all over the map. And you can still pay top dollar for what tastes like plonk to you. But HR's gonna extrapolate from an up-vote count? Yeeeeeeah...
The thing is, if programming were treated like every other profession instead of the Dark Alchemic Art that it emphatically is not, none of this would even be an issue. And I place the blame squarely on business. When you can't trust the standard metrics, make your own. You need demonstrable coding skills? Have your current developers put together a quiz for candidates. There's software for that. Does this person have language-specific certifications? Great. Take a quick peek at the GitHub/StackOverflow oeuvres to see whether they've actually been used. Do they have a blog? Do they give any presentations to other coders? Google is your friend. And you don't even have to leave the office. Yet.
Once you have a handful of candidates, get your butt off the internet and meet them. Preferably in a third-party setting. Then coordinate interview with those that make the cut in-person. This is where HR really earns its salary. While you (or your technical evaluators) are swimming in alphabet soup, they're looking for the social cues:
- What's their body language? Closed? Aggressive?
- How do they react when they're challenged/corrected?
- Do they treat people of different genders/ethnicities differently?
- How much of their attention goes to people who were not "authority figures"?
- When talking about previous teamwork, what's the "We"-to-"I" ratio?
- Do they put their feet up on my desk during our 1:1? (No joke, this legit. happened to a recruiter I worked with. Needless to write, the candidate was not invited back.)
Sure, the internet is a great (and relatively cheap) way to boost the signal. But it also affects the signal-to-noise ratio like, whoa. So the first-line filter (a.k.a. HR) needs some criteria to weed out the self-proclaimed ninjas, rock stars, and unicorns of our Dunning-Krueger age. Employment histories will list date-ranges and, hopefully, mention the technology stack used by the previous employer. But there's rarely room to mention how long or how extensively any given technology was used in the field.
The bottom line is that, without training to evaluate code, HR doesn't have the resources to make this call on their own. At best, HR can do the legwork of plugging code samples into search engines to scan for plagiarism (assuming you care about that). Lacking proper domain knowledge, they have every incentive to fall back on traditional metrics. Which includes "outsourcing" the "follow up" / "circular file" judgement-call to the four-star rating-system of (you guessed it!) the good ol' college GPA.
At which point, you (as the hypothetical employer) have effectively lost your right to complain about universities not preparing CS students for real-world coding work. And fer cryin' in your Double-Double, please don't expect colleges to reward plagiarism. This world is already too much a kleptocracy, thanks.
Tuesday, January 31, 2017
The presentation I won't give
Today I had lunch with two of the three other Illuminati of the local programmer group. (If you happened to be in the Codiac Cafe on George Street over the noon-ish hour, yeah, that was us in the shadowy grey robes. Next time, come up and say "Hi.")
Anyhoo, in addition to hashing out the details of the upcoming meeting/hackathon, we also knocked around ideas for upcoming presentations. Which involves, of course, recruiting speakers. Wherein lies the problem. Because, you see, the Venn diagram of people who know a lot about something interesting and the people who have the ability to get up in front of a crowd of their peers and roll out that information in a structured, digestible way is not what'cha'call a perfect circle, y'know?
But the rub is that the only people to show up for a "How to Give a Presentation" presentation are likely to come from a single demographic -- i.e. the folks who don't want to see the presenter's feelings hurt.
So I'm doing this online. Think of it as a self-study course. If you're old enough to remember people working out to a Jazzercise VCR, so much the better.
Now. I do at least one presentation a year. Mostly to give back, but partly to look the beast of public speaking in its many eyeballs and say, "I ain't scared 'a you." (The beastie and I both know it's an alternative fact straight-up lie, but that's actually the point here.) But I have an ace in the hole: I was doing this at sixteen for giggles. Okay, not really giggles. I had a massive crush on one of my high school's Debate alpha-team. But that crush put me on a path that lead to meeting my husband and the Bestest of BFFs. So who's the real winner here?
Anyhoo. After six or seven years (depending on how you count) of competitive public speaking, I have a few things to pass down to the up-and-coming generation of geek-speakers. Normally, I'd issue a "your mileage may vary" kind of disclaimer, but if you haven't done this more than one or two times, stick with the program until you are comfortable enough not to need it. (This is highly analogous to following a recipe to the number until you grok what makes it tick, m'kay?)
I'm assuming that you have one of the following:
1.) Pull out your favourite plain-text editor. Yes, plain-text. Heck, Notepad, for all I care. Because you are emphatically NOT ALLOWED TO FORMAT ANYTHING at this point. Got that editor fired up? Groovy. Now shut up and let your brain barf out a bullet-list. Single-level only. Don't think: Just get it on the screen. Like, yesterday.
Done? Fabulous. Whatever you do, don't fall in love. Because, about a century before George R. R. Martin, "Murder your darlings" was already legit. literary advice.
2.) Organise. Okay, NOW you're allowed to be hierarchical. But not more than three levels, including section-headers. That effectively leaves you with two levels of detail. Why only two? Because this is soooooooooooo not about you dazzling anyone with nuance; this is about you not boring the ever-loving snot out of the people who are graciously lending you their attention-span.
Cut-and-paste, drag-and-drop until the content feels like it should flow like cream into the brain of someone who knows nothing about the subject.
Nope, don't fall in love here, either. You're going to both murder and mutilate very shortly.
3.) Show, don't tell. Time to lean on images. First, create a folder at the same level as your brain-dump. Now go search and scroll. At a bare-bones minimum, you're going to want images to jazz up your section-header pages. I prefer sly, snarky humour m'self -- big surprise there -- but follow your intended audiences tastes. Whatever you pick, name the file descriptively. You're going to need that information at your fingertips.
Can you convert either-ors into flow-charts? Go. Side-by-side comparisons as tables? Dooo eeeeet. Comparisons/Contrasts as Venn diagrams? Make it so, Number One.
Are you showing code-samples? Excellent. Go type them up, make sure they compile, and then screen-capture them. Ditto the output. This is your insurance against wi-fi issues. Also: You know all those nervous presenters you've seen trying to type code live? This is precisely why you won't be them. You're welcome.
4.) Unit-testing. First, brace yourself for some ugly truths. Deep breath. Now, translate your outline (and images) to simple slides. Simple slides, d'ya hear?! PUT THAT ANIMATION MENU DOWN. Animations are for closers. (Sorry-not-sorry, Alec Baldwin.)
Uh-oh: Some of those topics don't fit onto a single slide, now do they? Huh. Guess you should be thinking about how you're going to break them up into separate topics, mmmm? 'sokay, it's not like you lose points for this. This is the "mutilation" I talked about. Good thing you never let that outline give you big chibi puppy-dog-eyes, amirite?
When you're done, convert the whole thing to PDF, and close your slide editor.
5.) First integration test. Close the door. Keep your laptop/keyboard within arm's reach. Open the document in PDF format. Annnnnnnnd...present! (No stopping allowed -- just plough through it, already.)
Dread Cthulu, that was painful. All that goodness in your head doesn't always quite make it to the spoken word, no? The good news is that the presentation will never sound that gawdawful again. The bad news is that you have a bunch more iterative more work in front of you.
6.) Cull content. First things first. Let's get rid of the stuff that your audience, on second glance, doesn't actually need to know to understand the main points. Just get rid of it and don't look back. Yep--murder your darlings. "Red Wedding" style if necessary.
7.) Re-organise ruthlessly. Did some sections really seem like non-sequiturs when you were fumbling your way through them? Move them before they try to consolidate their positions. Do some still stick out like the proverbial sore thumb? You might want to re-think their importance.
8.) Subsequent integration tests. Follow the recipe for "First integration test." This is basically analogous to the "childbirth amnesia" women experience between their first and second (and even subsequent) children. Sure, the first time is generally the worst, but even so...yeeowch. But, in fairness, perhaps not quite as bad. Maybe.
Keep iterating: Cull and re-organise. Work on something else for awhile and let things simmer on the proverbial back-burner. The content is starting to fall into its rightful places.
9.) Refine the flow. No slide-deck, however well put together, will always flow seamlessly from one section into the next. Or subsection. Or maybe even between bullet-points on the same slide. You probably noticed that during the iterations, yes? That's okay. Seriously okay. Because, really, if it were All About the bullet-points alone, you could (and -- let's face it -- should) just email the slide-desk and let people read it at their leisure. Your value-add is to, quite literally, read between the lines. And the page-breaks, for that matter.
Condense your bullet-points. They exist as hooks for your content, the war-stories you're going to tell, the experienced opinions you're going to lay down. You are there to riff on, not read from, the slides, remember? Now start bridging the slides with transition material. Trust me: This material is more for your sake than even your audience's.
Repeat until your gut tells you it's done. You're not really done, but roll with the sweet illusion for a short time, m'kay?
10.) Beta-test with a trusted (and brutally honest!) peer. And no, I don't mean your cat. I'm talking about someone who has a similar background but not the level of expertise in your subject matter. This will be painful, guaranteed. But you're already used to that. If you've picked the right peer, the feedback will be tough to process. That's okay. The trick is to triage the big-ticket items (e.g. she was totally lost with this whole section). Ignore the nit-picky stuff. No, really. Yes, it's the easiest to fix. But in terms of bang-for-the-loonie? Fuggedabbouddit.
11.) Dance without a net. By this time (several days in), your creation should be starting to breathe on its own. At the risk of sounding pervy, take it into the shower with you. (Not the laptop, silly!) Blanking between sections is perfectly normal and, really, who cares what the shower curtain thinks? If you're visually-oriented like me, the images you picked out for your section-header slides will help trigger the unwritten "glue" content.
12.) Dress rehearsal. At least a day (but no more than two days) before the scheduled presentation, Do a couple runs. If you can arrange to make them in the same conference room (or whatever) in which you'll give the final product, so much the better. What's in your head and what comes out of your mouth should be fairly close on the second try.
Pro tip: Limit the # of rehearsals per day if you don't want your vocal cords to turn on you on The Big Day. If you're still feeling dicey two days out, try three whispered full rehearsals, but no more.
13.) The real deal. Honestly, there's nothing that I can say or that you can do that will truly prepare you for all the eyeballs boring into you when you get up. You will be terrified...and that's okay. Like Q said to James Bond: "Never let them see you bleed." You're going to be riding on obsessive preparation and your think-meat's muscle-memory. This is precisely why you screen-capped your code samples and output.
Okay...because you've read this far and (apparently) trust my experience on this, I'm going to drop an ugly secret: You're going to need to go through this process many times -- dozens, in fact -- before you can trust your mind to stuff your lizard-brain into a sound- and chew-proof box. And the box is the best you can hope for. That lizard-beastie will always be there.
Sorry 'bout that, but to tell you otherwise would be to lie to you. Not to mention short-circuit the process of you becoming that rarest of birds: The geek who can teach. And we need more of you more than ever. The opening shots of a war between the people who actually know what's going on and the mouth-breathing ideologues who think they can shoot from the hip have already been fired. And not by our side.
So, to tweak the closing line from all my presentations: Now get out there and teach something awesome.
Anyhoo, in addition to hashing out the details of the upcoming meeting/hackathon, we also knocked around ideas for upcoming presentations. Which involves, of course, recruiting speakers. Wherein lies the problem. Because, you see, the Venn diagram of people who know a lot about something interesting and the people who have the ability to get up in front of a crowd of their peers and roll out that information in a structured, digestible way is not what'cha'call a perfect circle, y'know?
But the rub is that the only people to show up for a "How to Give a Presentation" presentation are likely to come from a single demographic -- i.e. the folks who don't want to see the presenter's feelings hurt.
So I'm doing this online. Think of it as a self-study course. If you're old enough to remember people working out to a Jazzercise VCR, so much the better.
Now. I do at least one presentation a year. Mostly to give back, but partly to look the beast of public speaking in its many eyeballs and say, "I ain't scared 'a you." (The beastie and I both know it's a
Anyhoo. After six or seven years (depending on how you count) of competitive public speaking, I have a few things to pass down to the up-and-coming generation of geek-speakers. Normally, I'd issue a "your mileage may vary" kind of disclaimer, but if you haven't done this more than one or two times, stick with the program until you are comfortable enough not to need it. (This is highly analogous to following a recipe to the number until you grok what makes it tick, m'kay?)
I'm assuming that you have one of the following:
- A topic you're so passionate about that you're bursting at the seams with its awesomeness, or
- Your boss has "voluntold" you to present on something of relevance to your co-workers
1.) Pull out your favourite plain-text editor. Yes, plain-text. Heck, Notepad, for all I care. Because you are emphatically NOT ALLOWED TO FORMAT ANYTHING at this point. Got that editor fired up? Groovy. Now shut up and let your brain barf out a bullet-list. Single-level only. Don't think: Just get it on the screen. Like, yesterday.
Done? Fabulous. Whatever you do, don't fall in love. Because, about a century before George R. R. Martin, "Murder your darlings" was already legit. literary advice.
2.) Organise. Okay, NOW you're allowed to be hierarchical. But not more than three levels, including section-headers. That effectively leaves you with two levels of detail. Why only two? Because this is soooooooooooo not about you dazzling anyone with nuance; this is about you not boring the ever-loving snot out of the people who are graciously lending you their attention-span.
Cut-and-paste, drag-and-drop until the content feels like it should flow like cream into the brain of someone who knows nothing about the subject.
Nope, don't fall in love here, either. You're going to both murder and mutilate very shortly.
3.) Show, don't tell. Time to lean on images. First, create a folder at the same level as your brain-dump. Now go search and scroll. At a bare-bones minimum, you're going to want images to jazz up your section-header pages. I prefer sly, snarky humour m'self -- big surprise there -- but follow your intended audiences tastes. Whatever you pick, name the file descriptively. You're going to need that information at your fingertips.
Can you convert either-ors into flow-charts? Go. Side-by-side comparisons as tables? Dooo eeeeet. Comparisons/Contrasts as Venn diagrams? Make it so, Number One.
Are you showing code-samples? Excellent. Go type them up, make sure they compile, and then screen-capture them. Ditto the output. This is your insurance against wi-fi issues. Also: You know all those nervous presenters you've seen trying to type code live? This is precisely why you won't be them. You're welcome.
4.) Unit-testing. First, brace yourself for some ugly truths. Deep breath. Now, translate your outline (and images) to simple slides. Simple slides, d'ya hear?! PUT THAT ANIMATION MENU DOWN. Animations are for closers. (Sorry-not-sorry, Alec Baldwin.)
Uh-oh: Some of those topics don't fit onto a single slide, now do they? Huh. Guess you should be thinking about how you're going to break them up into separate topics, mmmm? 'sokay, it's not like you lose points for this. This is the "mutilation" I talked about. Good thing you never let that outline give you big chibi puppy-dog-eyes, amirite?
When you're done, convert the whole thing to PDF, and close your slide editor.
5.) First integration test. Close the door. Keep your laptop/keyboard within arm's reach. Open the document in PDF format. Annnnnnnnd...present! (No stopping allowed -- just plough through it, already.)
Dread Cthulu, that was painful. All that goodness in your head doesn't always quite make it to the spoken word, no? The good news is that the presentation will never sound that gawdawful again. The bad news is that you have a bunch more iterative more work in front of you.
6.) Cull content. First things first. Let's get rid of the stuff that your audience, on second glance, doesn't actually need to know to understand the main points. Just get rid of it and don't look back. Yep--murder your darlings. "Red Wedding" style if necessary.
7.) Re-organise ruthlessly. Did some sections really seem like non-sequiturs when you were fumbling your way through them? Move them before they try to consolidate their positions. Do some still stick out like the proverbial sore thumb? You might want to re-think their importance.
8.) Subsequent integration tests. Follow the recipe for "First integration test." This is basically analogous to the "childbirth amnesia" women experience between their first and second (and even subsequent) children. Sure, the first time is generally the worst, but even so...yeeowch. But, in fairness, perhaps not quite as bad. Maybe.
Keep iterating: Cull and re-organise. Work on something else for awhile and let things simmer on the proverbial back-burner. The content is starting to fall into its rightful places.
9.) Refine the flow. No slide-deck, however well put together, will always flow seamlessly from one section into the next. Or subsection. Or maybe even between bullet-points on the same slide. You probably noticed that during the iterations, yes? That's okay. Seriously okay. Because, really, if it were All About the bullet-points alone, you could (and -- let's face it -- should) just email the slide-desk and let people read it at their leisure. Your value-add is to, quite literally, read between the lines. And the page-breaks, for that matter.
Condense your bullet-points. They exist as hooks for your content, the war-stories you're going to tell, the experienced opinions you're going to lay down. You are there to riff on, not read from, the slides, remember? Now start bridging the slides with transition material. Trust me: This material is more for your sake than even your audience's.
Repeat until your gut tells you it's done. You're not really done, but roll with the sweet illusion for a short time, m'kay?
10.) Beta-test with a trusted (and brutally honest!) peer. And no, I don't mean your cat. I'm talking about someone who has a similar background but not the level of expertise in your subject matter. This will be painful, guaranteed. But you're already used to that. If you've picked the right peer, the feedback will be tough to process. That's okay. The trick is to triage the big-ticket items (e.g. she was totally lost with this whole section). Ignore the nit-picky stuff. No, really. Yes, it's the easiest to fix. But in terms of bang-for-the-loonie? Fuggedabbouddit.
11.) Dance without a net. By this time (several days in), your creation should be starting to breathe on its own. At the risk of sounding pervy, take it into the shower with you. (Not the laptop, silly!) Blanking between sections is perfectly normal and, really, who cares what the shower curtain thinks? If you're visually-oriented like me, the images you picked out for your section-header slides will help trigger the unwritten "glue" content.
12.) Dress rehearsal. At least a day (but no more than two days) before the scheduled presentation, Do a couple runs. If you can arrange to make them in the same conference room (or whatever) in which you'll give the final product, so much the better. What's in your head and what comes out of your mouth should be fairly close on the second try.
Pro tip: Limit the # of rehearsals per day if you don't want your vocal cords to turn on you on The Big Day. If you're still feeling dicey two days out, try three whispered full rehearsals, but no more.
13.) The real deal. Honestly, there's nothing that I can say or that you can do that will truly prepare you for all the eyeballs boring into you when you get up. You will be terrified...and that's okay. Like Q said to James Bond: "Never let them see you bleed." You're going to be riding on obsessive preparation and your think-meat's muscle-memory. This is precisely why you screen-capped your code samples and output.
Okay...because you've read this far and (apparently) trust my experience on this, I'm going to drop an ugly secret: You're going to need to go through this process many times -- dozens, in fact -- before you can trust your mind to stuff your lizard-brain into a sound- and chew-proof box. And the box is the best you can hope for. That lizard-beastie will always be there.
Sorry 'bout that, but to tell you otherwise would be to lie to you. Not to mention short-circuit the process of you becoming that rarest of birds: The geek who can teach. And we need more of you more than ever. The opening shots of a war between the people who actually know what's going on and the mouth-breathing ideologues who think they can shoot from the hip have already been fired. And not by our side.
So, to tweak the closing line from all my presentations: Now get out there and teach something awesome.
Wednesday, November 16, 2016
Chasms and bridges
Last night's meeting of the local programmers' club (The Moncton User Group) was another departure from the usual lecture-with-Q-and-A format. This past spring, the MUG "illuminati" threw together a three-person panel of experienced software developers and structured the discussion around the theme of "To Be a Great Developer." Turnout was phenomenal. And, after a few "seed" questions (prepared in advance), questions from the audience flowed like wine.
So we tried it again with a different panel, again with excellent turnout. Which included one lone non-programmer--basically, someone looking to get into the proverbial head of his (hypothetical?) technical co-founder. (Needless to write, I was overjoyed at the hustle.)
Another thing they don't teach you in Programmer School(TM) is that the words "I'm looking for a technical co-founder" mean "I expect someone else to do all the development without a guaranteed paycheck." And I'm not necessarily knocking that attitude. Mainly because I don't buy into the idea that any job should be solely about the paycheck. (When it is, that's an unconscionable Management #FAIL.)
For a programmer, there are any number of reasons to step into the alternate-reality bubble of brand-new startup. Maybe she wants to make a bazillion dollars as a founder. Maybe she's really scratching her own itch, and the non-technical founder is just there to make the process scale. Maybe she wants to stick it to a particularly oppressive oligopoly. Maybe she wants to make something breathtakingly new. Maybe she just wants full control over her greenfield code. Maybe she wants the cachet of working for a sexy startup.
Any of those is a perfectly legitimate reason for a garden-variety software developer to become a technical co-founder. Or, for that matter, join an ambitious start-up at a steep discount. Which is precisely what the non-technical co-founder is looking for as they hustle an actual business into existence.
But here's the thing: Once that bargain between the non-technical and technical co-founder is made, it cannot be changed without steep penalties. And both the technical and non-technical founders have to grok that.
So here's the war-story I couldn't tell last night. (I moderated the discussion, so it would have been out of line.) Disclaimer: Everyone involved has since moved on to other things, so I'm not "tattling." A few years ago, I was approached by someone I knew only through social media. "Do you want to change the world?" was the pitch. Fair enough: The product hit that (narrow) sweet spot between "disruptive" and "in the public interest." (Go on...) There were meet-n-greets and salary-vs.-equity numbers thrown around and blahblahblah. But then the founder, kicking back at their desk, said, "I want to make a @#$%^~* lot of money."
Whoa there, Nellie: That's not the horse I saddled.
For various reasons--mostly unrelated and outside my control--our association never got off the ground. For me, it was an expensive lesson that opened my eyes to the self-involved monomania involved in running a startup. The lesson which I'm "donating" here, however, is something beyond that. I prefer to believe that the bait-and-switch chronicled above was not deliberate. Doing well and doing good are not always mutually exclusive. I prefer to believe that, too.
A lot of ink (real and digital) has been spilled on the notion that vision, creativity, and commitment (the critical DNA of a startup for which money is the spark of life) cannot be bought. Fair point. It's been documented since the 1950s that tying pay to creativity actually makes people less creative. But what's missing from those essays (of which I am guilty) is that just because those loftier qualities can't be bought does not mean that they can't be sold.
Seems like a contradiction, no?
Let's throw back to good ol' Dr. Maslow and his famous hierarchy. Except think of each tier as its own separate country--complete with its own currency.
When a person is working strictly for a paycheck, s/he is being paid primarily in the currency of the "Safety" country. Which, since we came down from the trees and invented trade, means that the currency exchange into the "Physiological" tier is pretty much a 1:1 transaction. An ample and steady paycheck assures that this person stays comfortably in this zone. That's normally not something a startup can offer.
However, someone taking a pay cut to work for a hot start-up, by contrast, is being paid at least partly in the currency of the "Esteem" country. Similarly, someone working to change the world (or to stick it to The Man) is cashing their check at the "Self-Actualisation" bank.
Switching currencies without notice--and especially without paying the exchange rate--is management and leadership suicide. A mass exodus of technical founders and early hires is thoroughly justified: Only naifs or morons should expect otherwise.
For instance, when working the traditional (and increasingly mythical) 9-to-5 job, one expects to spend 40+ hours away from the family one might be supporting. That's the deal. Beyond that? Time-and-a-half overtime was at least partly intended to be a penalty for taking away one's time with loved ones. Similarly, when Yahoo's Marissa Meyer rescinded work-from-home flexibility with no increase in compensation, the furor was completely predictable. That's the Safety - Family exchange rate in action. In the case of time-and-a-half, the exchange rate is agreed on. In the case of Yahoo, it was unacknowledged--hence, the (well-deserved) backlash.
Thus, when a glamourous unicorn startup is acquired by a company with deep pockets, each and every aqui-hire will (rightfully) expect to receive deep-pocket wages. If, for no other reason that the fact that the startup's "glamour discount" on their paycheck is no longer justified. Goodbye, cool factor; hello, boring behemoth. Esteem currency, meet Safety currency. Cha-ching!
And, finally, there's that idealistic, self-motivating Self-Actualisation crowd with their penthouse view of the hierarchy. From where I sit, commuting Safety currency to Self-Actualisation currency seems like pure alchemy. Unless you're Elon Musk, maybe. Otherwise? Morality, meet profit motive. Creativity, meet "...because we've always done it this way." Spontaneity, meet process. Problem solving, meet executive fiat. Lack of prejudice and acceptance of facts, may I introduce bureaucracy and vanity metrics. It's gonna take a huge payday to cover the exchange rate for all the chips that are cashed at that exit event.
Yet the technical co-founder as well as the early hires have some responsibility for maintaining the integrity of the currencies and their personal exchange rates. Mostly that involves respecting one's market value and being willing to take it elsewhere, of course. But also recognising the necessary pitfalls that come with a successful startup.
Geoffrey Moore's Crossing the Chasm details the process of bringing a new-new product (meaning something that didn't previously exist...particularly one that consumers have to actively wrap their brains around) to market. The "chasm" is the no-person's-land between the bleeding edge early adopters and the outer edges of the mainstream. In software development terms, that generally translates to going from simply "Make it work" to "Make it secure" + "Make it fast" + "Make it pretty" + "Make it scale."
That all involves more people. Assuming that the money doesn't run out first, that means more eyeballs on the product and different perspectives on where it is headed. Assuming outside funding, those "different perspectives" are guaranteed to devalue technical perfection in favour of faster time to market. And also guaranteed to cramp your style as CTO, even if you're only in it for your share of the jackpot.
As the company scales up, maybe you're brave enough to hire people just as bright and fearless and self-directing as you are. That's no guarantee that anyone else will be. So not everyone you work with will necessarily be as motivated and/or talented as the original gang. And, strangely enough, senior devs are grumpy about on-boarding new-hires when they still have deadlines to hit. Point new-hires at the wiki? That's been gathering dust for months. Maybe they can just jump in head-first via source control history? Ha! Not even an Ent could make sense of all those branches. And oh, dear sweet FSM, not another Slack channel--you gotta be kidding me. Can I pretty-please just get back to coding The Thing?
(Hint: Nope.)
And then comes the day when you've busted your rear to make month-over-month profits sustainable. Which is precisely when the VCs decide that they need "someone to take us to the next level." Whether that turns out to be a seasoned industry insider or just the douchebro smarmasaurus ex-roommate of one of the VCs, you can pretty much kiss any remaining startup vibe goodbye.
Exactly how much equity will to make up for that in terms of cold, hard cash? That's not a rhetorical question. That's a number that should be revisited regularly and often. Preferably objectively, and not merely in the immediate wake of those little twinges you feel at...
But, as a developer, you get exactly one startup experience before you can't claim that you never knew what hit you. You get exactly one excuse for naively being caught flat-footed with worthless equity and/or a bank balance that hasn't kept pace with your contributions to the company. It's probably better if you get it over with earlier in your career, but that's ultimately up to you.
Because your "reward" for bringing your product across the chasm is to turn around and find the bridge burned behind you. Like I said, all those intangible "perks" living in the upper stories of Maslow's hierarchy can't be bought. But now they have to be cashed out regardless of whether or not you're ready to sell. Just make sure that the payout matches your exchange rate. After all, you might just want to build yourself a new bridge into your next startup.
So we tried it again with a different panel, again with excellent turnout. Which included one lone non-programmer--basically, someone looking to get into the proverbial head of his (hypothetical?) technical co-founder. (Needless to write, I was overjoyed at the hustle.)
Another thing they don't teach you in Programmer School(TM) is that the words "I'm looking for a technical co-founder" mean "I expect someone else to do all the development without a guaranteed paycheck." And I'm not necessarily knocking that attitude. Mainly because I don't buy into the idea that any job should be solely about the paycheck. (When it is, that's an unconscionable Management #FAIL.)
For a programmer, there are any number of reasons to step into the alternate-reality bubble of brand-new startup. Maybe she wants to make a bazillion dollars as a founder. Maybe she's really scratching her own itch, and the non-technical founder is just there to make the process scale. Maybe she wants to stick it to a particularly oppressive oligopoly. Maybe she wants to make something breathtakingly new. Maybe she just wants full control over her greenfield code. Maybe she wants the cachet of working for a sexy startup.
Any of those is a perfectly legitimate reason for a garden-variety software developer to become a technical co-founder. Or, for that matter, join an ambitious start-up at a steep discount. Which is precisely what the non-technical co-founder is looking for as they hustle an actual business into existence.
But here's the thing: Once that bargain between the non-technical and technical co-founder is made, it cannot be changed without steep penalties. And both the technical and non-technical founders have to grok that.
So here's the war-story I couldn't tell last night. (I moderated the discussion, so it would have been out of line.) Disclaimer: Everyone involved has since moved on to other things, so I'm not "tattling." A few years ago, I was approached by someone I knew only through social media. "Do you want to change the world?" was the pitch. Fair enough: The product hit that (narrow) sweet spot between "disruptive" and "in the public interest." (Go on...) There were meet-n-greets and salary-vs.-equity numbers thrown around and blahblahblah. But then the founder, kicking back at their desk, said, "I want to make a @#$%^~* lot of money."
Whoa there, Nellie: That's not the horse I saddled.
For various reasons--mostly unrelated and outside my control--our association never got off the ground. For me, it was an expensive lesson that opened my eyes to the self-involved monomania involved in running a startup. The lesson which I'm "donating" here, however, is something beyond that. I prefer to believe that the bait-and-switch chronicled above was not deliberate. Doing well and doing good are not always mutually exclusive. I prefer to believe that, too.
A lot of ink (real and digital) has been spilled on the notion that vision, creativity, and commitment (the critical DNA of a startup for which money is the spark of life) cannot be bought. Fair point. It's been documented since the 1950s that tying pay to creativity actually makes people less creative. But what's missing from those essays (of which I am guilty) is that just because those loftier qualities can't be bought does not mean that they can't be sold.
Seems like a contradiction, no?
Let's throw back to good ol' Dr. Maslow and his famous hierarchy. Except think of each tier as its own separate country--complete with its own currency.
When a person is working strictly for a paycheck, s/he is being paid primarily in the currency of the "Safety" country. Which, since we came down from the trees and invented trade, means that the currency exchange into the "Physiological" tier is pretty much a 1:1 transaction. An ample and steady paycheck assures that this person stays comfortably in this zone. That's normally not something a startup can offer.
However, someone taking a pay cut to work for a hot start-up, by contrast, is being paid at least partly in the currency of the "Esteem" country. Similarly, someone working to change the world (or to stick it to The Man) is cashing their check at the "Self-Actualisation" bank.
Switching currencies without notice--and especially without paying the exchange rate--is management and leadership suicide. A mass exodus of technical founders and early hires is thoroughly justified: Only naifs or morons should expect otherwise.
For instance, when working the traditional (and increasingly mythical) 9-to-5 job, one expects to spend 40+ hours away from the family one might be supporting. That's the deal. Beyond that? Time-and-a-half overtime was at least partly intended to be a penalty for taking away one's time with loved ones. Similarly, when Yahoo's Marissa Meyer rescinded work-from-home flexibility with no increase in compensation, the furor was completely predictable. That's the Safety - Family exchange rate in action. In the case of time-and-a-half, the exchange rate is agreed on. In the case of Yahoo, it was unacknowledged--hence, the (well-deserved) backlash.
Thus, when a glamourous unicorn startup is acquired by a company with deep pockets, each and every aqui-hire will (rightfully) expect to receive deep-pocket wages. If, for no other reason that the fact that the startup's "glamour discount" on their paycheck is no longer justified. Goodbye, cool factor; hello, boring behemoth. Esteem currency, meet Safety currency. Cha-ching!
And, finally, there's that idealistic, self-motivating Self-Actualisation crowd with their penthouse view of the hierarchy. From where I sit, commuting Safety currency to Self-Actualisation currency seems like pure alchemy. Unless you're Elon Musk, maybe. Otherwise? Morality, meet profit motive. Creativity, meet "...because we've always done it this way." Spontaneity, meet process. Problem solving, meet executive fiat. Lack of prejudice and acceptance of facts, may I introduce bureaucracy and vanity metrics. It's gonna take a huge payday to cover the exchange rate for all the chips that are cashed at that exit event.
Yet the technical co-founder as well as the early hires have some responsibility for maintaining the integrity of the currencies and their personal exchange rates. Mostly that involves respecting one's market value and being willing to take it elsewhere, of course. But also recognising the necessary pitfalls that come with a successful startup.
Geoffrey Moore's Crossing the Chasm details the process of bringing a new-new product (meaning something that didn't previously exist...particularly one that consumers have to actively wrap their brains around) to market. The "chasm" is the no-person's-land between the bleeding edge early adopters and the outer edges of the mainstream. In software development terms, that generally translates to going from simply "Make it work" to "Make it secure" + "Make it fast" + "Make it pretty" + "Make it scale."
That all involves more people. Assuming that the money doesn't run out first, that means more eyeballs on the product and different perspectives on where it is headed. Assuming outside funding, those "different perspectives" are guaranteed to devalue technical perfection in favour of faster time to market. And also guaranteed to cramp your style as CTO, even if you're only in it for your share of the jackpot.
As the company scales up, maybe you're brave enough to hire people just as bright and fearless and self-directing as you are. That's no guarantee that anyone else will be. So not everyone you work with will necessarily be as motivated and/or talented as the original gang. And, strangely enough, senior devs are grumpy about on-boarding new-hires when they still have deadlines to hit. Point new-hires at the wiki? That's been gathering dust for months. Maybe they can just jump in head-first via source control history? Ha! Not even an Ent could make sense of all those branches. And oh, dear sweet FSM, not another Slack channel--you gotta be kidding me. Can I pretty-please just get back to coding The Thing?
(Hint: Nope.)
And then comes the day when you've busted your rear to make month-over-month profits sustainable. Which is precisely when the VCs decide that they need "someone to take us to the next level." Whether that turns out to be a seasoned industry insider or just the douchebro smarmasaurus ex-roommate of one of the VCs, you can pretty much kiss any remaining startup vibe goodbye.
Exactly how much equity will to make up for that in terms of cold, hard cash? That's not a rhetorical question. That's a number that should be revisited regularly and often. Preferably objectively, and not merely in the immediate wake of those little twinges you feel at...
- "We didn't think you wanted to be at that meeting."
- "What's our policy/process for that?"
- "Why are you wasting your team's time interviewing candidates?"
- "Can't we offshore some of this?"
- "All the conference rooms are booked."
- "The Board wants to bring in a growth hacker."
- "You need to step back from the code and focus on the team."
But, as a developer, you get exactly one startup experience before you can't claim that you never knew what hit you. You get exactly one excuse for naively being caught flat-footed with worthless equity and/or a bank balance that hasn't kept pace with your contributions to the company. It's probably better if you get it over with earlier in your career, but that's ultimately up to you.
Because your "reward" for bringing your product across the chasm is to turn around and find the bridge burned behind you. Like I said, all those intangible "perks" living in the upper stories of Maslow's hierarchy can't be bought. But now they have to be cashed out regardless of whether or not you're ready to sell. Just make sure that the payout matches your exchange rate. After all, you might just want to build yourself a new bridge into your next startup.
Tuesday, June 28, 2016
Waterfall vs. Agile Development, an illustration from the NB DOT
This post is mostly about the Department of Transportation (specifically the one of New Brunswick), but first I want to thank the Department of Small Mercies for doing me a solid...or two...or three...
I was scheduled for a noon meeting in Moncton today. The initial meeting announcement listed the location as happening on the campus of one of Moncton's two private one-year colleges. Which narrows it down to a single street address, but nevertheless leaves something to be desired in terms of precision.
So, having enough sense of the organiser's temperament to allow myself a bit of smart-arsery, I enquired as to whether the vagueness was a test of my persistence/resourcefulness, upon which my admission to said meeting would depend. It turns out that the organiser is at least my equal in smart-arsery. Which I naturally took as a challenge: "Oh, honey, it is soooo on," I thought, resolving to be in the room, greeting him with a wave and a Cheshire Cat grin, when he arrived.
Thus, I left Grande Digue with a little over half an hour to spare. Outside of Shediac on (westbound) Highway 15, traffic suddenly slowed to the proverbial crawl. First an ambulance and then a squad car passed our line on the left. Somewhere past the 29km marker, a car lay flipped on its roof, but the rubber-necking was kept to a minimum. Including my own, so my Gentle Reader will have to check the news for the details. (P.S.: Bonne chance, whoever you are...)
Following a short speed up, I encountered the stretch of road which had been stripped of asphalt during my previous run into HubCity. Today the asphalt was fresh, a dotted line had been painted down the centre...and traffic again slowed down to < 10km/hour. At least when it wasn't standing completely still.
And this is where our "illustration" really begins. Because that freshly renewed stretch of highway had been necked down to one lane for kilometres. Kilometres of perfectly sound road, missing only the side-line markers and (possibly) the rumble-strips. With nary a worker nor piece of equipment in sight to justify the buffer-state of asphalt that kept traffic to the pace of a snail on quaaludes. (And, as anyone who lives where other folks vacation can attest, there should be a Special Hell(TM) reserved for anyone who pulls those shenanigans during Tourist Season.)
In software development, there are two ways of making A Thing (for lack of a better term).
But that mixing of contexts is, alas, precisely what added an extra twenty minutes to a commute that normally takes thirty.
Context: For my Gentle Readers outside of Canada, New Brunswick is what's known as a "have-not Province," and has been since steam replaced sail. Historically, various Governments (Conservative and Liberal) have cushioned budget shortfalls with "equalisation payments" to guarantee a minimum standard of public services (notably those related to health care). But New Brunswick's debt (and its debt to GDP ratio) has crept up under both brand-name parties. And that's before the previous federal Government decided to jump on the austerity bandwagon.
What with that and the talk (a.k.a. threat) of privatising provincial road-building, it's hardly surprising that the bean-counters have taken over with a vengeance.
Now. Road construction is not my domain, we say in software development. So some of the following will be, to a greater or lesser extent, conjecture. That being said, I can in all fairness call out certain strong similarities between what I do for a living and how the folks in the bright orange jackets make their gelt:
I mentioned "beta" mode above. A road that's partially open during construction is basically the same thing as an open beta in software development. And in open beta you emphatically do not wait until the "official" launch-date to release all the bug-fixes and missing features. Any product beta-tested in that fashion will lose the interest of the influential early adopters (and probably never see the light of day). No. You shove those things out the door as soon as QA green-lights them. (Granted, the province has the upper hand in this instance, because people will always gravitate back to the shortest time between Points A and B. But my point, I think, stands.)
Yet, whoever was directing today's crew could have easily limited the bottleneck a mere kilometre of work surface and left everything else open to two lanes of traffic. When that was finished, they could have done exactly the same thing for the next kilometre of surface. And so on...until either the potholes or budget ran out. In short--the Agile method would have produced the minimal amount of traffic disruption during the busiest season of the year.
Instead, someone (quite wrongly) chose to operate by the Waterfall method. Again, this is just conjecture, but my strong suspicion is that that someone assumes that handling the resurfacing in larger chunks allows for economies of scale. For all I know, they're correct--at least superficially. And, of course, the Suits luuuuuuvvve their Waterfalls...mainly because they live upstream and it makes them feel more like they're the driving force behind everything. [eyeroll]
But the problem is that, by my estimate, the entirely gratuitous one-lane bottlenecking cost each and every person about extra fifteen minutes, compared to the length of road actually being worked on. Every missed appointment, every late delivery, every disgruntled tourist -- those come with a cost, too. For sure, that cost won't show up on this year's tax bill. But it will show up in one way or another--make no mistake about that.
In my particular case, the Patron Saints of traffic lights and 2-hour parking and sheer dumb luck (plus my taste for harmless practical jokes) allowed me to make my meeting just in time. That, however, doesn't mean that I'm not brassed-off. I've been on well-run projects and ones that barely ran at all--in both Waterfall mode as well as Agile. Like I said, they each have their proper context. And it is a sorry manager (and steward of public funds) who chooses the wrong context.
I was scheduled for a noon meeting in Moncton today. The initial meeting announcement listed the location as happening on the campus of one of Moncton's two private one-year colleges. Which narrows it down to a single street address, but nevertheless leaves something to be desired in terms of precision.
So, having enough sense of the organiser's temperament to allow myself a bit of smart-arsery, I enquired as to whether the vagueness was a test of my persistence/resourcefulness, upon which my admission to said meeting would depend. It turns out that the organiser is at least my equal in smart-arsery. Which I naturally took as a challenge: "Oh, honey, it is soooo on," I thought, resolving to be in the room, greeting him with a wave and a Cheshire Cat grin, when he arrived.
Thus, I left Grande Digue with a little over half an hour to spare. Outside of Shediac on (westbound) Highway 15, traffic suddenly slowed to the proverbial crawl. First an ambulance and then a squad car passed our line on the left. Somewhere past the 29km marker, a car lay flipped on its roof, but the rubber-necking was kept to a minimum. Including my own, so my Gentle Reader will have to check the news for the details. (P.S.: Bonne chance, whoever you are...)
Following a short speed up, I encountered the stretch of road which had been stripped of asphalt during my previous run into HubCity. Today the asphalt was fresh, a dotted line had been painted down the centre...and traffic again slowed down to < 10km/hour. At least when it wasn't standing completely still.
And this is where our "illustration" really begins. Because that freshly renewed stretch of highway had been necked down to one lane for kilometres. Kilometres of perfectly sound road, missing only the side-line markers and (possibly) the rumble-strips. With nary a worker nor piece of equipment in sight to justify the buffer-state of asphalt that kept traffic to the pace of a snail on quaaludes. (And, as anyone who lives where other folks vacation can attest, there should be a Special Hell(TM) reserved for anyone who pulls those shenanigans during Tourist Season.)
In software development, there are two ways of making A Thing (for lack of a better term).
- The "Waterfall Development" school harks back to the assembly line of the industrial past (presumably an artefact of lumping Software Engineering in with traditional Engineering). Designers hand off to Coders who hand off to Testers who ultimately hand off to whoever packages the code and delivers it to customers. Henry Ford would feel completely at home in this world.
- The "Agile Development" school hews more to the "throw it against the wall and see if it sticks" line of thought. Which, surprisingly, is also based on manufacturing principles developed in the automotive industry. Except that it was done within the limited resources of a scrappy post-WWII Toyoda (now Toyota).
But that mixing of contexts is, alas, precisely what added an extra twenty minutes to a commute that normally takes thirty.
Context: For my Gentle Readers outside of Canada, New Brunswick is what's known as a "have-not Province," and has been since steam replaced sail. Historically, various Governments (Conservative and Liberal) have cushioned budget shortfalls with "equalisation payments" to guarantee a minimum standard of public services (notably those related to health care). But New Brunswick's debt (and its debt to GDP ratio) has crept up under both brand-name parties. And that's before the previous federal Government decided to jump on the austerity bandwagon.
What with that and the talk (a.k.a. threat) of privatising provincial road-building, it's hardly surprising that the bean-counters have taken over with a vengeance.
Now. Road construction is not my domain, we say in software development. So some of the following will be, to a greater or lesser extent, conjecture. That being said, I can in all fairness call out certain strong similarities between what I do for a living and how the folks in the bright orange jackets make their gelt:
- We only perform our ostensible "work" between interruptions. In their case, it's mainly weather. In mine, it's meetings and administrivia.
- We can't always trust the infrastructure. In their case it's a pocket of soggy clay, erosion from wonky grading on the last job, etc. In my case it's network issues, unexpected upgrades, security holes, what-have-you.
- We can be screwed over twelve ways to Sunday by vendors. 'Nuff said.
- We can be--and too often are--encouraged by the Powers That Be to cut corners and/or kick the can down the proverbial road.
- We have to develop and learn to trust a healthy sense of pessimism to sniff out the edge-cases that could bring everything crashing down.
- We know that nothing is ever going to be 100% perfect 100% of the time -- there will always be "beta" mode as well as maintenance. More on that later.
I mentioned "beta" mode above. A road that's partially open during construction is basically the same thing as an open beta in software development. And in open beta you emphatically do not wait until the "official" launch-date to release all the bug-fixes and missing features. Any product beta-tested in that fashion will lose the interest of the influential early adopters (and probably never see the light of day). No. You shove those things out the door as soon as QA green-lights them. (Granted, the province has the upper hand in this instance, because people will always gravitate back to the shortest time between Points A and B. But my point, I think, stands.)
Yet, whoever was directing today's crew could have easily limited the bottleneck a mere kilometre of work surface and left everything else open to two lanes of traffic. When that was finished, they could have done exactly the same thing for the next kilometre of surface. And so on...until either the potholes or budget ran out. In short--the Agile method would have produced the minimal amount of traffic disruption during the busiest season of the year.
Instead, someone (quite wrongly) chose to operate by the Waterfall method. Again, this is just conjecture, but my strong suspicion is that that someone assumes that handling the resurfacing in larger chunks allows for economies of scale. For all I know, they're correct--at least superficially. And, of course, the Suits luuuuuuvvve their Waterfalls...mainly because they live upstream and it makes them feel more like they're the driving force behind everything. [eyeroll]
But the problem is that, by my estimate, the entirely gratuitous one-lane bottlenecking cost each and every person about extra fifteen minutes, compared to the length of road actually being worked on. Every missed appointment, every late delivery, every disgruntled tourist -- those come with a cost, too. For sure, that cost won't show up on this year's tax bill. But it will show up in one way or another--make no mistake about that.
In my particular case, the Patron Saints of traffic lights and 2-hour parking and sheer dumb luck (plus my taste for harmless practical jokes) allowed me to make my meeting just in time. That, however, doesn't mean that I'm not brassed-off. I've been on well-run projects and ones that barely ran at all--in both Waterfall mode as well as Agile. Like I said, they each have their proper context. And it is a sorry manager (and steward of public funds) who chooses the wrong context.
Friday, April 1, 2016
Frivolous Friday, 2016.04.01: Wishful, Wistful Thinking
We code-crunching types tend to think of ourselves as logical and data-oriented, but we're not above pipe-dreams of our own.
Understand that the last few workdays have seen a significant up-tick in traffic between my development stations and the server. Alas, my younger, newer, PC with the snazzier OS is the one with the wifi hiccups. (The 2nd-hand Dell from 2009 shambling along with Debian? Rock solid. Nat'cherly. Get off my lawn, you snot-nose Ubuntu kids! ;~P)
Anyhoo. As I babysit thesecond third fourth database restore of the day, I can look at a single complete stored procedure (and the start of a second one) that represents the sum total of today's "work" that hasn't involved email. :~( And I again internalise the problem with such intermittent bursts of down-time. Namely, how they're good for housekeeping (e.g. backups or quick web searches in response to events like this week's Build2016 announcements), but not heads-down work.
And (mentally) sandwiched between all today's nickel-and-dime penny-ante (yeah, I just mixed monetary metaphors--roll with it) is sugar-plum fancies of the (mature) software developer.
In my favourite fairy tale, the record-locking gremlin (or something equally time-wasting) has struck again, the coffee thermos is both empty and cold, and our heroine knows better than to make another pot at this point in the day. And just as her shoulders droop in despair at the lateness of the clock and the shortness of her list of crossed-off tasks, The Programming Fairy Godmother appears.
The Programming Fairy Godmother (or, "PFG," as we'll call her from here on out) is dressed in a white silk ball-gown + tiara, both deliberately reminiscent of Ada Lovelace--though sporting horn-rimmed glasses like Admiral Grace Hopper. (Hipster!). In lieu of a wand, she carries a shining tablet computer (because tablets can do everything, amirite?)
"Despair not, my dear," she says. "In all your career, you have faithfully treaded that fine line between over-engineering and creating technical debt. You have sanitised your inputs. You have, despite their appalling lack of appreciation for data integrity, maintaining empathy for the end-user. Moreover, you have commented your code, unit-tested, spot-checked, made frequent source-control check-ins, and generally shipped on time. For this you have earned a reward."
"Huh?" says our heroine. "Uh, that's just doing my job. You know, like a real programmer."
"Oh, honey, you have nooooooooooo idea..." replies the PFG, with a near-audible eye-roll. "If you only knew what can bag you six figures plus stock options in Silicon Valley..."
"Good point," replies the programmer, "let's not go there."
The PFG holds up the tablet, which begins to play a PowerPoint retrospective of the programmer's career. There she is as a much-younger student, waiting in line waiting for the printout of her program and output on fan-fold green-bar paper. And regularly sneakernetting--as big floppy discs give way to smaller floppy discs, and then to CDs, and DVDs and flash drives, and occasionally, backup tapes. Somewhere along the line, punctuated by the "RRRRRRRRRR--broingity-broing-broing!" of a modem, memories of sneaker-netting were interspersed with waiting for web pages to download. Also, FTP, SFTP, SCP. Backups. Restores. And, regardless of decade or operating system, our heroine can be seen twiddling her thumbs while her computer boots up. Reboots to install the updates to the Adobe software updater--'nuff said.
If her fairy godmother had meant the presentation to be uplifting, she had failed miserably. The programmer felt the weight of all the, well, waiting. Minutes, hours, and (effectively) whole days of not being able to just tuck into the work at hand with her full attention--all flashed before her in a varied, yet monotonous panoply of wasted time.
Being an Upper Midwesterner only recently relocated to Canada, the programmer bit back her disappointment and considered the possibility that she was being trolled by a supernatural being. "Well, it's good to know that someone still appreciates the virtue of patience," she ventured once the excruciatingly soporific mini-biography had concluded.
The Programmer Fairy Godmother gave the programmer a blank stare. "You're seriously kidding me, right? No, kiddo. I'm giving you that time back."
"Wuuuuuuut?" was the best retort our stunned heroine could muster.
"You heard me," said the PFG. From the standpoint of your cubicle, time stands still until you fill all that otherwise wasted time. Now get off that mocha-latté-padded butt and build something awesome.
So, with scarcely more than a "Thank you!" hollered over her shoulder, the programmer immediately scarpered the heck off. Because her Momma might have raised a lazy, selfish git--but she for sure did not raise a fool. And, despite wasting a, frankly, inexcusable amount of time screwing off on the internet, the programmer managed to manage her time well enough to build A Thing. With (for a change) herself and her friends, and (incidentally) some of the rest of humanity in mind, thank you very much.
When all was said and done, code just plain shipped. Without answering to the mandarins of Marketing or Legal at every turn. Without enduring infinite fractal Groundhogs Day-esque strategy meetings. Without quashing the eternal food-fights over flat vs. skeuomorphic design. (Or, worse, Helvetica vs. Suisse fonts.) Without having to look over her shoulder for hockey-stick inflection-points in the adoption curve.
And, after a couple of iterations (and the inevitable hiccups), everyone involved was appreciably better off for the results, thank you.
Mind you, the programmer was never wined & dined by Y-Combinator or Andreeson-Horowitz. She didn't publish self-congratulatory business self-help woo, much find herself on the TED/SXSWi circuit. She never once did a reddit AMA or made the cover of Inc. or FastCompany...though she may have been interviewed by an intern for TechCrunch. Or maybe it was CNet. She can't remember now, but honestly it doesn't even matter--that piece never saw the light of day in any case.
But I think that we can safely say that this programmer, at least, lived happily ever after.
Understand that the last few workdays have seen a significant up-tick in traffic between my development stations and the server. Alas, my younger, newer, PC with the snazzier OS is the one with the wifi hiccups. (The 2nd-hand Dell from 2009 shambling along with Debian? Rock solid. Nat'cherly. Get off my lawn, you snot-nose Ubuntu kids! ;~P)
Anyhoo. As I babysit the
And (mentally) sandwiched between all today's nickel-and-dime penny-ante (yeah, I just mixed monetary metaphors--roll with it) is sugar-plum fancies of the (mature) software developer.
In my favourite fairy tale, the record-locking gremlin (or something equally time-wasting) has struck again, the coffee thermos is both empty and cold, and our heroine knows better than to make another pot at this point in the day. And just as her shoulders droop in despair at the lateness of the clock and the shortness of her list of crossed-off tasks, The Programming Fairy Godmother appears.
The Programming Fairy Godmother (or, "PFG," as we'll call her from here on out) is dressed in a white silk ball-gown + tiara, both deliberately reminiscent of Ada Lovelace--though sporting horn-rimmed glasses like Admiral Grace Hopper. (Hipster!). In lieu of a wand, she carries a shining tablet computer (because tablets can do everything, amirite?)
"Despair not, my dear," she says. "In all your career, you have faithfully treaded that fine line between over-engineering and creating technical debt. You have sanitised your inputs. You have, despite their appalling lack of appreciation for data integrity, maintaining empathy for the end-user. Moreover, you have commented your code, unit-tested, spot-checked, made frequent source-control check-ins, and generally shipped on time. For this you have earned a reward."
"Huh?" says our heroine. "Uh, that's just doing my job. You know, like a real programmer."
"Oh, honey, you have nooooooooooo idea..." replies the PFG, with a near-audible eye-roll. "If you only knew what can bag you six figures plus stock options in Silicon Valley..."
"Good point," replies the programmer, "let's not go there."
The PFG holds up the tablet, which begins to play a PowerPoint retrospective of the programmer's career. There she is as a much-younger student, waiting in line waiting for the printout of her program and output on fan-fold green-bar paper. And regularly sneakernetting--as big floppy discs give way to smaller floppy discs, and then to CDs, and DVDs and flash drives, and occasionally, backup tapes. Somewhere along the line, punctuated by the "RRRRRRRRRR--broingity-broing-broing!" of a modem, memories of sneaker-netting were interspersed with waiting for web pages to download. Also, FTP, SFTP, SCP. Backups. Restores. And, regardless of decade or operating system, our heroine can be seen twiddling her thumbs while her computer boots up. Reboots to install the updates to the Adobe software updater--'nuff said.
If her fairy godmother had meant the presentation to be uplifting, she had failed miserably. The programmer felt the weight of all the, well, waiting. Minutes, hours, and (effectively) whole days of not being able to just tuck into the work at hand with her full attention--all flashed before her in a varied, yet monotonous panoply of wasted time.
Being an Upper Midwesterner only recently relocated to Canada, the programmer bit back her disappointment and considered the possibility that she was being trolled by a supernatural being. "Well, it's good to know that someone still appreciates the virtue of patience," she ventured once the excruciatingly soporific mini-biography had concluded.
The Programmer Fairy Godmother gave the programmer a blank stare. "You're seriously kidding me, right? No, kiddo. I'm giving you that time back."
"Wuuuuuuut?" was the best retort our stunned heroine could muster.
"You heard me," said the PFG. From the standpoint of your cubicle, time stands still until you fill all that otherwise wasted time. Now get off that mocha-latté-padded butt and build something awesome.
So, with scarcely more than a "Thank you!" hollered over her shoulder, the programmer immediately scarpered the heck off. Because her Momma might have raised a lazy, selfish git--but she for sure did not raise a fool. And, despite wasting a, frankly, inexcusable amount of time screwing off on the internet, the programmer managed to manage her time well enough to build A Thing. With (for a change) herself and her friends, and (incidentally) some of the rest of humanity in mind, thank you very much.
When all was said and done, code just plain shipped. Without answering to the mandarins of Marketing or Legal at every turn. Without enduring infinite fractal Groundhogs Day-esque strategy meetings. Without quashing the eternal food-fights over flat vs. skeuomorphic design. (Or, worse, Helvetica vs. Suisse fonts.) Without having to look over her shoulder for hockey-stick inflection-points in the adoption curve.
And, after a couple of iterations (and the inevitable hiccups), everyone involved was appreciably better off for the results, thank you.
Mind you, the programmer was never wined & dined by Y-Combinator or Andreeson-Horowitz. She didn't publish self-congratulatory business self-help woo, much find herself on the TED/SXSWi circuit. She never once did a reddit AMA or made the cover of Inc. or FastCompany...though she may have been interviewed by an intern for TechCrunch. Or maybe it was CNet. She can't remember now, but honestly it doesn't even matter--that piece never saw the light of day in any case.
But I think that we can safely say that this programmer, at least, lived happily ever after.
Subscribe to:
Comments (Atom)