Wednesday, May 31, 2017

You can't copy-and-paste a career

[Warning:  Rant ahead.]

Because filing paperwork that will almost certainly be a waste of paper & printer ink & postage (not to mention the time of all involved) wasn't infuriating enough, this had to land like a fresh cow-pat across my path in Twitter today:  Computer science students should learn to cheat, not be punished for it.

The tl;dr summary was best done by Homer Simpson (quoting from memory here):  "Marge, don't discourage the boy!  Weaseling is an important skill.  It's what sets us apart from animals...except, of course, the weasels."

Because, you see, in The Real World(TM), coders copy and paste all the time.  And coding in school should reflect the less ethically pristine norms of Silicon Valley.  At least, according to a journalist who lists precisely no coding background in his profile.

Oh, and teaching Java as a first language is somehow corroding professional skills.  I say "somehow" because there was literally no explanation for that offered in the main article.  The CrossTalk URL stalls out.  (Pity--it looked much more promising.)  The second related URL links to another article by the same author which argues that JavaScript is better because it isn't as scary and thus doesn't discourage the "fundamentally creative endeavor" that coding is supposed to be.  (No, really, it said that.  I wish I were making that up.)

Because schools, you see, are failing the software industry, and the 10% unemployment rate among UK CS graduates is iron-clad proof that Universities aren't teaching real job skills.

I mean, no real job skills besides sitting in herds pretending to be interested in what the authority figure at the head of the room is saying.  Or to subsist on crap food consumed at irregular hours.  Or the mad, scrambling stampede ahead of arbitrary deadlines (a.k.a. the semester).  Or swallowing the seething rage that comes with individual performance ratings being dragged down by slackers you didn't want on your team in the first place.  Or, not least of all, the almost rhythmic filling and emptying of your memory with the Next New Hotness that we need you on the bleeding edge of so we have someone to tap when we're finally forced to use the industry standard five years hence.

And now a journalist is advocating stealing--notice I didn't say borrowing--code as a professional skill. 

Now.  I spent about a year in the Fourth Estate, and even after more than two decades, I can appreciate how your knowledge has to be the proverbial mile-wide-and-an-inch-deep.  I know that I had to lean on other people to understand the intricacies of red clay vs. blue clay in the street renovations, the problems caused by shoddy contracting on the new high school, and even which number under that pile of jerseys was actually holding the football.  But I leaned on people who actually knew what they were talking about.  Reporting on something from your personal "Hello, World!" perspective is a great view into how beginners view things.  But it does not qualify you to design curriculum for an entire industry.

But!  Surprise twist!  I actually agree in principle that four-year University degrees are doing no one any favours by devoting weeks' worth of time to edge cases like sorting algorithms.  Or discrete mathematics.  Or even much NP-completeness theory beyond the basic epiphany that not all software solutions can be encapsulated by an algorithm.  (Turning a brilliant-but-naive coder loose on something that they don't know can only be approximated or brute-forced will waste one heck of a lot of money.  Let's avoid that, but not go too crazy on knapsack problems, m'kay?)

And I certainly can't argue that coming out of school knowing unit-testing, source control (particularly the Special Hell(TM) of branch merges) aren't a bad thing.  Replace pop-quizzes with Scrum stand-up check-ins, for all I care. 

But school already does a bang-up job of reinforcing some of the worst aspects of the world of work.  (See above snark on "real job skills.")  Stealing code should not be added to those sins.

Borrowing code is an entirely different matter.  By "borrowing" I mean citing the source (e.g. the StackOverflow URL) in the comments.  That accomplishes a few things:
  1. It allows you (or the poor slob who has to maintain your code) to go back to the source for reference.  Which can answer questions like:  "What was the original code meant to accomplish"?  "How old is it?"  "Was the solution up-voted and/or embellished with further useful comment?"
  2. It demonstrates to your team-mates and/or bosses that you don't take credit for other people's work.
  3. If the code completely bombs QA/CI tests, you don't look like quite the idiot you would have it had been your own creation, amirite?  ;~)
The first point is more immediately and practically useful.  The second, however, has more far-reaching implications.  We've seen billion-dollar lawsuits filed in the name of code-stealing.  (Remember SCO Linux?  No?  Okay.  Howsabout the "APIs are copyrightable" legal Wrestlemania between Oracle and Google?)  My industry is notorious for men claiming credit for women's contributions.  And someone thinks that taking credit for other people's work is a skill to reward from the age of 18 on?  Because citing your sources is for academia (and, one hopes, journalism)?  Seriously???

The good news is that there is a stupid-simple way to stop universities from turning out unqualified junior programmers.  Seriously.  Paint-chip-eating-stupid-simple.

Ready for it?

Programmer job postings just need to stop requiring CS degrees.

That's it.  Supply, meet demand.  Next problem, please!

Okay, so there's actually some bad news:  The Suits won't stand for that.

Let's take a minute to break that down.  If The Suits absolutely must hire an onshore programmer, that's not chump-change.  Granted, the modern (read: "internet-ified") job search process does a fantastic job of externalising much of that cost on the job-seeker.  But there is some residual internal cost to hiring.  That cost rises dramatically after a newly-acquired warm body is parked in its cubicle-stantion.  Mainly because most new hires require 60-90 days to evolve the dysfunctions necessary to survive in their unique new corporate micro-climate.  (But I'm not cynical.)  Two to three months of salary plus overhead is not an insignificant investment.  The bean-counters are right to want to minimise the risk that the new organism they've introduced will turn out to be a parasite.

Suits want security and stability.  Disruption, y'understand, is only a good thing when it happens to other people.  For them, post-secondary degrees are a form of insurance -- with the shiny bonus of not having to front the money for it.  (If you're a home-owner, think about your bank requiring you to buy mortgage insurance to protect them in case you lose your ability to make payments:  It's exactly like that.)

Are all companies so risk-phobic?  Of course not.  The current (U.S.) average seems to be that only about half of software developers hold a CS degree.  It's absolutely possible to get a coding job without checking that box.  Just not at places like Google, of course.  Also, that expensive piece of paper, broadly speaking, is leverage for a higher starting salary (upon which future raises and starting salaries at other companies are largely dependant).

Because any stable company -- the kind we work for when we don't feel like gambling our ability to pay next month's rent on the chance of owning a private island -- is generally large enough to have a candidate-filtering mechanism called Human Resources.

I've had the privilege of knowing some very, very sharp HR folks.  Yet nary a one of them can tell you whether or not my GitHub check-ins are crap.  My StackOverflow score?  That's literally just a number...one without an anchor (Pareto distributions and all that).  Obviously, higher is better.  But what's the baseline minimum?  And what if there was even a baseline?  Think about wine for a second.  It's rated on a hundred-point scale, but its scores (among its self-appointed referees) can be all over the map.  And you can still pay top dollar for what tastes like plonk to you.  But HR's gonna extrapolate from an up-vote count?  Yeeeeeeah...

The thing is, if programming were treated like every other profession instead of the Dark Alchemic Art that it emphatically is not, none of this would even be an issue.  And I place the blame squarely on business.  When you can't trust the standard metrics, make your own.  You need demonstrable coding skills?  Have your current developers put together a quiz for candidates.  There's software for that.  Does this person have language-specific certifications?  Great.  Take a quick peek at the GitHub/StackOverflow oeuvres to see whether they've actually been used.  Do they have a blog?  Do they give any presentations to other coders?  Google is your friend.  And you don't even have to leave the office.  Yet.

Once you have a handful of candidates, get your butt off the internet and meet them.  Preferably in a third-party setting.  Then coordinate interview with those that make the cut in-person.  This is where HR really earns its salary.  While you (or your technical evaluators) are swimming in alphabet soup, they're looking for the social cues:
  • What's their body language?  Closed?  Aggressive?
  • How do they react when they're challenged/corrected?
  • Do they treat people of different genders/ethnicities differently?
  • How much of their attention goes to people who were not "authority figures"?
  • When talking about previous teamwork, what's the "We"-to-"I" ratio?
  • Do they put their feet up on my desk during our 1:1?  (No joke, this legit. happened to a recruiter I worked with.  Needless to write, the candidate was not invited back.)
Of course, I have to wonder why there was even a (public) job posting to fill in the first place.  Employee referral rewards, internship pipelines, on-the-job-training, coding "boot camps," and tuition reimbursement are real, actual things, friends and brethren.  Those are all investments -- most of them not inexpensive.  But, then, so is on-boarding someone whose salary averages $50K+ in Canada...and unintentionally sabotaging an entire team with an incompetent git who looked good on paper.

Sure, the internet is a great (and relatively cheap) way to boost the signal.  But it also affects the signal-to-noise ratio like, whoa.  So the first-line filter (a.k.a. HR) needs some criteria to weed out the self-proclaimed ninjas, rock stars, and unicorns of our Dunning-Krueger age.  Employment histories will list date-ranges and, hopefully, mention the technology stack used by the previous employer.  But there's rarely room to mention how long or how extensively any given technology was used in the field.

The bottom line is that, without training to evaluate code, HR doesn't have the resources to make this call on their own.  At best, HR can do the legwork of plugging code samples into search engines to scan for plagiarism (assuming you care about that).  Lacking proper domain knowledge, they have every incentive to fall back on traditional metrics.  Which includes "outsourcing" the "follow up" / "circular file" judgement-call to the four-star rating-system of (you guessed it!) the good ol' college GPA.

At which point, you (as the hypothetical employer) have effectively lost your right to complain about universities not preparing CS students for real-world coding work.  And fer cryin' in your Double-Double, please don't expect colleges to reward plagiarism.  This world is already too much a kleptocracy, thanks.