One of the electronics/computer repair shops in Moncton has been branching out into selling Raspberry Pis, Arduinos, and a whole whack of, ahem!, "accessories" for them. Business is apparently brisk: One wall became two, then three. I alternate between referring to them as "Candyland" and "The Widget-pushers."
So I emailed the Alpha-Widget-Pusher last week to see whether he had a source for the ATMega328P DIP chip I've been having a difficult time tracking down...at least from Canadian suppliers. And since I'm probably going to be spending some time moving these from breadboard to breadboard, it's a good excuse to find a better chip-puller than the twist-prone cheapie that I picked up from Princess Auto. So I tacked a request for recommendation onto the email.
The reply was that their Alpha Chip-Puller is mostly using tools made in the 1930s. For which, mad respect. Plus a small side of jealousy. Take care of the tools and the tools will take care of you -- that's one of my tenets, and not just in electronics.
I griped to Dennis about how, after decades of Kanban, TQM, ISO-9001, and every other flavour-of-the-month underfunded QA push, they still "don't make 'em like they used to." That's something you shouldn't gripe about to a Recovering Manufacturing Engineer(TM). At least not unless you want an earful about how ISO-9001 only guarantees that your processes are executed as documented.
But he's right. A CNC machine capable of of 0.001% tolerances can't stop the cheap steel it just milled from warping or cracking in use. Consistency is not quality...or at least is only a part of it. In fairness, in some cases consistency is a Very Big Deal(TM). (Remember Intel's division-error and the resulting freak-out? Even the one-in-nine-billion odds were too much for everyone.) But consistency is a hard, easily-measurable metric. Thus its overweighted role as a surrogate for quality.
True quality, of course, is squishier, less easily shoe-horned into a database or aggregated into a colour-coded chart/graph. Let's face it; no one picks a tool off the wall at the big-box hardware store or out of a mail-order box and thinks, "My great-grandchild will maybe use this to build __x__ one day." So I very much doubt that, unless they were hand-made, the
manufacturer of those antique tools gave much thought to their
whereabouts in 2018.
That's not to say that no one cared about quality. In the Great Depression, people were more apt to look on non-daily expenditures as "investments," especially tools that would better allow them to mend and "make do" vs. having to buy new. Double that for the craftsperson putting food on the table. Print advertisements and radio jingles, doubtless, headlined "quality"* as a major selling-point.
But tracking down product made decades ago and measuring its longevity? Even scraping and comparing offerings on eBay is nothing any MBA could consider a "metric." No, the value is solely in the eye of the beholder--specifically, the craftsperson who imagines the spirit of previous generations of craftspeople echoing in their work with the tool in question.
I suppose that some savvy marketer could pitch tools as "future heirlooms." 'S'matter'a'fact, I'm kind of surprised they already haven't. (And if they already haven't, for the noodley love of the Flying Spaghetti Monster, shhhhhhhhhhh...) Mind you, any such campaign will be waged by some giga-conglomerate offs-horing its manufacturing while writhing its last in the strangling grip of a Wall Street vampire squid. But maybe--just maybe--its last gasp will blow on whatever embers remain of the crafts(wo)man ethic.
Melodramatic? Eh, probably. But even for someone whose tool-belt can be so very ephemeral--wherever you are, PC-Write, know that I still love you!--it does matter. Particularly post-Christmas as I stare at a brass-rimmed steel thimble perched on my monitor-stand. Mom doesn't expect to have the fine motor control necessary for hand-sewing ever again. So I'm the heiress of a *whack* of embroidery floss and her thimble. Which turns out to be her Mom's thimble. It's clearly been stepped-on, and there's a patina of rust on the inside. Alas, my bone-structure takes after my other, more petite, Grandmother. But that's nothing a little padding and a skinny rare-earth magnet can't deal with. And it will be dealt with, and shortly. Enshrining tools on a shelf is the same as burying them. I'd like a bit of Grandma to live on, even if only in my crazy projects.
- - - - -
* A sound captured by Mark Knopfler in his dangerously catchy throwback tune "Quality Shoe." (You, Gentle Reader, have been warned.)
Thoughts on computers, companies, and the equally puzzling humans who interact with them
Showing posts with label Excellence. Show all posts
Showing posts with label Excellence. Show all posts
Wednesday, January 17, 2018
Wednesday, May 31, 2017
You can't copy-and-paste a career
[Warning: Rant ahead.]
Because filing paperwork that will almost certainly be a waste of paper & printer ink & postage (not to mention the time of all involved) wasn't infuriating enough, this had to land like a fresh cow-pat across my path in Twitter today: Computer science students should learn to cheat, not be punished for it.
The tl;dr summary was best done by Homer Simpson (quoting from memory here): "Marge, don't discourage the boy! Weaseling is an important skill. It's what sets us apart from animals...except, of course, the weasels."
Because, you see, in The Real World(TM), coders copy and paste all the time. And coding in school should reflect the less ethically pristine norms of Silicon Valley. At least, according to a journalist who lists precisely no coding background in his profile.
Oh, and teaching Java as a first language is somehow corroding professional skills. I say "somehow" because there was literally no explanation for that offered in the main article. The CrossTalk URL stalls out. (Pity--it looked much more promising.) The second related URL links to another article by the same author which argues that JavaScript is better because it isn't as scary and thus doesn't discourage the "fundamentally creative endeavor" that coding is supposed to be. (No, really, it said that. I wish I were making that up.)
Because schools, you see, are failing the software industry, and the 10% unemployment rate among UK CS graduates is iron-clad proof that Universities aren't teaching real job skills.
I mean, no real job skills besides sitting in herds pretending to be interested in what the authority figure at the head of the room is saying. Or to subsist on crap food consumed at irregular hours. Or the mad, scrambling stampede ahead of arbitrary deadlines (a.k.a. the semester). Or swallowing the seething rage that comes with individual performance ratings being dragged down by slackers you didn't want on your team in the first place. Or, not least of all, the almost rhythmic filling and emptying of your memory with the Next New Hotness that we need you on the bleeding edge of so we have someone to tap when we're finally forced to use the industry standard five years hence.
And now a journalist is advocating stealing--notice I didn't say borrowing--code as a professional skill.
Now. I spent about a year in the Fourth Estate, and even after more than two decades, I can appreciate how your knowledge has to be the proverbial mile-wide-and-an-inch-deep. I know that I had to lean on other people to understand the intricacies of red clay vs. blue clay in the street renovations, the problems caused by shoddy contracting on the new high school, and even which number under that pile of jerseys was actually holding the football. But I leaned on people who actually knew what they were talking about. Reporting on something from your personal "Hello, World!" perspective is a great view into how beginners view things. But it does not qualify you to design curriculum for an entire industry.
But! Surprise twist! I actually agree in principle that four-year University degrees are doing no one any favours by devoting weeks' worth of time to edge cases like sorting algorithms. Or discrete mathematics. Or even much NP-completeness theory beyond the basic epiphany that not all software solutions can be encapsulated by an algorithm. (Turning a brilliant-but-naive coder loose on something that they don't know can only be approximated or brute-forced will waste one heck of a lot of money. Let's avoid that, but not go too crazy on knapsack problems, m'kay?)
And I certainly can't argue that coming out of school knowing unit-testing, source control (particularly the Special Hell(TM) of branch merges) aren't a bad thing. Replace pop-quizzes with Scrum stand-up check-ins, for all I care.
But school already does a bang-up job of reinforcing some of the worst aspects of the world of work. (See above snark on "real job skills.") Stealing code should not be added to those sins.
Borrowing code is an entirely different matter. By "borrowing" I mean citing the source (e.g. the StackOverflow URL) in the comments. That accomplishes a few things:
The good news is that there is a stupid-simple way to stop universities from turning out unqualified junior programmers. Seriously. Paint-chip-eating-stupid-simple.
Ready for it?
Programmer job postings just need to stop requiring CS degrees.
That's it. Supply, meet demand. Next problem, please!
Okay, so there's actually some bad news: The Suits won't stand for that.
Let's take a minute to break that down. If The Suits absolutely must hire an onshore programmer, that's not chump-change. Granted, the modern (read: "internet-ified") job search process does a fantastic job of externalising much of that cost on the job-seeker. But there is some residual internal cost to hiring. That cost rises dramatically after a newly-acquired warm body is parked in its cubicle-stantion. Mainly because most new hires require 60-90 days to evolve the dysfunctions necessary to survive in their unique new corporate micro-climate. (But I'm not cynical.) Two to three months of salary plus overhead is not an insignificant investment. The bean-counters are right to want to minimise the risk that the new organism they've introduced will turn out to be a parasite.
Suits want security and stability. Disruption, y'understand, is only a good thing when it happens to other people. For them, post-secondary degrees are a form of insurance -- with the shiny bonus of not having to front the money for it. (If you're a home-owner, think about your bank requiring you to buy mortgage insurance to protect them in case you lose your ability to make payments: It's exactly like that.)
Are all companies so risk-phobic? Of course not. The current (U.S.) average seems to be that only about half of software developers hold a CS degree. It's absolutely possible to get a coding job without checking that box. Just not at places like Google, of course. Also, that expensive piece of paper, broadly speaking, is leverage for a higher starting salary (upon which future raises and starting salaries at other companies are largely dependant).
Because any stable company -- the kind we work for when we don't feel like gambling our ability to pay next month's rent on the chance of owning a private island -- is generally large enough to have a candidate-filtering mechanism called Human Resources.
I've had the privilege of knowing some very, very sharp HR folks. Yet nary a one of them can tell you whether or not my GitHub check-ins are crap. My StackOverflow score? That's literally just a number...one without an anchor (Pareto distributions and all that). Obviously, higher is better. But what's the baseline minimum? And what if there was even a baseline? Think about wine for a second. It's rated on a hundred-point scale, but its scores (among its self-appointed referees) can be all over the map. And you can still pay top dollar for what tastes like plonk to you. But HR's gonna extrapolate from an up-vote count? Yeeeeeeah...
The thing is, if programming were treated like every other profession instead of the Dark Alchemic Art that it emphatically is not, none of this would even be an issue. And I place the blame squarely on business. When you can't trust the standard metrics, make your own. You need demonstrable coding skills? Have your current developers put together a quiz for candidates. There's software for that. Does this person have language-specific certifications? Great. Take a quick peek at the GitHub/StackOverflow oeuvres to see whether they've actually been used. Do they have a blog? Do they give any presentations to other coders? Google is your friend. And you don't even have to leave the office. Yet.
Once you have a handful of candidates, get your butt off the internet and meet them. Preferably in a third-party setting. Then coordinate interview with those that make the cut in-person. This is where HR really earns its salary. While you (or your technical evaluators) are swimming in alphabet soup, they're looking for the social cues:
Sure, the internet is a great (and relatively cheap) way to boost the signal. But it also affects the signal-to-noise ratio like, whoa. So the first-line filter (a.k.a. HR) needs some criteria to weed out the self-proclaimed ninjas, rock stars, and unicorns of our Dunning-Krueger age. Employment histories will list date-ranges and, hopefully, mention the technology stack used by the previous employer. But there's rarely room to mention how long or how extensively any given technology was used in the field.
The bottom line is that, without training to evaluate code, HR doesn't have the resources to make this call on their own. At best, HR can do the legwork of plugging code samples into search engines to scan for plagiarism (assuming you care about that). Lacking proper domain knowledge, they have every incentive to fall back on traditional metrics. Which includes "outsourcing" the "follow up" / "circular file" judgement-call to the four-star rating-system of (you guessed it!) the good ol' college GPA.
At which point, you (as the hypothetical employer) have effectively lost your right to complain about universities not preparing CS students for real-world coding work. And fer cryin' in your Double-Double, please don't expect colleges to reward plagiarism. This world is already too much a kleptocracy, thanks.
Because filing paperwork that will almost certainly be a waste of paper & printer ink & postage (not to mention the time of all involved) wasn't infuriating enough, this had to land like a fresh cow-pat across my path in Twitter today: Computer science students should learn to cheat, not be punished for it.
The tl;dr summary was best done by Homer Simpson (quoting from memory here): "Marge, don't discourage the boy! Weaseling is an important skill. It's what sets us apart from animals...except, of course, the weasels."
Because, you see, in The Real World(TM), coders copy and paste all the time. And coding in school should reflect the less ethically pristine norms of Silicon Valley. At least, according to a journalist who lists precisely no coding background in his profile.
Oh, and teaching Java as a first language is somehow corroding professional skills. I say "somehow" because there was literally no explanation for that offered in the main article. The CrossTalk URL stalls out. (Pity--it looked much more promising.) The second related URL links to another article by the same author which argues that JavaScript is better because it isn't as scary and thus doesn't discourage the "fundamentally creative endeavor" that coding is supposed to be. (No, really, it said that. I wish I were making that up.)
Because schools, you see, are failing the software industry, and the 10% unemployment rate among UK CS graduates is iron-clad proof that Universities aren't teaching real job skills.
I mean, no real job skills besides sitting in herds pretending to be interested in what the authority figure at the head of the room is saying. Or to subsist on crap food consumed at irregular hours. Or the mad, scrambling stampede ahead of arbitrary deadlines (a.k.a. the semester). Or swallowing the seething rage that comes with individual performance ratings being dragged down by slackers you didn't want on your team in the first place. Or, not least of all, the almost rhythmic filling and emptying of your memory with the Next New Hotness that we need you on the bleeding edge of so we have someone to tap when we're finally forced to use the industry standard five years hence.
And now a journalist is advocating stealing--notice I didn't say borrowing--code as a professional skill.
Now. I spent about a year in the Fourth Estate, and even after more than two decades, I can appreciate how your knowledge has to be the proverbial mile-wide-and-an-inch-deep. I know that I had to lean on other people to understand the intricacies of red clay vs. blue clay in the street renovations, the problems caused by shoddy contracting on the new high school, and even which number under that pile of jerseys was actually holding the football. But I leaned on people who actually knew what they were talking about. Reporting on something from your personal "Hello, World!" perspective is a great view into how beginners view things. But it does not qualify you to design curriculum for an entire industry.
But! Surprise twist! I actually agree in principle that four-year University degrees are doing no one any favours by devoting weeks' worth of time to edge cases like sorting algorithms. Or discrete mathematics. Or even much NP-completeness theory beyond the basic epiphany that not all software solutions can be encapsulated by an algorithm. (Turning a brilliant-but-naive coder loose on something that they don't know can only be approximated or brute-forced will waste one heck of a lot of money. Let's avoid that, but not go too crazy on knapsack problems, m'kay?)
And I certainly can't argue that coming out of school knowing unit-testing, source control (particularly the Special Hell(TM) of branch merges) aren't a bad thing. Replace pop-quizzes with Scrum stand-up check-ins, for all I care.
But school already does a bang-up job of reinforcing some of the worst aspects of the world of work. (See above snark on "real job skills.") Stealing code should not be added to those sins.
Borrowing code is an entirely different matter. By "borrowing" I mean citing the source (e.g. the StackOverflow URL) in the comments. That accomplishes a few things:
- It allows you (or the poor slob who has to maintain your code) to go back to the source for reference. Which can answer questions like: "What was the original code meant to accomplish"? "How old is it?" "Was the solution up-voted and/or embellished with further useful comment?"
- It demonstrates to your team-mates and/or bosses that you don't take credit for other people's work.
- If the code completely bombs QA/CI tests, you don't look like quite the idiot you would have it had been your own creation, amirite? ;~)
The good news is that there is a stupid-simple way to stop universities from turning out unqualified junior programmers. Seriously. Paint-chip-eating-stupid-simple.
Ready for it?
Programmer job postings just need to stop requiring CS degrees.
That's it. Supply, meet demand. Next problem, please!
Okay, so there's actually some bad news: The Suits won't stand for that.
Let's take a minute to break that down. If The Suits absolutely must hire an onshore programmer, that's not chump-change. Granted, the modern (read: "internet-ified") job search process does a fantastic job of externalising much of that cost on the job-seeker. But there is some residual internal cost to hiring. That cost rises dramatically after a newly-acquired warm body is parked in its cubicle-stantion. Mainly because most new hires require 60-90 days to evolve the dysfunctions necessary to survive in their unique new corporate micro-climate. (But I'm not cynical.) Two to three months of salary plus overhead is not an insignificant investment. The bean-counters are right to want to minimise the risk that the new organism they've introduced will turn out to be a parasite.
Suits want security and stability. Disruption, y'understand, is only a good thing when it happens to other people. For them, post-secondary degrees are a form of insurance -- with the shiny bonus of not having to front the money for it. (If you're a home-owner, think about your bank requiring you to buy mortgage insurance to protect them in case you lose your ability to make payments: It's exactly like that.)
Are all companies so risk-phobic? Of course not. The current (U.S.) average seems to be that only about half of software developers hold a CS degree. It's absolutely possible to get a coding job without checking that box. Just not at places like Google, of course. Also, that expensive piece of paper, broadly speaking, is leverage for a higher starting salary (upon which future raises and starting salaries at other companies are largely dependant).
Because any stable company -- the kind we work for when we don't feel like gambling our ability to pay next month's rent on the chance of owning a private island -- is generally large enough to have a candidate-filtering mechanism called Human Resources.
I've had the privilege of knowing some very, very sharp HR folks. Yet nary a one of them can tell you whether or not my GitHub check-ins are crap. My StackOverflow score? That's literally just a number...one without an anchor (Pareto distributions and all that). Obviously, higher is better. But what's the baseline minimum? And what if there was even a baseline? Think about wine for a second. It's rated on a hundred-point scale, but its scores (among its self-appointed referees) can be all over the map. And you can still pay top dollar for what tastes like plonk to you. But HR's gonna extrapolate from an up-vote count? Yeeeeeeah...
The thing is, if programming were treated like every other profession instead of the Dark Alchemic Art that it emphatically is not, none of this would even be an issue. And I place the blame squarely on business. When you can't trust the standard metrics, make your own. You need demonstrable coding skills? Have your current developers put together a quiz for candidates. There's software for that. Does this person have language-specific certifications? Great. Take a quick peek at the GitHub/StackOverflow oeuvres to see whether they've actually been used. Do they have a blog? Do they give any presentations to other coders? Google is your friend. And you don't even have to leave the office. Yet.
Once you have a handful of candidates, get your butt off the internet and meet them. Preferably in a third-party setting. Then coordinate interview with those that make the cut in-person. This is where HR really earns its salary. While you (or your technical evaluators) are swimming in alphabet soup, they're looking for the social cues:
- What's their body language? Closed? Aggressive?
- How do they react when they're challenged/corrected?
- Do they treat people of different genders/ethnicities differently?
- How much of their attention goes to people who were not "authority figures"?
- When talking about previous teamwork, what's the "We"-to-"I" ratio?
- Do they put their feet up on my desk during our 1:1? (No joke, this legit. happened to a recruiter I worked with. Needless to write, the candidate was not invited back.)
Sure, the internet is a great (and relatively cheap) way to boost the signal. But it also affects the signal-to-noise ratio like, whoa. So the first-line filter (a.k.a. HR) needs some criteria to weed out the self-proclaimed ninjas, rock stars, and unicorns of our Dunning-Krueger age. Employment histories will list date-ranges and, hopefully, mention the technology stack used by the previous employer. But there's rarely room to mention how long or how extensively any given technology was used in the field.
The bottom line is that, without training to evaluate code, HR doesn't have the resources to make this call on their own. At best, HR can do the legwork of plugging code samples into search engines to scan for plagiarism (assuming you care about that). Lacking proper domain knowledge, they have every incentive to fall back on traditional metrics. Which includes "outsourcing" the "follow up" / "circular file" judgement-call to the four-star rating-system of (you guessed it!) the good ol' college GPA.
At which point, you (as the hypothetical employer) have effectively lost your right to complain about universities not preparing CS students for real-world coding work. And fer cryin' in your Double-Double, please don't expect colleges to reward plagiarism. This world is already too much a kleptocracy, thanks.
Tuesday, January 31, 2017
The presentation I won't give
Today I had lunch with two of the three other Illuminati of the local programmer group. (If you happened to be in the Codiac Cafe on George Street over the noon-ish hour, yeah, that was us in the shadowy grey robes. Next time, come up and say "Hi.")
Anyhoo, in addition to hashing out the details of the upcoming meeting/hackathon, we also knocked around ideas for upcoming presentations. Which involves, of course, recruiting speakers. Wherein lies the problem. Because, you see, the Venn diagram of people who know a lot about something interesting and the people who have the ability to get up in front of a crowd of their peers and roll out that information in a structured, digestible way is not what'cha'call a perfect circle, y'know?
But the rub is that the only people to show up for a "How to Give a Presentation" presentation are likely to come from a single demographic -- i.e. the folks who don't want to see the presenter's feelings hurt.
So I'm doing this online. Think of it as a self-study course. If you're old enough to remember people working out to a Jazzercise VCR, so much the better.
Now. I do at least one presentation a year. Mostly to give back, but partly to look the beast of public speaking in its many eyeballs and say, "I ain't scared 'a you." (The beastie and I both know it's an alternative fact straight-up lie, but that's actually the point here.) But I have an ace in the hole: I was doing this at sixteen for giggles. Okay, not really giggles. I had a massive crush on one of my high school's Debate alpha-team. But that crush put me on a path that lead to meeting my husband and the Bestest of BFFs. So who's the real winner here?
Anyhoo. After six or seven years (depending on how you count) of competitive public speaking, I have a few things to pass down to the up-and-coming generation of geek-speakers. Normally, I'd issue a "your mileage may vary" kind of disclaimer, but if you haven't done this more than one or two times, stick with the program until you are comfortable enough not to need it. (This is highly analogous to following a recipe to the number until you grok what makes it tick, m'kay?)
I'm assuming that you have one of the following:
1.) Pull out your favourite plain-text editor. Yes, plain-text. Heck, Notepad, for all I care. Because you are emphatically NOT ALLOWED TO FORMAT ANYTHING at this point. Got that editor fired up? Groovy. Now shut up and let your brain barf out a bullet-list. Single-level only. Don't think: Just get it on the screen. Like, yesterday.
Done? Fabulous. Whatever you do, don't fall in love. Because, about a century before George R. R. Martin, "Murder your darlings" was already legit. literary advice.
2.) Organise. Okay, NOW you're allowed to be hierarchical. But not more than three levels, including section-headers. That effectively leaves you with two levels of detail. Why only two? Because this is soooooooooooo not about you dazzling anyone with nuance; this is about you not boring the ever-loving snot out of the people who are graciously lending you their attention-span.
Cut-and-paste, drag-and-drop until the content feels like it should flow like cream into the brain of someone who knows nothing about the subject.
Nope, don't fall in love here, either. You're going to both murder and mutilate very shortly.
3.) Show, don't tell. Time to lean on images. First, create a folder at the same level as your brain-dump. Now go search and scroll. At a bare-bones minimum, you're going to want images to jazz up your section-header pages. I prefer sly, snarky humour m'self -- big surprise there -- but follow your intended audiences tastes. Whatever you pick, name the file descriptively. You're going to need that information at your fingertips.
Can you convert either-ors into flow-charts? Go. Side-by-side comparisons as tables? Dooo eeeeet. Comparisons/Contrasts as Venn diagrams? Make it so, Number One.
Are you showing code-samples? Excellent. Go type them up, make sure they compile, and then screen-capture them. Ditto the output. This is your insurance against wi-fi issues. Also: You know all those nervous presenters you've seen trying to type code live? This is precisely why you won't be them. You're welcome.
4.) Unit-testing. First, brace yourself for some ugly truths. Deep breath. Now, translate your outline (and images) to simple slides. Simple slides, d'ya hear?! PUT THAT ANIMATION MENU DOWN. Animations are for closers. (Sorry-not-sorry, Alec Baldwin.)
Uh-oh: Some of those topics don't fit onto a single slide, now do they? Huh. Guess you should be thinking about how you're going to break them up into separate topics, mmmm? 'sokay, it's not like you lose points for this. This is the "mutilation" I talked about. Good thing you never let that outline give you big chibi puppy-dog-eyes, amirite?
When you're done, convert the whole thing to PDF, and close your slide editor.
5.) First integration test. Close the door. Keep your laptop/keyboard within arm's reach. Open the document in PDF format. Annnnnnnnd...present! (No stopping allowed -- just plough through it, already.)
Dread Cthulu, that was painful. All that goodness in your head doesn't always quite make it to the spoken word, no? The good news is that the presentation will never sound that gawdawful again. The bad news is that you have a bunch more iterative more work in front of you.
6.) Cull content. First things first. Let's get rid of the stuff that your audience, on second glance, doesn't actually need to know to understand the main points. Just get rid of it and don't look back. Yep--murder your darlings. "Red Wedding" style if necessary.
7.) Re-organise ruthlessly. Did some sections really seem like non-sequiturs when you were fumbling your way through them? Move them before they try to consolidate their positions. Do some still stick out like the proverbial sore thumb? You might want to re-think their importance.
8.) Subsequent integration tests. Follow the recipe for "First integration test." This is basically analogous to the "childbirth amnesia" women experience between their first and second (and even subsequent) children. Sure, the first time is generally the worst, but even so...yeeowch. But, in fairness, perhaps not quite as bad. Maybe.
Keep iterating: Cull and re-organise. Work on something else for awhile and let things simmer on the proverbial back-burner. The content is starting to fall into its rightful places.
9.) Refine the flow. No slide-deck, however well put together, will always flow seamlessly from one section into the next. Or subsection. Or maybe even between bullet-points on the same slide. You probably noticed that during the iterations, yes? That's okay. Seriously okay. Because, really, if it were All About the bullet-points alone, you could (and -- let's face it -- should) just email the slide-desk and let people read it at their leisure. Your value-add is to, quite literally, read between the lines. And the page-breaks, for that matter.
Condense your bullet-points. They exist as hooks for your content, the war-stories you're going to tell, the experienced opinions you're going to lay down. You are there to riff on, not read from, the slides, remember? Now start bridging the slides with transition material. Trust me: This material is more for your sake than even your audience's.
Repeat until your gut tells you it's done. You're not really done, but roll with the sweet illusion for a short time, m'kay?
10.) Beta-test with a trusted (and brutally honest!) peer. And no, I don't mean your cat. I'm talking about someone who has a similar background but not the level of expertise in your subject matter. This will be painful, guaranteed. But you're already used to that. If you've picked the right peer, the feedback will be tough to process. That's okay. The trick is to triage the big-ticket items (e.g. she was totally lost with this whole section). Ignore the nit-picky stuff. No, really. Yes, it's the easiest to fix. But in terms of bang-for-the-loonie? Fuggedabbouddit.
11.) Dance without a net. By this time (several days in), your creation should be starting to breathe on its own. At the risk of sounding pervy, take it into the shower with you. (Not the laptop, silly!) Blanking between sections is perfectly normal and, really, who cares what the shower curtain thinks? If you're visually-oriented like me, the images you picked out for your section-header slides will help trigger the unwritten "glue" content.
12.) Dress rehearsal. At least a day (but no more than two days) before the scheduled presentation, Do a couple runs. If you can arrange to make them in the same conference room (or whatever) in which you'll give the final product, so much the better. What's in your head and what comes out of your mouth should be fairly close on the second try.
Pro tip: Limit the # of rehearsals per day if you don't want your vocal cords to turn on you on The Big Day. If you're still feeling dicey two days out, try three whispered full rehearsals, but no more.
13.) The real deal. Honestly, there's nothing that I can say or that you can do that will truly prepare you for all the eyeballs boring into you when you get up. You will be terrified...and that's okay. Like Q said to James Bond: "Never let them see you bleed." You're going to be riding on obsessive preparation and your think-meat's muscle-memory. This is precisely why you screen-capped your code samples and output.
Okay...because you've read this far and (apparently) trust my experience on this, I'm going to drop an ugly secret: You're going to need to go through this process many times -- dozens, in fact -- before you can trust your mind to stuff your lizard-brain into a sound- and chew-proof box. And the box is the best you can hope for. That lizard-beastie will always be there.
Sorry 'bout that, but to tell you otherwise would be to lie to you. Not to mention short-circuit the process of you becoming that rarest of birds: The geek who can teach. And we need more of you more than ever. The opening shots of a war between the people who actually know what's going on and the mouth-breathing ideologues who think they can shoot from the hip have already been fired. And not by our side.
So, to tweak the closing line from all my presentations: Now get out there and teach something awesome.
Anyhoo, in addition to hashing out the details of the upcoming meeting/hackathon, we also knocked around ideas for upcoming presentations. Which involves, of course, recruiting speakers. Wherein lies the problem. Because, you see, the Venn diagram of people who know a lot about something interesting and the people who have the ability to get up in front of a crowd of their peers and roll out that information in a structured, digestible way is not what'cha'call a perfect circle, y'know?
But the rub is that the only people to show up for a "How to Give a Presentation" presentation are likely to come from a single demographic -- i.e. the folks who don't want to see the presenter's feelings hurt.
So I'm doing this online. Think of it as a self-study course. If you're old enough to remember people working out to a Jazzercise VCR, so much the better.
Now. I do at least one presentation a year. Mostly to give back, but partly to look the beast of public speaking in its many eyeballs and say, "I ain't scared 'a you." (The beastie and I both know it's a
Anyhoo. After six or seven years (depending on how you count) of competitive public speaking, I have a few things to pass down to the up-and-coming generation of geek-speakers. Normally, I'd issue a "your mileage may vary" kind of disclaimer, but if you haven't done this more than one or two times, stick with the program until you are comfortable enough not to need it. (This is highly analogous to following a recipe to the number until you grok what makes it tick, m'kay?)
I'm assuming that you have one of the following:
- A topic you're so passionate about that you're bursting at the seams with its awesomeness, or
- Your boss has "voluntold" you to present on something of relevance to your co-workers
1.) Pull out your favourite plain-text editor. Yes, plain-text. Heck, Notepad, for all I care. Because you are emphatically NOT ALLOWED TO FORMAT ANYTHING at this point. Got that editor fired up? Groovy. Now shut up and let your brain barf out a bullet-list. Single-level only. Don't think: Just get it on the screen. Like, yesterday.
Done? Fabulous. Whatever you do, don't fall in love. Because, about a century before George R. R. Martin, "Murder your darlings" was already legit. literary advice.
2.) Organise. Okay, NOW you're allowed to be hierarchical. But not more than three levels, including section-headers. That effectively leaves you with two levels of detail. Why only two? Because this is soooooooooooo not about you dazzling anyone with nuance; this is about you not boring the ever-loving snot out of the people who are graciously lending you their attention-span.
Cut-and-paste, drag-and-drop until the content feels like it should flow like cream into the brain of someone who knows nothing about the subject.
Nope, don't fall in love here, either. You're going to both murder and mutilate very shortly.
3.) Show, don't tell. Time to lean on images. First, create a folder at the same level as your brain-dump. Now go search and scroll. At a bare-bones minimum, you're going to want images to jazz up your section-header pages. I prefer sly, snarky humour m'self -- big surprise there -- but follow your intended audiences tastes. Whatever you pick, name the file descriptively. You're going to need that information at your fingertips.
Can you convert either-ors into flow-charts? Go. Side-by-side comparisons as tables? Dooo eeeeet. Comparisons/Contrasts as Venn diagrams? Make it so, Number One.
Are you showing code-samples? Excellent. Go type them up, make sure they compile, and then screen-capture them. Ditto the output. This is your insurance against wi-fi issues. Also: You know all those nervous presenters you've seen trying to type code live? This is precisely why you won't be them. You're welcome.
4.) Unit-testing. First, brace yourself for some ugly truths. Deep breath. Now, translate your outline (and images) to simple slides. Simple slides, d'ya hear?! PUT THAT ANIMATION MENU DOWN. Animations are for closers. (Sorry-not-sorry, Alec Baldwin.)
Uh-oh: Some of those topics don't fit onto a single slide, now do they? Huh. Guess you should be thinking about how you're going to break them up into separate topics, mmmm? 'sokay, it's not like you lose points for this. This is the "mutilation" I talked about. Good thing you never let that outline give you big chibi puppy-dog-eyes, amirite?
When you're done, convert the whole thing to PDF, and close your slide editor.
5.) First integration test. Close the door. Keep your laptop/keyboard within arm's reach. Open the document in PDF format. Annnnnnnnd...present! (No stopping allowed -- just plough through it, already.)
Dread Cthulu, that was painful. All that goodness in your head doesn't always quite make it to the spoken word, no? The good news is that the presentation will never sound that gawdawful again. The bad news is that you have a bunch more iterative more work in front of you.
6.) Cull content. First things first. Let's get rid of the stuff that your audience, on second glance, doesn't actually need to know to understand the main points. Just get rid of it and don't look back. Yep--murder your darlings. "Red Wedding" style if necessary.
7.) Re-organise ruthlessly. Did some sections really seem like non-sequiturs when you were fumbling your way through them? Move them before they try to consolidate their positions. Do some still stick out like the proverbial sore thumb? You might want to re-think their importance.
8.) Subsequent integration tests. Follow the recipe for "First integration test." This is basically analogous to the "childbirth amnesia" women experience between their first and second (and even subsequent) children. Sure, the first time is generally the worst, but even so...yeeowch. But, in fairness, perhaps not quite as bad. Maybe.
Keep iterating: Cull and re-organise. Work on something else for awhile and let things simmer on the proverbial back-burner. The content is starting to fall into its rightful places.
9.) Refine the flow. No slide-deck, however well put together, will always flow seamlessly from one section into the next. Or subsection. Or maybe even between bullet-points on the same slide. You probably noticed that during the iterations, yes? That's okay. Seriously okay. Because, really, if it were All About the bullet-points alone, you could (and -- let's face it -- should) just email the slide-desk and let people read it at their leisure. Your value-add is to, quite literally, read between the lines. And the page-breaks, for that matter.
Condense your bullet-points. They exist as hooks for your content, the war-stories you're going to tell, the experienced opinions you're going to lay down. You are there to riff on, not read from, the slides, remember? Now start bridging the slides with transition material. Trust me: This material is more for your sake than even your audience's.
Repeat until your gut tells you it's done. You're not really done, but roll with the sweet illusion for a short time, m'kay?
10.) Beta-test with a trusted (and brutally honest!) peer. And no, I don't mean your cat. I'm talking about someone who has a similar background but not the level of expertise in your subject matter. This will be painful, guaranteed. But you're already used to that. If you've picked the right peer, the feedback will be tough to process. That's okay. The trick is to triage the big-ticket items (e.g. she was totally lost with this whole section). Ignore the nit-picky stuff. No, really. Yes, it's the easiest to fix. But in terms of bang-for-the-loonie? Fuggedabbouddit.
11.) Dance without a net. By this time (several days in), your creation should be starting to breathe on its own. At the risk of sounding pervy, take it into the shower with you. (Not the laptop, silly!) Blanking between sections is perfectly normal and, really, who cares what the shower curtain thinks? If you're visually-oriented like me, the images you picked out for your section-header slides will help trigger the unwritten "glue" content.
12.) Dress rehearsal. At least a day (but no more than two days) before the scheduled presentation, Do a couple runs. If you can arrange to make them in the same conference room (or whatever) in which you'll give the final product, so much the better. What's in your head and what comes out of your mouth should be fairly close on the second try.
Pro tip: Limit the # of rehearsals per day if you don't want your vocal cords to turn on you on The Big Day. If you're still feeling dicey two days out, try three whispered full rehearsals, but no more.
13.) The real deal. Honestly, there's nothing that I can say or that you can do that will truly prepare you for all the eyeballs boring into you when you get up. You will be terrified...and that's okay. Like Q said to James Bond: "Never let them see you bleed." You're going to be riding on obsessive preparation and your think-meat's muscle-memory. This is precisely why you screen-capped your code samples and output.
Okay...because you've read this far and (apparently) trust my experience on this, I'm going to drop an ugly secret: You're going to need to go through this process many times -- dozens, in fact -- before you can trust your mind to stuff your lizard-brain into a sound- and chew-proof box. And the box is the best you can hope for. That lizard-beastie will always be there.
Sorry 'bout that, but to tell you otherwise would be to lie to you. Not to mention short-circuit the process of you becoming that rarest of birds: The geek who can teach. And we need more of you more than ever. The opening shots of a war between the people who actually know what's going on and the mouth-breathing ideologues who think they can shoot from the hip have already been fired. And not by our side.
So, to tweak the closing line from all my presentations: Now get out there and teach something awesome.
Tuesday, June 28, 2016
Waterfall vs. Agile Development, an illustration from the NB DOT
This post is mostly about the Department of Transportation (specifically the one of New Brunswick), but first I want to thank the Department of Small Mercies for doing me a solid...or two...or three...
I was scheduled for a noon meeting in Moncton today. The initial meeting announcement listed the location as happening on the campus of one of Moncton's two private one-year colleges. Which narrows it down to a single street address, but nevertheless leaves something to be desired in terms of precision.
So, having enough sense of the organiser's temperament to allow myself a bit of smart-arsery, I enquired as to whether the vagueness was a test of my persistence/resourcefulness, upon which my admission to said meeting would depend. It turns out that the organiser is at least my equal in smart-arsery. Which I naturally took as a challenge: "Oh, honey, it is soooo on," I thought, resolving to be in the room, greeting him with a wave and a Cheshire Cat grin, when he arrived.
Thus, I left Grande Digue with a little over half an hour to spare. Outside of Shediac on (westbound) Highway 15, traffic suddenly slowed to the proverbial crawl. First an ambulance and then a squad car passed our line on the left. Somewhere past the 29km marker, a car lay flipped on its roof, but the rubber-necking was kept to a minimum. Including my own, so my Gentle Reader will have to check the news for the details. (P.S.: Bonne chance, whoever you are...)
Following a short speed up, I encountered the stretch of road which had been stripped of asphalt during my previous run into HubCity. Today the asphalt was fresh, a dotted line had been painted down the centre...and traffic again slowed down to < 10km/hour. At least when it wasn't standing completely still.
And this is where our "illustration" really begins. Because that freshly renewed stretch of highway had been necked down to one lane for kilometres. Kilometres of perfectly sound road, missing only the side-line markers and (possibly) the rumble-strips. With nary a worker nor piece of equipment in sight to justify the buffer-state of asphalt that kept traffic to the pace of a snail on quaaludes. (And, as anyone who lives where other folks vacation can attest, there should be a Special Hell(TM) reserved for anyone who pulls those shenanigans during Tourist Season.)
In software development, there are two ways of making A Thing (for lack of a better term).
But that mixing of contexts is, alas, precisely what added an extra twenty minutes to a commute that normally takes thirty.
Context: For my Gentle Readers outside of Canada, New Brunswick is what's known as a "have-not Province," and has been since steam replaced sail. Historically, various Governments (Conservative and Liberal) have cushioned budget shortfalls with "equalisation payments" to guarantee a minimum standard of public services (notably those related to health care). But New Brunswick's debt (and its debt to GDP ratio) has crept up under both brand-name parties. And that's before the previous federal Government decided to jump on the austerity bandwagon.
What with that and the talk (a.k.a. threat) of privatising provincial road-building, it's hardly surprising that the bean-counters have taken over with a vengeance.
Now. Road construction is not my domain, we say in software development. So some of the following will be, to a greater or lesser extent, conjecture. That being said, I can in all fairness call out certain strong similarities between what I do for a living and how the folks in the bright orange jackets make their gelt:
I mentioned "beta" mode above. A road that's partially open during construction is basically the same thing as an open beta in software development. And in open beta you emphatically do not wait until the "official" launch-date to release all the bug-fixes and missing features. Any product beta-tested in that fashion will lose the interest of the influential early adopters (and probably never see the light of day). No. You shove those things out the door as soon as QA green-lights them. (Granted, the province has the upper hand in this instance, because people will always gravitate back to the shortest time between Points A and B. But my point, I think, stands.)
Yet, whoever was directing today's crew could have easily limited the bottleneck a mere kilometre of work surface and left everything else open to two lanes of traffic. When that was finished, they could have done exactly the same thing for the next kilometre of surface. And so on...until either the potholes or budget ran out. In short--the Agile method would have produced the minimal amount of traffic disruption during the busiest season of the year.
Instead, someone (quite wrongly) chose to operate by the Waterfall method. Again, this is just conjecture, but my strong suspicion is that that someone assumes that handling the resurfacing in larger chunks allows for economies of scale. For all I know, they're correct--at least superficially. And, of course, the Suits luuuuuuvvve their Waterfalls...mainly because they live upstream and it makes them feel more like they're the driving force behind everything. [eyeroll]
But the problem is that, by my estimate, the entirely gratuitous one-lane bottlenecking cost each and every person about extra fifteen minutes, compared to the length of road actually being worked on. Every missed appointment, every late delivery, every disgruntled tourist -- those come with a cost, too. For sure, that cost won't show up on this year's tax bill. But it will show up in one way or another--make no mistake about that.
In my particular case, the Patron Saints of traffic lights and 2-hour parking and sheer dumb luck (plus my taste for harmless practical jokes) allowed me to make my meeting just in time. That, however, doesn't mean that I'm not brassed-off. I've been on well-run projects and ones that barely ran at all--in both Waterfall mode as well as Agile. Like I said, they each have their proper context. And it is a sorry manager (and steward of public funds) who chooses the wrong context.
I was scheduled for a noon meeting in Moncton today. The initial meeting announcement listed the location as happening on the campus of one of Moncton's two private one-year colleges. Which narrows it down to a single street address, but nevertheless leaves something to be desired in terms of precision.
So, having enough sense of the organiser's temperament to allow myself a bit of smart-arsery, I enquired as to whether the vagueness was a test of my persistence/resourcefulness, upon which my admission to said meeting would depend. It turns out that the organiser is at least my equal in smart-arsery. Which I naturally took as a challenge: "Oh, honey, it is soooo on," I thought, resolving to be in the room, greeting him with a wave and a Cheshire Cat grin, when he arrived.
Thus, I left Grande Digue with a little over half an hour to spare. Outside of Shediac on (westbound) Highway 15, traffic suddenly slowed to the proverbial crawl. First an ambulance and then a squad car passed our line on the left. Somewhere past the 29km marker, a car lay flipped on its roof, but the rubber-necking was kept to a minimum. Including my own, so my Gentle Reader will have to check the news for the details. (P.S.: Bonne chance, whoever you are...)
Following a short speed up, I encountered the stretch of road which had been stripped of asphalt during my previous run into HubCity. Today the asphalt was fresh, a dotted line had been painted down the centre...and traffic again slowed down to < 10km/hour. At least when it wasn't standing completely still.
And this is where our "illustration" really begins. Because that freshly renewed stretch of highway had been necked down to one lane for kilometres. Kilometres of perfectly sound road, missing only the side-line markers and (possibly) the rumble-strips. With nary a worker nor piece of equipment in sight to justify the buffer-state of asphalt that kept traffic to the pace of a snail on quaaludes. (And, as anyone who lives where other folks vacation can attest, there should be a Special Hell(TM) reserved for anyone who pulls those shenanigans during Tourist Season.)
In software development, there are two ways of making A Thing (for lack of a better term).
- The "Waterfall Development" school harks back to the assembly line of the industrial past (presumably an artefact of lumping Software Engineering in with traditional Engineering). Designers hand off to Coders who hand off to Testers who ultimately hand off to whoever packages the code and delivers it to customers. Henry Ford would feel completely at home in this world.
- The "Agile Development" school hews more to the "throw it against the wall and see if it sticks" line of thought. Which, surprisingly, is also based on manufacturing principles developed in the automotive industry. Except that it was done within the limited resources of a scrappy post-WWII Toyoda (now Toyota).
But that mixing of contexts is, alas, precisely what added an extra twenty minutes to a commute that normally takes thirty.
Context: For my Gentle Readers outside of Canada, New Brunswick is what's known as a "have-not Province," and has been since steam replaced sail. Historically, various Governments (Conservative and Liberal) have cushioned budget shortfalls with "equalisation payments" to guarantee a minimum standard of public services (notably those related to health care). But New Brunswick's debt (and its debt to GDP ratio) has crept up under both brand-name parties. And that's before the previous federal Government decided to jump on the austerity bandwagon.
What with that and the talk (a.k.a. threat) of privatising provincial road-building, it's hardly surprising that the bean-counters have taken over with a vengeance.
Now. Road construction is not my domain, we say in software development. So some of the following will be, to a greater or lesser extent, conjecture. That being said, I can in all fairness call out certain strong similarities between what I do for a living and how the folks in the bright orange jackets make their gelt:
- We only perform our ostensible "work" between interruptions. In their case, it's mainly weather. In mine, it's meetings and administrivia.
- We can't always trust the infrastructure. In their case it's a pocket of soggy clay, erosion from wonky grading on the last job, etc. In my case it's network issues, unexpected upgrades, security holes, what-have-you.
- We can be screwed over twelve ways to Sunday by vendors. 'Nuff said.
- We can be--and too often are--encouraged by the Powers That Be to cut corners and/or kick the can down the proverbial road.
- We have to develop and learn to trust a healthy sense of pessimism to sniff out the edge-cases that could bring everything crashing down.
- We know that nothing is ever going to be 100% perfect 100% of the time -- there will always be "beta" mode as well as maintenance. More on that later.
I mentioned "beta" mode above. A road that's partially open during construction is basically the same thing as an open beta in software development. And in open beta you emphatically do not wait until the "official" launch-date to release all the bug-fixes and missing features. Any product beta-tested in that fashion will lose the interest of the influential early adopters (and probably never see the light of day). No. You shove those things out the door as soon as QA green-lights them. (Granted, the province has the upper hand in this instance, because people will always gravitate back to the shortest time between Points A and B. But my point, I think, stands.)
Yet, whoever was directing today's crew could have easily limited the bottleneck a mere kilometre of work surface and left everything else open to two lanes of traffic. When that was finished, they could have done exactly the same thing for the next kilometre of surface. And so on...until either the potholes or budget ran out. In short--the Agile method would have produced the minimal amount of traffic disruption during the busiest season of the year.
Instead, someone (quite wrongly) chose to operate by the Waterfall method. Again, this is just conjecture, but my strong suspicion is that that someone assumes that handling the resurfacing in larger chunks allows for economies of scale. For all I know, they're correct--at least superficially. And, of course, the Suits luuuuuuvvve their Waterfalls...mainly because they live upstream and it makes them feel more like they're the driving force behind everything. [eyeroll]
But the problem is that, by my estimate, the entirely gratuitous one-lane bottlenecking cost each and every person about extra fifteen minutes, compared to the length of road actually being worked on. Every missed appointment, every late delivery, every disgruntled tourist -- those come with a cost, too. For sure, that cost won't show up on this year's tax bill. But it will show up in one way or another--make no mistake about that.
In my particular case, the Patron Saints of traffic lights and 2-hour parking and sheer dumb luck (plus my taste for harmless practical jokes) allowed me to make my meeting just in time. That, however, doesn't mean that I'm not brassed-off. I've been on well-run projects and ones that barely ran at all--in both Waterfall mode as well as Agile. Like I said, they each have their proper context. And it is a sorry manager (and steward of public funds) who chooses the wrong context.
Monday, March 21, 2016
The Accidental Internship
In truth, I was planning to post this last Frivolous Friday. Because, when it comes right down to it, the joke's really on me.
You see a lot of the same faces at various Moncton technology-related events. However, they're rarely the same faces. But as with middle- and high-school cliques, you'll find a scant handful of individuals who can comfortably exist in multiple contexts. In this case, one of those individuals is a student shortly due to graduate from one of the local colleges.
Said college does not require (but will give credit for) what amounts to internships. Mind you, those are not plentiful even in a booming economy (into which category New Brunswick emphatically does NOT fall). Now, it's stupefying to conceive of a situation in which businesses have abused the system egregiously enough to (gasp!) force a change in requirements--under a Conservative government, no less!. But that, in fact, has actually happened in Canada. Thus, we can count on several months of a reality in which unpaid grunt-work is not, in fact, the norm for certain career-tracks. At least not until the lawyers find all the loopholes and labour conditions disintegrate even further into the stuff of Ayn Rand pr0n, anyway.
For all that, I (as a freelancer) seriously doubt that the dearth of internship opportunities can be blamed on the same oppressive regime which tyrannically imposes a lower corporate tax rate than that of the United States. No. The reality is that "branching off" tasks is actually pretty darned complex. There's the up-front cognitive investment, certainly. One has to define tasks and metrics, for starters. Also, to be prepared to do it all oneself in cases of extreme failure. And Dread Cthulu help anyone who has employees. Because the same ones who are perpetually "swamped" are inevitably the same ones who "don't have time" to train anyone else to take the load off. (Yeah, people are awesome, amirite?)
But there's a third barrier to the wide availability of internships, and I'll get to that in a bit.
In the meantime, I have the luxury of not having the second problem. So when a local college student (an older gentleman with an amiable disposition and ridiculous amounts of hustle) informed me that (despite all his networking) he didn't have an internship lined up, I airily suggested that he make his own with a local non-profit. Then I (very) briefly outlined a project that's about three or four rungs down on my personal pro-bono "To Do" list. And followed up with the vague, "Oh, well, if nothing else turns up, you know where to find me..."
That was last Tuesday night.
By last Thursday, he and his partner had decided that they liked my idea better than what they had been working on, and would I care to take a look at the project draft they'd hammered out in the interim? And, oh, by the way, the app. has to be turned in by April 15th...
Colour me wryly amused. But also colour me a bit wiser. Not about making airy, open-ended suggestions (when I should darned well know better) to ambitious college students. Pffffft--that's just paying it forward after making my own (second) internship all those years ago. (Plus, knowing me, I'll always be that kind of stupid anyway.)
Uh-uh. The take-away for me is if interns don't scare the ever-loving beejeebers out of you, you're in serious trouble. Because, as it turns out, these two aspiring programmers aren't too shabby. The UI specs that landed in my Inbox this evening had use cases I wouldn't have thought of until the second (maybe third?) draft. That's not to say these folks are bound for the next Y-Combinator cohort, but daaaaaaaaaang.
And is that, I have to wonder, what prevents a lot of businesses from pipelining students into their ranks while the latter are still in school. Make no mistake: It's emphatically NOT FUN to be reminded of the fact that someone else has the luxury (yea, even requirement) of using the Cool Tools(TM) while you're being paid to maintain boring legacy code with infuriating pay-per-seat software that long-gone manager once scored a sweet discount on.
Sure, you can pooh-pooh their naivité ("Erhmehgerhd--where is your validation code!?!?!?"). But chances are, they'll internalise that lesson faster than even a seasoned coder will absorb how Android configuration files are laid out. In a world where coders halfway around the globe sell their time/expertise at dime-store rates, it's the ability to, well, triage knowledge that's the key to survival. Oh, and having some hustle certainly doesn't hurt either.
Be afraid--be very afraid. And for love of Mammon, on-board some of these people if you know what's good for you.
You see a lot of the same faces at various Moncton technology-related events. However, they're rarely the same faces. But as with middle- and high-school cliques, you'll find a scant handful of individuals who can comfortably exist in multiple contexts. In this case, one of those individuals is a student shortly due to graduate from one of the local colleges.
Said college does not require (but will give credit for) what amounts to internships. Mind you, those are not plentiful even in a booming economy (into which category New Brunswick emphatically does NOT fall). Now, it's stupefying to conceive of a situation in which businesses have abused the system egregiously enough to (gasp!) force a change in requirements--under a Conservative government, no less!. But that, in fact, has actually happened in Canada. Thus, we can count on several months of a reality in which unpaid grunt-work is not, in fact, the norm for certain career-tracks. At least not until the lawyers find all the loopholes and labour conditions disintegrate even further into the stuff of Ayn Rand pr0n, anyway.
For all that, I (as a freelancer) seriously doubt that the dearth of internship opportunities can be blamed on the same oppressive regime which tyrannically imposes a lower corporate tax rate than that of the United States. No. The reality is that "branching off" tasks is actually pretty darned complex. There's the up-front cognitive investment, certainly. One has to define tasks and metrics, for starters. Also, to be prepared to do it all oneself in cases of extreme failure. And Dread Cthulu help anyone who has employees. Because the same ones who are perpetually "swamped" are inevitably the same ones who "don't have time" to train anyone else to take the load off. (Yeah, people are awesome, amirite?)
But there's a third barrier to the wide availability of internships, and I'll get to that in a bit.
In the meantime, I have the luxury of not having the second problem. So when a local college student (an older gentleman with an amiable disposition and ridiculous amounts of hustle) informed me that (despite all his networking) he didn't have an internship lined up, I airily suggested that he make his own with a local non-profit. Then I (very) briefly outlined a project that's about three or four rungs down on my personal pro-bono "To Do" list. And followed up with the vague, "Oh, well, if nothing else turns up, you know where to find me..."
That was last Tuesday night.
By last Thursday, he and his partner had decided that they liked my idea better than what they had been working on, and would I care to take a look at the project draft they'd hammered out in the interim? And, oh, by the way, the app. has to be turned in by April 15th...
Colour me wryly amused. But also colour me a bit wiser. Not about making airy, open-ended suggestions (when I should darned well know better) to ambitious college students. Pffffft--that's just paying it forward after making my own (second) internship all those years ago. (Plus, knowing me, I'll always be that kind of stupid anyway.)
Uh-uh. The take-away for me is if interns don't scare the ever-loving beejeebers out of you, you're in serious trouble. Because, as it turns out, these two aspiring programmers aren't too shabby. The UI specs that landed in my Inbox this evening had use cases I wouldn't have thought of until the second (maybe third?) draft. That's not to say these folks are bound for the next Y-Combinator cohort, but daaaaaaaaaang.
And is that, I have to wonder, what prevents a lot of businesses from pipelining students into their ranks while the latter are still in school. Make no mistake: It's emphatically NOT FUN to be reminded of the fact that someone else has the luxury (yea, even requirement) of using the Cool Tools(TM) while you're being paid to maintain boring legacy code with infuriating pay-per-seat software that long-gone manager once scored a sweet discount on.
Sure, you can pooh-pooh their naivité ("Erhmehgerhd--where is your validation code!?!?!?"). But chances are, they'll internalise that lesson faster than even a seasoned coder will absorb how Android configuration files are laid out. In a world where coders halfway around the globe sell their time/expertise at dime-store rates, it's the ability to, well, triage knowledge that's the key to survival. Oh, and having some hustle certainly doesn't hurt either.
Be afraid--be very afraid. And for love of Mammon, on-board some of these people if you know what's good for you.
Thursday, November 19, 2015
Permission vs. Excuse
We all know that one person. Likely we know multiple flavours of
that one person, but let's just generalise for simplicity here.
That one person of whom I speak has the Great Idea. Or above-average talent. There's no reason why they shouldn't do/make The Thing (whatever it is) that would make the world better. Except that they're being stalked by failure. And so your offers to link them with people people who could help them are sabotaged, if not rebuffed outright. Your encouragement disappears into a black hole. The time, you see, is never quite right. There are too many people willing and able to take advantage of them.
We might, even to a fractional degree, be that one person at some point in our lives. You know, making up excuses before we even do or make The Thing.
The bad news is that we will fail. The good news is that we will fail more than once.
Which sounds illogical until you consider how empowering that is. Give yourself permission to fail, and you will be more than equipped for the next failure. And the next. And so on.
Giving yourself permission to fail is not the same as making excuses ahead of time. The saying goes that it's better to seek forgiveness than to ask permission. This is one of the glaring exceptions to that rule. Even so, seeking forgiveness for failure is likewise not at all the same as making excuses.
Excuses are pernicious things, and can rob us twice. They allow us to declare bankruptcy on our responsibility for The Thing not working out the first time. But they can also give us a pass on learning from failure. Which makes it less likely that we will try another way to make/do The Thing...or The New Thing.
But with the permission to fail, it's incumbent on us to clearly define "failure" from the outset. How will we know what failure looks like? What's the plan for changing course to avoid a crash? What can we salvage in the event we crash anyway? Sure, those questions take some CPU cycles. Then again, so does sweating the timing of The Thing, or paranoia that someone will steal The Thing from you.
Now, if my Gentle Reader will excuse me, I have to go practice what I preach and fire off an email or two so that I can get back to The Thing in earnest.
That one person of whom I speak has the Great Idea. Or above-average talent. There's no reason why they shouldn't do/make The Thing (whatever it is) that would make the world better. Except that they're being stalked by failure. And so your offers to link them with people people who could help them are sabotaged, if not rebuffed outright. Your encouragement disappears into a black hole. The time, you see, is never quite right. There are too many people willing and able to take advantage of them.
We might, even to a fractional degree, be that one person at some point in our lives. You know, making up excuses before we even do or make The Thing.
The bad news is that we will fail. The good news is that we will fail more than once.
Which sounds illogical until you consider how empowering that is. Give yourself permission to fail, and you will be more than equipped for the next failure. And the next. And so on.
Giving yourself permission to fail is not the same as making excuses ahead of time. The saying goes that it's better to seek forgiveness than to ask permission. This is one of the glaring exceptions to that rule. Even so, seeking forgiveness for failure is likewise not at all the same as making excuses.
Excuses are pernicious things, and can rob us twice. They allow us to declare bankruptcy on our responsibility for The Thing not working out the first time. But they can also give us a pass on learning from failure. Which makes it less likely that we will try another way to make/do The Thing...or The New Thing.
But with the permission to fail, it's incumbent on us to clearly define "failure" from the outset. How will we know what failure looks like? What's the plan for changing course to avoid a crash? What can we salvage in the event we crash anyway? Sure, those questions take some CPU cycles. Then again, so does sweating the timing of The Thing, or paranoia that someone will steal The Thing from you.
Now, if my Gentle Reader will excuse me, I have to go practice what I preach and fire off an email or two so that I can get back to The Thing in earnest.
Thursday, October 22, 2015
Quality isn't just another value-add
Previously, I mentioned that the Q-and-A period at this past Tuesday's Moncton User Group meeting ("Security Enterprise Architecture for Developers") basically resulted in two epiphanies for me. The first, and more junior, one I riffed on during the previous entry.
My question to Jamie Rees had to do with "selling" security's value to your client as part of the application development process. (Because we all know we should make apps. secure from the ground up, right? But, then, we also know that we're supposed to floss every night, too. And we all know how that plays out for most people. Your faithful blogger included.)
I was thinking of I/T security in terms of risk management. To wit: A data breach costs money. If data related to financial information (credit card numbers, Social Security / Social Insurance numbers, bank account numbers), the company whose data was leaked is typically on the hook for years of credit monitoring for each person affected.
Then there are the lump-sum costs. Things like the hit to customer goodwill (which sounds really squishy, but there are accountants who specialise in quantifying that in hard cash). Finding and patching the weaknesses in the system does not come cheap either. Depending on the industry, third-party audits might be required. And if the security lapses were super-egregious, heads will roll, which entails (at a minimum) the costs of hiring and training.
So I figured that this could be gelled down to a simple formula:
If that "annual risk" (quantified in dollars) is greater than the budgeted amount for security in the Statement of Work, it should be a no-brainer, right?
Jamie Rees had a few thoughts and suggestions, including the nugget that in security more than anything else, you have to protect yourself from entropy. Because waiting until a crisis to fix a hole means that you will focus on that crisis alone. But once that energy's expended, organisational fatigue will guarantee that there will few (if any) resources spent on proactively preventing the next crisis. (Sound familiar?)
But as I was digesting this all down for my notes, someone else raised a hand and asked the question as it should have been phrased in the first place: "How do I sell my clients on security without selling fear?" And, wham! Synapses linked up, proverbial light bulbs went on. (For all I know, the heavens opened to the sound of angel-choirs. But Suite 201 of the Venn Centre is really, really well-soundproofed, so don't quote me on that.)
For the record, Mr. Rees's answer boiled down to mapping security to the project goals. (Like, I might add, y'do for everything. We're All About the goals, not the features here.)
But what hit me was security, really, is just another facet of Quality Assurance. A very specific facet, it's true--and one perhaps almost large enough to overshadow its general category.
But the thing that's Quality Assurance has proven over and over since Dr. Deming pioneered the discipline is that quality ultimately pays for itself. Namely, because focusing on quality forces you to take a hard look at your organisation and its processes. A relentless focus on quality allows far less room for the politics of personality--which includes the always-regretable "rock star" culture. And it has no mercy for the "But we've always done it that way" argument.
So, when pitching my services to future clients, you can bet that I will be pointing out how developing an application for their business buys them process consulting from an unbiased 3rd party as part of the package. And all for the low, low cost of higher quality. :~)
My question to Jamie Rees had to do with "selling" security's value to your client as part of the application development process. (Because we all know we should make apps. secure from the ground up, right? But, then, we also know that we're supposed to floss every night, too. And we all know how that plays out for most people. Your faithful blogger included.)
I was thinking of I/T security in terms of risk management. To wit: A data breach costs money. If data related to financial information (credit card numbers, Social Security / Social Insurance numbers, bank account numbers), the company whose data was leaked is typically on the hook for years of credit monitoring for each person affected.
Then there are the lump-sum costs. Things like the hit to customer goodwill (which sounds really squishy, but there are accountants who specialise in quantifying that in hard cash). Finding and patching the weaknesses in the system does not come cheap either. Depending on the industry, third-party audits might be required. And if the security lapses were super-egregious, heads will roll, which entails (at a minimum) the costs of hiring and training.
So I figured that this could be gelled down to a simple formula:
Average cost per breach * Likelihood of breach per year = Annual risk
If that "annual risk" (quantified in dollars) is greater than the budgeted amount for security in the Statement of Work, it should be a no-brainer, right?
Jamie Rees had a few thoughts and suggestions, including the nugget that in security more than anything else, you have to protect yourself from entropy. Because waiting until a crisis to fix a hole means that you will focus on that crisis alone. But once that energy's expended, organisational fatigue will guarantee that there will few (if any) resources spent on proactively preventing the next crisis. (Sound familiar?)
But as I was digesting this all down for my notes, someone else raised a hand and asked the question as it should have been phrased in the first place: "How do I sell my clients on security without selling fear?" And, wham! Synapses linked up, proverbial light bulbs went on. (For all I know, the heavens opened to the sound of angel-choirs. But Suite 201 of the Venn Centre is really, really well-soundproofed, so don't quote me on that.)
For the record, Mr. Rees's answer boiled down to mapping security to the project goals. (Like, I might add, y'do for everything. We're All About the goals, not the features here.)
But what hit me was security, really, is just another facet of Quality Assurance. A very specific facet, it's true--and one perhaps almost large enough to overshadow its general category.
But the thing that's Quality Assurance has proven over and over since Dr. Deming pioneered the discipline is that quality ultimately pays for itself. Namely, because focusing on quality forces you to take a hard look at your organisation and its processes. A relentless focus on quality allows far less room for the politics of personality--which includes the always-regretable "rock star" culture. And it has no mercy for the "But we've always done it that way" argument.
So, when pitching my services to future clients, you can bet that I will be pointing out how developing an application for their business buys them process consulting from an unbiased 3rd party as part of the package. And all for the low, low cost of higher quality. :~)
Wednesday, October 21, 2015
When questions >= answers
So October's meeting of the Moncton User Group (@monctonug) was a bit different from the usual classroom-esque schtick. Granted, the presenter gave a prepared spiel, followed by a Q&A period. But, thanks to technical difficulties (i.e. the wrong laptop connectors), there was no PowerPoint. Which can be a good thing, and in this case drove the content more into the realm of "war stories."
But I had two epiphanies, one small, and another not-so-wee. Because I need to mop up some things before decamping for a meeting, I'm going to just focus on the wee one.
Anyone who's ever attended a conference (other than to collect tchotchkes and gawp at booth-babes...or just play hooky on the company tab) knows that the presentations are the hamburger patty, but everything else is the bun and toppings. In other words, you don't just eat the patty. (Not unless you have a lot of food allergies, I suppose.)
Naturally, the networking is a big deal. But so's the chance to pull the content of the presentation into your own context. Normally that's done through Q&A. But sometimes, as I discovered last evening, someone else's question is even more clarifying that your own. Which is exactly when someone (with a lot more business development experience than I currently have) followed up my question with, frankly, the one I should have asked in the first place. I don't like using the phrase "refined my thinking" because I think it's usually a fig leaf for "why didn't I think of that?" Mind you, that's actually what happened, but it triggered a whole new riff in my head.
That riff is the subject of the next post. But I thought that this insight might encourage folks to click that EventBrite link the next time they're sitting on the fence about a learning opportunity. Remember, the burger is more than the patty. You're there for the burger.
But I had two epiphanies, one small, and another not-so-wee. Because I need to mop up some things before decamping for a meeting, I'm going to just focus on the wee one.
Anyone who's ever attended a conference (other than to collect tchotchkes and gawp at booth-babes...or just play hooky on the company tab) knows that the presentations are the hamburger patty, but everything else is the bun and toppings. In other words, you don't just eat the patty. (Not unless you have a lot of food allergies, I suppose.)
Naturally, the networking is a big deal. But so's the chance to pull the content of the presentation into your own context. Normally that's done through Q&A. But sometimes, as I discovered last evening, someone else's question is even more clarifying that your own. Which is exactly when someone (with a lot more business development experience than I currently have) followed up my question with, frankly, the one I should have asked in the first place. I don't like using the phrase "refined my thinking" because I think it's usually a fig leaf for "why didn't I think of that?" Mind you, that's actually what happened, but it triggered a whole new riff in my head.
That riff is the subject of the next post. But I thought that this insight might encourage folks to click that EventBrite link the next time they're sitting on the fence about a learning opportunity. Remember, the burger is more than the patty. You're there for the burger.
Tuesday, September 29, 2015
Hating on Physics
Ugh -- two hiccups in one morning. (Mercifully, there was just enough coffee left in the thermos for them both.)
Firstly, a rough head-count of wires suggests that we may not be able to get away with a mini-breadboard for the MPL's robot -- at least, not for the "brains" part up top. Which means more weight. And, more annoyingly, a chassis redesign.
Secondly, a client asked me to verify that data had been imported by a scheduled job. Alas, when I peeked into the database, I found that only one record from roughly 800 had made it over. Going back to the original data dump (like y'do), I quickly realised that the code was fine, but all but one record was seriously hosed thanks to a freaky Excel export. (In programming, the shorthand for this type of situation is "GIGO": Garbage In, Garbage Out.)
The fact that the code is, technically, doing what it's supposed to do is cold comfort. The fact remains that we have another edge-case to take into account (and gracefully handle, natch'erly) in the next release. Which means time and budget that doesn't go into what we want to accomplish. Booyah, legacy code.
So, as a human -- particularly a human who works from home -- I'm allowed to say a few unprintable things. But I'm not allowed to resent the newly-discovered edge-case. That would be as ridiculous as hating the physics of not enough breadboard wires for all the robot-widgets. And I'm certainly not allowed to carry on as if those problems didn't exist. Or worse, go solve the problems that interest me.
See, that's what bugs me so much about wanna-be "leaders" like The Donald or Carly Fiorina. Forget the incendiary rhetoric. It's when the former responds to a hard, documentable fact (e.g. that 40+% of illegal immigrants arrive in the U.S. via plane, not sneaking through the desert) with "I don't believe it." Go pound sand, Trump: Reality doesn't care what you believe.
Likewise, the latter doubling down on her claims about the breitbarted "Center for Medical Progress" video, inventing footage that never existed in the original scam. (I thought that fanfic had reached its nadir when Twilight fan E.L. James penned the 50 Shades trilogy, but...daaaaaaaaang. That's a whole 'nuther level of gaslighting your own gender, Ms. Fiorina.)
Gosh, I can't imagine how one's been in bankruptcy (at least) four times, and the latter very nearly led her company off a cliff... [insert uber-sarcastic eyeroll]
Look. I don't get to ignore reality--much less hate on it. That would lead to solving the wrong problems...assuming they're even problems in the first place. At best, that's a waste of time and money. In the middle, the real problems are neglected. And in the worst-case, that additionally creates new problems.
I assume that, like me, my Gentle Reader is paid to solve real problems within the constraints of the real world. We should accept nothing less of ourselves. And expect even better from those who want to have the power to declare war, spend our tax dollars, etc. Values can be debated; facts cannot. Anyone who cannot accept that should have their "adult card" taken away and be given a Sims account so they can live in their own reality without bothering anyone else's.
Firstly, a rough head-count of wires suggests that we may not be able to get away with a mini-breadboard for the MPL's robot -- at least, not for the "brains" part up top. Which means more weight. And, more annoyingly, a chassis redesign.
Secondly, a client asked me to verify that data had been imported by a scheduled job. Alas, when I peeked into the database, I found that only one record from roughly 800 had made it over. Going back to the original data dump (like y'do), I quickly realised that the code was fine, but all but one record was seriously hosed thanks to a freaky Excel export. (In programming, the shorthand for this type of situation is "GIGO": Garbage In, Garbage Out.)
The fact that the code is, technically, doing what it's supposed to do is cold comfort. The fact remains that we have another edge-case to take into account (and gracefully handle, natch'erly) in the next release. Which means time and budget that doesn't go into what we want to accomplish. Booyah, legacy code.
So, as a human -- particularly a human who works from home -- I'm allowed to say a few unprintable things. But I'm not allowed to resent the newly-discovered edge-case. That would be as ridiculous as hating the physics of not enough breadboard wires for all the robot-widgets. And I'm certainly not allowed to carry on as if those problems didn't exist. Or worse, go solve the problems that interest me.
See, that's what bugs me so much about wanna-be "leaders" like The Donald or Carly Fiorina. Forget the incendiary rhetoric. It's when the former responds to a hard, documentable fact (e.g. that 40+% of illegal immigrants arrive in the U.S. via plane, not sneaking through the desert) with "I don't believe it." Go pound sand, Trump: Reality doesn't care what you believe.
Likewise, the latter doubling down on her claims about the breitbarted "Center for Medical Progress" video, inventing footage that never existed in the original scam. (I thought that fanfic had reached its nadir when Twilight fan E.L. James penned the 50 Shades trilogy, but...daaaaaaaaang. That's a whole 'nuther level of gaslighting your own gender, Ms. Fiorina.)
Gosh, I can't imagine how one's been in bankruptcy (at least) four times, and the latter very nearly led her company off a cliff... [insert uber-sarcastic eyeroll]
Look. I don't get to ignore reality--much less hate on it. That would lead to solving the wrong problems...assuming they're even problems in the first place. At best, that's a waste of time and money. In the middle, the real problems are neglected. And in the worst-case, that additionally creates new problems.
I assume that, like me, my Gentle Reader is paid to solve real problems within the constraints of the real world. We should accept nothing less of ourselves. And expect even better from those who want to have the power to declare war, spend our tax dollars, etc. Values can be debated; facts cannot. Anyone who cannot accept that should have their "adult card" taken away and be given a Sims account so they can live in their own reality without bothering anyone else's.
Monday, August 24, 2015
Innovation iconoclasm: Beyond the cult of the start-up
I stalled out about a quarter of the way through Geoffrey Moore's Dealing With Darwin, which has nothing to do with the quality of the book (which is excellent), and everything to do with my ability to be distracted. Now I'm picking it up again as my "nightcap" reading.
It kind of hit a nerve when I learned that the office where I spent the best years of my career has been broken up into functional "pods" (for lack of a better term). That's on the heels of a shake-up that saw a spike in my LinkedIn social circle. Zo wellz--the silver lining is that it spares me the risk of nostalgia. I mean, yeah, I can be nostalgic about the days when 20 or so of us were proudly referred to by the boss as "The Island of Misfit Toys." But we were also working on a scrappy new bet-the-branch-office product then. That takes a particular alchemy--not mere chemistry, which is far too predictable--of personalities to pull off.
And then at some point, you wake up and find youself and your product find yourself in a more mature market--which is a whole 'nuther game. That's what Dealing With Darwin is about. And probably why it's overlooked on most business reading lists. After all, Moore (and his colleagues) are known for the consulting work that preceded and spun out of Crossing the Chasm. The latter focuses entirely on growing a product market from the adventurous early adopters to the more sceptical mainstream (by bridging the gap between their very divergent needs).
That early stage company bringing something brand-new to a market with no mental map for what they're making/selling is what (nearly) everyone associates with "innovation," right?
Darwin, however, hammers home the much-neglected truth that there are more species of innovation than the brand-new product. Things like the following require "innovation":
The point is that established companies can't rest on their proverbial laurels. Any good idea will have copycats--some better and faster than others--and the "first to market" advantage has a limited shelf life. After that, it takes management discipline (and probably no small amount of luck) to stay ahead. Which will yield higher returns--investing R&D dollars into making a better product, or into reducing costs? Or maybe (just maybe) is it time to start exiting the race to the bottom and bring that skunkworks project into the light of day?
Those are not easy questions to tackle, particularly with all the baggage and politics of an established money-making track-record. While individuals may too often throw away tangible good in pursuit of phantoms, organisations are not so often guilty. Maybe it would easier if we'd more readily recognise innovation when it wears business casual in Toronto instead of just a Red Bull-stained hoodie in Silicon Valley.
It kind of hit a nerve when I learned that the office where I spent the best years of my career has been broken up into functional "pods" (for lack of a better term). That's on the heels of a shake-up that saw a spike in my LinkedIn social circle. Zo wellz--the silver lining is that it spares me the risk of nostalgia. I mean, yeah, I can be nostalgic about the days when 20 or so of us were proudly referred to by the boss as "The Island of Misfit Toys." But we were also working on a scrappy new bet-the-branch-office product then. That takes a particular alchemy--not mere chemistry, which is far too predictable--of personalities to pull off.
And then at some point, you wake up and find youself and your product find yourself in a more mature market--which is a whole 'nuther game. That's what Dealing With Darwin is about. And probably why it's overlooked on most business reading lists. After all, Moore (and his colleagues) are known for the consulting work that preceded and spun out of Crossing the Chasm. The latter focuses entirely on growing a product market from the adventurous early adopters to the more sceptical mainstream (by bridging the gap between their very divergent needs).
That early stage company bringing something brand-new to a market with no mental map for what they're making/selling is what (nearly) everyone associates with "innovation," right?
Darwin, however, hammers home the much-neglected truth that there are more species of innovation than the brand-new product. Things like the following require "innovation":
- Adding new features to an existing product (e.g., a camera to a phone)
- Making an existing product do more with fewer resources (e.g., lower-power computer chips)
- Tapping a new (unexpected) market for an existing product (e.g., Viagra was originally a failed treatment for high blood pressure)
- Up-scaling an existing product/market for higher profit-margins (Starbucks, Apple Computer, Whole Foods)
- Streamlining and standardising supply chains and work-flow (e.g., Ford Motor Company, McDonald's, Dell Computer, etc.)
- Re-tooling work-flows and supply-chains to emphasise quality and reduce the cost of mistakes (e.g. Toyota Motor Corporation)
- Reducing transaction friction/overhead with the end-consumer (e.g., Zipcar, Netflix)
- Abdicating responsibility for labour and safety laws by declaring your employees "contractors" and yourself a "technology platform" (e.g. Uber, TaskRabbit)
- Itemising core services and adding surcharges for them (e.g., airlines, banks)
The point is that established companies can't rest on their proverbial laurels. Any good idea will have copycats--some better and faster than others--and the "first to market" advantage has a limited shelf life. After that, it takes management discipline (and probably no small amount of luck) to stay ahead. Which will yield higher returns--investing R&D dollars into making a better product, or into reducing costs? Or maybe (just maybe) is it time to start exiting the race to the bottom and bring that skunkworks project into the light of day?
Those are not easy questions to tackle, particularly with all the baggage and politics of an established money-making track-record. While individuals may too often throw away tangible good in pursuit of phantoms, organisations are not so often guilty. Maybe it would easier if we'd more readily recognise innovation when it wears business casual in Toronto instead of just a Red Bull-stained hoodie in Silicon Valley.
Monday, July 27, 2015
The Roman(ce) of organisation
I was a little under the weather a few days ago--just enough to take the edge off. Sometimes the internet (by which I mean being sucked into the ADHD maelstrom of social media or the news) is the wrong kind of distraction. Instead, I ended up going down a different (albeit rather more focused) rabbit-hole: Roman history of the classical period.
Dennis is fascinated by Roman history. But, then, he's more or less the military historian of the household. And, let's call a spade a shovel here--the Roman empire was basically organised around conquest more than trade.
A lot of noise is made about the proverbial bread and circuses (particularly by the libertarian right) being the eventual downfall of Roman civilsation. That's far, far too facile. Bread and circuses? Really? When the military and the government come to operate hand-in-glove? And the military largely consists of large private armies loyal to a single leader? And the empire has to invade new territories to pay off the soldiers recruited after conquering those soldiers' territory?
That's basically a Ponzi scheme--with all the hallmarks of a tin-pot dictatorship besides. We all know how those schemes--and sometimes even the dictators-- eventually end. (Shocking precisely no one, Rome once had four Caesars in a single year.)
Plus, Rome's dependence on slavery--on average, one in three residents was a slave--gave little incentive for advancing technology. Plus, the slave trade was one of the more lucrative aspects of war-mongering. Which, of course left even less incentive for inventing labour-saving devices.
Now. I'm not saying that Dennis isn't correct. The Romans were straight-up fascinating. Most notably for their organisational skills. (To me, the obsessive organisation of the Roman armies is far more interesting than tactics, battles or campaigns. Making vs. breaking and all that...)
After all, it's not like war-mongering made them unique in the ancient Western world (or probably any other place and time in human history, really). Neither did slavery: Romans were rank amateurs compared to, say, Spartans Nor did religion--theirs was a mongrel mish-mash based somewhat on the cosmology borrowed from Greece, but with plenty of Etruscan leftovers and cults imported from as far afield as Persia (modern Iran). (And that doesn't even count those pesky mono-theists like the Hebrews and Christians who showed up in the middle of the story.) Culture? Nope--classical Rome had a decided inferiority complex: "Real" intellectualism came from (or imitated) Greece.
But nonetheless, time has been kind to the Roman legacy in the Western world. Take away the letters "J," "K," "U," and "W," and you have the Latin alphabet. (Mercifully, however, their numbering system has largely been left by the wayside in favour of the more sensible Arabic one. Want to know why Star Wars takes place "a long time ago"? Because they're using Roman numerals for the episode numbers. Just sayin'.) Until a couple of centuries ago, their language was still the lingua franca of the educated elite. Five modern languages are derived from Latin: Spanish, Portuguese, French, Italian, and Romanian. (And let's not forget that roughly one in three English words derives from French.) The Western calendar (with some corrections and innovations like the leap year) is largely their legacy.
Part of that is, of course, the legacy of Empire--particularly one that stretched from the middle of the U.K. to Istanbul. Flexibility in adapting to local customs and politics is a virtue, but flexibility in language and standards (e.g. measurements, dates) is the short road to administrative suicide. On the flip side, in a world where people are used to squabbling with their nearest neighbours from time immemorial, adopting the systems of the Empire means that no one actually has to compromise. (Human beings are awesome, amirite? [eyeroll])
In a milieu where even the longest-lived empires (e.g. the classical Greeks) didn't survive much more than a few hundred years, the Romans--despite the serious flaws in their civilisation--could hold out for roughly a millennium.
And yet today, "organisational skills" is such a throwaway term--the kind of fluff one puts on a resume or emphasises in an interview as part of the ritualised employment dance. As a liberal arts graduate, I've had a quarter-century to shake my head over how "communication skills" are treated much the same way--universally demanded, but rarely valued. It was only after reflecting on the classical Romans that I realised that talent in organisation falls into the same category.
In that light, I suppose it comes as no surprise (in an age of rock star CEOs) that we all know who Julius Caesar is, but take for granted the much larger legacy of often anonymous census-takers, accountants, lawyers, tribunes, consuls, engineers, scribes, et. al. who gave cohesiveness to what otherwise might have been a flash in the pan based on a handful of military victories.
Dennis is fascinated by Roman history. But, then, he's more or less the military historian of the household. And, let's call a spade a shovel here--the Roman empire was basically organised around conquest more than trade.
A lot of noise is made about the proverbial bread and circuses (particularly by the libertarian right) being the eventual downfall of Roman civilsation. That's far, far too facile. Bread and circuses? Really? When the military and the government come to operate hand-in-glove? And the military largely consists of large private armies loyal to a single leader? And the empire has to invade new territories to pay off the soldiers recruited after conquering those soldiers' territory?
That's basically a Ponzi scheme--with all the hallmarks of a tin-pot dictatorship besides. We all know how those schemes--and sometimes even the dictators-- eventually end. (Shocking precisely no one, Rome once had four Caesars in a single year.)
Plus, Rome's dependence on slavery--on average, one in three residents was a slave--gave little incentive for advancing technology. Plus, the slave trade was one of the more lucrative aspects of war-mongering. Which, of course left even less incentive for inventing labour-saving devices.
Now. I'm not saying that Dennis isn't correct. The Romans were straight-up fascinating. Most notably for their organisational skills. (To me, the obsessive organisation of the Roman armies is far more interesting than tactics, battles or campaigns. Making vs. breaking and all that...)
After all, it's not like war-mongering made them unique in the ancient Western world (or probably any other place and time in human history, really). Neither did slavery: Romans were rank amateurs compared to, say, Spartans Nor did religion--theirs was a mongrel mish-mash based somewhat on the cosmology borrowed from Greece, but with plenty of Etruscan leftovers and cults imported from as far afield as Persia (modern Iran). (And that doesn't even count those pesky mono-theists like the Hebrews and Christians who showed up in the middle of the story.) Culture? Nope--classical Rome had a decided inferiority complex: "Real" intellectualism came from (or imitated) Greece.
But nonetheless, time has been kind to the Roman legacy in the Western world. Take away the letters "J," "K," "U," and "W," and you have the Latin alphabet. (Mercifully, however, their numbering system has largely been left by the wayside in favour of the more sensible Arabic one. Want to know why Star Wars takes place "a long time ago"? Because they're using Roman numerals for the episode numbers. Just sayin'.) Until a couple of centuries ago, their language was still the lingua franca of the educated elite. Five modern languages are derived from Latin: Spanish, Portuguese, French, Italian, and Romanian. (And let's not forget that roughly one in three English words derives from French.) The Western calendar (with some corrections and innovations like the leap year) is largely their legacy.
Part of that is, of course, the legacy of Empire--particularly one that stretched from the middle of the U.K. to Istanbul. Flexibility in adapting to local customs and politics is a virtue, but flexibility in language and standards (e.g. measurements, dates) is the short road to administrative suicide. On the flip side, in a world where people are used to squabbling with their nearest neighbours from time immemorial, adopting the systems of the Empire means that no one actually has to compromise. (Human beings are awesome, amirite? [eyeroll])
In a milieu where even the longest-lived empires (e.g. the classical Greeks) didn't survive much more than a few hundred years, the Romans--despite the serious flaws in their civilisation--could hold out for roughly a millennium.
And yet today, "organisational skills" is such a throwaway term--the kind of fluff one puts on a resume or emphasises in an interview as part of the ritualised employment dance. As a liberal arts graduate, I've had a quarter-century to shake my head over how "communication skills" are treated much the same way--universally demanded, but rarely valued. It was only after reflecting on the classical Romans that I realised that talent in organisation falls into the same category.
In that light, I suppose it comes as no surprise (in an age of rock star CEOs) that we all know who Julius Caesar is, but take for granted the much larger legacy of often anonymous census-takers, accountants, lawyers, tribunes, consuls, engineers, scribes, et. al. who gave cohesiveness to what otherwise might have been a flash in the pan based on a handful of military victories.
Monday, July 20, 2015
Beware the fair-weather meteorologist
Today, as a favour to a client, I spent about an hour on the phone with a consultant who was helping him be reimbursed for some of the cost of "our" application via a government program. The aim of the program is to promote research and development into new products/designs...even when those things don't necessarily pan out.
This app. certainly qualifies in that regard: At the start, we had no idea that it would even work. Two years and a few significant adjustments later, we're still rolling the bones on each iteration.
My part in this interview was to basically tell the story of each stage in the application from a boilerplate format something like this:
Uhhmmmm, there's no such thing as those kinds of things. I only sit on problems long enough for one of the following to happen:
Option A: I have (at a minimum) a strong gut-level diagnosis plus a plan to verify/disprove that hypothesis.
Option B: No immediate diagnosis is forthcoming, but I know exactly where I'm going to start looking.
Arriving at either option shouldn't take very long. Half an hour, maybe. Basically, experience/instinct kicks in...or it doesn't and my Google-fu skills get a workout.
But that's not really the important part. What's important is choosing to work with clients who understand that freaking out at problems only makes them harder to solve. Not only the problem(s) at hand, but future problems as well.
Sure, freak-outs guarantee that you (as a client) won't hear about a lot of the small problems--at least not the kind that are easily and quietly fixed. Maybe you'll sleep better for that. At least until the less-easily-fixable small problems fester into big ones. All because energy that could have been spent on diagnosis is instead channeled into suppressing the symptoms. (And let's not even count the time wasted on blame-slinging and finger-pointing.) I have no sympathy for a client relationship run on a "no bad news" principle. It's tantamount to only accepting sunny forecasts from the meteorologist. Or buying stocks from perpetually bullish brokers. Die of pneumonia in the poor-house, and who's to blame, really? Exactly.
Yes, I realise that no one likes to report (much less hear about) project line-items slipping a schedule date, or costing more than initially budgeted. Or that off-the-shelf software will require more customisation than advertised. Or that a Zero-day security hole has to be patched and rushed out the door in the middle of a release cycle. Or that Google/Facebook/Bing/Microsoft/Apple/PayPal/Etc. has arbitrarily changed their terms of service. Or that a massive DDOS attack is measurably impacting an app.'s response times. Whatever. Stuff happens. Professionals deal with it.
In lieu of falling into a "no bad news" mindset, here's an alternative. You (again meaning the client) can't insist that there will be no bad news. At least without everybody knowing you're on holiday from Reality.
But you can--within reason--insist on no surprises. Obviously, things like security flaws, third party service failures, etc. aren't always predictable. (Hence, the "within reason" qualifier.) Developers, you can insist on a working in a No Freak-out Zone as the price of landing no surprises. (Pro tip: If you're not sure about the client's commitment to the "no surprises" credo, find a small, no-brainer problem to test the waters. Bring it--and your plan to fix it--to the client in person or over the phone. None of this passive-aggressive end-of-the-week-hoping-they've-already-left email nonsense. Nope. Own it.)
Eliminate the freak-outs, minimise the surprises, and the a lot more of the problems will sort out much sooner. After over a decade in software, I can pretty much promise you that.
This app. certainly qualifies in that regard: At the start, we had no idea that it would even work. Two years and a few significant adjustments later, we're still rolling the bones on each iteration.
My part in this interview was to basically tell the story of each stage in the application from a boilerplate format something like this:
- What was the problem we were facing?
- What did we do to address the problem?
- Did it work according to expectation?
- If so, by how much?
- Or if not, by how much?
- What unexpected problems/obstacles (if any) cropped up?
Uhhmmmm, there's no such thing as those kinds of things. I only sit on problems long enough for one of the following to happen:
Option A: I have (at a minimum) a strong gut-level diagnosis plus a plan to verify/disprove that hypothesis.
Option B: No immediate diagnosis is forthcoming, but I know exactly where I'm going to start looking.
Arriving at either option shouldn't take very long. Half an hour, maybe. Basically, experience/instinct kicks in...or it doesn't and my Google-fu skills get a workout.
But that's not really the important part. What's important is choosing to work with clients who understand that freaking out at problems only makes them harder to solve. Not only the problem(s) at hand, but future problems as well.
Sure, freak-outs guarantee that you (as a client) won't hear about a lot of the small problems--at least not the kind that are easily and quietly fixed. Maybe you'll sleep better for that. At least until the less-easily-fixable small problems fester into big ones. All because energy that could have been spent on diagnosis is instead channeled into suppressing the symptoms. (And let's not even count the time wasted on blame-slinging and finger-pointing.) I have no sympathy for a client relationship run on a "no bad news" principle. It's tantamount to only accepting sunny forecasts from the meteorologist. Or buying stocks from perpetually bullish brokers. Die of pneumonia in the poor-house, and who's to blame, really? Exactly.
Yes, I realise that no one likes to report (much less hear about) project line-items slipping a schedule date, or costing more than initially budgeted. Or that off-the-shelf software will require more customisation than advertised. Or that a Zero-day security hole has to be patched and rushed out the door in the middle of a release cycle. Or that Google/Facebook/Bing/Microsoft/Apple/PayPal/Etc. has arbitrarily changed their terms of service. Or that a massive DDOS attack is measurably impacting an app.'s response times. Whatever. Stuff happens. Professionals deal with it.
In lieu of falling into a "no bad news" mindset, here's an alternative. You (again meaning the client) can't insist that there will be no bad news. At least without everybody knowing you're on holiday from Reality.
But you can--within reason--insist on no surprises. Obviously, things like security flaws, third party service failures, etc. aren't always predictable. (Hence, the "within reason" qualifier.) Developers, you can insist on a working in a No Freak-out Zone as the price of landing no surprises. (Pro tip: If you're not sure about the client's commitment to the "no surprises" credo, find a small, no-brainer problem to test the waters. Bring it--and your plan to fix it--to the client in person or over the phone. None of this passive-aggressive end-of-the-week-hoping-they've-already-left email nonsense. Nope. Own it.)
Eliminate the freak-outs, minimise the surprises, and the a lot more of the problems will sort out much sooner. After over a decade in software, I can pretty much promise you that.
Monday, June 8, 2015
Another thought on diversity
A couple months ago, I was both amused and encouraged by the example of camaraderie among people who can be notorious for their religious wars factionalism.
Today I had occasion to muse upon diversity again, albeit from a different angle.
I'm working on the database of an application that will--if all pans out--eventually be available to professionals in Canada and the United States (at a minimum). The superficial view is that we can get by with just an English-language version. At least during an invitation-only beta. In the long run, however, French and Spanish will have to be added.
If my Gentle Reader is not a programmer, the only thing that s/he needs to understand is that there are three significant classes of numbers in data design (and programming for it):
If a programmer is creating a table, s/he can get away with a spreadsheet-like model. For instance, each customer would have their data on a single row: Account ID, First Name, Last Name, Middle Initial, Current Balance, and so on.
Zero complicates matters. For instance, I had a chequing account years before my driver's license. Which sometimes made it awkward at the counter, at least in the U.S., because that magic rectangle of plastic was somehow a talisman against the cheque bouncing. (In essence, one driver's license number was expected, but it was just plausible that zero was an acceptable number.) Usually the clerk and I worked something out that involved my University ID and my Social Security card. To some extent, that prepared me for the limbo between the time Dennis & I pulled up stakes for Canada and the time we obtained our Social Insurance Numbers (which things like utilities, banks, etc. expect to set up accounts). Again, it was just possible that zero was an acceptable number, but extra hoop-jumping (and sometimes expense) was involved.
To the programmer, it means some extra brain-work. One has to remember to set database tables to accept no (a.k.a. NULL) values. One also has to remember to allow "None of the above"-type options in drop-down boxes. Also to give "none" or NULL a pass when validating data passing in or out of the database. Extra brain-work, naturally, translates to extra time, and of course extra expense.
For numbers over one, though? Hoo-boy...that's a whole different ball-game. Let's circle back (in management-speak) to the language thing. For this particular application, we can set up each client to have a preferred language. But--particularly among those who speak French and/or Spanish--it's more than likely that a client will be fluent in more than one language.
If we were still storing client info. in a spreadsheet, we'd simply have a "Preferred Language" column and another titled something like "Second Language." In most cases, that column would just be wasted space. And wait a minute: What about people who are fluent in more than two languages? Case in point: One of my step-in-laws speaks six languages. (He's originally from the Netherlands and now lives in the U.S., which pretty much guarantees four languages.)
That's the point where a programmer trades in the spreadsheet for something called a relational database. The database replaces the single table of clients with three: One for basic client info. (all that "one" stuff like name, age, balance, etc.), one small one that merely lists the supported languages, and another table that links clients and languages. The beauty of that arrangement is threefold:
And, as it turns out, that's not the worst metaphor for coping with the world, either. Just sayin'.
Today I had occasion to muse upon diversity again, albeit from a different angle.
I'm working on the database of an application that will--if all pans out--eventually be available to professionals in Canada and the United States (at a minimum). The superficial view is that we can get by with just an English-language version. At least during an invitation-only beta. In the long run, however, French and Spanish will have to be added.
If my Gentle Reader is not a programmer, the only thing that s/he needs to understand is that there are three significant classes of numbers in data design (and programming for it):
- One
- Zero
- Any number greater than one
If a programmer is creating a table, s/he can get away with a spreadsheet-like model. For instance, each customer would have their data on a single row: Account ID, First Name, Last Name, Middle Initial, Current Balance, and so on.
Zero complicates matters. For instance, I had a chequing account years before my driver's license. Which sometimes made it awkward at the counter, at least in the U.S., because that magic rectangle of plastic was somehow a talisman against the cheque bouncing. (In essence, one driver's license number was expected, but it was just plausible that zero was an acceptable number.) Usually the clerk and I worked something out that involved my University ID and my Social Security card. To some extent, that prepared me for the limbo between the time Dennis & I pulled up stakes for Canada and the time we obtained our Social Insurance Numbers (which things like utilities, banks, etc. expect to set up accounts). Again, it was just possible that zero was an acceptable number, but extra hoop-jumping (and sometimes expense) was involved.
To the programmer, it means some extra brain-work. One has to remember to set database tables to accept no (a.k.a. NULL) values. One also has to remember to allow "None of the above"-type options in drop-down boxes. Also to give "none" or NULL a pass when validating data passing in or out of the database. Extra brain-work, naturally, translates to extra time, and of course extra expense.
For numbers over one, though? Hoo-boy...that's a whole different ball-game. Let's circle back (in management-speak) to the language thing. For this particular application, we can set up each client to have a preferred language. But--particularly among those who speak French and/or Spanish--it's more than likely that a client will be fluent in more than one language.
If we were still storing client info. in a spreadsheet, we'd simply have a "Preferred Language" column and another titled something like "Second Language." In most cases, that column would just be wasted space. And wait a minute: What about people who are fluent in more than two languages? Case in point: One of my step-in-laws speaks six languages. (He's originally from the Netherlands and now lives in the U.S., which pretty much guarantees four languages.)
That's the point where a programmer trades in the spreadsheet for something called a relational database. The database replaces the single table of clients with three: One for basic client info. (all that "one" stuff like name, age, balance, etc.), one small one that merely lists the supported languages, and another table that links clients and languages. The beauty of that arrangement is threefold:
- There's no wasted space from that unused "Second Language" column. Which may not seem like a big deal for a few hundred records, but when you scale into the tens of thousands and millions really adds up.
- The solution is infinitely scalable--it can accommodate the linguistic xenophobe as well as the Babel Fish with no hacks required.
- (Bonus) The data stays cleaner. If the "Preferred Language" and "Second Language" are text fields, then someone will inevitably misspell language names. That just corrupts the data. In the relational model, the third (i.e. "association") table links clients and languages by their ID numbers. Which not only takes up less hard drive space (numbers are cheaper to store than sets of letters), but makes it much, muuuuuch harder to screw things up.
And, as it turns out, that's not the worst metaphor for coping with the world, either. Just sayin'.
Wednesday, May 20, 2015
Scratching at scale, Part II
So there's an itch. And it turns out that the most effective way to scratch it is with a computer of some sort, typically a desktop, laptop, tablet, or smart phone.
One of the dirty secrets of software development is that, even in relatively small organisations, the sheer number of users tends to stretch the usage of the software in directions that might not have been anticipated on the whiteboard.
The person (or people) with the initial itch face something of a paradox when designing a software product meant for mass-use. And that paradox boils down to two contradictory mandates when bringing the product into the real world:
In the earliest phases of product design, everyone--and I mean everyone--involved absolutely must understand the itch. Top to bottom. Backward and forward. Inside and out. No excuses. Why? Because misunderstandings make for bad assumptions. And bad assumptions solve the wrong problems (at best) or non-existant problems (at worst). And in either case, waste time and money.
In practical terms, that means that the everyone, including the designers and coders, need to be allowed to experience, first-hand, the pain-points. Even if it's in a shadowing capacity. Even if they don't actually work at the organisation. Actually, make that especially if they don't actually work at the organisation. Mainly because there's no substitute for first-hand experience. But also because the outside perspective can see past the politics to strip the personalities away from the actual problem to get to its essence.
After that point, it should merely be a matter of implementation, right? Not so fast.
Software development is expensive--let's not pretend otherwise. It's expensive in terms of the time the organisation spends on the development process as well as the quoted budget. Thus, the temptation is always there to leverage that investment by making it available for others. Those "others" could be different parts of the organisation. Or they could be working for companies in the same (or a similar) industry. I've seen it happen both ways.
That's the point where Commandment #2 kicks in. Maybe the software will do 95% of what these other folks need--but the catch is that the missing 5% is mission-critical. At that point, the original group has to decide whether or not to add that functionality and, if so, how to fund the additional work.
Even when the monetary cost is shifted to those requesting the changes, there's also a loss involved--namely a slight loss of control over the product. It's not logical, but this can be more painful for some than writing the cheque. But it's understandable--all the work that the original team put into researching, brainstorming, evaluating, testing, training, etc. is something to be proud of. Particularly when the product is useful enough that others are interested in it. But now these unappreciative barbarians want to change it!
Again, the reaction is perfectly understandable. Unshockingly, it even happens to your faithful blogger from time to time. But that reaction is also the warning that you've strayed away from Pride and wandered into Ego. See, the process of initially scratching the itch requires checking egos at the door. In practical terms that meant things like...
So, in the end, Commandments One and Two really aren't contradictory. They're both about zeroing in on the pain-points, and making the pain go away. The shortest route to that more often than not cuts through personality and power-dynamics and especially egos. Compared to that, the challenges posed by technology should be a breeze, right?
One of the dirty secrets of software development is that, even in relatively small organisations, the sheer number of users tends to stretch the usage of the software in directions that might not have been anticipated on the whiteboard.
The person (or people) with the initial itch face something of a paradox when designing a software product meant for mass-use. And that paradox boils down to two contradictory mandates when bringing the product into the real world:
- Commandment 1: Know Thy Itch.
- Commandment 2: It's Not About Thou.
In the earliest phases of product design, everyone--and I mean everyone--involved absolutely must understand the itch. Top to bottom. Backward and forward. Inside and out. No excuses. Why? Because misunderstandings make for bad assumptions. And bad assumptions solve the wrong problems (at best) or non-existant problems (at worst). And in either case, waste time and money.
In practical terms, that means that the everyone, including the designers and coders, need to be allowed to experience, first-hand, the pain-points. Even if it's in a shadowing capacity. Even if they don't actually work at the organisation. Actually, make that especially if they don't actually work at the organisation. Mainly because there's no substitute for first-hand experience. But also because the outside perspective can see past the politics to strip the personalities away from the actual problem to get to its essence.
After that point, it should merely be a matter of implementation, right? Not so fast.
Software development is expensive--let's not pretend otherwise. It's expensive in terms of the time the organisation spends on the development process as well as the quoted budget. Thus, the temptation is always there to leverage that investment by making it available for others. Those "others" could be different parts of the organisation. Or they could be working for companies in the same (or a similar) industry. I've seen it happen both ways.
That's the point where Commandment #2 kicks in. Maybe the software will do 95% of what these other folks need--but the catch is that the missing 5% is mission-critical. At that point, the original group has to decide whether or not to add that functionality and, if so, how to fund the additional work.
Even when the monetary cost is shifted to those requesting the changes, there's also a loss involved--namely a slight loss of control over the product. It's not logical, but this can be more painful for some than writing the cheque. But it's understandable--all the work that the original team put into researching, brainstorming, evaluating, testing, training, etc. is something to be proud of. Particularly when the product is useful enough that others are interested in it. But now these unappreciative barbarians want to change it!
Again, the reaction is perfectly understandable. Unshockingly, it even happens to your faithful blogger from time to time. But that reaction is also the warning that you've strayed away from Pride and wandered into Ego. See, the process of initially scratching the itch requires checking egos at the door. In practical terms that meant things like...
- Not taking "That's how we've always done it." for an answer.
- Not allowing office politics to shape the design.
- Not using information to play favourites.
- Not pawning off responsibility on those who have no authority.
- Not postponing tough decisions.
So, in the end, Commandments One and Two really aren't contradictory. They're both about zeroing in on the pain-points, and making the pain go away. The shortest route to that more often than not cuts through personality and power-dynamics and especially egos. Compared to that, the challenges posed by technology should be a breeze, right?
Wednesday, February 25, 2015
Strategic advantage
Being a programmer (or "programmeuse," in local parlance) has certain perks. One of them being that you can crunch your own data without too much fuss...or even using Excel in ways that the laws of Mathematics and Boolean Logic never intended. Another being that you can help other folks do the same--with the added bonus that they think you're a genius.
But one definite downside is the frustration of discovering broken code...even when it's code that you didn't write. Particularly code that's broken in front of the whole Internet 'n Everybody. By which I of course mean websites with glaring errors. From the standpoint of someone browsing the site in question, it's easy to blame the lazy/incompetent web developer(s).
As a programmer, however, one has more tools available for sussing out the real problem. One only has to submit the "Contact" form (assuming it works) or an email (assuming it doesn't) to let the website owner know:
But is a "Thanks for letting us know that you had a problem; we're looking into it" too much to ask? You'd think not. But then you'd be wrong.
Please don't think that I'm sulking. This isn't the first time that this sort of thing has happened to me. The feedback took less than five minutes of my day: I don't begrudge that--particularly when it involves keeping my fellow web-mongers honest. That's to the benefit of the entire guild, right?
Yet I had an actual reason for visiting the website in question--namely, following up on a conversation I had with someone earlier in the week. (For the record, they're not a client or even a potential client. That relationship is sacred, and thus off-limits for blogging.)
Now, I don't judge organisations (particularly those that operate on a proverbial shoestring) by website errors...although errors certainly raise some warning signals regarding the client review process. But I think it's perfectly fair to judge organisations on how quickly they respond to negative (though hopefully constructive) feedback. Lack of response is symptomatic of any number of organisational ills, including (but not limited to):
Programmers have an unfortunate reputation of being too code-centric to be successful businesspeople. But, boy howdy, this is one case when groking the bits definitely provides a leg up on other trades. [pats self fatuously on head for making it through Programmer School]
But one definite downside is the frustration of discovering broken code...even when it's code that you didn't write. Particularly code that's broken in front of the whole Internet 'n Everybody. By which I of course mean websites with glaring errors. From the standpoint of someone browsing the site in question, it's easy to blame the lazy/incompetent web developer(s).
As a programmer, however, one has more tools available for sussing out the real problem. One only has to submit the "Contact" form (assuming it works) or an email (assuming it doesn't) to let the website owner know:
- Hey, there's a problem with your website.
- Here is the specific error message (or bug).
- This is how you reproduce the problem yourself.
- (The biggie) This is likely the root cause. Debug here first.
But is a "Thanks for letting us know that you had a problem; we're looking into it" too much to ask? You'd think not. But then you'd be wrong.
Please don't think that I'm sulking. This isn't the first time that this sort of thing has happened to me. The feedback took less than five minutes of my day: I don't begrudge that--particularly when it involves keeping my fellow web-mongers honest. That's to the benefit of the entire guild, right?
Yet I had an actual reason for visiting the website in question--namely, following up on a conversation I had with someone earlier in the week. (For the record, they're not a client or even a potential client. That relationship is sacred, and thus off-limits for blogging.)
Now, I don't judge organisations (particularly those that operate on a proverbial shoestring) by website errors...although errors certainly raise some warning signals regarding the client review process. But I think it's perfectly fair to judge organisations on how quickly they respond to negative (though hopefully constructive) feedback. Lack of response is symptomatic of any number of organisational ills, including (but not limited to):
- Generic indifference
- Bureaucratic sclerosis
- Chronic under-staffing/under-training
- Lack of planning/budget for asset maintenance (and websites are assets, dagnabbit!)
- A culture that penalises mistakes
Programmers have an unfortunate reputation of being too code-centric to be successful businesspeople. But, boy howdy, this is one case when groking the bits definitely provides a leg up on other trades. [pats self fatuously on head for making it through Programmer School]
Wednesday, January 28, 2015
Dear Client: Please don't apologise for your website
Today I was on the phone with the lawyer who'd helping me overhaul my service contract. Every business niche is unique, and software development may well be a special snowflake--what with its dependencies on software written by someone else and all. (For future reference: When civilisation falls because Skynet or The Matrix or The Borg hacked in through a NSA/GCHQ/CSIS/Whomever-mandated crypto. back-door, it's totally not my fault--got it? Also? Don't say I didn't tell you so.)
As the chat wound down, I asked him about some of the other services mentioned on his website. At which point he rather apologetically explained that the site had been done in Wordpress. As if the only legit. websites are done in artisanal HTML5. Hand-coded exclusively from free-range organic fair-trade ones and zeroes. Or something. I'm not entirely sure I grok the motivation behind the disclaimer, frankly.
Well, maybe I do...a little. My guess is that it's a revenant of the pre-WYSIWYG* (e.g. pre-Dreamweaver) days of making web pages. You know, back when web development shops could call themselves "boutiques," with a straight face...and nobody snickered.
Folks, seriously-and-for-realz now: If I could care less how the actual bytes of anyone's website were generated, I'd be a web browser. What (as a consumer and geek) I do care about is:
- - - - -
* WYSIWYG (pronounced "Whiz-ee-wig") is an acronym for "What You See Is What You Get."
** CMS == "Content Management System."
As the chat wound down, I asked him about some of the other services mentioned on his website. At which point he rather apologetically explained that the site had been done in Wordpress. As if the only legit. websites are done in artisanal HTML5. Hand-coded exclusively from free-range organic fair-trade ones and zeroes. Or something. I'm not entirely sure I grok the motivation behind the disclaimer, frankly.
Well, maybe I do...a little. My guess is that it's a revenant of the pre-WYSIWYG* (e.g. pre-Dreamweaver) days of making web pages. You know, back when web development shops could call themselves "boutiques," with a straight face...and nobody snickered.
Folks, seriously-and-for-realz now: If I could care less how the actual bytes of anyone's website were generated, I'd be a web browser. What (as a consumer and geek) I do care about is:
- Do I know what your organisation is even about? Because, in the hands of an appropriately-qualified idiot, any publishing platform can spew fluent marketing gibberish with a legalese accent. (Or vice-versa.) Mere technology is no match for this species of idiot...much less a committee of idiots.
- How long has it been since the website content was updated? Similarly, no technology in the world will save you from not caring enough to update the content when your organisation changes or has something new to say.
- (And speaking of updates...) Pretty-please have a plan in place to update the underlying software. Unless you know that the person who set it up had to make some hacks under the hood, there's no reason maintenance updates should break anything. (And if hacks were made, that should have been addressed up front.) Otherwise, you're leaving yourself wide open for an embarrassing security breach. Don't do that.
- - - - -
* WYSIWYG (pronounced "Whiz-ee-wig") is an acronym for "What You See Is What You Get."
** CMS == "Content Management System."
Wednesday, January 21, 2015
Welp. That was different.
If you take a Statistics 101 course, you should emerge with three basic concepts burned into your synapses. Bonus points for more than three, of course, but the minimum required to justify your time and tuition are:
(There's also a slightly embarrassing story about me at a politically-themed business-over-breakfast event, but that's for another time...)
But (belatedly) following the recommendation of a native, I signed up for the Chambre de commerce du Grand Shediac yesterday and showed up at tonight's reception, which included the State of the City address by the Mayor. Granted, I was treading water (okay, technically "jellyfish floating"-look it up) during the French portions of his address, but it was informative nonetheless.
My first clue should have been that I was left to my own devices during the "mingling" part of the evening. That gave me the luxury of scanning the room to see who was making the rounds, which cliques stayed clumped together--that sort of thing. (Extroverted alpha-salespeople, IMO, would do well to shut up and hang back long enough to do this. Trust me: you can learn a lot.)
Mind you, I wasn't above introducing myself the singleton looking lost or bored and striking up a conversation. (For the record, that had a 2-out-of-3 payoff: I met someone who grew up 2 houses down from mine plus a fellow transplant to the area, but awkwardly bashed my pathetic French against someone's much-better-but-still-limited English. Zo wellz....)
During the wind-down, I happened to be in the vicinity of the Chamber's Directrice, who took me under he wing long enough to introduce me to the President, and then the Mayor introduced himself while I was chatting with two insurance agents. (Aside: Kilogram for kilogram, anyone who's not only read Team of Rivals but watches the movie at least once a year probably has my vote if s/he decides to run for Prime Minister. Just sayin'.)
But despite the attention lavished upon this (still appallingly unilingual) newcomer, the afore-mentioned firehose was noticeably absent. That was almost surreal. But appreciated all the same. I don't miss the sense of being fresh meat. Or, perhaps more aptly, the sense of being chum tossed into the shark-tank while my fins are still twitching.
So big ups to the Shediac Chamber for a very positive data-point. Both the introverted and extroverted parts of my character join me in saying "thanks."
- Correlation does not equal causation
- Any fewer than 32 data-points, and you got some 's'plainin' to do
- Standard deviations (particularly on the bell or "normal" curve)
(There's also a slightly embarrassing story about me at a politically-themed business-over-breakfast event, but that's for another time...)
But (belatedly) following the recommendation of a native, I signed up for the Chambre de commerce du Grand Shediac yesterday and showed up at tonight's reception, which included the State of the City address by the Mayor. Granted, I was treading water (okay, technically "jellyfish floating"-look it up) during the French portions of his address, but it was informative nonetheless.
My first clue should have been that I was left to my own devices during the "mingling" part of the evening. That gave me the luxury of scanning the room to see who was making the rounds, which cliques stayed clumped together--that sort of thing. (Extroverted alpha-salespeople, IMO, would do well to shut up and hang back long enough to do this. Trust me: you can learn a lot.)
Mind you, I wasn't above introducing myself the singleton looking lost or bored and striking up a conversation. (For the record, that had a 2-out-of-3 payoff: I met someone who grew up 2 houses down from mine plus a fellow transplant to the area, but awkwardly bashed my pathetic French against someone's much-better-but-still-limited English. Zo wellz....)
During the wind-down, I happened to be in the vicinity of the Chamber's Directrice, who took me under he wing long enough to introduce me to the President, and then the Mayor introduced himself while I was chatting with two insurance agents. (Aside: Kilogram for kilogram, anyone who's not only read Team of Rivals but watches the movie at least once a year probably has my vote if s/he decides to run for Prime Minister. Just sayin'.)
But despite the attention lavished upon this (still appallingly unilingual) newcomer, the afore-mentioned firehose was noticeably absent. That was almost surreal. But appreciated all the same. I don't miss the sense of being fresh meat. Or, perhaps more aptly, the sense of being chum tossed into the shark-tank while my fins are still twitching.
So big ups to the Shediac Chamber for a very positive data-point. Both the introverted and extroverted parts of my character join me in saying "thanks."
Wednesday, October 29, 2014
Something I didn't expect to learn at Programmer School
There are any number of good things you can say about attending a small Programmer School like the one that made me a professional geek. Alas, one of them was not the civic planning. Specifically, the fact that the school library and computer lab were situated next to the common area (a.k.a. the cafeteria)...directly across from the day care.
Now, I've never needed help with being distracted. (Squirrel!) So I wasn't too thrilled with the arrangement. I found little sympathy, however, when I grumbled to one of my professors, a father of three: "Eh. I figure if they're screaming, they're still alive," quoth he.
Sigh. Nobody understands me. Except maybe squirrels.
But I can admit that my prof. had a point, at least as it relates to project management. As both an employee and a freelancer, I've never been known to sit on problems that come up once our tidy project design meets up with messy reality. (Although I normally try to have a least one workaround in my back pocket before I actually raise the red flag.)
After a couple of hard lessons, I've also learned not to drop off the radar even when the project is hitting its milestones ahead of schedule. Once upon a time, I considered weekly status reports a sign that my boss was a paranoid control-freak who didn't trust me to be the professional they were paying me to be.
As a freelancer, however, I've come to the opposite view. Someone who isn't interested in status "because you're the one who understands all that technical stuff" is a red flag. Because if you don't want to be bothered with good news, what happens if there's any bad news to handle? Software, like any other (designed) product, is nothing more that the sum of the decisions made between the initial brainstorm and the final bytes. Not all of those decisions can be made in the heady optimism of the kick-off meeting. And some of those decisions could even be mid-project course-corrections.
A potential client who expects me to work in a vacuum and deliver exactly what s/he wanted makes me nervous. But the flip side is that a software developer who expects to work that way should make you (as the potential client) more nervous still. In a freelancer, that behaviour is symptomatic of someone afraid of criticism, who might just let the clock (and budget) run out until your decision boils down to take-it-or-leave it.
Look. I know we're all busy. But everyone on this road-trip is responsible for making sure we all arrive where we want to be, even when they're not technically driving. Road signs matter. Detour signs, in particular, do not exist to be ignored. Once in a great while, we may even have to pull over for a minute and dig out the map and compass when we're not where we expect to be. In the long run, we'll save time and gas. And, unlike the Blues Brothers, our road trip won't end up in the hoosegow.
Now, I've never needed help with being distracted. (Squirrel!) So I wasn't too thrilled with the arrangement. I found little sympathy, however, when I grumbled to one of my professors, a father of three: "Eh. I figure if they're screaming, they're still alive," quoth he.
Sigh. Nobody understands me. Except maybe squirrels.
But I can admit that my prof. had a point, at least as it relates to project management. As both an employee and a freelancer, I've never been known to sit on problems that come up once our tidy project design meets up with messy reality. (Although I normally try to have a least one workaround in my back pocket before I actually raise the red flag.)
After a couple of hard lessons, I've also learned not to drop off the radar even when the project is hitting its milestones ahead of schedule. Once upon a time, I considered weekly status reports a sign that my boss was a paranoid control-freak who didn't trust me to be the professional they were paying me to be.
As a freelancer, however, I've come to the opposite view. Someone who isn't interested in status "because you're the one who understands all that technical stuff" is a red flag. Because if you don't want to be bothered with good news, what happens if there's any bad news to handle? Software, like any other (designed) product, is nothing more that the sum of the decisions made between the initial brainstorm and the final bytes. Not all of those decisions can be made in the heady optimism of the kick-off meeting. And some of those decisions could even be mid-project course-corrections.
A potential client who expects me to work in a vacuum and deliver exactly what s/he wanted makes me nervous. But the flip side is that a software developer who expects to work that way should make you (as the potential client) more nervous still. In a freelancer, that behaviour is symptomatic of someone afraid of criticism, who might just let the clock (and budget) run out until your decision boils down to take-it-or-leave it.
Look. I know we're all busy. But everyone on this road-trip is responsible for making sure we all arrive where we want to be, even when they're not technically driving. Road signs matter. Detour signs, in particular, do not exist to be ignored. Once in a great while, we may even have to pull over for a minute and dig out the map and compass when we're not where we expect to be. In the long run, we'll save time and gas. And, unlike the Blues Brothers, our road trip won't end up in the hoosegow.
Monday, October 20, 2014
Generations
Both my parents spent the majority of their careers working in a hospital environment. (If you want a good working definition of corporate benevolence, it would be in how my Dad's supervisor said absolutely bupkis about how long it took Dad to return from his maintenance jobs while Mom and I were in the maternity ward and nursery, respectively. 'Nuff said.)
Both parents, however, are at the stage of life where they're more likely to experience hospitals from a customer's, rather than an employee's perspective. I called Mom today to check in on her progress after surgery a couple months back. (No, it's not the first time I've called her since then. Even I'm not such a horrid child as that.) For the record, she's back out in the yard, shelling walnuts, fully up-to-date on the doings of the wild critters and feral cats, etc. Business as usual, in other words.
Mom mentioned that the hospital where she'd worked until a couple years back had asked her if she was interested in volunteering. She said "no." Which didn't surprise me--she has enough going on right now, even without recuperating from her third surgery in three years. But then I had an ear-full about everything her former employer has outsourced since she started working there--which, for context, was during the Carter Administration.
Food service, IIRC, was the first to go. Now housekeeping has been outsourced. So has billing. Because, of course, nutrition has nothing to do with health. And neither does cleanliness. (My Gentle Reader naturally picked up on the sarcasm there.) And when, thanks to data-sharing limitations, my Mom is batted like a ping-pong ball between Accounts Receivable and at least two insurance companies when she's still half-whacked-out on painkillers, I'm going to take a dim view of outsourced billing. [sharpens fingernails] [bares teeth]
I have a lot of fond memories of visiting that place when I was growing up: The smell of acetone, the superball-bouncy scrambled eggs in the cafeteria, the stately pace of the elevators, the brain-in-a-jar in the Histology lab (true story). But I can also understand why Mom turned them down, too. So far as I can tell, it's still a clean, orderly, almost nurturing place. But the ghosts of the nuns who built and poured their devotion into it become more transparent with every contractor who slings an ID-card lanyard around their neck.
Fast-forward a generation--meaning me--and press the "zemblanity" button, and there's today's news about IBM selling its semiconductor business to GlobalFoundries. It's certainly not unprecedented, given IBM's sell-off of its PC/laptop business to Lenovo a few years back and some of its server business earlier this year. Except that this isn't your typical offshoring:
Thus, at least in the near term, GlobalFoundries will employ IBM expats at IBM facilities to make a profit at what, for IBM, was a money-pit. And IBM's taking a sesqui-billion-dollar hit on the deal besides. Slow-clap for IBM management there. (Sarcasm again, btw.)
Granted, I haven't seen the inside of the Blue Zoo since Lou Gerstner was yanking the platinum ripcord on his golden parachute. And, even then, being "contractor scum" insulated me from the insane (unpaid) overtime and pension-jigging and attrition-by-early-retirement and other assorted idiocies inflicted by the buscuit-salesman. But my frustration really boils down to one similar to Mom's. Namely, that the definition of "core competence" has become dangerously strict.
Now, I'm certainly not arguing in favour of vertical monopolies. The fact that Monsanto is allowed to GMO a seed specifically optimised for the petro-chemical atrocities they market frankly blows my mind. As Bruce Sterling put it, "Teddy Roosevelt would jump down off Mount Rushmore and kick our ass from hell to breakfast for tolerating a situation like this." And he's absolutely right--even when he was talking about software monopolies.
Maybe I've just been out of the server space for too long. For all I know, pushing mission-critical data off-site to cloud servers doesn't give CIOs the willies it would have given them a decade ago. Maybe Microsoft has finally earned enough enterprise-computing street-cred to muscle out the Big Iron in the server-room.
But I do know that outsourcing always entails friction and a certain amount of bridge-burning. In the case of Mom's ex-employer, it's orders of magnitude easier and less expensive to retrain (or, if necessary, fire) a under-performing employee than it is to cancel a contract and find a replacement when work isn't up to snuff. When you're a technology company that's weathered two decades of commodisation in both hardware and software by optimising one for the other, throwing away that balance makes no strategic to me.
When I look at the risk of things IBM flags as higher-margin (cloud, data and analytics, security, social and mobile), there is not one of them that I would flag as being "owned" by Big Blue. (Cloud? Amazon. Data? Oracle, with Microsoft hot on their heels. Analytics? Everybody's a "big data" guru these days. Security? Nope. Social? Please...who isn't gunning for Facebook? Mobile? Are we seriously expecting for an IBM phone?)
I owe IBM credit for making me a decent technical writer and for teaching me basic white-collar survival skills. Oh, and for disabusing me of the notion that working inside the belly of the leviathon is securer than working outside it. But apart from my comrades and the cafeteria/coffee-shop/cleaning ladies, there's no love lost for the big blue behemoth on my end.
Yet it galls me to see a company that lionised its R&D department (and the patent lawyers who filed every brain-wave thereof) hitching their wagons to other people's horses. Or, perhaps more aptly, jumping onto other bandwagons. Because bandwagon-passengers forfeit their right to drive, right?
Both parents, however, are at the stage of life where they're more likely to experience hospitals from a customer's, rather than an employee's perspective. I called Mom today to check in on her progress after surgery a couple months back. (No, it's not the first time I've called her since then. Even I'm not such a horrid child as that.) For the record, she's back out in the yard, shelling walnuts, fully up-to-date on the doings of the wild critters and feral cats, etc. Business as usual, in other words.
Mom mentioned that the hospital where she'd worked until a couple years back had asked her if she was interested in volunteering. She said "no." Which didn't surprise me--she has enough going on right now, even without recuperating from her third surgery in three years. But then I had an ear-full about everything her former employer has outsourced since she started working there--which, for context, was during the Carter Administration.
Food service, IIRC, was the first to go. Now housekeeping has been outsourced. So has billing. Because, of course, nutrition has nothing to do with health. And neither does cleanliness. (My Gentle Reader naturally picked up on the sarcasm there.) And when, thanks to data-sharing limitations, my Mom is batted like a ping-pong ball between Accounts Receivable and at least two insurance companies when she's still half-whacked-out on painkillers, I'm going to take a dim view of outsourced billing. [sharpens fingernails] [bares teeth]
I have a lot of fond memories of visiting that place when I was growing up: The smell of acetone, the superball-bouncy scrambled eggs in the cafeteria, the stately pace of the elevators, the brain-in-a-jar in the Histology lab (true story). But I can also understand why Mom turned them down, too. So far as I can tell, it's still a clean, orderly, almost nurturing place. But the ghosts of the nuns who built and poured their devotion into it become more transparent with every contractor who slings an ID-card lanyard around their neck.
Fast-forward a generation--meaning me--and press the "zemblanity" button, and there's today's news about IBM selling its semiconductor business to GlobalFoundries. It's certainly not unprecedented, given IBM's sell-off of its PC/laptop business to Lenovo a few years back and some of its server business earlier this year. Except that this isn't your typical offshoring:
GlobalFoundries will take over IBM manufacturing facilities in New York and Vermont, and the company "plans to provide employment opportunities for substantially all IBM employees at the two facilities who are part of the transferred businesses, except for a team of semiconductor server group employees who will remain with IBM."
Thus, at least in the near term, GlobalFoundries will employ IBM expats at IBM facilities to make a profit at what, for IBM, was a money-pit. And IBM's taking a sesqui-billion-dollar hit on the deal besides. Slow-clap for IBM management there. (Sarcasm again, btw.)
Granted, I haven't seen the inside of the Blue Zoo since Lou Gerstner was yanking the platinum ripcord on his golden parachute. And, even then, being "contractor scum" insulated me from the insane (unpaid) overtime and pension-jigging and attrition-by-early-retirement and other assorted idiocies inflicted by the buscuit-salesman. But my frustration really boils down to one similar to Mom's. Namely, that the definition of "core competence" has become dangerously strict.
Now, I'm certainly not arguing in favour of vertical monopolies. The fact that Monsanto is allowed to GMO a seed specifically optimised for the petro-chemical atrocities they market frankly blows my mind. As Bruce Sterling put it, "Teddy Roosevelt would jump down off Mount Rushmore and kick our ass from hell to breakfast for tolerating a situation like this." And he's absolutely right--even when he was talking about software monopolies.
Maybe I've just been out of the server space for too long. For all I know, pushing mission-critical data off-site to cloud servers doesn't give CIOs the willies it would have given them a decade ago. Maybe Microsoft has finally earned enough enterprise-computing street-cred to muscle out the Big Iron in the server-room.
But I do know that outsourcing always entails friction and a certain amount of bridge-burning. In the case of Mom's ex-employer, it's orders of magnitude easier and less expensive to retrain (or, if necessary, fire) a under-performing employee than it is to cancel a contract and find a replacement when work isn't up to snuff. When you're a technology company that's weathered two decades of commodisation in both hardware and software by optimising one for the other, throwing away that balance makes no strategic to me.
When I look at the risk of things IBM flags as higher-margin (cloud, data and analytics, security, social and mobile), there is not one of them that I would flag as being "owned" by Big Blue. (Cloud? Amazon. Data? Oracle, with Microsoft hot on their heels. Analytics? Everybody's a "big data" guru these days. Security? Nope. Social? Please...who isn't gunning for Facebook? Mobile? Are we seriously expecting for an IBM phone?)
I owe IBM credit for making me a decent technical writer and for teaching me basic white-collar survival skills. Oh, and for disabusing me of the notion that working inside the belly of the leviathon is securer than working outside it. But apart from my comrades and the cafeteria/coffee-shop/cleaning ladies, there's no love lost for the big blue behemoth on my end.
Yet it galls me to see a company that lionised its R&D department (and the patent lawyers who filed every brain-wave thereof) hitching their wagons to other people's horses. Or, perhaps more aptly, jumping onto other bandwagons. Because bandwagon-passengers forfeit their right to drive, right?
Subscribe to:
Posts (Atom)