Okay, I'm just bowing to the inevitable and calling it a year (more or less) on blogging here. Whichever holiday my Gentle Reader chooses to observe around the Winter Solstice, I hope it's a lovely one.
Catch y'all on the flip side!
Thoughts on computers, companies, and the equally puzzling humans who interact with them
Monday, December 22, 2014
Wednesday, December 17, 2014
It takes a village to raise a programmer
Today I was at the computer repair shop, having a fried hard drive replaced in a laptop. When a client arrives with the diagnosis already made, repair folks are only being smart when they question it. I assured the gentleman at the front counter that I had booted into the laptop's BIOS and found no hard drive listing. Also that I had also tried running two flavours of Linux off CDs/DVDs and both freaked out when they found no hard drive.
At that he took my word for it. and I walked out with a new hard drive installed in less than fifteen minutes. (For the record, the laptop is now running Ubuntu with one PEBKAC* wireless issue since overcome.)
The gentleman seemed surprised at me being a Linux user, and asked what prompted me to use that operating system. Not wanting to drag politics in it, I explained that it makes web programming in the PHP/MySQL space a little more seamless when work is promoted to a server.
It turns out that he's starting to learn web programming just now, but that it isn't always coming easy. So I handed him my business card and told him that he was perfectly welcome to call on me for help when he was stuck. Just as I was able to call upon my elders in my times of need. (And, 's'matter'o'fact, still do.)
While I savour the few opportunities I've had to repay my elders for what they've taught me, it's equally delicious to be able to smooth the path for someone else. Sometimes it's not even a matter of catching a logic-bug or pointing to a more appropriate API or anything even remotely technical. Sometimes just knowing that someone won't consider your question "stupid" is more important than the answer itself.
I do hope to hear from him. Because that means that another craftsman is joining the trade, and auto-didacts who aren't afraid to ask for help are an asset to the guild.
- - - - -
* "PEBKAC" == "Problem Exists Between Keyboard And Chair"
At that he took my word for it. and I walked out with a new hard drive installed in less than fifteen minutes. (For the record, the laptop is now running Ubuntu with one PEBKAC* wireless issue since overcome.)
The gentleman seemed surprised at me being a Linux user, and asked what prompted me to use that operating system. Not wanting to drag politics in it, I explained that it makes web programming in the PHP/MySQL space a little more seamless when work is promoted to a server.
It turns out that he's starting to learn web programming just now, but that it isn't always coming easy. So I handed him my business card and told him that he was perfectly welcome to call on me for help when he was stuck. Just as I was able to call upon my elders in my times of need. (And, 's'matter'o'fact, still do.)
While I savour the few opportunities I've had to repay my elders for what they've taught me, it's equally delicious to be able to smooth the path for someone else. Sometimes it's not even a matter of catching a logic-bug or pointing to a more appropriate API or anything even remotely technical. Sometimes just knowing that someone won't consider your question "stupid" is more important than the answer itself.
I do hope to hear from him. Because that means that another craftsman is joining the trade, and auto-didacts who aren't afraid to ask for help are an asset to the guild.
- - - - -
* "PEBKAC" == "Problem Exists Between Keyboard And Chair"
Monday, December 15, 2014
Unwelcome magic
I'm trying to rebuild an old-ish (and rather banged-up) personal computer with a new operating system. Because, you know, friends don't let friends use Vista. [insert trollface]
So far, it's Gremlins: 4, Me: 0. (I've had to break out the paperclip--'nuff said.) I'm starting to sympathise a bit with the sub-species of programmer who despises hardware. In my case, I was less than amused to learn to realise that sometimes, the System Administrator has to deal with their own version of "syntactic sugar."
For non-coders, "syntactic sugar" is a shortcut in a programming language that has counter-intuitive (at least to a newbie) results or side-effects. In the worst-case scenario, "it just works" despite seeming to miss critical input or function invocations.
Other coders--including my betters--will disagree, but I generally dislike that in a language. It jacks up the slope in the learning curve, in my less-than-humble-opinion. Arthur C. Clarke aside, the magic of "it just works" is antithetical to the scientific reasoning process that's supposed to govern computer programming (and debugging).
Outside of the coding sphere, I tripped over something suspiciously similar. My Debian installation DVD didn't include a hardware driver compatible with the USB wifi stick I was using. Fortunately, I was able to find them on the Debian website and copy them to a flash drive. I plugged the drive into the back of the PC and continued. From that point on, I had wireless access.
Convenient, right? To me, it was actually kind of creepy. First off, the installer never asked me to specify the location of the wifi driver. Secondly, the driver was just one of many files bundled into a larger .DEB archive.
Magic.
Ugh.
I realise that there are times--particularly when it's not part of your core competency and the SysAdmin has already left for the day/week--when "stuff just working" is a good thing. But it's unlikely that this is the last time this particular PC and I will tussle. In which case "magic," however convenient, may actually be counter-productive in the long run.
That and I flatter myself that a programmer who isn't interested in pulling back the curtain on the controls is no programmer at all.
So far, it's Gremlins: 4, Me: 0. (I've had to break out the paperclip--'nuff said.) I'm starting to sympathise a bit with the sub-species of programmer who despises hardware. In my case, I was less than amused to learn to realise that sometimes, the System Administrator has to deal with their own version of "syntactic sugar."
For non-coders, "syntactic sugar" is a shortcut in a programming language that has counter-intuitive (at least to a newbie) results or side-effects. In the worst-case scenario, "it just works" despite seeming to miss critical input or function invocations.
Other coders--including my betters--will disagree, but I generally dislike that in a language. It jacks up the slope in the learning curve, in my less-than-humble-opinion. Arthur C. Clarke aside, the magic of "it just works" is antithetical to the scientific reasoning process that's supposed to govern computer programming (and debugging).
Outside of the coding sphere, I tripped over something suspiciously similar. My Debian installation DVD didn't include a hardware driver compatible with the USB wifi stick I was using. Fortunately, I was able to find them on the Debian website and copy them to a flash drive. I plugged the drive into the back of the PC and continued. From that point on, I had wireless access.
Convenient, right? To me, it was actually kind of creepy. First off, the installer never asked me to specify the location of the wifi driver. Secondly, the driver was just one of many files bundled into a larger .DEB archive.
Magic.
Ugh.
I realise that there are times--particularly when it's not part of your core competency and the SysAdmin has already left for the day/week--when "stuff just working" is a good thing. But it's unlikely that this is the last time this particular PC and I will tussle. In which case "magic," however convenient, may actually be counter-productive in the long run.
That and I flatter myself that a programmer who isn't interested in pulling back the curtain on the controls is no programmer at all.
Sunday, December 14, 2014
Silly Sunday, 2014.12.14: Tribalism in computing
Once upon a time (roughly coinciding with the Carter and Reagan Administrations in the U.S.), I actually liked to travel by airline. By which I mean that there was a time when I flew at least twice a year and didn't understand why airline jokes were a staple of stand-up comedy.
Enter three decades of cost-cutting, capped by the collective bed-wetting in response to a "black swan" event like 9/11.
Part of the reason this de-evolution is particularly painful for yours truly has to do with my transition from Liberal Arts graduate to programmer. The Liberal Arts bit pretty much ensures my awareness of the history of my profession. To wit: Twentieth century computing was initially driven by things being launched from Point A to Point B. Initially, those things were ballistic in nature. But eventually, the assumption was that the landing would be somewhat less...ahem..."dramatic."
One convenient aspect of visiting Washington, D.C., is that the major museums are (mostly) clumped in one place--namely along the mall. That includes the National Air and Space Museum. There behind glass, I was tickled to find a retro metal IBMonster that once crunched flight & reservation data. Anyone who's discovered a new branch in their family tree understands that feeling.
And so it was that last week, when I was flying through YYZ (Pearson Airport in Toronto, ON), my proverbial cold comfort for a 4+-hour delay (other than an extra bag of almonds) was the knowledge that the blame rested entirely with the hardware side.
That lasted until one of my fellow passengers wryly informed the flight attendant that, according to the in-flight display, we were already en route to our final destination.
D'oh!
And so, to my chagrin, I was reminded that what sociologists call tribalism can be just as rife the the left-brain-dominated world of computing as it can be anywhere else. And just as challenging to overcome as any other bias. Thus do computers make us more human. In this isolated instance, anyway.
Enter three decades of cost-cutting, capped by the collective bed-wetting in response to a "black swan" event like 9/11.
Part of the reason this de-evolution is particularly painful for yours truly has to do with my transition from Liberal Arts graduate to programmer. The Liberal Arts bit pretty much ensures my awareness of the history of my profession. To wit: Twentieth century computing was initially driven by things being launched from Point A to Point B. Initially, those things were ballistic in nature. But eventually, the assumption was that the landing would be somewhat less...ahem..."dramatic."
One convenient aspect of visiting Washington, D.C., is that the major museums are (mostly) clumped in one place--namely along the mall. That includes the National Air and Space Museum. There behind glass, I was tickled to find a retro metal IBMonster that once crunched flight & reservation data. Anyone who's discovered a new branch in their family tree understands that feeling.
And so it was that last week, when I was flying through YYZ (Pearson Airport in Toronto, ON), my proverbial cold comfort for a 4+-hour delay (other than an extra bag of almonds) was the knowledge that the blame rested entirely with the hardware side.
That lasted until one of my fellow passengers wryly informed the flight attendant that, according to the in-flight display, we were already en route to our final destination.
D'oh!
And so, to my chagrin, I was reminded that what sociologists call tribalism can be just as rife the the left-brain-dominated world of computing as it can be anywhere else. And just as challenging to overcome as any other bias. Thus do computers make us more human. In this isolated instance, anyway.
Wednesday, December 10, 2014
Of Google and Gestalt
Maybe I have an odd definition of "amusing," but the near-juxtaposition of two Slashdot article picks today made me smile.
The first item is a thought-provoking rebuttal of Elon Musk and Dr. Stephen Hawking's warnings about the dangers of artificial intelligence (AI). Its salient point is that intelligence is not autonomy--i.e. a machine cannot truly have free will. Certainly, our reliance on AI--as with computers and networks in general--makes us especially vulnerable to its failure(s). We're sometimes vulnerable to its successes, too. (Think obsolete livelihoods, cybercrime, etc.) And when some fool decides to abdicate critical decisions to an algorithm? Yeah--I think that most of us know the end of that movie.
There's also phenomenon known as "the uncanny valley," wherein computer generated (human) images are oh-so-close-but-no-cigar to life-like that we actually react negatively to them, compared with something more cartoonish. (Aside: If you're among those who were creeped out by The Polar Express but think that the minions of Despicable Me are adorable, now you know why.) In Star Trek: The Next Generation, the android Data notes that he has been programmed not only to blink, but to do so at semi-random intervals so as not to trigger that vague sense of unease associated with the uncanny valley.
And, even being a programmer, I have to admit to being creeped out myself by the accuracy of voice recognition in some automated phone systems. In the end, it may well be that the market's response to the uncanny valley may forestall an AI bot takeover before the technology is even capable of making it a threat.
In short, we are (probably) a long, long way off from renegade replicants and time-travelling hit-men for a genocidal AI. Or so The Matrix wants us to believe... ;~)
At this point, it's tempting to congratulate ourselves for being such inimitably complex carbon-based beasties. Until we consider the second Slashdot item, which brings home how easy it is to impersonate a human browsing a website. And not only a human, but a wealthy human. In related news, Google made headlines last week for stealing a march in the arms-race against the bots--or, more aptly, the people who code them. (Though I do have to wonder whether the visually impaired will be the collateral damage of that escalation.)
That's the contrast that made me smile, albeit wryly. To wit: The bar for "humanity" is set so high in one area of software development, but so low in another. (Embarrassingly, that latter area is my own.)
As Mr. Etzioni pointed out, part of culture's freak-out over threats of AI is our fear of irrelevance. Or...do we also fear that we've passed some inflection-point where our lazy, self-fetishising parochialism leaves us open to a palace coup by our digital serfs? Personally, machine learning doesn't worry me half so much as humans who refuse to learn. But if my Gentle Reader is more of the Musk/Hawking camp, perhaps we can agree that the only viable response is to insist on a higher bar for humanity.
The first item is a thought-provoking rebuttal of Elon Musk and Dr. Stephen Hawking's warnings about the dangers of artificial intelligence (AI). Its salient point is that intelligence is not autonomy--i.e. a machine cannot truly have free will. Certainly, our reliance on AI--as with computers and networks in general--makes us especially vulnerable to its failure(s). We're sometimes vulnerable to its successes, too. (Think obsolete livelihoods, cybercrime, etc.) And when some fool decides to abdicate critical decisions to an algorithm? Yeah--I think that most of us know the end of that movie.
There's also phenomenon known as "the uncanny valley," wherein computer generated (human) images are oh-so-close-but-no-cigar to life-like that we actually react negatively to them, compared with something more cartoonish. (Aside: If you're among those who were creeped out by The Polar Express but think that the minions of Despicable Me are adorable, now you know why.) In Star Trek: The Next Generation, the android Data notes that he has been programmed not only to blink, but to do so at semi-random intervals so as not to trigger that vague sense of unease associated with the uncanny valley.
And, even being a programmer, I have to admit to being creeped out myself by the accuracy of voice recognition in some automated phone systems. In the end, it may well be that the market's response to the uncanny valley may forestall an AI bot takeover before the technology is even capable of making it a threat.
In short, we are (probably) a long, long way off from renegade replicants and time-travelling hit-men for a genocidal AI. Or so The Matrix wants us to believe... ;~)
At this point, it's tempting to congratulate ourselves for being such inimitably complex carbon-based beasties. Until we consider the second Slashdot item, which brings home how easy it is to impersonate a human browsing a website. And not only a human, but a wealthy human. In related news, Google made headlines last week for stealing a march in the arms-race against the bots--or, more aptly, the people who code them. (Though I do have to wonder whether the visually impaired will be the collateral damage of that escalation.)
That's the contrast that made me smile, albeit wryly. To wit: The bar for "humanity" is set so high in one area of software development, but so low in another. (Embarrassingly, that latter area is my own.)
As Mr. Etzioni pointed out, part of culture's freak-out over threats of AI is our fear of irrelevance. Or...do we also fear that we've passed some inflection-point where our lazy, self-fetishising parochialism leaves us open to a palace coup by our digital serfs? Personally, machine learning doesn't worry me half so much as humans who refuse to learn. But if my Gentle Reader is more of the Musk/Hawking camp, perhaps we can agree that the only viable response is to insist on a higher bar for humanity.
Monday, December 8, 2014
Snippets
Back in college (the first time), I was into what some might consider an "extreme" sport. Oh, there was no athletic ability involved--not unless you count hauling cases of newspaper/magazine clippings or occasionally sprinting across a campus...in high heels. Once in a full-on snowstorm. High heels are great for digging into some kinds of snow--trust me on this.
Basically, I was on the speech team. And while I joined thinking that my focus would be on interpretive readings of poetry/prose/drama, I was shanghaied into something called "Impromptu." The rules for that type of speaking were straight-forward:
How does one train to face the unknown? My team more or less had a structure for each speech, which (apart from the obvious benefit) helped you pace yourself when were actually speaking. But the Team Elders--which included the reigning national champion in Impromptu speaking (no pressure there)--also kept a full desk-drawer of note-cards containing quotations. Sometimes it seemed like every sound-bite uttered by Every Famous Dead Person Ever was in that drawer. To practice, you'd draw a card at random and find yourself riffing on anyone from Karl Marx to Groucho Marx.
My afore-mentioned Elders, as well as the official coaches, also liked to move the proverbial goal-posts. For instance, after the first semester of my Freshman year, I wasn't allowed to use a note-card to jot down my outline--from then on, I had to keep it all in my head. Still later in my career, the Assistant Coach knuckled down on my prep time. Most folks spent between one and two minutes of the allotted seven in preparation. I wasn't allowed more than one in practice. Of course, the consequence of having less time to prepare was that I had to fill that much more time with content to hit the seven minute goal. Eeep!
Like I said, it's one heck of an adrenaline-kick--even compared to public speaking when you have a prepared, memorised presentation. (And keep in mind that PowerPoint hadn't even been invented at that point.) Apart from reading, writing, and math, here are few skills as valuable as learning to hide your terror while you're on stage...or even being put on the spot in a meeting. And being able to riff while still keeping the salient points at your synaptic fingertips is a huge--yea, even ginormous--bonus to that basic skill.
But the afore-mentioned drawer of quotations wasn't just a means to an end, either. Being largely the handiwork of a couple of political junkies and a six-year student about to graduate with a Philosophy degree, you can imagine the breadth of topics. Combining that with the discipline of organising my thoughts on the fly and making them sound good when they came out, the experience was a liberal arts education in microcosm. And it is quite possibly the single most useful thing I gained from my four-year degree. (Also, that's how I met Dennis, because he competed on the speech team of a neighbouring university. If that's not a win, I don't know what is.)
Nearly three decades later, I'm fairly certain that this training has improved how I do my day-to-day work. See, programming is no different from any other expertise in that the proverbial 10,000 hours of practice rule really applies. But unlike, say, mastering the violin, the range of things you need to unlearn and learn anew to stay on top of the game doesn't diminish with time. Moreover, much of that unlearning/relearning is done on the clock, on the spot.
That's where the Team Elders--in this case, the mavens of an unfamiliar (to me) technology/language/platform--come into play. Their code snippets, typically by the time they're blogged or accepted/up-voted on StackOverflow, are like the nuggets received wisdom I once pulled out of the desk drawer. Likewise, much of what I do afterwards is to fit them into the framework of the moment. All while the clock is ticking, of course. At the end of this process, if I've done my job correctly, there's the sense that I've added context and relevance to the snippet's pithy brilliance. And I can vouch for being better off for the experience--maybe a little smarter, maybe a little more efficient, maybe even both.
So don't ever let anyone tell you that a Liberal Arts degree doesn't have practical value. I'm living proof that it does. Oh yeah, and I moonlighted on the Debate team, too--so I've been trained argue for a reeeeallllly loooooong time. ;~)
Basically, I was on the speech team. And while I joined thinking that my focus would be on interpretive readings of poetry/prose/drama, I was shanghaied into something called "Impromptu." The rules for that type of speaking were straight-forward:
- You had absolutely no idea what you were going to talk about until it was your turn.
- You would be given a topic, typically in the form of a quotation or proverb.
- You had seven minutes to prepare and deliver your speech; you were docked for going over seven minutes--or running significantly under.
- While you were prepping, the judge for your round would call out elapsed time in 30-second increments, and give you a countdown of time remaining by hand while you were speaking.
How does one train to face the unknown? My team more or less had a structure for each speech, which (apart from the obvious benefit) helped you pace yourself when were actually speaking. But the Team Elders--which included the reigning national champion in Impromptu speaking (no pressure there)--also kept a full desk-drawer of note-cards containing quotations. Sometimes it seemed like every sound-bite uttered by Every Famous Dead Person Ever was in that drawer. To practice, you'd draw a card at random and find yourself riffing on anyone from Karl Marx to Groucho Marx.
My afore-mentioned Elders, as well as the official coaches, also liked to move the proverbial goal-posts. For instance, after the first semester of my Freshman year, I wasn't allowed to use a note-card to jot down my outline--from then on, I had to keep it all in my head. Still later in my career, the Assistant Coach knuckled down on my prep time. Most folks spent between one and two minutes of the allotted seven in preparation. I wasn't allowed more than one in practice. Of course, the consequence of having less time to prepare was that I had to fill that much more time with content to hit the seven minute goal. Eeep!
Like I said, it's one heck of an adrenaline-kick--even compared to public speaking when you have a prepared, memorised presentation. (And keep in mind that PowerPoint hadn't even been invented at that point.) Apart from reading, writing, and math, here are few skills as valuable as learning to hide your terror while you're on stage...or even being put on the spot in a meeting. And being able to riff while still keeping the salient points at your synaptic fingertips is a huge--yea, even ginormous--bonus to that basic skill.
But the afore-mentioned drawer of quotations wasn't just a means to an end, either. Being largely the handiwork of a couple of political junkies and a six-year student about to graduate with a Philosophy degree, you can imagine the breadth of topics. Combining that with the discipline of organising my thoughts on the fly and making them sound good when they came out, the experience was a liberal arts education in microcosm. And it is quite possibly the single most useful thing I gained from my four-year degree. (Also, that's how I met Dennis, because he competed on the speech team of a neighbouring university. If that's not a win, I don't know what is.)
Nearly three decades later, I'm fairly certain that this training has improved how I do my day-to-day work. See, programming is no different from any other expertise in that the proverbial 10,000 hours of practice rule really applies. But unlike, say, mastering the violin, the range of things you need to unlearn and learn anew to stay on top of the game doesn't diminish with time. Moreover, much of that unlearning/relearning is done on the clock, on the spot.
That's where the Team Elders--in this case, the mavens of an unfamiliar (to me) technology/language/platform--come into play. Their code snippets, typically by the time they're blogged or accepted/up-voted on StackOverflow, are like the nuggets received wisdom I once pulled out of the desk drawer. Likewise, much of what I do afterwards is to fit them into the framework of the moment. All while the clock is ticking, of course. At the end of this process, if I've done my job correctly, there's the sense that I've added context and relevance to the snippet's pithy brilliance. And I can vouch for being better off for the experience--maybe a little smarter, maybe a little more efficient, maybe even both.
So don't ever let anyone tell you that a Liberal Arts degree doesn't have practical value. I'm living proof that it does. Oh yeah, and I moonlighted on the Debate team, too--so I've been trained argue for a reeeeallllly loooooong time. ;~)
Wednesday, December 3, 2014
Digital rubbernecking
As a programmer, I write a fair amount of code that is paranoid--specifically, of anything coming in from the Web. There is also the added overhead of dealing with encrypted data--passwords, email addresses and the like. The rest I more or less leave to the folks who set up the servers on which my code and data lives. Ditto the folks who set up the routers and networks and who invent/improve the encryption algorithms.
That's not to say that I'm not fascinated by issues around security and cryptology. It's just that I know that I have no aptitude for it--particularly what you'd call "thinking like a hacker." And hacking-post mortems are like candy for me.
Which, in the wake of last week's Sony mega-hack, basically makes me a rubbernecker. (In my defence, I don't rubberneck in real life; gawping at disasters online doesn't slow down the traffic or hinder first responders.)
Oh, and is this ever a train-wreck. Partly, it's the sheer scope. Thirty-eight million files stolen--some of those files whole databases.
Also, though, there was the initial blank-wall response, and now the possibility of fingering the wrong wrongdoer. North Korea was the prime suspect from the get-go. That assessment has been disputed and even criticised by the infosec. community, but that's Sony's story and they're sticking to it. You have to admit, being targeted by a rogue government makes for better security theatre than falling victim to an inside job carried out by pissed-off plebeians.
Oh, and passwords weren't encrypted and the hackers managed to nab SSL root certificates that won't expire for years? #headdesk
It's impossible not to look, right? There are just so many flavours of "screwed" involved here--for the short-, medium-, and long-term:
If it sounds like I'm blaming the victim, I am--but only sorta-kinda. Yes, it's tempting to see this as karma for a company that had no problem infecting paying customers with malware--basically using them as conscripts in their battle against piracy--thus leaving them open to other hackers. And, honestly, Sony's response when the news broke might just be the douchiest thing you'll read all day...assuming you're not following Timothy Loehmann / Daniel Pantaleo apologists on Twitter, of course:
Obviously, I don't work for or at the company, so please don't think I'm speaking with any evidence-based authority here. But the circumstantial evidence points to a management mindset in which security was viewed as an expense to be minimised, rather than an asset to be built and leveraged as a competitive advantage.
If true, investors and other stakeholders should take that cavalier attitude--toward their own crown jewels as well as the personal data of others--as a sign-post on the road to extinction.
Because ultimately, Sony lives in a fully digital world. Movies no longer exist as spools of celluloid. Except for audiophiles, 21st century music is not served up on fragile black vinyl platters. Most games do not play out with wooden/plastic/metal markers on cardboard these days. The upside of that world is that copies of intellectual property can be made for mere fractions of pennies. The downside of that world is that copies can be made for mere fractions of pennies.
Playwright GB Shaw claimed that because the "reasonable man" adapts himself to his environment while the "unreasonable man" adapts his environment to suit himself, all progress must therefore be driven by the unreasonable man. But that assumes two finite ends of a continuum--a continuum that ignores the possibility of unreasonability shading into delusion.
Further back in time, the earliest tragedy tracked the ruin of the great, often precipitated by the hubris with which they met forces or events beyond their control. You'd think that an entertainment company would take that wisdom to heart. The warnings of Euripides and Sophocles ring true even today. But companies like Sony bear no resemblance to the travelling companies of players in centuries past. They're little more than accounting machines, slicing revenues into royalties and residuals.
But you're smart enough to actually listen to your security folks, am I right, Gentle Reader? Please tell me I'm right. Because as much as I do enjoy a good hacking post-mortem (in the same way some people enjoy a good murder mystery), I'd really rather not be rubbernecking at the hacking of someone I know. Thanks.
That's not to say that I'm not fascinated by issues around security and cryptology. It's just that I know that I have no aptitude for it--particularly what you'd call "thinking like a hacker." And hacking-post mortems are like candy for me.
Which, in the wake of last week's Sony mega-hack, basically makes me a rubbernecker. (In my defence, I don't rubberneck in real life; gawping at disasters online doesn't slow down the traffic or hinder first responders.)
Oh, and is this ever a train-wreck. Partly, it's the sheer scope. Thirty-eight million files stolen--some of those files whole databases.
- Movies leaked to torrent sites before their release date
- Scripts for movies not even in production yet
- Source code--presumably for games
- Legal assets like contracts, non-disclosure agreements, etc.
- Salaries, including highly embarrassing discrepancies in executive level pay
- Human resources data, including social security numbers, addresses, birthdays, phone numbers, etc.
- Sales and financial data going back years.
- I/T infrastructure maps, complete with security credentials
Also, though, there was the initial blank-wall response, and now the possibility of fingering the wrong wrongdoer. North Korea was the prime suspect from the get-go. That assessment has been disputed and even criticised by the infosec. community, but that's Sony's story and they're sticking to it. You have to admit, being targeted by a rogue government makes for better security theatre than falling victim to an inside job carried out by pissed-off plebeians.
Oh, and passwords weren't encrypted and the hackers managed to nab SSL root certificates that won't expire for years? #headdesk
It's impossible not to look, right? There are just so many flavours of "screwed" involved here--for the short-, medium-, and long-term:
- Revenue lost to piracy
- Further revenue loss if said pirated content sucks and no one wants to pay to see it
- A pretty-much-unquantifiable loss in competitive advantage to its competitors in the entertainment and gaming industries
- Equipment and staffing costs for scanning, then scrubbing or replacing every single computer currently on possibly touched the network
- Nothing short of an identity theft nightmare for thousands of employees and contractors: Sony footing the bill for any reasonable amount of credit-monitoring and remediation will easily run into the millions of dollars
- The productivity-killing morale-buster for employees now freaking about their current job or their future credit rating
- Possible (probable?) massive class-action lawsuits, particularly if North Korea doesn't turn out to be the villain after all
- The inevitable stock price bobbles, particularly as the after-shocks play out
If it sounds like I'm blaming the victim, I am--but only sorta-kinda. Yes, it's tempting to see this as karma for a company that had no problem infecting paying customers with malware--basically using them as conscripts in their battle against piracy--thus leaving them open to other hackers. And, honestly, Sony's response when the news broke might just be the douchiest thing you'll read all day...assuming you're not following Timothy Loehmann / Daniel Pantaleo apologists on Twitter, of course:
NPR was one of the first to report on the scandal on November 4, 2005. Thomas Hesse, Sony BMG's Global Digital Business President, told reporter Neda Ulaby, "Most people, I think, don't even know what a rootkit is, so why should they care about it?"
Obviously, I don't work for or at the company, so please don't think I'm speaking with any evidence-based authority here. But the circumstantial evidence points to a management mindset in which security was viewed as an expense to be minimised, rather than an asset to be built and leveraged as a competitive advantage.
If true, investors and other stakeholders should take that cavalier attitude--toward their own crown jewels as well as the personal data of others--as a sign-post on the road to extinction.
Because ultimately, Sony lives in a fully digital world. Movies no longer exist as spools of celluloid. Except for audiophiles, 21st century music is not served up on fragile black vinyl platters. Most games do not play out with wooden/plastic/metal markers on cardboard these days. The upside of that world is that copies of intellectual property can be made for mere fractions of pennies. The downside of that world is that copies can be made for mere fractions of pennies.
Playwright GB Shaw claimed that because the "reasonable man" adapts himself to his environment while the "unreasonable man" adapts his environment to suit himself, all progress must therefore be driven by the unreasonable man. But that assumes two finite ends of a continuum--a continuum that ignores the possibility of unreasonability shading into delusion.
Further back in time, the earliest tragedy tracked the ruin of the great, often precipitated by the hubris with which they met forces or events beyond their control. You'd think that an entertainment company would take that wisdom to heart. The warnings of Euripides and Sophocles ring true even today. But companies like Sony bear no resemblance to the travelling companies of players in centuries past. They're little more than accounting machines, slicing revenues into royalties and residuals.
But you're smart enough to actually listen to your security folks, am I right, Gentle Reader? Please tell me I'm right. Because as much as I do enjoy a good hacking post-mortem (in the same way some people enjoy a good murder mystery), I'd really rather not be rubbernecking at the hacking of someone I know. Thanks.
Monday, December 1, 2014
Two styles of product development
Whew. That was a close call. For a few minutes there, I thought that the house had also eaten James Webb Young's A Technique for Producing Ideas. (Mind you, I can't find the Penguin History of Canada when I need it, so the house still has some 'splainin' to do. But that's another story.)
I have a librarian friend who--unsurprisingly--organises her home bookshelves according to Library of Congress numbering. I'm not so professional m'self, but even so, my History shelves are parsed according to a system...albeit one which likely makes no sense to anyone else. And the delineation between "literature" vs. mere "fiction" admittedly tends to factor in book size to an inordinate degree.
But given my usual categorisation instincts, the fact that I didn't immediately think to look for Young's book in the "software development" neighbourhood is disgraceful, truth be told. Particularly as that's also where Strunk & White and a dot-com-vintage Chicago Manual of Style live. (Anyone who thinks that being a software developer starts and ends at the bytes is--in Web 2.0 parlance--"doin it rong." So sayeth my bookshelf: Your argument is invalid.)
A Technique for Producing Ideas is a slim slip of a book--almost a pamphlet on steroids, really. It dates from the advertising world of the 1940s--notably before "the tee-vee" became the most valued fixture in America's living room. But even a grab-your-belt-and-fly-by-the-seat-of-your-pants autodidact like Don Draper would have known it chapter-and-verse. And it's no less relevant today, when all we First World pixel-pushing proles (allegedly) need to do to hose the backwash of globalisation off our blue suede shoes is "innovate." (This is where I'd love to link to Rory Blyth's brilliant, scathing "Innovidiots" blog-post, but it looks like it's offline indefinitely.)
Absent Mr. Blyth's take on the subject, I think our biggest problem with the term "innovation" is its intimidating suggestion of the blank page. And I don't think I'm making a straw-man argument when I say that, popularly, innovation is too often conflated with creating something ex nihilo. Intellectually, you can look at the portfolio of the most valuable consumer products company on the planet (Apple) : Graphical user interfaces, MP3 players, smartphones, and tablet computers--and know that Woz/Jobs didn't invent a single one of them.
That insight doesn't necessarily help when you're staring into a looming abyss of enforced downtime--yay, holidays. It helps even less to remember that Sir Tim Berners-Lee invented the HTTP protocol on Christmas Day. No pressure there... [grumble]
So to bring things down to the scale of a manageable metaphor, you mainly just need to decide whether you're ultimately the waffle-iron or the crock pot when it comes to making something.
Waffles have the advantage of being very specific--anyone who's been by the grocery story freezer case should have the basic idea down. But the parameters, to a certain extent, are relatively fixed: Starch--typically flour--for body, eggs for structure, some sort of leavening (typically baking soda/powder or yeast) for loft, and milk to make the batter pourable. Too much of one thing and you could have a weird looking hockey-puck or (literally) a hot mess. Moreover, modern electric/electronic waffle irons typically impose limits on temperature.
Within those basic parameters, however, you can make some amazing waffles. (In my world, read "Dennis" for "you.") Making a "sponge" of yeast-leavened batter the night before, and only adding the eggs in the morning, for instance, makes for a revelation in texture. Likewise, eggs can be split into yolks for the main batter, while the whites are frothed and gently folded in afterwards. A touch of vanilla or almond extract? Yes, please. Topped with lingonberry syrup (because you live close enough to Newfoundland/Labrador that it's a staple in Maritime grocery stores)? Bring it.
Waffles are incremental innovation in a nutshell. Evolution, y'understand.
In contrast, there's the crock pot. True, milk and/or eggs probably won't be staples of most recipes. But apart from those, you have a lot of latitude...assuming you respect the laws of Physics. A crock pot will happily simmer organic vegetarian vegetable soup all day. A crock pot will just as happily caramelise cocktail weinies and bottled BBQ sauce into artery-clogging, potentially carcinogenic ambrosia. A crock pot doesn't judge.
In tonight's metaphor, that latitude is what pushes the crock-pot toward the "revolution" end of the invention spectrum.
I'm not particularly partial to either--in fact, I'm delighted when an idea that I consider commoditised is successfully re-invented/re-imagined. LCD monitors, LED light bulbs, thermostats, etc.
But whether you ultimately choose to make waffles or some slow-cooked goodness, the end-goal is the same. Sure, maybe the first few attempts you'll end up feeding to the dog or what-have-you. But ultimately, you have to muster the confidence to serve it to company. Because just as there is no art without an audience, there is no invention without an end-user.
I have a librarian friend who--unsurprisingly--organises her home bookshelves according to Library of Congress numbering. I'm not so professional m'self, but even so, my History shelves are parsed according to a system...albeit one which likely makes no sense to anyone else. And the delineation between "literature" vs. mere "fiction" admittedly tends to factor in book size to an inordinate degree.
But given my usual categorisation instincts, the fact that I didn't immediately think to look for Young's book in the "software development" neighbourhood is disgraceful, truth be told. Particularly as that's also where Strunk & White and a dot-com-vintage Chicago Manual of Style live. (Anyone who thinks that being a software developer starts and ends at the bytes is--in Web 2.0 parlance--"doin it rong." So sayeth my bookshelf: Your argument is invalid.)
A Technique for Producing Ideas is a slim slip of a book--almost a pamphlet on steroids, really. It dates from the advertising world of the 1940s--notably before "the tee-vee" became the most valued fixture in America's living room. But even a grab-your-belt-and-fly-by-the-seat-of-your-pants autodidact like Don Draper would have known it chapter-and-verse. And it's no less relevant today, when all we First World pixel-pushing proles (allegedly) need to do to hose the backwash of globalisation off our blue suede shoes is "innovate." (This is where I'd love to link to Rory Blyth's brilliant, scathing "Innovidiots" blog-post, but it looks like it's offline indefinitely.)
Absent Mr. Blyth's take on the subject, I think our biggest problem with the term "innovation" is its intimidating suggestion of the blank page. And I don't think I'm making a straw-man argument when I say that, popularly, innovation is too often conflated with creating something ex nihilo. Intellectually, you can look at the portfolio of the most valuable consumer products company on the planet (Apple) : Graphical user interfaces, MP3 players, smartphones, and tablet computers--and know that Woz/Jobs didn't invent a single one of them.
That insight doesn't necessarily help when you're staring into a looming abyss of enforced downtime--yay, holidays. It helps even less to remember that Sir Tim Berners-Lee invented the HTTP protocol on Christmas Day. No pressure there... [grumble]
So to bring things down to the scale of a manageable metaphor, you mainly just need to decide whether you're ultimately the waffle-iron or the crock pot when it comes to making something.
Waffles have the advantage of being very specific--anyone who's been by the grocery story freezer case should have the basic idea down. But the parameters, to a certain extent, are relatively fixed: Starch--typically flour--for body, eggs for structure, some sort of leavening (typically baking soda/powder or yeast) for loft, and milk to make the batter pourable. Too much of one thing and you could have a weird looking hockey-puck or (literally) a hot mess. Moreover, modern electric/electronic waffle irons typically impose limits on temperature.
Within those basic parameters, however, you can make some amazing waffles. (In my world, read "Dennis" for "you.") Making a "sponge" of yeast-leavened batter the night before, and only adding the eggs in the morning, for instance, makes for a revelation in texture. Likewise, eggs can be split into yolks for the main batter, while the whites are frothed and gently folded in afterwards. A touch of vanilla or almond extract? Yes, please. Topped with lingonberry syrup (because you live close enough to Newfoundland/Labrador that it's a staple in Maritime grocery stores)? Bring it.
Waffles are incremental innovation in a nutshell. Evolution, y'understand.
In contrast, there's the crock pot. True, milk and/or eggs probably won't be staples of most recipes. But apart from those, you have a lot of latitude...assuming you respect the laws of Physics. A crock pot will happily simmer organic vegetarian vegetable soup all day. A crock pot will just as happily caramelise cocktail weinies and bottled BBQ sauce into artery-clogging, potentially carcinogenic ambrosia. A crock pot doesn't judge.
In tonight's metaphor, that latitude is what pushes the crock-pot toward the "revolution" end of the invention spectrum.
I'm not particularly partial to either--in fact, I'm delighted when an idea that I consider commoditised is successfully re-invented/re-imagined. LCD monitors, LED light bulbs, thermostats, etc.
But whether you ultimately choose to make waffles or some slow-cooked goodness, the end-goal is the same. Sure, maybe the first few attempts you'll end up feeding to the dog or what-have-you. But ultimately, you have to muster the confidence to serve it to company. Because just as there is no art without an audience, there is no invention without an end-user.
Saturday, November 29, 2014
Silly Saturday, 2014.11.29: Nerd vs. Snob
Dennis & I bottled a kit's worth of red wine last weekend, a blend of Shiraz and Cabernet Sauvignon. Unlike (most) whites, reds typically need a little time to settle into their new digs. Our future dinner-companion was no exception. "The fruit and the tannins really haven't melded yet," I pronounced after a sampling sip, "Does saying that make me a wine snob?"
"Yes," pronounced Dennis. And I laughed, because I knew he was rattling my cage. (He has standing orders to shoot me if I ever become a wine-snob, and I know darned well that he'd be too sneaky to let me know what was coming.)
I will cop to being a wine nerd...or at least a wanna-be wine nerd--no question. But it wasn't until a bit later in the afternoon that the difference between wine snob and wine nerd really occurred to me. What's more, I think that my nerd-snob distinction pretty much applies to anything about which one can be a nerd or snob.
Simply, this: A nerd tries not to let their judgment interfere with learning; a snob tries not to let learning interfere with their judgment.
That's not to say that a snob won't keep up with the material; it's just that their grading-system is pretty well set for life. And it's also not to say that a nerd doesn't have standards. "Two-buck Chuck" is still plonk. And over ice? [insert uncontrollable twitching]
"Yes," pronounced Dennis. And I laughed, because I knew he was rattling my cage. (He has standing orders to shoot me if I ever become a wine-snob, and I know darned well that he'd be too sneaky to let me know what was coming.)
I will cop to being a wine nerd...or at least a wanna-be wine nerd--no question. But it wasn't until a bit later in the afternoon that the difference between wine snob and wine nerd really occurred to me. What's more, I think that my nerd-snob distinction pretty much applies to anything about which one can be a nerd or snob.
Simply, this: A nerd tries not to let their judgment interfere with learning; a snob tries not to let learning interfere with their judgment.
That's not to say that a snob won't keep up with the material; it's just that their grading-system is pretty well set for life. And it's also not to say that a nerd doesn't have standards. "Two-buck Chuck" is still plonk. And over ice? [insert uncontrollable twitching]
Wednesday, November 26, 2014
The flip-side of an old engineering adage
The description of software development as an "engineering" discipline is, to me, one of those "for lack of a better term" bits of taxonomy. Sure, there are marked similarities. In an increasingly networked world, it can truly be said that lives are riding on things working as designed--even when those "things" are merely electrical impulses transmitted between or stored in bits to silicon or magnetic platters.
There's another area where software engineers and all other flavours of engineers certainly do overlap. That's in the common adage, "'Better' is the enemy of 'done.'" In management, it's the mantra of those who have to keep an eye on cash-flow. Below management level, it's the mantra of people who have this filthy habit of wanting to spend time with their families and/or significant others.
Don't get me wrong: I'm totally about 40 hour workweeks and keeping the company solvent. I consider anything less an #epicfail on the part of management.
Yet, what rarely (if ever) is addressed is that the adage has an equal-and-opposite truth:
"Done" is the enemy of "better."
If you didn't instinctively grok the essence of that, I can pretty much guarantee that you will the first time you have to get Version 2.0 out the door. All those corners you had to cut? They're still as sharp as they ever were. All those values you hard-coded as 1:1 relationships because you didn't have time to populate association tables? Consider them diamond-hard-coded now. Yeah--have fun dynamiting those out.
Now, I would never say that those first-iteration shortcuts were ill-judged. After all, this is Version 1.0 we're talking about. One-point-oh is a unique and invariably rude & mercurial beastie. Version 2.0 is our attempt to domesticate it. Flea-dip. De-worming. The dreaded trip to the vet's office. If we play our cards right, it won't eat any (more) couch cushions. If we're very lucky, it won't lick its nethers in the middle of the living room...at least not while company's over.
Problem is--and oh, Friends and Brethren I confess that I also have the stain of this sin upon my soul--too many times we don't take Version 2.0 as seriously as we do 1.0. First generation products are castles in the air that have been successfully brought to earth. That's a massive and commendable achievement. But thinking of Version 2.0 purely in terms of "features we didn't have time for in Version 1.0" is not a forgiveable sin for anyone who aspires to the title of "Software Engineer." After all, no sane engineer would dream of adding turrets and towers to a castle built on sand.
For programmers, the work authorisation for Version 2.0 is an opportunity to pay down some (technical) debt, not just wear the silver numbers off a brand-new debit card. And in actuality, I'm preaching to the choir for the vast majority of programmers. It's the folks who commission programmers to make new stuff for them that I'm hoping to convince here.
Clients: Your programmers saved you a wad of cash up-front when you weren't sure whether that wild-haired idea you had on the StairMaster would actually pan out. But its weekend of garage-tinkering in its boxer shorts is done; let's get this thing showered, shaved, and dressed for real work. Don't be surprised when that takes extra money and time. Whether you were aware of it or not, you're the co-signer on the afore-mentioned 1.0 debt.
That probably comes off as sounding brutal. But I can assure you that it's a mother's kiss in comparison to the sound of a potential customer clicking on a competitor's product instead.
There's another area where software engineers and all other flavours of engineers certainly do overlap. That's in the common adage, "'Better' is the enemy of 'done.'" In management, it's the mantra of those who have to keep an eye on cash-flow. Below management level, it's the mantra of people who have this filthy habit of wanting to spend time with their families and/or significant others.
Don't get me wrong: I'm totally about 40 hour workweeks and keeping the company solvent. I consider anything less an #epicfail on the part of management.
Yet, what rarely (if ever) is addressed is that the adage has an equal-and-opposite truth:
"Done" is the enemy of "better."
If you didn't instinctively grok the essence of that, I can pretty much guarantee that you will the first time you have to get Version 2.0 out the door. All those corners you had to cut? They're still as sharp as they ever were. All those values you hard-coded as 1:1 relationships because you didn't have time to populate association tables? Consider them diamond-hard-coded now. Yeah--have fun dynamiting those out.
Now, I would never say that those first-iteration shortcuts were ill-judged. After all, this is Version 1.0 we're talking about. One-point-oh is a unique and invariably rude & mercurial beastie. Version 2.0 is our attempt to domesticate it. Flea-dip. De-worming. The dreaded trip to the vet's office. If we play our cards right, it won't eat any (more) couch cushions. If we're very lucky, it won't lick its nethers in the middle of the living room...at least not while company's over.
Problem is--and oh, Friends and Brethren I confess that I also have the stain of this sin upon my soul--too many times we don't take Version 2.0 as seriously as we do 1.0. First generation products are castles in the air that have been successfully brought to earth. That's a massive and commendable achievement. But thinking of Version 2.0 purely in terms of "features we didn't have time for in Version 1.0" is not a forgiveable sin for anyone who aspires to the title of "Software Engineer." After all, no sane engineer would dream of adding turrets and towers to a castle built on sand.
For programmers, the work authorisation for Version 2.0 is an opportunity to pay down some (technical) debt, not just wear the silver numbers off a brand-new debit card. And in actuality, I'm preaching to the choir for the vast majority of programmers. It's the folks who commission programmers to make new stuff for them that I'm hoping to convince here.
Clients: Your programmers saved you a wad of cash up-front when you weren't sure whether that wild-haired idea you had on the StairMaster would actually pan out. But its weekend of garage-tinkering in its boxer shorts is done; let's get this thing showered, shaved, and dressed for real work. Don't be surprised when that takes extra money and time. Whether you were aware of it or not, you're the co-signer on the afore-mentioned 1.0 debt.
That probably comes off as sounding brutal. But I can assure you that it's a mother's kiss in comparison to the sound of a potential customer clicking on a competitor's product instead.
Monday, November 24, 2014
When sorcery and software don't mix
It's been nearly a decade since I had to wear the "Sys. Admin." hat full-time, but apparently the karma that goes with that role hasn't entirely worn off. Today was the first time I realised that this can sometimes be a mixed blessing.
Let me back up for a bit and first define what I mean by "Sys. Admin. karma." Let's say you work in an office environment and your computer is, for lack of a better term, "being stupid." Maybe you've already rebooted, or maybe that would throw the proverbial monkey-wrench into your current workflow. Either way, you're hosed, and it's time to call in someone whose job it is to un-hose you.
Back in the Day(TM), in another country, in another industry, that would have been me...when I wasn't babysitting servers or refurbishing workstations for the new folks being shoehorned into a rapidly-expanding staff. Now, my office was tucked away from most of everyone else--probably because I shared it with four servers and, hoo-boy, were they loud. So by the time I'd crossed my floor to the stairwell and trotted over to the far end of the lower floor, the problem had a good chance of fixing itself. Memory/CPU usage had stopped spiking, a file lock had been relinquished, whatever.
Being Upper Midwesterners, my co-workers would typically apologise profusely for "bugging" me, typically after swearing up and down that the problem had been there just a minute ago, really-and-for-true.
That's Sys. Admin. karma. The phenomenon is not limited to I/T of course--as anyone who has had their car's disconcerting squeak/rattle disappear on the way to the mechanic can attest.
When I changed jobs back to developer, I was spoiled for several years by having The Sys. Admin. Who Walks on Water there. But for my own projects, particularly after going freelance, I'm pretty much on my own. So it was today after I was fresh off a status call with a client. We'd both noticed that there'd been no actionable traffic to/from his web app. That's weird for a Monday. But then again, it's a slow week in the U.S. due to the Thanksgiving holiday.
Or so I rationalised.
For a short while.
Inevitably, paranoia got the best of me, so I logged in to peek at the database. Sure enough, data was still being crunched; it's just that nothing had tripped the required threshold. So I emailed the client to let him know that, so far as I could see, everything was cool.
Not fifteen minutes later, the app. spit out a couple of emails indicating action items.
It was pure coincidence, of course. (No, really. Pinky-swear.) Yet the human mind could easily translate the juxtaposition of me telling my client that everything was cool and the sudden appearance of app.-generated emails into a cause-and-effect relationship.
Technically, that's synchronicity.
But--is that necessarily a bad thing? From an outside perspective, I only had to log in, barely poke around, and the inscrutable Server Gods blessed the client with a couple of emails. Magic! w00t! Five points for Hufflepuff!
Problem is, the root of magic is the audience seeing an action (or set of actions) result in something seemingly impossible...or at least counter-intuitive. In the absence of complete information about inner workings, folks will construct their own narrative. Professionally, the magician has two jobs: 1.) Conceal the actual process between the action(s) and results, which includes 2.) Preventing the audience from forming unwanted hypotheses about cause and effect.
But since the days when we stared into the darkness outside the firelight in hope that the darkness wasn't staring back at us, our species has mastered few skills quite like narrative-generation. (Which probably explains why statistics--more honoured in the misuse than the use--have a bad name.) Thus, one person's magician is another's charlatan--or, worse, practitioner of the Dark Arts.
In my case, my client could suspect that I quietly fixed some bug under the guise of "sanity-checking" that the app. hadn't stalled out. And, in the face of suspiciously close timing, I couldn't in fairness call that unreasonable.
Right now I'm trusting to nearly a year and a half's work with said client that he doesn't, in fact, suspect me of server-side slight-of-hand. Mind you, I do still occasionally take joy in finding the magic in what I do for a living. But I know that I'll never have the marketing chops to peddle it. Then again, if I can earn that kind of trust from someone with a very different skill-set, that's a higher form of magic than anything I could coax from a compiler, no?
Let me back up for a bit and first define what I mean by "Sys. Admin. karma." Let's say you work in an office environment and your computer is, for lack of a better term, "being stupid." Maybe you've already rebooted, or maybe that would throw the proverbial monkey-wrench into your current workflow. Either way, you're hosed, and it's time to call in someone whose job it is to un-hose you.
Back in the Day(TM), in another country, in another industry, that would have been me...when I wasn't babysitting servers or refurbishing workstations for the new folks being shoehorned into a rapidly-expanding staff. Now, my office was tucked away from most of everyone else--probably because I shared it with four servers and, hoo-boy, were they loud. So by the time I'd crossed my floor to the stairwell and trotted over to the far end of the lower floor, the problem had a good chance of fixing itself. Memory/CPU usage had stopped spiking, a file lock had been relinquished, whatever.
Being Upper Midwesterners, my co-workers would typically apologise profusely for "bugging" me, typically after swearing up and down that the problem had been there just a minute ago, really-and-for-true.
That's Sys. Admin. karma. The phenomenon is not limited to I/T of course--as anyone who has had their car's disconcerting squeak/rattle disappear on the way to the mechanic can attest.
When I changed jobs back to developer, I was spoiled for several years by having The Sys. Admin. Who Walks on Water there. But for my own projects, particularly after going freelance, I'm pretty much on my own. So it was today after I was fresh off a status call with a client. We'd both noticed that there'd been no actionable traffic to/from his web app. That's weird for a Monday. But then again, it's a slow week in the U.S. due to the Thanksgiving holiday.
Or so I rationalised.
For a short while.
Inevitably, paranoia got the best of me, so I logged in to peek at the database. Sure enough, data was still being crunched; it's just that nothing had tripped the required threshold. So I emailed the client to let him know that, so far as I could see, everything was cool.
Not fifteen minutes later, the app. spit out a couple of emails indicating action items.
It was pure coincidence, of course. (No, really. Pinky-swear.) Yet the human mind could easily translate the juxtaposition of me telling my client that everything was cool and the sudden appearance of app.-generated emails into a cause-and-effect relationship.
Technically, that's synchronicity.
But--is that necessarily a bad thing? From an outside perspective, I only had to log in, barely poke around, and the inscrutable Server Gods blessed the client with a couple of emails. Magic! w00t! Five points for Hufflepuff!
Problem is, the root of magic is the audience seeing an action (or set of actions) result in something seemingly impossible...or at least counter-intuitive. In the absence of complete information about inner workings, folks will construct their own narrative. Professionally, the magician has two jobs: 1.) Conceal the actual process between the action(s) and results, which includes 2.) Preventing the audience from forming unwanted hypotheses about cause and effect.
But since the days when we stared into the darkness outside the firelight in hope that the darkness wasn't staring back at us, our species has mastered few skills quite like narrative-generation. (Which probably explains why statistics--more honoured in the misuse than the use--have a bad name.) Thus, one person's magician is another's charlatan--or, worse, practitioner of the Dark Arts.
In my case, my client could suspect that I quietly fixed some bug under the guise of "sanity-checking" that the app. hadn't stalled out. And, in the face of suspiciously close timing, I couldn't in fairness call that unreasonable.
Right now I'm trusting to nearly a year and a half's work with said client that he doesn't, in fact, suspect me of server-side slight-of-hand. Mind you, I do still occasionally take joy in finding the magic in what I do for a living. But I know that I'll never have the marketing chops to peddle it. Then again, if I can earn that kind of trust from someone with a very different skill-set, that's a higher form of magic than anything I could coax from a compiler, no?
Friday, November 21, 2014
Frivolous Friday, 2014.11.21: Maslow's Hierarchy for programmers
amirite?
- - - - -
* If you weren't required to take something like Psych. 101 as part of your GenEd. requirements, here's the original: http://www.simplypsychology.org/maslow.html
/ \
/ \
/ \
/ \
/ \
/ \
/ Source \
/ Control \
-------------------
/ Debugger \
-----------------------
/ Runtime \
----------------------------
/ Compiler \
--------------------------------
/ Code Editor \
------------------------------------
* If you weren't required to take something like Psych. 101 as part of your GenEd. requirements, here's the original: http://www.simplypsychology.org/maslow.html
Wednesday, November 19, 2014
Working "unplugged"
So there's been a bunch of chatter in recent years about reducing distraction at work...at least when we're not being all collaborative-y, getting loopy on dry-eraser fumes and all that knowledge-pollen we're sharing while singing "Kumbaya" in some corporate war-room.
Turning off your cell phone, powering down the IM client, signing out of social media, even checking out of the digital Hotel California otherwise known as email--that's how I usually understand "unplugged" working.
But what would happen if we also pulled the plug on our workstations?(Assuming that they can run off batteries, of course--regular PCs don't take kindly to that sort of thing.) Human value-systems shift drastically when something previously taken for granted becomes scarce. I can't imagine that electrical current is any different.
I, working on this 2009-vintage Dell would be lucky to see half the battery life of, say, a new-ish Macbook Air. But...would that make a difference in productivity? That's the interesting question.
Maybe, when we know that we need to go heads-down on some chunk of work, we should think about turning that battery indicator into a hourglass. (And, yes, I'm thinking about that scene from The Wizard of Oz.) Scarcity clarifies...and while not the mother of invention, is frequently is the midwife.
Turning off your cell phone, powering down the IM client, signing out of social media, even checking out of the digital Hotel California otherwise known as email--that's how I usually understand "unplugged" working.
But what would happen if we also pulled the plug on our workstations?(Assuming that they can run off batteries, of course--regular PCs don't take kindly to that sort of thing.) Human value-systems shift drastically when something previously taken for granted becomes scarce. I can't imagine that electrical current is any different.
I, working on this 2009-vintage Dell would be lucky to see half the battery life of, say, a new-ish Macbook Air. But...would that make a difference in productivity? That's the interesting question.
Maybe, when we know that we need to go heads-down on some chunk of work, we should think about turning that battery indicator into a hourglass. (And, yes, I'm thinking about that scene from The Wizard of Oz.) Scarcity clarifies...and while not the mother of invention, is frequently is the midwife.
Monday, November 17, 2014
Yin and Yang, web style
I don't know whether there's a French equivalent, but in my little corner of l'Acadie, I've been living the web development equivalent of the English adage that "the cobbler's children have no shoes." Meaning that I have yet to put any content out on the business domain I've been hosting for over two and a half years.
That's finally changing. Anyone who knows my "business" Twitter account (bonaventuresoft) recognises the sailing ship avatar. It's a nod to the ships so closely entwined with New Brunswick's history. (I just want to go on record now as saying that NB has the coolest flag of any of the Canadian provinces/territories. Sorry, everybody else.)
But I digress.
To me, the sailing ship evokes the 18th century, and its Colonial-era aesthetic here on the East Coast, so I was trying to capture that look and feel as much as feasible for the eventual home of the bits that will make up www.bonaventuresoftware.ca. Obviously, I don't want to completely replicate the newspapers and almanacs of those times in every respect. Multi-column layouts and packed text is antithetical to the web browser experience. And--for cryin' in yer syllabub--keep that stupid "s" that looks like an "f" out of the 21st century.
Now, calligraphy is one of my nerderies; alas, my grasp of its fundamentals is only applicable up to about the year 1600 or so. But of the several favours I owe Tantramar Interactive, one is pointing me to the correct font-family for the job.
In Design is a Job, Mike Monteiro points out that the increasing sophistication of web browsers (and their sorta-kinda-maybe-lip-service-commitment to W3C standards) renews the demand for skills of traditional print-oriented graphic artists. In particular, programmers who code solely for newer browsers have enough fine-grained control over layout, typography, gradients, transparency, etc. And goodbye to "Any font you like, as long as it's Arial or Roman." (Woo-hoo!)
Which is great news for those of us who use the web day-in and day-out. For those of us who make things for the web, but do not have a graphics background, not so much. And thus much of my time on this project has been spent fighting small skirmishes with things like indentation. And tilting at windmills like aligning the Roman numerals used in a list.
(This is why I'm always happy to hand layout work off to web designers/developers who actually grok cascading stylesheets (CSS). And give them ample kudos for not only understanding the difference between box and inline flow, but also knowing where the sharp edges of each version of each browser are.)
But it's not all slogging on this project. A closer look at the Atom syndication of this blog handed me an unexpected gift, namely that it includes the labels (e.g. "Software Development," "Innovation," etc.) that I can optionally apply to any blog post. That gives me a way to automatically cross-post specific topics to the blog. (In a shocking development, potential clients may not actually base their purchasing decisions on a freelancer's ability to filk tunes or scour the internet for Elvis photos. Or even for their world-class sarcasm. Who knew?)
What makes that possible is that, while HTML tags (and their attributes) are used by web developers to format content, Atom feeds (like their RSS forebears) use tags to give meaning to content. (Yes, I know that, originally, HTML was intended to denote things like paragraphs, headings, tables, etc. It didn't take too long before that was honoured more in the breach than the observance. Now, with a little CSS, you can created lists that run horizontally rather than vertically.)
Thus, in the context of one very minor project, it's hard to ignore the two directions in which the web has been--and will continue to be--pulled. The left brain prefers XML content that is what it says it is (titles, publication dates, etc.). The right brain wants HTML5 that's pretty and friendly to any number of screen sizes.
And while I'm fairly disciplined about marking up my HTML to make it XML-like, I know by now that the promise of XHTML will never be realised. But maybe convergence isn't the point. Maybe it's a yin and yang thing...and maybe that's also fine. Because the software trade needs both left and right brains.
That's finally changing. Anyone who knows my "business" Twitter account (bonaventuresoft) recognises the sailing ship avatar. It's a nod to the ships so closely entwined with New Brunswick's history. (I just want to go on record now as saying that NB has the coolest flag of any of the Canadian provinces/territories. Sorry, everybody else.)
But I digress.
To me, the sailing ship evokes the 18th century, and its Colonial-era aesthetic here on the East Coast, so I was trying to capture that look and feel as much as feasible for the eventual home of the bits that will make up www.bonaventuresoftware.ca. Obviously, I don't want to completely replicate the newspapers and almanacs of those times in every respect. Multi-column layouts and packed text is antithetical to the web browser experience. And--for cryin' in yer syllabub--keep that stupid "s" that looks like an "f" out of the 21st century.
Now, calligraphy is one of my nerderies; alas, my grasp of its fundamentals is only applicable up to about the year 1600 or so. But of the several favours I owe Tantramar Interactive, one is pointing me to the correct font-family for the job.
In Design is a Job, Mike Monteiro points out that the increasing sophistication of web browsers (and their sorta-kinda-maybe-lip-service-commitment to W3C standards) renews the demand for skills of traditional print-oriented graphic artists. In particular, programmers who code solely for newer browsers have enough fine-grained control over layout, typography, gradients, transparency, etc. And goodbye to "Any font you like, as long as it's Arial or Roman." (Woo-hoo!)
Which is great news for those of us who use the web day-in and day-out. For those of us who make things for the web, but do not have a graphics background, not so much. And thus much of my time on this project has been spent fighting small skirmishes with things like indentation. And tilting at windmills like aligning the Roman numerals used in a list.
(This is why I'm always happy to hand layout work off to web designers/developers who actually grok cascading stylesheets (CSS). And give them ample kudos for not only understanding the difference between box and inline flow, but also knowing where the sharp edges of each version of each browser are.)
But it's not all slogging on this project. A closer look at the Atom syndication of this blog handed me an unexpected gift, namely that it includes the labels (e.g. "Software Development," "Innovation," etc.) that I can optionally apply to any blog post. That gives me a way to automatically cross-post specific topics to the blog. (In a shocking development, potential clients may not actually base their purchasing decisions on a freelancer's ability to filk tunes or scour the internet for Elvis photos. Or even for their world-class sarcasm. Who knew?)
What makes that possible is that, while HTML tags (and their attributes) are used by web developers to format content, Atom feeds (like their RSS forebears) use tags to give meaning to content. (Yes, I know that, originally, HTML was intended to denote things like paragraphs, headings, tables, etc. It didn't take too long before that was honoured more in the breach than the observance. Now, with a little CSS, you can created lists that run horizontally rather than vertically.)
Thus, in the context of one very minor project, it's hard to ignore the two directions in which the web has been--and will continue to be--pulled. The left brain prefers XML content that is what it says it is (titles, publication dates, etc.). The right brain wants HTML5 that's pretty and friendly to any number of screen sizes.
And while I'm fairly disciplined about marking up my HTML to make it XML-like, I know by now that the promise of XHTML will never be realised. But maybe convergence isn't the point. Maybe it's a yin and yang thing...and maybe that's also fine. Because the software trade needs both left and right brains.
Friday, November 14, 2014
Frivolous Friday, 2014.11.14: Live-blogging my NADD* navel-gazing
In software development, there's a term known as "yak-shaving," which refers to all the time-consuming stuff that you didn't budget time for doing before you could get down to the serious business of coding. Or maybe debugging. Scott Hanselman's definition is the most cited.
Today, my client and I are basically in evaluation mode for code that's been rolled out in pieces over the last few weeks. So far--knock on wood--nothing's gone kerblooey, so I set aside a "half-day" to get my house in order for upcoming development projects.
W00t! New dev. tools! It's Christmas morning!
... But first, I really should install all those system/software patches to which I've mostly been giving lip service (if that).
... But there are some kernel-level updates for the Debian laptop (my primary computer for development and emailing clients)--it'd be a good idea to back up email first. In two places, because this is mission-critical stuff. Okay, patched and turned off.
... But the brand-new Windows 7 installation is complaining that the standalone (not OEM) copy of Windows is not valid. Some updates fail. So does the online attempt to prove to MS that my copy came from Best Buy and not the back of a windowless van. So I dig out the DVDs and re-type the activation code...no thanks a certain Office Cat #1 who shall remain nameless. Reboot, lather, rinse, reboot.
... Ubuntu workstation updated w/o any static. Good baby. [pats top of case]
... But the Windows 8 laptop is not prompting me to download and install updates, despite being off for well over a week. Okay, where are they hiding update in Windows 8? Found it. Go do something else while those downloads take their sweet time to download. Install. Reboot, which installs more patches. Okay, you do that, Windows 8. Sure, let's take Windows Defender out for a run while we're at it.
... So...time to turn this shiny Win 7 installation into a real software development box. Geany? Check. Mercurial? Check. NetBeans? Whoops--I need the JDK first. Which fails on the first download. Try again. Okay. Now NetBeans. Cool.
... But out of the box, a lot of Geany's default settings are the polar opposite of my preferences. Go away, message window and sidebar. 80 character line wrapping, please. Show white-space, and tab-indent at three spaces, thank you. That will be all for now, Windows7.
... But last night I learned that Microsoft had released a new freebie version of Visual Studio. So let's figure out how to enable IIS on Windows 8 (no biggie). Microsoft wants you to create an account before they'll give you free stuff. Fair enough. But, no, Dennis doesn't have a login I can borrow. Dig out the password file to make sure I don't have one. Go to account page; try to come up with an ugly but memorable (to me) password.
... Knowing how much Microsoft will email me, I select my usual spam email. Sign in to said email account to activate account. Decide that as long as I'm at it, I should delete all the cron job emails that landed when I was testing an enhancement last week. Okay, now whack a bunch of other automated emails. Account activated.
... But there's very little point in IIS + Visual Studio w/o a database, so off to find the download for the freebie edition of SQL Server, particularly since I'm already logged into my Microsoft account.
... SQL Server download is lickety-split. Expected downloaded time for Visual Studio climbs to over 24 hours. Kill that. Crud. I already closed the download window in the browser. Go find it again.
... Waitaminnit...isn't Firefox supposed to be dumping my history & cookies on every shutdown? Go check settings on that.
... Dang. I should have thought of the Visual Studio / SQL Server thing before I turned off the Windows 7 box. Turn that box back on. (Have I mentioned that I heart KVM switches?)
... Whoops, except that I heard Windows 7 make the startup beep, which means it rebooted itself, which requires me to (quickly) switch back to it on the KVM or it won't pick up the keyboard, mouse, and monitor and I'll have to crash it. Okay, as long as I'm here, I might as well log in with my shiny new Microsoft account and get my freebies. Some runaround from SQL Server...maybe I should have just sneakernetted the executables from Windows 8?
... Um, no, I will not wait six days for a 6.x GB file that you're streaming through a digital eyedropper. Yeah, sneakernet...something's not right.
... So....what, precisely, just happened after I installed what's supposed to be SQL Server 2014 Express? There's absolutely nothing in Program Files, and I can't even find its daemon in Services. Fine. Uninstall, and hopefully I didn't clutter up the registry too badly. Back over to Windows 8 to figure out what's going on here.
... Oh, for the love of Cthulu, how many updates does Windows 7 need to install? Guess we're shutting down. Again.
... Time for dinner now. (Yeah, technically, I'm just sharpening my razor and not shaving the yak in this step.)
... Okay, back at it. Cool. Now, where were we? Right--figuring out what I actually installed when I thought I was installing SQL Server Express 2014. Ah. Got it. Don't take the default options. Let's try this again. No, you can't contact me at my business phone number, Microsoft. You're on the West Coast; you don't even know my time zone exists.
... Oooof...this is going to take awhile to download. Maybe this would be a good time to snag jQuery & jQuery Mobile, maybe even see what we can do about setting up for Apache Cordova development, since that looks like it will be in the cards shortly.
... jQuery Mobile was a almost a no-brainer except for Windows freaking out about unzipping a file in the inetpub folder. Now download and install Node.js (crazy-simple) and Git for Windows (because Cordova uses both under the hood).
... Oh, there's a free eBook for Git? Groovy. Gimme summa that goodness. Fetch the tablet and cable and import the .EPUB into the Aldiko eReader. (Bonus: This is Windows -> Android. Ergo, simple little file copy.)
... Command line...npm install -g cordova. Oh, fun little retro touch of -/|\ spinner! Totally brings me back to the 90s. (Good times, the 90s...except for the part about graduating as a Liberal Arts major into that pre-dot-com recession. But, hey, I could eat half a pan of brownies and burn off the calories drinking a pot of coffee.)
... Oh! Looks like SQL Server Express (the bells-and-whistles version) is done downloading. Let's use the Win 7 box as the guinea-pig on this. Copy to USB drive...eject...copy to Win7. Mostly take the defaults while installing. Aaaaaannnnd wait...
... Meanwhile, back on Windows 8, let's snag MySQL and MySQL Workbench. Oh, Workbench can be installed alongside MySQL. Ossum. Let's do that.
... Or not. Visual Studio (still downloading...allegedly for another hour to go) has MySQL connectors, and MySQL knows this. It also wants Excel (not gonna happen) and Python (that we can do). Long story short, though, this isn't going to all happen tonight.
... Checking in with Windows 7, it's finished installing SQL Server Express 2014...plus a piece of SQL Server 2008. It's taking awhile to launch, which I put down to initialisation issues--and toggle back to Windows 8 to install Python.
... While Python is installing, peek in at Windows 7, and find that SQL Server Management Studio will at least launch--if slowly. Elect to install yet another round of updates (106 of them, as it turns out--I am not making this up) and shut that machine down for the night.
... The Visual Studio installer still--allegedly--has a half-hour to go.
And now it's nearly 11:00 and it doesn't look like I'll get to rebuilding a Raspberry Pi (which--my bad--I more or less rooted b/c I was misinformed about the permissions one needs to patch Raspbian) will have to wait until later this weekend. Determining how well MacOS will run in a VMWare instance on Ubuntu is looking like it might even have to wait until next weekend--assuming I can finagle a legit copy, of course. (Dirty secret: To download a copy from The Mac Store, you have to use, well, a Mac. Open-source, commodity hardware hippies like m'self have to do a little horse-trading, y'understand...)
A small part of me desperately misses the SysAdmin Who Walks on Water. But the vast majority of me fully appreciates what an amazing time it is to be a developer here in the developed world. As I hope that the above (bit)stream-of-consciousness fully demonstrates, the only real problem is the embarrassment of riches one has at the other end of one's broadband connection.
And I would be remiss if I didn't credit the commercial software behemoths for what they contribute to the ecosystem. Microsoft is mentioned above...a lot...but Oracle--miraculously, despite every incentive--has yet to kill off MySQL. Java/NetBeans are still free-as-in-beer in 2014 (also thanks to Oracle's noblesse oblige). Apple has loosened the screws--a bit--on how one can generate the bits for an iOS app. One hopes they will eventually have no choice but to come back to ground in other respects--particularly if they don't stop treating Mac developers like untermenchen.
- - - - -
* NADD is a term coined by (the oft-quoted) author Michael Lopp, and stands for "Nerd Attention Deficit Disorder." NADD stands in surreal contrast to the monomaniacal state of concentration we geeks are known to achieve when debugging or taking a firehose of interesting data straight to the brain.
Today, my client and I are basically in evaluation mode for code that's been rolled out in pieces over the last few weeks. So far--knock on wood--nothing's gone kerblooey, so I set aside a "half-day" to get my house in order for upcoming development projects.
W00t! New dev. tools! It's Christmas morning!
... But first, I really should install all those system/software patches to which I've mostly been giving lip service (if that).
... But there are some kernel-level updates for the Debian laptop (my primary computer for development and emailing clients)--it'd be a good idea to back up email first. In two places, because this is mission-critical stuff. Okay, patched and turned off.
... But the brand-new Windows 7 installation is complaining that the standalone (not OEM) copy of Windows is not valid. Some updates fail. So does the online attempt to prove to MS that my copy came from Best Buy and not the back of a windowless van. So I dig out the DVDs and re-type the activation code...no thanks a certain Office Cat #1 who shall remain nameless. Reboot, lather, rinse, reboot.
... Ubuntu workstation updated w/o any static. Good baby. [pats top of case]
... But the Windows 8 laptop is not prompting me to download and install updates, despite being off for well over a week. Okay, where are they hiding update in Windows 8? Found it. Go do something else while those downloads take their sweet time to download. Install. Reboot, which installs more patches. Okay, you do that, Windows 8. Sure, let's take Windows Defender out for a run while we're at it.
... So...time to turn this shiny Win 7 installation into a real software development box. Geany? Check. Mercurial? Check. NetBeans? Whoops--I need the JDK first. Which fails on the first download. Try again. Okay. Now NetBeans. Cool.
... But out of the box, a lot of Geany's default settings are the polar opposite of my preferences. Go away, message window and sidebar. 80 character line wrapping, please. Show white-space, and tab-indent at three spaces, thank you. That will be all for now, Windows7.
... But last night I learned that Microsoft had released a new freebie version of Visual Studio. So let's figure out how to enable IIS on Windows 8 (no biggie). Microsoft wants you to create an account before they'll give you free stuff. Fair enough. But, no, Dennis doesn't have a login I can borrow. Dig out the password file to make sure I don't have one. Go to account page; try to come up with an ugly but memorable (to me) password.
... Knowing how much Microsoft will email me, I select my usual spam email. Sign in to said email account to activate account. Decide that as long as I'm at it, I should delete all the cron job emails that landed when I was testing an enhancement last week. Okay, now whack a bunch of other automated emails. Account activated.
... But there's very little point in IIS + Visual Studio w/o a database, so off to find the download for the freebie edition of SQL Server, particularly since I'm already logged into my Microsoft account.
... SQL Server download is lickety-split. Expected downloaded time for Visual Studio climbs to over 24 hours. Kill that. Crud. I already closed the download window in the browser. Go find it again.
... Waitaminnit...isn't Firefox supposed to be dumping my history & cookies on every shutdown? Go check settings on that.
... Dang. I should have thought of the Visual Studio / SQL Server thing before I turned off the Windows 7 box. Turn that box back on. (Have I mentioned that I heart KVM switches?)
... Whoops, except that I heard Windows 7 make the startup beep, which means it rebooted itself, which requires me to (quickly) switch back to it on the KVM or it won't pick up the keyboard, mouse, and monitor and I'll have to crash it. Okay, as long as I'm here, I might as well log in with my shiny new Microsoft account and get my freebies. Some runaround from SQL Server...maybe I should have just sneakernetted the executables from Windows 8?
... Um, no, I will not wait six days for a 6.x GB file that you're streaming through a digital eyedropper. Yeah, sneakernet...something's not right.
... So....what, precisely, just happened after I installed what's supposed to be SQL Server 2014 Express? There's absolutely nothing in Program Files, and I can't even find its daemon in Services. Fine. Uninstall, and hopefully I didn't clutter up the registry too badly. Back over to Windows 8 to figure out what's going on here.
... Oh, for the love of Cthulu, how many updates does Windows 7 need to install? Guess we're shutting down. Again.
... Time for dinner now. (Yeah, technically, I'm just sharpening my razor and not shaving the yak in this step.)
... Okay, back at it. Cool. Now, where were we? Right--figuring out what I actually installed when I thought I was installing SQL Server Express 2014. Ah. Got it. Don't take the default options. Let's try this again. No, you can't contact me at my business phone number, Microsoft. You're on the West Coast; you don't even know my time zone exists.
... Oooof...this is going to take awhile to download. Maybe this would be a good time to snag jQuery & jQuery Mobile, maybe even see what we can do about setting up for Apache Cordova development, since that looks like it will be in the cards shortly.
... jQuery Mobile was a almost a no-brainer except for Windows freaking out about unzipping a file in the inetpub folder. Now download and install Node.js (crazy-simple) and Git for Windows (because Cordova uses both under the hood).
... Oh, there's a free eBook for Git? Groovy. Gimme summa that goodness. Fetch the tablet and cable and import the .EPUB into the Aldiko eReader. (Bonus: This is Windows -> Android. Ergo, simple little file copy.)
... Command line...npm install -g cordova. Oh, fun little retro touch of -/|\ spinner! Totally brings me back to the 90s. (Good times, the 90s...except for the part about graduating as a Liberal Arts major into that pre-dot-com recession. But, hey, I could eat half a pan of brownies and burn off the calories drinking a pot of coffee.)
... Oh! Looks like SQL Server Express (the bells-and-whistles version) is done downloading. Let's use the Win 7 box as the guinea-pig on this. Copy to USB drive...eject...copy to Win7. Mostly take the defaults while installing. Aaaaaannnnd wait...
... Meanwhile, back on Windows 8, let's snag MySQL and MySQL Workbench. Oh, Workbench can be installed alongside MySQL. Ossum. Let's do that.
... Or not. Visual Studio (still downloading...allegedly for another hour to go) has MySQL connectors, and MySQL knows this. It also wants Excel (not gonna happen) and Python (that we can do). Long story short, though, this isn't going to all happen tonight.
... Checking in with Windows 7, it's finished installing SQL Server Express 2014...plus a piece of SQL Server 2008. It's taking awhile to launch, which I put down to initialisation issues--and toggle back to Windows 8 to install Python.
... While Python is installing, peek in at Windows 7, and find that SQL Server Management Studio will at least launch--if slowly. Elect to install yet another round of updates (106 of them, as it turns out--I am not making this up) and shut that machine down for the night.
... The Visual Studio installer still--allegedly--has a half-hour to go.
And now it's nearly 11:00 and it doesn't look like I'll get to rebuilding a Raspberry Pi (which--my bad--I more or less rooted b/c I was misinformed about the permissions one needs to patch Raspbian) will have to wait until later this weekend. Determining how well MacOS will run in a VMWare instance on Ubuntu is looking like it might even have to wait until next weekend--assuming I can finagle a legit copy, of course. (Dirty secret: To download a copy from The Mac Store, you have to use, well, a Mac. Open-source, commodity hardware hippies like m'self have to do a little horse-trading, y'understand...)
A small part of me desperately misses the SysAdmin Who Walks on Water. But the vast majority of me fully appreciates what an amazing time it is to be a developer here in the developed world. As I hope that the above (bit)stream-of-consciousness fully demonstrates, the only real problem is the embarrassment of riches one has at the other end of one's broadband connection.
And I would be remiss if I didn't credit the commercial software behemoths for what they contribute to the ecosystem. Microsoft is mentioned above...a lot...but Oracle--miraculously, despite every incentive--has yet to kill off MySQL. Java/NetBeans are still free-as-in-beer in 2014 (also thanks to Oracle's noblesse oblige). Apple has loosened the screws--a bit--on how one can generate the bits for an iOS app. One hopes they will eventually have no choice but to come back to ground in other respects--particularly if they don't stop treating Mac developers like untermenchen.
- - - - -
* NADD is a term coined by (the oft-quoted) author Michael Lopp, and stands for "Nerd Attention Deficit Disorder." NADD stands in surreal contrast to the monomaniacal state of concentration we geeks are known to achieve when debugging or taking a firehose of interesting data straight to the brain.
Tuesday, November 11, 2014
A nerdy Remembrance
I decided last evening to break the Monday-Wednesday-Frivolous-Friday pattern of this blog to make a tech-related post relevant to Remembrance/Veteran's Day. Arguably, drone warfare is the ne plus ultra (and probably the reductio ad absurdam besides) of the earliest military apps--namely, target acquisition. But having already covered that plus some origins of the wireless communication that also make drone strikes possible, I was casting about for a fresh twist on the intersection of military and technological history.
WWII buff husband (and kindred geek soul) to the rescue!
I want to make it absolutely, Waterford-crystal-clear that my intent tonight isn't to glorify military applications of technology. But when Dennis mentioned something called the Norden bomb-sight, I was intrigued. Because, after all, the whole point of the contraption was ultimately to drop fewer bombs overall by having them land on people who (arguably) "deserved" them...as opposed to collateral civilian damage. (Which, in turn, requires fewer missions in which "the good guys" will fly into harm's way.) For that to have a reasonable chance of happening, altitude, airspeed, the vector of the aircraft (relative to the target), the speed and vector of the wind (relative to the aircraft) all have to be taken into account. (Remember that the next time you're assigned quadratic equations by the dozen in Algebra. Also, please be grateful that you don't have to solve them while being shot at.)
What Dennis described over dinner frankly sounded like what you'd see nine months after a gyroscope, a telescope, and a slide-rule all woke up during a hazy weekend in Vegas. That's not too far off in some ways, though later incarnations of the device also plugged into the plane's autopilot and even controlled the bombs' release because its precision was thrown off by human reaction-times.
Not surprisingly, this was top-secret stuff, at least until toward the end of The Good War. Norton-made (and rival Sperry-made) bomb-sights cooled their gears in safes between flights, and were supposed to be destroyed in the event of impending capture--even at the cost of American lives. British allies offered to trade the Crown Jewels--meaning sonar technology, rather than the shiny gew-gaws currently on display in the Tower of London--for its use.
Ultimately, however, it was an evolutionary dead-end in technology. It was sold to the Nazis by spies, but never used by a Luftwaffe that preferred dive-bombing. American bombers eventually adopted the carpet-bombing tactics of their RAF counterparts. So why care? (Well, besides the fact that I'm kind of a sucker for analog computing...though I have yet to learn how to use a slide-rule. Bad me.) Alas, it's also a grand textbook-quality example of a technology's life-cycle.
Then, too, as a programmer--particularly one married to a recovering manufacturing engineer--I flatter myself that I have some appreciation of the problems of scaling something new. Sometimes it seems like the real world is nothing but edge cases. At the same time, once it latches onto something better than the former status quo, it typically relinquishes it only after a bitter fight. I sympathise--really, I do.
Yet, if the Norden example is how old men behave when they send young people into possible death and mayhem (with PTSD, addiction, divorce, homelessness, neglect, labyrinthine bureaucracy, and who-knows-what evils to come), the least we can do for current soldiers and future veterans is to give them better old men (and nowadays, old women).
So do me a favour and keep your poppy handy for the next time you head to the polls, okay? For those of us who don't work directly for veterans/soldiers week-in and week-out (i.e., most of us), that's the only kind of "remembrance" that truly matters.
- - - - -
Bibliography:
WWII buff husband (and kindred geek soul) to the rescue!
I want to make it absolutely, Waterford-crystal-clear that my intent tonight isn't to glorify military applications of technology. But when Dennis mentioned something called the Norden bomb-sight, I was intrigued. Because, after all, the whole point of the contraption was ultimately to drop fewer bombs overall by having them land on people who (arguably) "deserved" them...as opposed to collateral civilian damage. (Which, in turn, requires fewer missions in which "the good guys" will fly into harm's way.) For that to have a reasonable chance of happening, altitude, airspeed, the vector of the aircraft (relative to the target), the speed and vector of the wind (relative to the aircraft) all have to be taken into account. (Remember that the next time you're assigned quadratic equations by the dozen in Algebra. Also, please be grateful that you don't have to solve them while being shot at.)
What Dennis described over dinner frankly sounded like what you'd see nine months after a gyroscope, a telescope, and a slide-rule all woke up during a hazy weekend in Vegas. That's not too far off in some ways, though later incarnations of the device also plugged into the plane's autopilot and even controlled the bombs' release because its precision was thrown off by human reaction-times.
Not surprisingly, this was top-secret stuff, at least until toward the end of The Good War. Norton-made (and rival Sperry-made) bomb-sights cooled their gears in safes between flights, and were supposed to be destroyed in the event of impending capture--even at the cost of American lives. British allies offered to trade the Crown Jewels--meaning sonar technology, rather than the shiny gew-gaws currently on display in the Tower of London--for its use.
Ultimately, however, it was an evolutionary dead-end in technology. It was sold to the Nazis by spies, but never used by a Luftwaffe that preferred dive-bombing. American bombers eventually adopted the carpet-bombing tactics of their RAF counterparts. So why care? (Well, besides the fact that I'm kind of a sucker for analog computing...though I have yet to learn how to use a slide-rule. Bad me.) Alas, it's also a grand textbook-quality example of a technology's life-cycle.
- Usability issues? Check. Earlier versions of the device almost required an advanced degree in Mathematics...and pure math nerds could arguably be more useful at Bletchley Park or Nevada. (To its credit, such issues were addressed in future iterations.)
- Prima-donna egos? Check. Its eponymous developer, Carl Norden, had originally worked with his future rival Elmer Sperry, but the two geniuses had parted ways before the onset of the First World World War.
- Over-hyped promises that didn't hold up under field conditions? Check. Jet-stream winds, cloud/smog cover, higher-than-anticipated altitudes (required to avoid detection and anti-aircraft fire) and a host of other issues put the lie to marketing claims of dropping bombs into "pickle-barrels," and users were forced to develop workarounds (literally) on-the-fly. (Worse, failures have been too often blamed on operator error. Oh-so not cool, yo.)
- Engineering vs. Management? Check. Mercifully, the Navy paired up Army Col. Theodore Barth as yin to Norden's yang. The two became not only a formidable combination but good friends.
- Politics? Check, check, and check. Army vs. Navy. U.S. vs. U.K. Not to mention Round II of Sperry vs. Norden, when the former was called on to take up the slack in the latter's ability to keep up with wartime demand.
- Prolonged obsolescence due to bureaucratic inertia? Check. When last sighted (pun intended) Norton bomb-sights were dropping reconnaissance equipment in Vietnam.
Then, too, as a programmer--particularly one married to a recovering manufacturing engineer--I flatter myself that I have some appreciation of the problems of scaling something new. Sometimes it seems like the real world is nothing but edge cases. At the same time, once it latches onto something better than the former status quo, it typically relinquishes it only after a bitter fight. I sympathise--really, I do.
Yet, if the Norden example is how old men behave when they send young people into possible death and mayhem (with PTSD, addiction, divorce, homelessness, neglect, labyrinthine bureaucracy, and who-knows-what evils to come), the least we can do for current soldiers and future veterans is to give them better old men (and nowadays, old women).
So do me a favour and keep your poppy handy for the next time you head to the polls, okay? For those of us who don't work directly for veterans/soldiers week-in and week-out (i.e., most of us), that's the only kind of "remembrance" that truly matters.
- - - - -
Bibliography:
- skylighters.com: http://www.skylighters.org/encyclopedia/norden.html
- Wikipedia: https://en.wikipedia.org/wiki/Norden_bombsight
- twinbeech.com: http://www.twinbeech.com/norden_bombsight.htm (Lots of photos + a contrarian view here.)
Monday, November 10, 2014
Software innovation, security, and the chain of plus ça change
I've been thinking of sending a client a copy of Geoffrey Moore's Crossing the Chasm to give him an inside perspective in launching a new software offering. Whenever I do that sort of thing, though, I always re-read the book myself, in case it comes up in discussion. It's a fantastic book--don't get me wrong--but it's making me grind my teeth because my copy is the 2nd edition from 1998. That the company/product references are stale isn't so bad--c'mon, I'm a History grad. It's the feeling that I might be missing insights relevant to our broadband, mobile-driven, social media phase of the internet age.
Moore's non-tech references have sent me scurrying out to Wikipedia a few times so far. One of those references was a quote by Willie Sutton, a prolific--though gentlemanly--bank-robber of the mid 20th century. One of Sutton's nick-names was "the actor," because his preferred M.O. was impersonating people who would have a legitimate reason to be in the bank, jewelry store, etc. as a non-customer. Additionally, one of his prison escapes involved dressing as a guard. The true brazenness of that escape was in how, when he and his fellow escapees were caught in the glare of a searchlight as they were putting a ladder against the wall, Sutton shouted, "It's okay!" and the gang was allowed on its merry way.
Sutton caught my interest not because of his apocryphal quote, but because of his later career as a security consultant, writer, and general casher-in on his notoriety. Toward the end of his life, he was even featured in advertisements for photo-IDed MasterCards, which (tongue-in-cheek) advised bank customers to "tell them Willie sent you."
It was impossible to miss the parallels with the only slightly less flamboyant Kevin Mitnick, over whom the media and law enforcement of the Clinton Administration worked themselves into a hysterical lather*.
Mitnick repeatedly stresses that his "successes" were more due to social engineering than software engineering. To quote an interview with CNN:
In other words, the art of impersonation strikes again. Also like Sutton, Mitnick's career after "going straight" (in the parlance of gangster movies) involves hiring out his expertise to people who want to defend themselves against people just like him. And, of course, writing books.
Which--in the cluttered curiosity shop I fondly call a memory--calls to mind parallels even further afield in time and space. My Gentle Reader will not be shocked to learn that the "father of modern criminology" and history's first private detective was a reformed criminal. (Also unsurprising: Vidoq's appeal for storytellers and novelists, which presumably didn't dent the sales of his own ghost-written autobiography.)
Then, too, in this part of Maritimes Canada, I only have to drive a few hours to view the remains of 17th- and 18th-century star forts in various states of preservation/restoration. The star fort has its origins in the 15th century (as a response to the innovation of cannon). But the example of Fort Anne in Annapolis Royal, Nova Scotia brings to memory the name of the Marquis de Vauban. Vauban's career under Louis XIV was doubtless capped by his gig as Marshal of France. But that career was made as an expert in both breaking and defending such fortifications. (In other words, he was a one-man arms race. I'm sort of shocked that he didn't write an autobiography, too.)
Doubtless, My Lord de Vauban would strongly object to being compared with the above rogues, however they might have redeemed themselves to society. Yet the parallel is undeniably apt, even for an age defended by earthen walls rather than firewalls. The best defender is an accomplished (though hopefully reformed) offender, it seems.
Long--and meandering--story short, I'm probably fretting needlessly about missing any new insights on ideas that have been relevant since 1990 (when Crossing the Chasm was first published). As we've seen, very rarely is there anything truly new under the proverbial sun. But, hey, as long as I'm already making a trip to the bookstore anyway...
- - - - -
* "While in Federal custody, authorities even placed Mitnick in solitary confinement; reportedly, he was deemed so dangerous that if allowed access to a telephone he could start a nuclear war by just whistling into it." - Forbes. 2013.04.11
Moore's non-tech references have sent me scurrying out to Wikipedia a few times so far. One of those references was a quote by Willie Sutton, a prolific--though gentlemanly--bank-robber of the mid 20th century. One of Sutton's nick-names was "the actor," because his preferred M.O. was impersonating people who would have a legitimate reason to be in the bank, jewelry store, etc. as a non-customer. Additionally, one of his prison escapes involved dressing as a guard. The true brazenness of that escape was in how, when he and his fellow escapees were caught in the glare of a searchlight as they were putting a ladder against the wall, Sutton shouted, "It's okay!" and the gang was allowed on its merry way.
Sutton caught my interest not because of his apocryphal quote, but because of his later career as a security consultant, writer, and general casher-in on his notoriety. Toward the end of his life, he was even featured in advertisements for photo-IDed MasterCards, which (tongue-in-cheek) advised bank customers to "tell them Willie sent you."
It was impossible to miss the parallels with the only slightly less flamboyant Kevin Mitnick, over whom the media and law enforcement of the Clinton Administration worked themselves into a hysterical lather*.
Mitnick repeatedly stresses that his "successes" were more due to social engineering than software engineering. To quote an interview with CNN:
"A company can spend hundreds of thousands of dollars on firewalls, intrusion detection systems and encryption and other security technologies, but if an attacker can call one trusted person within the company, and that person complies, and if the attacker gets in, then all that money spent on technology is essentially wasted. It's essentially meaningless."
In other words, the art of impersonation strikes again. Also like Sutton, Mitnick's career after "going straight" (in the parlance of gangster movies) involves hiring out his expertise to people who want to defend themselves against people just like him. And, of course, writing books.
Which--in the cluttered curiosity shop I fondly call a memory--calls to mind parallels even further afield in time and space. My Gentle Reader will not be shocked to learn that the "father of modern criminology" and history's first private detective was a reformed criminal. (Also unsurprising: Vidoq's appeal for storytellers and novelists, which presumably didn't dent the sales of his own ghost-written autobiography.)
Then, too, in this part of Maritimes Canada, I only have to drive a few hours to view the remains of 17th- and 18th-century star forts in various states of preservation/restoration. The star fort has its origins in the 15th century (as a response to the innovation of cannon). But the example of Fort Anne in Annapolis Royal, Nova Scotia brings to memory the name of the Marquis de Vauban. Vauban's career under Louis XIV was doubtless capped by his gig as Marshal of France. But that career was made as an expert in both breaking and defending such fortifications. (In other words, he was a one-man arms race. I'm sort of shocked that he didn't write an autobiography, too.)
Doubtless, My Lord de Vauban would strongly object to being compared with the above rogues, however they might have redeemed themselves to society. Yet the parallel is undeniably apt, even for an age defended by earthen walls rather than firewalls. The best defender is an accomplished (though hopefully reformed) offender, it seems.
Long--and meandering--story short, I'm probably fretting needlessly about missing any new insights on ideas that have been relevant since 1990 (when Crossing the Chasm was first published). As we've seen, very rarely is there anything truly new under the proverbial sun. But, hey, as long as I'm already making a trip to the bookstore anyway...
- - - - -
* "While in Federal custody, authorities even placed Mitnick in solitary confinement; reportedly, he was deemed so dangerous that if allowed access to a telephone he could start a nuclear war by just whistling into it." - Forbes. 2013.04.11
Friday, November 7, 2014
Frivolous Friday, 2014.11.07: What is your computer age?
It's probably a textbook case of priming, but after a Facebook exchange with pal Larry earlier this week, the "What is your real age?" (cough!) "quiz" (cough!) seemed to pop out of my time-line like baby Surinam sea toads hatching from their Mom's back.
Larry was taking exception to the fact that the cringe-worthy use of "literally" by people who really mean "figuratively" is receiving official recognition. Doubtless, the Romans seeing Alaric's Visigoths on the horizon felt much the same.
The English major who inhabits those corners of my soul still perfumed by old books and fresh ink is not unduly concerned. After all, this is the natural order of things. The person who lives where two languages blend smiles and agrees. The History major squawks, "Just be thankful you're statistically likely to live long enough to witness it!"
My inner I/T Geek just rolls her eyes and thinks, "Oh, honey, please."
I'm already feeling d'une certaine age as it is. Granted, I've thus far been spared the horror of hearing my all-time favourite tune from high-school re-booted by a dreckish pop-star/boy-band who won't be around long enough to be filked by Weird Al. But it's bad enough hearing covers of crap that should have stayed buried alongside the first Reagan Administration. (Ditto movies. I mean, seriously-and-for-realz-y'all, was Footloose not actually bad enough the first time around???)
But compared to measuring age by computer advances, that pales to #FFFFFF. Go back to the line in Apollo 13, where Tom Hanks' character talks of "computers that only take up one room." I was born when they were still multi-room. Gordon Moore had already made what must have seemed like pie-in-the-sky predictions about the computing capacity of the future, at least to the specialists in the know.
But advances miniaturisation meant that permanent storage (a.k.a. hard drives) had actually been removable for several years. What's more, you could actually own them instead of renting them from IBM, who needed a cargo plane to ship 5 megabytes to you a decade or so earlier.
My step-sisters had "Pong" in the late 70s, but it wasn't until the (very) early 1980s when the middle school computer lab teacher humoured me by actually letting me store my (admittedly) pathetic attempt at re-creating "Space Invaders" onto cassette tape. Our TRS-80s and TRS-80 IIIs didn't actually have removable storage. For normal programming assignments, you printed out your BASIC program and its output in 9-pin dot-matrix on 15cm wide silvery paper (that picked up fingerprints like nobody's business), stapled it to your flow-chart, and turned off the computer (losing your program for good).
By high school, we had the mercy of Apple IIes and (360 KB) 5.25" floppy drives--i.e. no retyping programs from scratch if you screwed up. And 5.25" floppies it remained through college--CDs were what music came on...if you weren't still copying LPs onto cassette for your Walkman. I carried two of them in my backpack. One was the DOS boot disk, and the other the disk that held both my word processor (PC-Write) and my papers. Later, I schlepped three whole 5.25" floppies. That was after PC-Write freaked out and somehow sucked my Senior project into its Help section. (True story. The tech. in the University lab could only say, "Wow, I've never seen that happen before," and my BFF the MIS major quizzically enquired, "Didn't you have a backup?" and I looked at her like, "What's a backup?" And my boyfriend spent part of Spring Break helping me type it all back in. I married that guy--small wonder, hey?)
Nowadays, I still carry a backpack. It's green, not stressed denim. It's slightly smaller, because I'm not toting tombstone-weight textbooks. Yet it carries the equivalent of over 186,000 5.25" floppy disks. (In case you're wondering, that thumb drive is actually a step-sibling to the one that lives in my purse. So, yes, I have actually learned my lessons about having a backup. Go, me. [eyeroll]) And that's not counting what's archived to cloud drives at speeds known only to science fiction on the high school modems with their cradles for telephone receivers. (Or, for that matter, even in the days when we were using AOL CDs for coffee-coasters.)
So, despite being born into a time that made the first quarter or so of my life oblivious to personal computing, that's now pretty much impossible for folks in the developed world...much less the constant churn it introduces into daily life. And, when you spend your workdays mostly under the hood, setting your clock to the pace of hardware, software, and networking evolution is a sure way to feel ancient in a hot hurry. (And for cryin' in yer Sleeman's don't even think about measuring time by the lifespans of technology companies.)
Fortunately, for anyone who considers I/T a craft or a calling, rather than a ticket to a middle-class lifestyle, it's a wellspring of endless possibility. Perhaps even a fountain of youth for those who opt to drink deeply enough.
Larry was taking exception to the fact that the cringe-worthy use of "literally" by people who really mean "figuratively" is receiving official recognition. Doubtless, the Romans seeing Alaric's Visigoths on the horizon felt much the same.
The English major who inhabits those corners of my soul still perfumed by old books and fresh ink is not unduly concerned. After all, this is the natural order of things. The person who lives where two languages blend smiles and agrees. The History major squawks, "Just be thankful you're statistically likely to live long enough to witness it!"
My inner I/T Geek just rolls her eyes and thinks, "Oh, honey, please."
I'm already feeling d'une certaine age as it is. Granted, I've thus far been spared the horror of hearing my all-time favourite tune from high-school re-booted by a dreckish pop-star/boy-band who won't be around long enough to be filked by Weird Al. But it's bad enough hearing covers of crap that should have stayed buried alongside the first Reagan Administration. (Ditto movies. I mean, seriously-and-for-realz-y'all, was Footloose not actually bad enough the first time around???)
But compared to measuring age by computer advances, that pales to #FFFFFF. Go back to the line in Apollo 13, where Tom Hanks' character talks of "computers that only take up one room." I was born when they were still multi-room. Gordon Moore had already made what must have seemed like pie-in-the-sky predictions about the computing capacity of the future, at least to the specialists in the know.
But advances miniaturisation meant that permanent storage (a.k.a. hard drives) had actually been removable for several years. What's more, you could actually own them instead of renting them from IBM, who needed a cargo plane to ship 5 megabytes to you a decade or so earlier.
My step-sisters had "Pong" in the late 70s, but it wasn't until the (very) early 1980s when the middle school computer lab teacher humoured me by actually letting me store my (admittedly) pathetic attempt at re-creating "Space Invaders" onto cassette tape. Our TRS-80s and TRS-80 IIIs didn't actually have removable storage. For normal programming assignments, you printed out your BASIC program and its output in 9-pin dot-matrix on 15cm wide silvery paper (that picked up fingerprints like nobody's business), stapled it to your flow-chart, and turned off the computer (losing your program for good).
By high school, we had the mercy of Apple IIes and (360 KB) 5.25" floppy drives--i.e. no retyping programs from scratch if you screwed up. And 5.25" floppies it remained through college--CDs were what music came on...if you weren't still copying LPs onto cassette for your Walkman. I carried two of them in my backpack. One was the DOS boot disk, and the other the disk that held both my word processor (PC-Write) and my papers. Later, I schlepped three whole 5.25" floppies. That was after PC-Write freaked out and somehow sucked my Senior project into its Help section. (True story. The tech. in the University lab could only say, "Wow, I've never seen that happen before," and my BFF the MIS major quizzically enquired, "Didn't you have a backup?" and I looked at her like, "What's a backup?" And my boyfriend spent part of Spring Break helping me type it all back in. I married that guy--small wonder, hey?)
Nowadays, I still carry a backpack. It's green, not stressed denim. It's slightly smaller, because I'm not toting tombstone-weight textbooks. Yet it carries the equivalent of over 186,000 5.25" floppy disks. (In case you're wondering, that thumb drive is actually a step-sibling to the one that lives in my purse. So, yes, I have actually learned my lessons about having a backup. Go, me. [eyeroll]) And that's not counting what's archived to cloud drives at speeds known only to science fiction on the high school modems with their cradles for telephone receivers. (Or, for that matter, even in the days when we were using AOL CDs for coffee-coasters.)
So, despite being born into a time that made the first quarter or so of my life oblivious to personal computing, that's now pretty much impossible for folks in the developed world...much less the constant churn it introduces into daily life. And, when you spend your workdays mostly under the hood, setting your clock to the pace of hardware, software, and networking evolution is a sure way to feel ancient in a hot hurry. (And for cryin' in yer Sleeman's don't even think about measuring time by the lifespans of technology companies.)
Fortunately, for anyone who considers I/T a craft or a calling, rather than a ticket to a middle-class lifestyle, it's a wellspring of endless possibility. Perhaps even a fountain of youth for those who opt to drink deeply enough.
Thursday, November 6, 2014
Treating attention as a resource
In "The Reichenbach Fall" episode of the BBC series Sherlock, Moriarty trash-talks, "In the kingdom of locks, the man with the key is King. And, honey, you should see me in a crown!"
In the digital kingdom, locks mostly come in three forms: firewalls, encryption algorithms, and of course username/password combinations. Keeping the baddies' fingerprints off our email addresses, credit card numbers, nude selfies, what-have-you is the point.
But, genius that he was (or is?--we won't know until 2016), Moriarty didn't mention another species of baddie to whom locks were also immaterial. Namely the counterfeiter. Throwing back to Sherlock Holmes, this time the original incarnation: "...the counterfeiter stands in a class by himself as a public danger." Why? Counterfeiting is really a double-crime because, if undetected, it ultimately debases the value of the real deal.
You can make the case that spam, clickbait, and SEO shenanigans fall into this category, particularly when they're done convincingly. And they will only become better at pick-pocketing our attention. Doubtless, there are already children in pre-school born with immunities to NewsMax skeeziness coded into their DNA, so I shudder to think of the evolutionary counter-strike coming soon to a browser window nearyou me us. ;~)
We demand an internet with locks in place to prevent our money (and any personal brand we might cultivate online) from being stolen. We resent the bandwidth siphoned off our data plans by spam & ads. "You wastin' my minutes" denotes the waste of both money and time. But I find it absolutely bizarre that we do not guard our attention with the same jealousy we apply to our money and time. To a degree, it's understandable. Multi-tasking is a prized skill--has been since Julius Caesar's reputation for dictating four letters at once...all while other people were yakking at him, no less.
The conventional wisdom is that a knowledge economy is our future. (Personally, I'm not buying it, but that's a whole 'nuther blog post for a whole 'nuther time.) If you subscribe to that notion, though, you pretty much have to trade the adage that "time is money" for the more accurate "attention is money." In an economy powered by three shifts of people standing in front of machinery, attention took care of itself. Lack of attention on the part of the worker generally ended in maiming or death and possibly a starving family afterward. That was a world of time-clocks and piece-rates.
Powering this more nebulous economy of knowledge-y thingamajigs, however, is not a series of discrete steps performed by interchangeable labour. There can--and should--be a process in place, certainly. Metrics, too, one hopes. But the emphasis is on collaboration, not a waterfall assembly-line. (Hence the dreaded open-office layout.) And, at least in theory, a key differentiator will be the quality of worker--not only as individual talent, but also how well their atom bonds with other atoms in a team's molecule.
But, at some point, all that cross-pollination is supposed to gestate into some innovative-y, disruptive-y, paradigm-shift-y thing that will make the company the next Google or Apple. And that absolutely, non-negotiably, requires focus--a.k.a. uninterrupted attention. You know how those last couple hours on a Friday (when the office has mostly cleared out) can be more productive than the entire rest of the week? Behold, the power of attention.
Yet here we are in 2014, when the most politically important meetings are too often the useless ones. In 2014, we still believe that Silicon Valley is "innovating" when apps. like Snapchat are handsomely rewarded for dumping still more navel-gazing bits into the internet. (And somebody, for the love of Cthulu, pretty-please tell me that "Yo" is dead. Please?) In 2014, neither Twitter nor (especially) Facebook have added a "Snopes this before you look like a moron, m'kay?" button. (Okay, maybe that last one's wishful thinking, and sites like Snopes and Politifact might not appreciate the surge in traffic anyway.)
Let's pretend for a minute that the knowledge economy isn't just another hand-wavy term made up by MBAs to make us believe that there's light at the end off the offshoring tunnel. If that's actually true, then we need to treat attention--ours and others--as the coin of the realm. Prioritise the technologies who can boost signal and/or cancel out noise.
Given my druthers, I'd like to see the counterfeiters put out of business for good--specifically the greater good of the internet. It would free up resources--including mine--to solve more pressing problems. For the time being, however, here in the kingdom of firehoses, the woman with the sieve is Queen.
In the digital kingdom, locks mostly come in three forms: firewalls, encryption algorithms, and of course username/password combinations. Keeping the baddies' fingerprints off our email addresses, credit card numbers, nude selfies, what-have-you is the point.
But, genius that he was (or is?--we won't know until 2016), Moriarty didn't mention another species of baddie to whom locks were also immaterial. Namely the counterfeiter. Throwing back to Sherlock Holmes, this time the original incarnation: "...the counterfeiter stands in a class by himself as a public danger." Why? Counterfeiting is really a double-crime because, if undetected, it ultimately debases the value of the real deal.
You can make the case that spam, clickbait, and SEO shenanigans fall into this category, particularly when they're done convincingly. And they will only become better at pick-pocketing our attention. Doubtless, there are already children in pre-school born with immunities to NewsMax skeeziness coded into their DNA, so I shudder to think of the evolutionary counter-strike coming soon to a browser window near
We demand an internet with locks in place to prevent our money (and any personal brand we might cultivate online) from being stolen. We resent the bandwidth siphoned off our data plans by spam & ads. "You wastin' my minutes" denotes the waste of both money and time. But I find it absolutely bizarre that we do not guard our attention with the same jealousy we apply to our money and time. To a degree, it's understandable. Multi-tasking is a prized skill--has been since Julius Caesar's reputation for dictating four letters at once...all while other people were yakking at him, no less.
The conventional wisdom is that a knowledge economy is our future. (Personally, I'm not buying it, but that's a whole 'nuther blog post for a whole 'nuther time.) If you subscribe to that notion, though, you pretty much have to trade the adage that "time is money" for the more accurate "attention is money." In an economy powered by three shifts of people standing in front of machinery, attention took care of itself. Lack of attention on the part of the worker generally ended in maiming or death and possibly a starving family afterward. That was a world of time-clocks and piece-rates.
Powering this more nebulous economy of knowledge-y thingamajigs, however, is not a series of discrete steps performed by interchangeable labour. There can--and should--be a process in place, certainly. Metrics, too, one hopes. But the emphasis is on collaboration, not a waterfall assembly-line. (Hence the dreaded open-office layout.) And, at least in theory, a key differentiator will be the quality of worker--not only as individual talent, but also how well their atom bonds with other atoms in a team's molecule.
But, at some point, all that cross-pollination is supposed to gestate into some innovative-y, disruptive-y, paradigm-shift-y thing that will make the company the next Google or Apple. And that absolutely, non-negotiably, requires focus--a.k.a. uninterrupted attention. You know how those last couple hours on a Friday (when the office has mostly cleared out) can be more productive than the entire rest of the week? Behold, the power of attention.
Yet here we are in 2014, when the most politically important meetings are too often the useless ones. In 2014, we still believe that Silicon Valley is "innovating" when apps. like Snapchat are handsomely rewarded for dumping still more navel-gazing bits into the internet. (And somebody, for the love of Cthulu, pretty-please tell me that "Yo" is dead. Please?) In 2014, neither Twitter nor (especially) Facebook have added a "Snopes this before you look like a moron, m'kay?" button. (Okay, maybe that last one's wishful thinking, and sites like Snopes and Politifact might not appreciate the surge in traffic anyway.)
Let's pretend for a minute that the knowledge economy isn't just another hand-wavy term made up by MBAs to make us believe that there's light at the end off the offshoring tunnel. If that's actually true, then we need to treat attention--ours and others--as the coin of the realm. Prioritise the technologies who can boost signal and/or cancel out noise.
Given my druthers, I'd like to see the counterfeiters put out of business for good--specifically the greater good of the internet. It would free up resources--including mine--to solve more pressing problems. For the time being, however, here in the kingdom of firehoses, the woman with the sieve is Queen.
Monday, November 3, 2014
Of grey hairs and last leaves
Last week, I was teaching some old code new--or at least more sophisticated--tricks. At one point, I tripped over a particular tactic that I thought I'd more or less "outgrown" by the time the code was originally written. Every programmer who's been at a job long enough to replace the batteries in their cordless mouse knows this feeling.
Now, there was absolutely nothing wrong with the "old" code from either a functional or performance standpoint; it basically boiled down to style. Alas, for all the wrong reasons--what future maintainers of this code might think heading the list--I "modernised" the syntax. That's bad for two distinct reasons:
1.) I'm basically lying to myself about myself. I could write a whole blog about how not cringing at your old code is a sign that you've stagnated. Doubtless, that's been done many times over. So I won't. (Though, for all I know, I already have. I'm a little hazy on most of this blog's early history, truth be told.)
2.) That change had to be tested to make sure I didn't introduce any errors or other unintended side effects into the code. That wastes budget and time.
Don't get me wrong: I believe in making code as easy as possible to skim. That's no more than professional courtesy to future maintainers of that code...who will probably look an awful lot like a slightly greyer version of me. Who, by the bye, should be old enough to appreciate what a little salt-and-pepper and laugh-lines around the eyes can add. (Case in point--George Clooney. A decade ago, I rolled my eyes at all the swooning. Now? Hawt.) In code, it's a signal that this isn't greenfield work; that's a valuable insight.
One more thought: Trying to disguise the age of a code-base isn't doing newer programmers any favours, either. The digital equivalent of a As-I-am-now-so-shall-you-be memento mori is good for perspective. In our trade, that lesson's better internalised sooner than later.
- Oliver Wendell Holmes, Sr., The Last Leaf
Now, there was absolutely nothing wrong with the "old" code from either a functional or performance standpoint; it basically boiled down to style. Alas, for all the wrong reasons--what future maintainers of this code might think heading the list--I "modernised" the syntax. That's bad for two distinct reasons:
1.) I'm basically lying to myself about myself. I could write a whole blog about how not cringing at your old code is a sign that you've stagnated. Doubtless, that's been done many times over. So I won't. (Though, for all I know, I already have. I'm a little hazy on most of this blog's early history, truth be told.)
2.) That change had to be tested to make sure I didn't introduce any errors or other unintended side effects into the code. That wastes budget and time.
Don't get me wrong: I believe in making code as easy as possible to skim. That's no more than professional courtesy to future maintainers of that code...who will probably look an awful lot like a slightly greyer version of me. Who, by the bye, should be old enough to appreciate what a little salt-and-pepper and laugh-lines around the eyes can add. (Case in point--George Clooney. A decade ago, I rolled my eyes at all the swooning. Now? Hawt.) In code, it's a signal that this isn't greenfield work; that's a valuable insight.
One more thought: Trying to disguise the age of a code-base isn't doing newer programmers any favours, either. The digital equivalent of a As-I-am-now-so-shall-you-be memento mori is good for perspective. In our trade, that lesson's better internalised sooner than later.
And if I should live to be
The last leaf upon the tree
In the spring,
Let them smile, as I do now,
At the old forsaken bough
Where I cling.
- Oliver Wendell Holmes, Sr., The Last Leaf
Friday, October 31, 2014
Frivolous Friday, 2014.10.31: The Programmer's In-House of Horrors
Now that software development has become more mainstream--yea, even glamourised--the stereotype of the introverted nerd in the remotest office-corner typing cryptic incantations to invoke the blessings of the Server Gods has finally been put to rest. (And good shuttance!)
Mind you, I'm certainly not claiming that programmers have lost all the personality quirks that come with the trade. In fact, after over two decades in the workforce, I can't help but think that a course in Abnormal Psychology should be mandatory for every college graduate. For MIS/CS majors, this should probably be a minor. But it's never going to happen, of course. So for those still in programming school, here's the Rogue's Gallery that just might be lurking in a cubicle near you at any given point in your career.
"Charlie Victor" - The nickname comes from the CTRL-C and CTRL-V keyboard shortcuts for cutting and pasting. If you're finding code that is all over the map style-wise, I strongly recommend Googling a bunch of snippets. Don't be surprised to find StackOverflow, W3CSchools, or possibly 4GuysFromRolla (depending on the language) at the top of the list. In that case, you're not working with a real software developer; you're working with Red Green.
The Magpie - Can't stay away from shiny stuff. Which is fantastic if they're scratching their own itches--more power to them, then. But for a software shop with a legacy code-base, the platforms, languages, & APIs are typically chosen on the "second mouse gets the cheese" principle. Rightly so, I might add. The worst part of working with the magpie, however, is by the time everyone else on the team is up to speed with the new hotness, they've already moved on to the new new hotness.
The Baby Duck - For all practical purposes, the antithesis of Magpie. Covered elsewhere.
The Cowboy - When I worked in more of a startup scenario, I once had a co-worker directly editing files on the beta server while I was demo-ing them to a client. 'Nuff said. (He's actually a pretty decent guy; it was more product of us flying by the seat of our threadbare pants for years on end. That's a management issue more than anything.)
The Warrior-Priest(ess) - I owe this spot-on analogy to a client who was also the loudest critic of my team's work. He was referring in particular to the UNIX "warrior-culture, where one must strive and suffer for one's knowledge" (verbatim quote). This was pre-Ubuntu, pre-ServerFault, mind you, so he totally had a point. Alas, it's not only UNIX. The Warrior doesn't care what you learned; they're only interested in how many scars it left. (It's a frighteningly specific incarnation of the sunk cost fallacy. You, on the other hand, are sane enough to only do that sort of thing with, like, your house or car.)
The Firehose - This is the polar opposite of the Warrior. Ask a simple question, be digitally water-boarded with links to Wikipedia articles, YouTube how-tos, tutorial blog-posts, whatever. Don't be surprised if you climb back out of that rabbit-hole as clueless as you were falling in. (Strangely enough, I once worked with someone who could toggle between Warrior and Firehose any given day.)
The Eternal Prototyper - Real world constraints? Never heard of 'em. Oh, you have to support varying levels of hierarchy in the live version of this navigation widget? Sorry-not-sorry--they only hard-coded two. But boy, did it ever look snappy in Safari on their Mac. Sucks to be you watching it blow chunks in IE8. But, hey, they're the bleeding-edge genius and you're just the bit-twiddling code-monkey, right?
The Goth - Like the Rolling Stones, this species of coder wants it all painted black. At least the screen of their IDE, anyway. Outside of I/T, it probably looks like a terror of the space-bar and "Enter" key....maybe even the "Tab" key. To other coders, symptoms of the pathology include horrors like single-character variable names--all declared inline, nat'cherly. Also? Transcontinental method-chaining. Nested callbacks with more layers than a matroyska doll. Code so dense you need shades and a lead hoodie for the Hawking Radiation.
Pollyanna - "Validating input is just time that could be spent adding new features. Hackers would never bother with our small-potatoes website. Besides, our users are all internal--they know what they're doing, right?"
(Everone Else: Twitches uncontrollably whilst donning tin-foil hat.)
The Road-Tripping Parent - Remember "We'll get there when we get there!"? Admittedly, there may be a time and a place for this kind of push-back. Typically, it's during the triage that happens after someone feeds that cute, innocuous little feature request after midnight and it cannonballs into the swimming pool. That being said, having a more accurate sense of status than a Windows progress bar is a core function of a developer's job. And, as much as I'm most certainly not a card-carrying member of the Steve Jobs Fan Club (Team Woz, baby!), he nailed it: "Real programmers ship."
Captain Flypaper - You know the "Let's throw it against the wall and see if it sticks" schtick? Yeah, well, some programmers apparently work in flypaper-upholstered cubicles, because they expect every bit of code they write to stick on the first throw. (Mercifully, unit-testing + automated builds are becoming the norm, which should either push these folks into a more disciplined workflow...or another career.)
The Helicopter Parent - Unless the individual in question is a natural-born passive-aggressive control-freak, this coder is largely the product of working with some or all of the above. And when they're the senior geek or team lead who's expected to save the Earth week-in-and-week-out, it's difficult not to sympathise. Until, of course, you're on the receiving end of, "[sigh]...Well, that's not how I would do it..." Or you find that rewrites of your code have been quietly checked into the VCS. Again, this sort of thing typically boils down to an original sin committed at management levels. But it's still a heckuva morale-buster for any coder within its blast-radius.
Before wishing everyone a safe remainder to their Hallowe'en, I'd like to give credit for the idea of nick-naming programmer personality quirks to Michael Lopp, who runs the Rands in Repose website. (Sample: "Free Electron" + the comments.) If you don't feel like wading through years and years of blogging on the subject of working with geeks, you can save yourself a bunch of time by just buying the two books he's written. They're laugh-out-loud funny, and I re-read them every couple of years for grins.
Mind you, I'm certainly not claiming that programmers have lost all the personality quirks that come with the trade. In fact, after over two decades in the workforce, I can't help but think that a course in Abnormal Psychology should be mandatory for every college graduate. For MIS/CS majors, this should probably be a minor. But it's never going to happen, of course. So for those still in programming school, here's the Rogue's Gallery that just might be lurking in a cubicle near you at any given point in your career.
"Charlie Victor" - The nickname comes from the CTRL-C and CTRL-V keyboard shortcuts for cutting and pasting. If you're finding code that is all over the map style-wise, I strongly recommend Googling a bunch of snippets. Don't be surprised to find StackOverflow, W3CSchools, or possibly 4GuysFromRolla (depending on the language) at the top of the list. In that case, you're not working with a real software developer; you're working with Red Green.
The Magpie - Can't stay away from shiny stuff. Which is fantastic if they're scratching their own itches--more power to them, then. But for a software shop with a legacy code-base, the platforms, languages, & APIs are typically chosen on the "second mouse gets the cheese" principle. Rightly so, I might add. The worst part of working with the magpie, however, is by the time everyone else on the team is up to speed with the new hotness, they've already moved on to the new new hotness.
The Baby Duck - For all practical purposes, the antithesis of Magpie. Covered elsewhere.
The Cowboy - When I worked in more of a startup scenario, I once had a co-worker directly editing files on the beta server while I was demo-ing them to a client. 'Nuff said. (He's actually a pretty decent guy; it was more product of us flying by the seat of our threadbare pants for years on end. That's a management issue more than anything.)
The Warrior-Priest(ess) - I owe this spot-on analogy to a client who was also the loudest critic of my team's work. He was referring in particular to the UNIX "warrior-culture, where one must strive and suffer for one's knowledge" (verbatim quote). This was pre-Ubuntu, pre-ServerFault, mind you, so he totally had a point. Alas, it's not only UNIX. The Warrior doesn't care what you learned; they're only interested in how many scars it left. (It's a frighteningly specific incarnation of the sunk cost fallacy. You, on the other hand, are sane enough to only do that sort of thing with, like, your house or car.)
The Firehose - This is the polar opposite of the Warrior. Ask a simple question, be digitally water-boarded with links to Wikipedia articles, YouTube how-tos, tutorial blog-posts, whatever. Don't be surprised if you climb back out of that rabbit-hole as clueless as you were falling in. (Strangely enough, I once worked with someone who could toggle between Warrior and Firehose any given day.)
The Eternal Prototyper - Real world constraints? Never heard of 'em. Oh, you have to support varying levels of hierarchy in the live version of this navigation widget? Sorry-not-sorry--they only hard-coded two. But boy, did it ever look snappy in Safari on their Mac. Sucks to be you watching it blow chunks in IE8. But, hey, they're the bleeding-edge genius and you're just the bit-twiddling code-monkey, right?
The Goth - Like the Rolling Stones, this species of coder wants it all painted black. At least the screen of their IDE, anyway. Outside of I/T, it probably looks like a terror of the space-bar and "Enter" key....maybe even the "Tab" key. To other coders, symptoms of the pathology include horrors like single-character variable names--all declared inline, nat'cherly. Also? Transcontinental method-chaining. Nested callbacks with more layers than a matroyska doll. Code so dense you need shades and a lead hoodie for the Hawking Radiation.
Pollyanna - "Validating input is just time that could be spent adding new features. Hackers would never bother with our small-potatoes website. Besides, our users are all internal--they know what they're doing, right?"
(Everone Else: Twitches uncontrollably whilst donning tin-foil hat.)
The Road-Tripping Parent - Remember "We'll get there when we get there!"? Admittedly, there may be a time and a place for this kind of push-back. Typically, it's during the triage that happens after someone feeds that cute, innocuous little feature request after midnight and it cannonballs into the swimming pool. That being said, having a more accurate sense of status than a Windows progress bar is a core function of a developer's job. And, as much as I'm most certainly not a card-carrying member of the Steve Jobs Fan Club (Team Woz, baby!), he nailed it: "Real programmers ship."
Captain Flypaper - You know the "Let's throw it against the wall and see if it sticks" schtick? Yeah, well, some programmers apparently work in flypaper-upholstered cubicles, because they expect every bit of code they write to stick on the first throw. (Mercifully, unit-testing + automated builds are becoming the norm, which should either push these folks into a more disciplined workflow...or another career.)
The Helicopter Parent - Unless the individual in question is a natural-born passive-aggressive control-freak, this coder is largely the product of working with some or all of the above. And when they're the senior geek or team lead who's expected to save the Earth week-in-and-week-out, it's difficult not to sympathise. Until, of course, you're on the receiving end of, "[sigh]...Well, that's not how I would do it..." Or you find that rewrites of your code have been quietly checked into the VCS. Again, this sort of thing typically boils down to an original sin committed at management levels. But it's still a heckuva morale-buster for any coder within its blast-radius.
Before wishing everyone a safe remainder to their Hallowe'en, I'd like to give credit for the idea of nick-naming programmer personality quirks to Michael Lopp, who runs the Rands in Repose website. (Sample: "Free Electron" + the comments.) If you don't feel like wading through years and years of blogging on the subject of working with geeks, you can save yourself a bunch of time by just buying the two books he's written. They're laugh-out-loud funny, and I re-read them every couple of years for grins.
Wednesday, October 29, 2014
Something I didn't expect to learn at Programmer School
There are any number of good things you can say about attending a small Programmer School like the one that made me a professional geek. Alas, one of them was not the civic planning. Specifically, the fact that the school library and computer lab were situated next to the common area (a.k.a. the cafeteria)...directly across from the day care.
Now, I've never needed help with being distracted. (Squirrel!) So I wasn't too thrilled with the arrangement. I found little sympathy, however, when I grumbled to one of my professors, a father of three: "Eh. I figure if they're screaming, they're still alive," quoth he.
Sigh. Nobody understands me. Except maybe squirrels.
But I can admit that my prof. had a point, at least as it relates to project management. As both an employee and a freelancer, I've never been known to sit on problems that come up once our tidy project design meets up with messy reality. (Although I normally try to have a least one workaround in my back pocket before I actually raise the red flag.)
After a couple of hard lessons, I've also learned not to drop off the radar even when the project is hitting its milestones ahead of schedule. Once upon a time, I considered weekly status reports a sign that my boss was a paranoid control-freak who didn't trust me to be the professional they were paying me to be.
As a freelancer, however, I've come to the opposite view. Someone who isn't interested in status "because you're the one who understands all that technical stuff" is a red flag. Because if you don't want to be bothered with good news, what happens if there's any bad news to handle? Software, like any other (designed) product, is nothing more that the sum of the decisions made between the initial brainstorm and the final bytes. Not all of those decisions can be made in the heady optimism of the kick-off meeting. And some of those decisions could even be mid-project course-corrections.
A potential client who expects me to work in a vacuum and deliver exactly what s/he wanted makes me nervous. But the flip side is that a software developer who expects to work that way should make you (as the potential client) more nervous still. In a freelancer, that behaviour is symptomatic of someone afraid of criticism, who might just let the clock (and budget) run out until your decision boils down to take-it-or-leave it.
Look. I know we're all busy. But everyone on this road-trip is responsible for making sure we all arrive where we want to be, even when they're not technically driving. Road signs matter. Detour signs, in particular, do not exist to be ignored. Once in a great while, we may even have to pull over for a minute and dig out the map and compass when we're not where we expect to be. In the long run, we'll save time and gas. And, unlike the Blues Brothers, our road trip won't end up in the hoosegow.
Now, I've never needed help with being distracted. (Squirrel!) So I wasn't too thrilled with the arrangement. I found little sympathy, however, when I grumbled to one of my professors, a father of three: "Eh. I figure if they're screaming, they're still alive," quoth he.
Sigh. Nobody understands me. Except maybe squirrels.
But I can admit that my prof. had a point, at least as it relates to project management. As both an employee and a freelancer, I've never been known to sit on problems that come up once our tidy project design meets up with messy reality. (Although I normally try to have a least one workaround in my back pocket before I actually raise the red flag.)
After a couple of hard lessons, I've also learned not to drop off the radar even when the project is hitting its milestones ahead of schedule. Once upon a time, I considered weekly status reports a sign that my boss was a paranoid control-freak who didn't trust me to be the professional they were paying me to be.
As a freelancer, however, I've come to the opposite view. Someone who isn't interested in status "because you're the one who understands all that technical stuff" is a red flag. Because if you don't want to be bothered with good news, what happens if there's any bad news to handle? Software, like any other (designed) product, is nothing more that the sum of the decisions made between the initial brainstorm and the final bytes. Not all of those decisions can be made in the heady optimism of the kick-off meeting. And some of those decisions could even be mid-project course-corrections.
A potential client who expects me to work in a vacuum and deliver exactly what s/he wanted makes me nervous. But the flip side is that a software developer who expects to work that way should make you (as the potential client) more nervous still. In a freelancer, that behaviour is symptomatic of someone afraid of criticism, who might just let the clock (and budget) run out until your decision boils down to take-it-or-leave it.
Look. I know we're all busy. But everyone on this road-trip is responsible for making sure we all arrive where we want to be, even when they're not technically driving. Road signs matter. Detour signs, in particular, do not exist to be ignored. Once in a great while, we may even have to pull over for a minute and dig out the map and compass when we're not where we expect to be. In the long run, we'll save time and gas. And, unlike the Blues Brothers, our road trip won't end up in the hoosegow.
Monday, October 27, 2014
Change for time
For a couple of Frivolous Fridays, I've riffed on the subject "If geeks ran the world." It was meant to be nerdy, goofy fun, with maybe some wistful wishful thinking thrown in. Overall, I like to think that the world would be a better place: More logical yet more experimental, and probably more caffeinated.
But there's one point about which I'm serious. Like, global thermonuclear Armageddon serious. And that would be in how we (meaning as a planet) would learn to deal with dates and times under a geek regime, particularly when the majority of those geeks are from I/T.
At some point in your programming career--hopefully earlier than later--you realise that working with dates and times will always suck. No matter how good the language or framework or API you work with, the very best you can expect is that it will be less painful. Not because there's less inherent suckage, but because the date-time APIs provide a topical anesthetic to numb you enough to finish the job. Not unlike popping the first-string quarterback full of cortisone after a brutal sack so he can finish the game.
Boiled down to the essence, the concept of time revolved around the point when the sun was at its highest point in the sky. Dates, of course, were based off the length of day relative to the length of not-day.
Dates, at least in European history, have been an occasional pain. If you're morbidly curious, look up "Julian Calendar" and "Gregorian Calendar." Basically, from 1582 to 1918, the calendar date generally depended on who was in power in your country/territory during the late 16th century and, more importantly, how they felt about the Pope. (Exception: France tried decimalising the calendar during the Revolution...and again for a couple of weeks in 1871. I wish I were making that up.)
Which, except for cross-border contracts, will not be an issue for most folks living in an agrarian economy. At least not until the mid-19th century, when telegraph lines were strung across whole continents...and then between them. As we all appreciate, the world has been shrinking ever since.
The study of timepieces from the sundial to the atomic clock is a fun bit of nerdery, I'll grant you. (For instance, I hang an aquitaine off my belt at SCA events...and keep the wristwatch discreetly tucked under poofy Renaissance sleeves.)
You can accuse me--perhaps rightly--of too much optimism in thinking that our species will manage to stop squabbling and money-grubbing and gawping at glowing rectangular screens long enough to venture permanently into other regions of the galaxy. If and when that happens, the (occasionally wobbly) spinning of a water-logged, gas-shrouded rock in its decaying orbit around a flaming gas-ball will cease to be relevant.
But for our immediate purposes, I don't see any practical reason why the earth can't all be on the same time, all the time. Let's say that GMT/UTC/Zulu time is the only time zone. (Yeah, I know--that means that the British can still pretend that they're the centre of the world. But it's a trade-off I'm willing to make, however much it riles my American-born-and-raised sensibilities...)
The upshot is that you break for lunch at eight a.m. in NYC and five a.m. in L.A.--is that legitimately a big deal? I'd argue not. No more having to hope that you and your client both made the same mental adjustments when you agreed on a meeting time. For programmers and their friends who keep the servers running, no having to convert everything to UTC to make stuff apples-to-apples. And, best of all, good shuttance to the pointless nuisance that is Daylight Savings Time, while we're at it.
'Course, the arm-twisting politics of making that happen will probably be even more cantankerous than the religious grudges that gave 16th - 18th century European history a lot of its uglier moments. Alexander Pope wrote "'Tis with our judgements as with our watches; none go just alike, but each believes his own." Now, identical judgements can be a terrifying thing (think mob rule) and I'm not holding out for it in a world of billions anyway. But our watches going just alike would make life easier in any number of ways. And not just for we programmers who have to write code for a 24-timezone world.
But there's one point about which I'm serious. Like, global thermonuclear Armageddon serious. And that would be in how we (meaning as a planet) would learn to deal with dates and times under a geek regime, particularly when the majority of those geeks are from I/T.
At some point in your programming career--hopefully earlier than later--you realise that working with dates and times will always suck. No matter how good the language or framework or API you work with, the very best you can expect is that it will be less painful. Not because there's less inherent suckage, but because the date-time APIs provide a topical anesthetic to numb you enough to finish the job. Not unlike popping the first-string quarterback full of cortisone after a brutal sack so he can finish the game.
Boiled down to the essence, the concept of time revolved around the point when the sun was at its highest point in the sky. Dates, of course, were based off the length of day relative to the length of not-day.
Dates, at least in European history, have been an occasional pain. If you're morbidly curious, look up "Julian Calendar" and "Gregorian Calendar." Basically, from 1582 to 1918, the calendar date generally depended on who was in power in your country/territory during the late 16th century and, more importantly, how they felt about the Pope. (Exception: France tried decimalising the calendar during the Revolution...and again for a couple of weeks in 1871. I wish I were making that up.)
Which, except for cross-border contracts, will not be an issue for most folks living in an agrarian economy. At least not until the mid-19th century, when telegraph lines were strung across whole continents...and then between them. As we all appreciate, the world has been shrinking ever since.
The study of timepieces from the sundial to the atomic clock is a fun bit of nerdery, I'll grant you. (For instance, I hang an aquitaine off my belt at SCA events...and keep the wristwatch discreetly tucked under poofy Renaissance sleeves.)
You can accuse me--perhaps rightly--of too much optimism in thinking that our species will manage to stop squabbling and money-grubbing and gawping at glowing rectangular screens long enough to venture permanently into other regions of the galaxy. If and when that happens, the (occasionally wobbly) spinning of a water-logged, gas-shrouded rock in its decaying orbit around a flaming gas-ball will cease to be relevant.
But for our immediate purposes, I don't see any practical reason why the earth can't all be on the same time, all the time. Let's say that GMT/UTC/Zulu time is the only time zone. (Yeah, I know--that means that the British can still pretend that they're the centre of the world. But it's a trade-off I'm willing to make, however much it riles my American-born-and-raised sensibilities...)
The upshot is that you break for lunch at eight a.m. in NYC and five a.m. in L.A.--is that legitimately a big deal? I'd argue not. No more having to hope that you and your client both made the same mental adjustments when you agreed on a meeting time. For programmers and their friends who keep the servers running, no having to convert everything to UTC to make stuff apples-to-apples. And, best of all, good shuttance to the pointless nuisance that is Daylight Savings Time, while we're at it.
'Course, the arm-twisting politics of making that happen will probably be even more cantankerous than the religious grudges that gave 16th - 18th century European history a lot of its uglier moments. Alexander Pope wrote "'Tis with our judgements as with our watches; none go just alike, but each believes his own." Now, identical judgements can be a terrifying thing (think mob rule) and I'm not holding out for it in a world of billions anyway. But our watches going just alike would make life easier in any number of ways. And not just for we programmers who have to write code for a 24-timezone world.
Subscribe to:
Posts (Atom)