Thursday, October 22, 2015

Quality isn't just another value-add

Previously, I mentioned that the Q-and-A period at this past Tuesday's Moncton User Group meeting ("Security Enterprise Architecture for Developers") basically resulted in two epiphanies for me.  The first, and more junior, one I riffed on during the previous entry.

My question to Jamie Rees had to do with "selling" security's value to your client as part of the application development process.  (Because we all know we should make apps. secure from the ground up, right?  But, then, we also know that we're supposed to floss every night, too.  And we all know how that plays out for most people.  Your faithful blogger included.)

I was thinking of I/T security in terms of risk management.  To wit:  A data breach costs money.  If data related to financial information (credit card numbers, Social Security / Social Insurance numbers, bank account numbers), the company whose data was leaked is typically on the hook for years of credit monitoring for each person affected.

Then there are the lump-sum costs.  Things like the hit to customer goodwill (which sounds really squishy, but there are accountants who specialise in quantifying that in hard cash).  Finding and patching the weaknesses in the system does not come cheap either.  Depending on the industry, third-party audits might be required.  And if the security lapses were super-egregious, heads will roll, which entails (at a minimum) the costs of hiring and training.

So I figured that this could be gelled down to a simple formula:

Average cost per breach * Likelihood of breach per year = Annual risk

If that "annual risk" (quantified in dollars) is greater than the budgeted amount for security in the Statement of Work, it should be a no-brainer, right? 

Jamie Rees had a few thoughts and suggestions, including the nugget that in security more than anything else, you have to protect yourself from entropy.  Because waiting until a crisis to fix a hole means that you will focus on that crisis alone.  But once that energy's expended, organisational fatigue will guarantee that there will few (if any) resources spent on proactively preventing the next crisis.  (Sound familiar?)

But as I was digesting this all down for my notes, someone else raised a hand and asked the question as it should have been phrased in the first place:  "How do I sell my clients on security without selling fear?"  And, wham!  Synapses linked up, proverbial light bulbs went on.  (For all I know, the heavens opened to the sound of angel-choirs.  But Suite 201 of the Venn Centre is really, really well-soundproofed, so don't quote me on that.)

For the record, Mr. Rees's answer boiled down to mapping security to the project goals.  (Like, I might add, y'do for everything.  We're All About the goals, not the features here.) 

But what hit me was security, really, is just another facet of Quality Assurance.  A very specific facet, it's true--and one perhaps almost large enough to overshadow its general category.

But the thing that's Quality Assurance has proven over and over since Dr. Deming pioneered the discipline is that quality ultimately pays for itself.  Namely, because focusing on quality forces you to take a hard look at your organisation and its processes.  A relentless focus on quality allows far less room for the politics of personality--which includes the always-regretable "rock star" culture.  And it has no mercy for the "But we've always done it that way" argument.

So, when pitching my services to future clients, you can bet that I will be pointing out how developing an application for their business buys them process consulting from an unbiased 3rd party as part of the package.  And all for the low, low cost of higher quality. :~)

Wednesday, October 21, 2015

When questions >= answers

So October's meeting of the Moncton User Group (@monctonug) was a bit different from the usual classroom-esque schtick.  Granted, the presenter gave a prepared spiel, followed by a Q&A period.   But, thanks to technical difficulties (i.e. the wrong laptop connectors), there was no PowerPoint.  Which can be a good thing, and in this case drove the content more into the realm of "war stories."

But I had two epiphanies, one small, and another not-so-wee.  Because I need to mop up some things before decamping for a meeting, I'm going to just focus on the wee one.

Anyone who's ever attended a conference (other than to collect tchotchkes and gawp at booth-babes...or just play hooky on the company tab) knows that the presentations are the hamburger patty, but everything else is the bun and toppings.  In other words, you don't just eat the patty.  (Not unless you have a lot of food allergies, I suppose.) 

Naturally, the networking is a big deal.  But so's the chance to pull the content of the presentation into your own context.  Normally that's done through Q&A.  But sometimes, as I discovered last evening, someone else's question is even more clarifying that your own.  Which is exactly when someone (with a lot more business development experience than I currently have) followed up my question with, frankly, the one I should have asked in the first place.  I don't like using the phrase "refined my thinking" because I think it's usually a fig leaf for "why didn't I think of that?"  Mind you, that's actually what happened, but it triggered a whole new riff in my head.

That riff is the subject of the next post.  But I thought that this insight might encourage folks to click that EventBrite link the next time they're sitting on the fence about a learning opportunity.  Remember, the burger is more than the patty.  You're there for the burger.

Friday, October 16, 2015

A red, green, and blue silver lining

The Arduino platform was created more for design students than professional code-slingers.  Which is cool -- that's a noble rationale, actually.  But it comes back to bite in one way.  See, the folks who adapted its (C++-ish) programming language from the Java-ish "Processing" language aren't really able to support one reality of the modern programmers life.  And that's unit-testing. 

Again, I'm not criticising, because as it turns out, it's sort of a net win.

But first some background/terminology for non-programmers.  "Unit testing" is the act of testing the smallest possible chunks of one's code.  In modern development environments, you tend to write a function hand-in-glove with its test.  The test defines what the expected outcome is for any inputs set.  Then you (or some more automated process) runs the test against your actual code and verifies that the function did what it was expected to do.

(Aside:  Later during the application development process, there are other forms of testing, such as integration testing--did your feature break somebody else's function?--plus performance-testing, and vulnerability scans, etc.  For the purposes of this blog item, we're only focusing on unit-testing.)


In the Arduino world, which relies on the micro-controller taking input from physical sensors, or outputting to physical widgets, simulating that in software would be a boil-the-ocean endeavour.  That's mainly due to ever-growing number of widgets that are compatible with the Arduino.  For a community-supported foundation running on the proverbial shoe-string, that's too much to ask.  And, in any case, I think I can safely say that the community would muuuuuuch prefer to see those limited resources devoted to the core platform.

So for a complex project that involves multiple widgets, one solution is to write your code as separate projects, test it in a more atomic (meaning indivisible) fashion, and roll the tested code into the master project.  For example, my "master" project is a lighting system for the Office Finches.  There are a number of elements that go into that:
  • Two sets of bright white LEDs that are controlled by an integrated circuit because the Arduino itself doesn't have enough "pins," (i.e., input/output ports) for the number of lights needed.
  • One triplet of red, blue, and green lights that can fade in and out.  With 256 possible brightnesses available for each LED, this comes out to over 16.7 million combinations of red, green, and blue.  The idea is to gradually fade these colours in and out to compensate for the lack of full-spectrum daylight.
  • A real-time clock to determine what time it is to turn the lights on in the morning and when to turn them off at night.  (The Arduino, having a fairly primitive microcontroller, does not have an internal clock in the sense that we know it.)
    A passive infrared sensor that is activated (only in the dark) to determine whether one (or more) of the Office Finches has fallen off her/his perch.
  • One triplet of "soft" white LEDs to be used as an "emergency" night-light.  They are gradually raised if motion is detected by the infrared sensor over a few seconds, and gradually lowered when the finches have settled back in.
  • A photo-sensitive resistor to determine whether it's dark enough to warrant the emergency night-lighting.
As I write, I'm testing the red-green-blue lighting.  The "night-light" feature has been tested, as it was the nucleus of the whole setup.  Tomorrow will probably be devoted in part to deciphering some really old (in computer years) documentation for the real time clock and setting that up for unit-testing (again, in a separate project).

Normally, having multiple copies of the same functionality is frowned-upon in software development.  The reason being that it means that you have to remember to deploy bug fixes or improvements across those copies, which costs time for the coders as well as the testers.

But in this case, I rarely modify the "test" copies of the code once they're verified and rolled into the master project.  Which leaves me with bite-sized bits of single-function code that can be used as reference for (or imported lock-stock-and-barrel into future projects.  As long as I name them descriptively and document the functionality (which of course I do--'cuz that's how we roll chez fivechimera), it's All Good.


In this case, what I'd normally consider a "primitive" lack of testing infrastructure turns out to be a net win.  It's another one of those things they don't teach you in Programmer School.  That's not the point, after all--their job is to mold you into a cog that can be plugged into the machinery...at least for the first year or three.  But eventually, you figure out when and where "the rules" can (productively) be broken.  That's one of those career inflection-points where you slough off another bit of your "programmer" skin to reveal the "software developer" underneath.

Tuesday, October 13, 2015

Sax and Violins


Okay, not really.  No violins, anyway.  But Dennis did follow through on his persistent whim of taking up the saxophone.  Last week he picked up a used one and just now is starting to get a feel for the reed and stops.  "What's the 'Stairway to Heaven' of the sax?" I wondered out loud as he was assembling the sax and clipping on the neck-strap.

Neither of us had a good answer.  Which at first surprised me--I mean, doesn't every instrument have its own Stairway?  For instance, the piano has "The Entertainer."  Drums have the solo from "Inna Godda Da Vida"; the harp has "The March of Brian Boru," and bagpipes "Scotland the Brave." But nothing comes to mind (my mind, anyway) as the calling-card of the saxophone.  Except maybe the theme of The People's Court.  Or possibly a Kenny G pastische.  But, then, it's not like I could carry a tune in a two-handled bucket, so what do I know?

But then it occurred to me that, any number of non-musical skills have their own Stairway.  It represents an inflection-point on the learning curve--namely, the spot where the student can start feeling confident about being competent.  Unsurprisingly, a software developer is no different in that respect.  Particularly when the developer can expect to be in perpetual student mode, scrambling up at least one learning curve at any given time.

For instance, in mainframe-based systems (meaning text-only terminals--or, even more retro--green-bar paper), the Stairway app. was something like an accounting report crunched from flat files.  (Mercifully, I haven't had the tedium of [shudder] counting columns, specifying output formats, and fretting about data overflows since Bill Clinton and Jean Chrétien were in office.)

Likewise, back in the days of Visual Basic/C++, an angel got its wings when you deployed a multi-form app. that shuttled data back and forth to some sort of permanent storage.  (Alternatively, outside the Microsoft Universe, it likely involved Lotus Notes.)  As pure client-server topologies went out of fashion in favour of the web, so (mercifully) did this skill-set (most especially debugging your way through "DLL Hell.").

Before 21st century content management systems, having a home-brew collection of HTML pages for photos of your pets, vacation photos, and a mouldering blog was the Stairway of web programming.  Or, if you were coding for a business, it was an online catalogue with a shopping-cart duct-taped on.  (Nowadays, in this age of Wordpress, all bets are off...but if it doesn't involveHTML5 plus jQuery and/or AJAX, you might still be a n00b.)

On the server side, the Stairway once was any database-backed application.  While that's still the bread-and-butter of many developers, although with the emphasis on REST, it has more of a roll-your-own-API kind of feel to it.  Double that if you're providing/fielding data that could be consumed or produced by a variety of devices.

For mobile development, it seems to be the venerable list app. with cloud storage, drag-and-drop functionality, and probably some cool icons for classification.  (Disclaimer:  I'm not beyond the "Hello, Android" phase m'self, so don't take my word for that.)

Database?  If you're using a standard relational database, you should know your way around a stored procedure (with input and output parameters) that punches at least two of the SELECT, INSERT, UPDATE, or DELETE buttons.  Bonus points for advanced use of aggregate functions, conditional sorting, or pagination of results.

And, finally, in the strange world of physical computing, things get real when you have to power at least one of your widgets with a supply that's not the Arduino / Raspberry Pi / Beaglebone / Etc.  Extra credit for using interrupts or making it talk to another device that's not the PC that programmed it.

Obviously, there is a world full of other tunes to play with any instrument.  And now a solar system of things (given that humans sent a camera-spaceship out to Pluto and all) for coders to work at.  But just as musicians don't become (or stay!) musicians without practice, so it is with software development.   (Although, significantly, I have yet to see a title like Learn the Trombone in 24 Hours in the bookstore.  And, of course, there's no such thing as Autotune for programmers.  Grrrrrrr.)

My point is that when you buy an album from an artist or band, you're not actually paying for the individual notes or even, really, the songs.  You're paying for a bit of the inspiration that made them tackle those particular songs in the first place.   You're paying off the equipment.  You're paying for the collaboration of many talents.  You're paying those who mentored them, either directly or indirectly.  And, mostly, you're paying for countless hours of experimentation and error, for "Just one more try and we can call it a day" all-nighters, for head-banging frustration and moments of sheer hopelessness.  In a word, you're buying craftsmanship (and the commitment it requires).

And, although I'm not remotely musical, that sounds remarkably like software development. 

Wednesday, October 7, 2015

Debunking a dangerous meme

While I applaud the publicity for the efforts of a local-ish (to me) researcher to study the patterns of computer hackers, I always cringe when press articles focus on money and personal info.

No question that getting into your bank account is a valuable thing for a hacker.  Stealing your identity is theft one step removed:  The hacker (or thief who buys the info. from said hacker) aims to steal your identity to steal from others.

My point is that it doesn't end there. 

To wit:  You could be flat, busted broke, and have a negative credit score, and hackers will still be interested in you as long as you have a functioning computer connected to the internet.

Here are just a few ways you can still be victimised, even when you think that you are "safe" because you're not Warren Buffet:

Spam - the original flavour.  All the email addresses in your contacts list?  Those can be stolen and spammed.  I'm sure your Mom, your boss, and/or your BFF will all appreciate that...

Spam - now with new and improved Sleaze Factor(TM).   If the hacker (or hacker's client) isn't spamming your friends with dodgy V1@gr@ or Nigerian Prince come-ons, they're trying to trick your contacts into infecting their own computers.  (War story:  The one and only virus that happened on my 2-year SysAdmin gig happened because a normally vigilant someone expected an email with an attachment and double-clicked it.  Bottom line:  It can happen to anyone.  And I mean anyone.)

Spam - the social media version.  If you have a social media account, those passwords can be sniffed and stolen.  Love those sleazy DMs you sometimes get in Twitter or Facebook that are followed by an embarrassed apology from a friend who's just wrested back control over their account?  Yeah, me neither.  Want to be the one making those apologies to your aunt?  Me neither.

Borgifying your computerYou may never think you have a fast enough CPU or half the RAM you could use.  But believe-you-me, you have more than enough power to (unwittingly) help someone mine Bitcoins.  And, holy moly, if you think your computer is slow now...

Borgifying your computer + bogarting your bandwidth.  Remember the distributed denial of service (DDoS) attack that nearly took down the XBox network last Christmas?  That dick move was brought to the world not only by hackers, but by hordes of infected computers (otherwise known as a "botnet.")  Also, remember all that spam we were just talking about?  Yeah, that's likely being pumped through compromised computers as well.  Did Netflix streaming just sputter out?  Oh, your ISP just billed you b/c you went over your monthly bandwidth ration?  Sucks to be you...not to mention everyone else on the receiving end of your computer's shenanigans.

So, it could be just me being cynical about the human race (see afore-mentioned SysAdmin stint), but the whole "I don't have anything to hack" meme is being used as an excuse not to keep computers patched.  And that, even a decade down the road from babysitting networks, just pisses me off.

As much as I despise the codified knee-jerk hysteria that masquerades as cybersecurity legislation, sometimes I wish that people could be legally barred from having admin. rights on their own computers after Computing While Lazy.  Because when our digital lives are eating so deeply into our meatspace time, responsibility comes with the power to instantaneously connect with people all over the planet. 

Tuesday, October 6, 2015

Playing in a different league

Author/Blogger Michael Lopp (http://randsinrepose.com/) claims that, on a semi-regular basis, he crafts a short-list of people with whom he'd found a company.  Then he folds that Post-it into a small cube and swallows it. 

That anecdote actually came up over dinner chez fivechimera last evening, with both Dennis & I brainstorming our respective lists.  How someone makes the list is actually a bit of a balancing act.  It's a matrix, really.  On one axis is the general skills fit--does this person have chops?  That's the easy part.  The other axis is the price of those chops in the coin of personality friction.  (How often am I going to butt heads with this person?  How many people are they going to drive away by being insufferably right at the wrong time in our trajectory?)

Yeah, I know that all non-geeks (plus a healthy percentage of the geeks I know) are nodding, reliving the pain of every socially inept interaction they've had with that person...maybe even those people.  Uh-huh:  The retentive completionist who passive-aggressively drags on the schedule until their pet feature is ready.  The Tamarian whose Rosetta Stone turns out to be Firefly quotes.  The SysAdmin who clearly missed their calling as a black market dealer.  The "idea hamster" who can't execute their way out of a proverbial wet paper bag.  Et cetera.

But unless you're relatively new in the world of work (or you're extremely unlucky), you know a few who make the cut.  And if you're not as lazy as I am, you keep in touch.  This morning's spike in my LinkedIn traffic was not a coincidence.  Which, to my chagrin, prompted a comparison between last night's brainstorming and fantasy sports leagues. 

Mind you, it's not an entirely frivolous exercise, because it inherently forces one to acknowledge one's limitations (in time and talent).  For example, here's my particular fantasy roster (names omitted):
  • Network/IT Support - Yes (with a second-string backup)
  • QA - Yes (also with a second-string)
  • Graphics and Design - Yes
  • UI/UX Developer (web and mobile) - No
  • Tech Support - Yes (with multiple levels of backup)
  • Project Management - Tentatively yes
  • Sales - Nope--and that's the 363.64-kg gorilla of my (hypothetical) staffing problems
But the uncomfortable fact remains that until I have a working prototype with which to pitch the team (to say nothing of potential clients and/or investors), my roster in the Fantasy Startup League is just that--fantasy.  It's not out there, sweating and bruising itself on the ice or the astroturf.  It's not even in the weight room or breaking down video of last week's key plays/misses.  Nope--it's sipping a beer in its armchair, shouting off-the-cuff opinions (which may or may not be founded in reality) at the screen.

The folks out there who are fighting fires, losing sleep, duct-taping, putting the "work" in "networking," rolling the bones--all the while listening for the pacing of the wolf outside the door?   No matter how small the company, they're playing in the big league.  And don't think that I don't understand the difference.

Friday, October 2, 2015

Machines and meatspace

In internet parlance, the term "meatspace" refers to the non-networked world.  You know, good, old-fashioned reality reality--that messy place where we have to look people in the eye and remember our manners as we talk to them.  Programmers, particularly those of the database/back-end kind of coding, too often have to be reminded of its existence...at least after the business rules have been codified in the software design/specs.

Don't get me wrong--even programmers who don't write web pages or phone apps. still have to contend with real-world constraints.  For instance, on my last project, I ran up against the limits of a fairly slow processor on a cloud server.  

Now, cloud hosting companies, as a rule of thumb, keep their hardware in matched sets.  That keeps everything standardised.  It's not an OCD thing--the apples-to-apples parity means that they can scale servers up and down by adding & removing processors and blocks of memory almost on the fly.  Upgrades that would have taken days even fifteen years ago now can be done in a matter of minutes with mere seconds of downtime.  That's a Big Deal.

But for all the benefits of hosting in a virtual private server environment, it does come with a downside, and that being that the hosting companies have a vested interest in commoditisation, which generally pushes them toward the low end of performance.  In most instances, throwing more hardware at the code is all you need.  After all, industrial-strength software such as the Apache web server or the MySQL database will spread out its requirements over multiple CPUs. 

But woe betide the programmer crunching hundreds of thousands of records with a lowly script.  In my case, the PHP executable can only use one processor at a time.  In which case, 1 + 1 does not necessarily equal two.  Simply because processing data in parallel means that separate processes are in constant danger of stepping on each other's metaphorical toes.  So some extra protections (lock files, record flags in the database) had to be spliced into the code as insurance that, say, the same record wouldn't be processed twice.

But as I've side-stepped into physical computing, limitations are hammered home even more forcibly.  For instance, the Arduino and the Raspberry Pi only have a limited number of pins to work with.  Not to mention considerations like voltages (5 vs. 3.3 vs. ???) and number of milliamperes that can be squeezed through them.  And now, picking up 3D modeling (in the form of OpenSCAD), I even have to take pesky things like gravity seriously...at least for anything that's going to be printed in plastic.  (Which is everything--CGI isn't my bag.)  See, printing plastic into thin air doesn't work so well.  And that the tolerances aren't completely perfect.  And, most importantly, that saving my notes from the Geometry/Trigonometry classes that I took a decade ago was a really good idea.

I suppose that there's a silver lining, though.  As noted earlier, the vast majority of the code I've ever written is the below-the-waterline part of the iceberg.  So having to respect the limits of things besides database normalisation and Boolean logic and all that is good for you.  The same discipline applies, after all.  Code is written to be re-used.  Functions/modules are documented.  Input is validated (where possible; OpenSCAD doesn't provide exception-handling).  Code is checked into source control.  Alas, there's no such thing as unit-testing, but you can't have everything, I guess...