Wednesday, January 7, 2015

Third dimension of debugging

It's pretty much become an I/T cliché that software peeps blame the hardware, and hardware peeps blame software for any particularly nasty gremlins. 

The phenomenon dates, more or less, from the PC boom of the 1980s and 1990s.  Not only were there more players in the hardware market, but the standards were somewhat more in flux.  (Anyone foolish enough to whine about accidentally buying a mini-USB cable to charge their micro-USB phone within the earshot of any SysAdmin of a certain age can--and should--expect be summarily wedgied.)

The standards issues leaked into the software development as well.  For instance:  Even during my late-90s stint in Programmer School, I remember losing points on a test because I assumed the number of bytes in the data types were based on Borland's C-language compiler rather than Microsoft's. (If that seemed like gibberish to you, don't sweat it; just trust me when I say that it was kind of a big deal Back In The Day(TM).)

Then along came the (commercial) internet.  And then broadband internet.  Which, in addition to opening unparalleled opportunities for data exchange, opened unparalleled avenues for data theft, denial-of-service attacks, and the digital equivalent of graffiti.  Thus did using a PC on a network lose its dewy-eyed innocence. 

Enter the firewall.  In laypersons' terms, it's a device that can be configured to allow or deny network traffic based on where it's coming from or destined for, and also what channels (a.k.a. "ports") it's using.  (Firewalls--most famously Windows Firewall--can be software-only and designed to protect a single computer, but that's another story for another time.) 

As an analogy, think of an airport operating when zombies have taken over another part of the world.  Flights originating from entire continents might be turned away altogether.  Flights originating from "authorised" locations might be allowed to land, but are still restricted to certain gates.  Similarly, planes might not be allowed to fly to questionable destinations, or are only allowed to use certain terminals or runways.  Any overrides/exceptions to those rules, of course, are potential weak points in the system that heighten the risk of zombies taking over the entire world.

Then, too, security for groups of networked computers inside the same network has evolved, mainly to minimise the damage a rogue operator can do.   Folders and whole file systems, servers, databases, what-have-you can require separate passwords, or only allow you to do certain things (such as read but not update).  This is analogous to how airports are segmented into terminals, have restricted areas, etc.  It even, to some extent, mimics the redundancies in security one experiences with a connecting flight on an international trip.  Usually, that's a more seamless process than the idiocies of post-9/11 Security Theatre.

Usually.

This week has not been one of those times.  Thus do I find myself doing the legwork for a junior support tech at a shared hosting provider's Help Desk.  (Grrrrrr.)   And realising anew how much more I'm going to have to stay on top of the networking & security aspects of maintaining a server than I have in years past.  It's not so much that networking/security technology has taken a quantum leap and I'm playing catch-up.  No, it's mainly a human issue.  To wit:  When I can no longer trust the word of the wizards behind the consoles, it adds another dimension to debugging.  Debugging, I might add, that sometimes has to be done under fire in production.

These days (responsive web design and the usual Android/iOS tug-o-war excepted), hardware ranks relatively low on the list of usual suspects when a gremlin pops up.  Network and database permissions, on the other hand, have shot up that list to a near-permanent slot in second place.  Suspect #1, naturally, is always my own code.   But when I haven't touched said code since early Dec., and scheduled jobs suddenly stop working around New Year's Eve  (holidays being a favourite time for techs. to sneak in updates)?   Ding!Ding!Ding!Ding!Ding!Ding!  Ladies and gentlemen, I believe we have a new winner.

All of which sounds like whining on my part, and I'm really not.  (Pinky-swear!)  I'm merely hoping that this facet of a custom software developer's trade helps to explain why it's not a trivial activity.  And why we don't do for free.  (Note:  Code written to scratch our own itches before we let other people make copies of it is not the same thing as "free.")  And why we don't appreciate hearing, "This should be easy for you."  And also why we are so darned insistent that our clients actually try apps. out for themselves before turning them loose on the world.  Because "It works on my network" is the new version of the dreaded "It works on my machine."